Remote Senior + Data Jobs in Nov 2019 📈 Open Startup
RSS
API
Post a Job

get a remote job
you can do anywhere

6 Remote Senior Data Jobs at companies like Intrinio, Thorn and Creative Commons last posted 20 days ago

6 Remote Senior Data Jobs at companies like Intrinio, Thorn and Creative Commons last posted 20 days ago

Get a  email of all new remote Senior + Data jobs

Subscribe
×

  Jobs

  People

👉 Hiring for a remote Senior + Data position?

Post a Job - $299
on the 🏆 #1 remote jobs board

Last 30 days

Intrinio

 

Senior Ruby Engineer

verified
🇺🇸US-only

Senior Ruby Engineer  


Intrinio

🇺🇸US-only verified

dev

ruby

api

data

dev

ruby

api

data

🇺🇸US-only20d
**Why Work at Intrinio**\n\nWe are a fast-paced and well-funded startup creating new technology in the financial data market. Our team is highly experienced and productive. We enjoy working together and advancing in our craft. Our goal is to produce world-class software that will significantly disrupt the world of finance, creating new efficiencies and encouraging innovation by smaller players.\n\n**About the Job**\n\nWe are looking for a senior-level Ruby software engineer. In this position, you will be actively contributing to the design and development of our financial data platform and products. If you have the skills and ability to build high-quality, innovative and fully functional software in-line with modern coding standards and solid technical architecture - we want to talk to you. Intrinio is a startup (20+ people), so you should be comfortable working on a small team, moving fast, breaking things, committing code several times a day, and delivering working software weekly.\n\n\n**Ideal candidates will have several of the following:**\n* Mastery of the Ruby programming language and its major frameworks\n* Knowledge of data stores and their use cases: SQL databases, Redis, and Elasticsearch\n* Experience with API development and usage\n* A history of learning new technology stacks and methodologies \n* Interest (or experience) in the financial markets\n* A track-record of public and/or private contributions on GitHub\n\n\n# Responsibilities\n * Write well-designed, testable, documented, and performant code\n* Commit code several times a day\n* Deliver working features on a weekly basis\n* Review the code of and help to mentor junior and mid-level developers\n* Communicate clearly and timely with managers and team members (we use a Kanban process with Monday and Slack) \n\n# Requirements\n* Significant time working remotly (please do not apply otherwise)\n* 5+ years of software engineering experience\n* Significant experience developing web applications and APIs\n* Strong knowledge of Relational Databases, SQL and ORM libraries \n\n#Location\n- 🇺🇸US-only

See more jobs at Intrinio

# How do you apply? Visit https://about.intrinio.com/careers and click "Apply" next to "Senior Software Engineer". Our CTO will reach out to you shortly after for next steps.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Thorn


Senior Data Engineer


🇺🇸US-only

Senior Data Engineer


Thorn

🇺🇸US-only

data

engineering

engineer

work from home

data

engineering

engineer

work from home

🇺🇸US-only22d
Thorn is a non-profit focused on building technology to defend children from sexual abuse. Working at Thorn gives you the opportunity to apply your skills, expertise and passions to directly impact the lives of vulnerable and abused children. Our staff solves dynamic, quickly evolving problems with our network of partners from tech companies, NGOs, and law enforcement agencies. If you are able to bring clarity to complexity and lightness to heavy problems, you could be a great fit for our team.\n\nEarlier this year, we took the stage at TED and shared our audacious goal of eliminating child sexual abuse material from the internet. A key aspect of our work is partnering with the National Center for Missing & Exploited Children and building technology to optimize the broader ecosystem combating online child sexual abuse.\n\n**What You'll Do**\n\n* Collaborate with other engineers on your team to build a data pipeline and client application from end-to-end.\n* Prototype, implement, test, deploy, and maintain stable data engineering solutions.\n* Work closely with the product manager and engineers to define product requirements.\n* Present possible technical solutions to various stakeholders, clearly explaining your decisions and how they address real user needs, incorporating feedback in subsequent iterations.\n\n**What We're Looking For**\n\n* You have a commitment to putting the children we serve at the center of everything you do.\n* You have proficient software development knowledge, with experience building, growing, maintaining a variety of products, and a love for creating elegant applications using modern technologies.\n* You’re experienced with devops (Docker, AWS, microservices) and can launch and maintain new services.\n* You are experienced with distributed data storage systems/formats such as MemSQL, Snowflake, Redshift, Druid, Cassandra, Parquet, etc.\n* You have worked with real-time systems using various open source technologies like Spark, MapReduce, NoSQL, Hive, etc.\n* You have knowledge in data modeling, data access, and data storage techniques for big data platforms.\n* You have an ability and interest in learning new technologies quickly.\n* You can work with shifting requirements and collaborate with internal and external stakeholders.\n* You have experience prototyping, implementing, testing, and deploying code to production.\n* You have a passion for product engineering and an aptitude to work in a collaborative environment, can demonstrate empathy and strong advocacy for our users, while balancing the vision and constraints of engineering.\n* You communicate clearly, efficiently, and thoughtfully. We’re a highly-distributed team, so written communication is crucial, from Slack to pull requests to code reviews.\n\n**Technologies We Use**\n\n*You should have experience with at least a few of these, and a desire and ability to learn the rest.*\n\n* Python\n* Elasticsearch / PostgreSQL\n* AWS / Terraform\n* Docker / Kubernetes\n* Node / Typescript \n\n#Salary\n100000-150000\n \n\n#Location\n- 🇺🇸US-only

See more jobs at Thorn

Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

This year

Creative Commons


Senior Data Engineer

verified
🌏Worldwide

Senior Data Engineer


Creative Commons

🌏Worldwide verified

data

engineering

engineer

senior

data

engineering

engineer

senior

🌏Worldwide3mo
Creative Commons is building a “front door” to the growing universe of openly licensed and public domain content through CC Search and the CC Catalog API. The Senior Data Engineer reports to the Director of Engineering and is responsible for CC Catalog, the open source catalog that powers those products. This project will unite billions of records for openly-licensed and public domain works and metadata, across multiple platforms, diverse media types, and a variety of user communities and partners.\n\n**Diversity & inclusion**\n\nWe believe that diverse teams build better organizations and better services. Applications from qualified candidates from all backgrounds, including those from under-represented communities, are very welcome. Creative Commons works openly as part of a global community, guided by collaboratively developed codes of conduct and anti-harassment policies.\n\n**Work environment and location**\n\nCreative Commons is a fully-distributed organization - we have no central office. You must have reasonable mobility for travel to twice-annual all-staff meetings and the CC Global Summit (a total of 3 trips per year). We provide a subsidy towards high-speed broadband access. Laptop/desktop computer and necessary resources are supplied.\n\n\n\n# Responsibilities\n **Primary responsibilities**\nArchitect, build, and maintain the existing CC Catalog, including:\n* Ingesting content from new and existing sources of CC-licensed and public domain works.\n* Scaling the catalog to support billions of records and various media types.\n* Implementing resilient, distributed data solutions that operate robustly at web scale.\n* Automating data pipelines and workflows.\n* Collaborating with the Backend Software Engineer and Front End Engineer to support the smooth operation of the CC Catalog API and CC Search.\n\nAugment and improve the metadata associated with content indexed into the catalog using one or more of the following: machine learning, computer vision, OCR, data analysis, web crawling/scraping.\n\nBuild an open source community around the CC Catalog, including:\n* Restructuring the code and workflows such that it allows community contributors to identify new sources of content and add new data to the catalog.\n* Guiding new contributors and potentially participating in projects such as Google Summer of Code as a mentor. \n* Writing blog posts, maintaining documentation, reviewing pull requests, and responding to issues from the community.\n\nCollaborate with other outside communities, companies, and institutions to further Creative Commons’ mission. \n\n# Requirements\n* Demonstrated experience building and deploying large scale data services, including database design and modeling, ETL processing, and performance optimization\n* Proficiency with Python\n* Proficiency with Apache Spark\n* Experience with cloud computing platforms such as AWS\n* Experience with Apache Airflow or other workflow management software\n* Experience with machine learning or interest in picking it up\n* Fluent in English\n* Excellent written and verbal communication skills\n* Ability to work independently, build good working relationships and actively communicate, contribute, and speak up in a remote work structure\n* Curiosity and a desire to keep learning\n* Commitment to consumer privacy and security\n\nNice to have (but not required):\n* Experience with contributing to or maintaining open source software\n* Experience with web crawling\n* Experience with Docker\n \n\n#Salary\n100000 -120000\n \n\n#Location\n- 🌏Worldwide

See more jobs at Creative Commons

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Good Eggs


Senior Data Platform Engineer

verified
🇺🇸US-only

Senior Data Platform Engineer


Good Eggs

🇺🇸US-only verified

data

snowflake

dbt

devops

data

snowflake

dbt

devops

🇺🇸US-only7mo
At Good Eggs, we believe feeding your family well shouldn’t come with a trade off — be it your time, your standards, or your wallet. We’re pioneering a new way to fill your fridge, by sourcing the best food from producers we know and trust, and bringing it straight to you — all at a price the same or less than your grocery store.\n\nWe run a healthy agile engineering process with:\n\n* pair programming\n* test-driven development\n* continuous deployment\n\n\n# Responsibilities\n We're looking for a Data Platform Engineer who is interested in a multidisciplinary engineering environment and is excited to support the culture of data alongside a passionate, mission-driven team.\n\nAs a Data Platform Engineer, you'll work on ingest, modeling, warehousing, BI tools, and have significant influence over the tools & processes we deliver to our customers (Analysts, Engineers, Business Leaders). We have a modern data platform and a strong team of DevOps Engineers and Full-Stack Data Analysts to collaborate with. Some of the tech involved:\n\n* custom code written in multiple languages (primarily Node.js/Typescript, but also Python and Go)\n* Fivetran & Segment\n* Snowflake\n* dbt\n* Mode Analytics\n* a modern, AWS-based, containerized application platform \n\n# Requirements\n**Ideal candidates will have:**\n* A desire to use their talents to make the world a better place\n* 2+ years of agile software development experience including automated testing and pair programming\n* 3+ years of full time, Data experience (ETL, warehousing, modeling, supporting Analysts)\n* interest in learning and adopting new tools and techniques\n* Bachelor’s degree in computer science, computer engineering or equivalent experience\n\n**Experience in some of the following areas:**\n* Node.js/Typescript, Go, Python, SQL\n* DevOps, cloud infrastructure, developer tools\n* Container-based deployments, microservice architecture\n\n**Bonus points for:**\n* Previous work experience involving e-commerce, physical operations, finance, or BizOps\n* Being data-driven - ability to get insights from data\n* Experience with dimensional modeling and/or BEAM* \n\n#Location\n- 🇺🇸US-only

See more jobs at Good Eggs

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Saagie


Senior Back-end Developer

verified
France or between GMT-1 to GMT+3

Senior Back-end Developer


Saagie

France or between GMT-1 to GMT+3 verified

kotlin

spring

data

governance

kotlin

spring

data

governance

France or between GMT-1 to GMT+38mo
We're looking for a senior back-end developer to join our product team!\n\n## Why working at Saagie?\n* **Real agile organization**. Human before process\n* Work on **open source** projects\n* Flexible work schedules and **remote work** allowed\n* **Conferences lover**? We can sponsor you! (Limited to Europe)\n\n## Additional Information\n* Location: Rouen or Paris office (France), or anywhere (full-time remote work)\n* Contract: permanent\n* EU work permit needed\n\n# Responsibilities\n You'll be in charge of developing modules and connectors integrated into our product micro-services architecture (running on Kubernetes), and especially on our **data governance** and **security** modules.\nYou'll be working in a feature team with direct relations with the SRE team.\n\nYour team is responsible for his own architectural and technological choices and you are committed to:\n* **🤟 Contribute** to improving Saagie's platform **quality** \n* **🛠 Improve** its **maintainability** \n* **👮 guarantee** its operational condition\n* **🏭 Industrialize** your developments so that they are integrated as soon as possible into our daily deliveries to production. \n\n# Requirements\n* Minimum **5 years experience** as a back-end developer\n* Interested in **Data**, Big data, privacy, GDPR\n* You have skills in development with **Java** (or Kotlin) and Spring Boot\n* You know how to properly **test** your code\n* **Docker** has no secret to you\n* **Automation** and **continuous integration** is a standard for you\n* You know how - and want to - share your knowledge with your teammates\n* Resourceful and **open-minded**: you're keen to enhance your skills and use new tools quickly\n* **Autonomous** but can also work with teammates\n* You are **pragmatic** and **delivery oriented**\n* At least English speaking (French appreciated)\n\n### Nice to have\nKnowledge on:\n* **Kubernetes**\n* Hadoop \n* Front-end development (Angular/React)\n \n\n#Salary\n38K€ - 55K€\n \n\n#Location\n- France or between GMT-1 to GMT+3

See more jobs at Saagie

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Stats (beta): 👁 920 views,✍️ 0 applied (0%)
**The Company**\nSynergy Sports Technology, named by Fast Company as one of the world's top 10 most innovative companies in sports, seeks talented **Senior Backend Data Platform Engineers** to join our team on a long term contract basis. \nThis position offers a tremendous opportunity to work with the only company that delivers on-demand professional-level basketball, baseball, and hockey analytics linked to supporting video to nearly 1500 college, professional, and international teams. Our systems are highly complex and contains petabytes of data and video requiring extremely talented engineers to maintain scale and efficiency of its products.\nAs a member of the Synergy team, its engineering team will contribute to the ongoing development of Synergy’s revolutionary online sports data and video delivery solutions. Building applications such as:\n* Client Analytic Tools\n* Video Editing and Capture Tools\n* Data Logging Tools\n* Operational Game, Data and Video Pipeline Tools\n* Backend Data and Video Platforms\n\nSynergy’s work environment is geographically distributed, with employees working from home offices. The successful candidate must be comfortable working in a virtual office using online collaboration tools for all communication and interaction in conversational English. Synergy development staff work in a deadline-oriented, demanding, non-standard environment in which personal initiative and a strong work ethic are rewarded. Good communication skills, self-motivation, and the ability to work effectively with minimal supervision are crucial. Nonstandard working hours may be required, as Synergy operates on a 24x7 system for clients, with associated deadlines and requirements. Pay rate is dependent on experience.\nInformation for all Positions:\n* All Positions will last for roughly a year with some engineers lasting even longer if they are talented, we will keep them for future projects (contracts are renewing every year)\n* Engineers should be available for phone calls M-F from 7am to 10am Pacific Time zone. There will usually be 1 or 2 phone calls each week that are 30 to 90 minutes each. All other work hours availability is up to the engineer to work when it is a best fit and balance for them to communicate with their team and their personal commitments outside of work.\n* Working an average of 40 hours per week is expected except in rare or temporary circumstances. Each week can be flexible and up to the engineer as to when and how much they work per day. It is ok to work heavier and lighter weeks if desired based upon the engineer’s preference of when and how to work. But a preference is to average 40 hours per week.\n* No travel is required\n\n\n\n\n# Responsibilities\n **Team Objectives**\n\nA candidate joining the Data Platform team can expect to work on the following types of projects:\n* Creating internal and external APIs to support both data and video\n* Building complex data models supporting the business rules of sports\n* Developing algorithms that ingesting and transforming multiple streams of data and collapsing the data into a single event structure\n* Refactoring code to a .NET Core environment\n* Scaling out current systems to support new sports\n* Building build and test automation systems\n* Building complex reporting data structures for analytical systems \n\n# Requirements\n**Required Skill Sets**\n* NoSQL database (MongoDB Preferred)\n* C# (Latest version with a preference to .NET Core)\n

See more jobs at Synergy Sports Technology

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.