Remote Engineer + Data Jobs in Nov 2019 📈 Open Startup
RSS
API
Post a Job

get a remote job
you can do anywhere

8 Remote Engineer Data Jobs at companies like Intrinio, Kraken Digital Asset Exchange and Thorn last posted 20 days ago

8 Remote Engineer Data Jobs at companies like Intrinio, Kraken Digital Asset Exchange and Thorn last posted 20 days ago

Get a  email of all new remote Engineer + Data jobs

Subscribe
×

  Jobs

  People

👉 Hiring for a remote Engineer + Data position?

Post a Job - $299
on the 🏆 #1 remote jobs board

Last 30 days

Intrinio

 

Senior Ruby Engineer

verified
🇺🇸US-only

Senior Ruby Engineer  


Intrinio

🇺🇸US-only verified

dev

ruby

api

data

dev

ruby

api

data

🇺🇸US-only20d
**Why Work at Intrinio**\n\nWe are a fast-paced and well-funded startup creating new technology in the financial data market. Our team is highly experienced and productive. We enjoy working together and advancing in our craft. Our goal is to produce world-class software that will significantly disrupt the world of finance, creating new efficiencies and encouraging innovation by smaller players.\n\n**About the Job**\n\nWe are looking for a senior-level Ruby software engineer. In this position, you will be actively contributing to the design and development of our financial data platform and products. If you have the skills and ability to build high-quality, innovative and fully functional software in-line with modern coding standards and solid technical architecture - we want to talk to you. Intrinio is a startup (20+ people), so you should be comfortable working on a small team, moving fast, breaking things, committing code several times a day, and delivering working software weekly.\n\n\n**Ideal candidates will have several of the following:**\n* Mastery of the Ruby programming language and its major frameworks\n* Knowledge of data stores and their use cases: SQL databases, Redis, and Elasticsearch\n* Experience with API development and usage\n* A history of learning new technology stacks and methodologies \n* Interest (or experience) in the financial markets\n* A track-record of public and/or private contributions on GitHub\n\n\n# Responsibilities\n * Write well-designed, testable, documented, and performant code\n* Commit code several times a day\n* Deliver working features on a weekly basis\n* Review the code of and help to mentor junior and mid-level developers\n* Communicate clearly and timely with managers and team members (we use a Kanban process with Monday and Slack) \n\n# Requirements\n* Significant time working remotly (please do not apply otherwise)\n* 5+ years of software engineering experience\n* Significant experience developing web applications and APIs\n* Strong knowledge of Relational Databases, SQL and ORM libraries \n\n#Location\n- 🇺🇸US-only

See more jobs at Intrinio

# How do you apply? Visit https://about.intrinio.com/careers and click "Apply" next to "Senior Software Engineer". Our CTO will reach out to you shortly after for next steps.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Kraken Digital Asset Exchange

 

Data Engineer - Cryptowatch


North America or Europe

Data Engineer - Cryptowatch  


Kraken Digital Asset Exchange

North America or Europe

data

engineer

growth

digital nomad

data

engineer

growth

digital nomad

North America or Europe21d
You will be an instrumental piece of a small team with a mandate to understand how Cryptowatch visitors and clients are utilizing the product. Succeeding in this role requires knowledge on architecting data systems, a deep understanding of how to measure user behavior, and ability to translate raw data in to easy-to-understand dashboards. You will work closely with marketers and product managers on the Growth team to design+build user behavior measurement infrastructure and translate this data into insights. By structuring and helping build measurement pipelines, you'll help the team learn about customers and drive growth. Your work will directly impact the product roadmap and bottom line of the Cryptowatch business. \n\nYou will also help establish measurement of key conversion and retention metrics, then use them to identify opportunities for improvement in the product experience. As a fullstack developer passionate about driving towards business goals, you will work up and down the stack and pick up new tools and frameworks quickly.\n\n# Responsibilities\n * Design and help implement data pipelines that collect, transform, and curate data to help our team understand user behavior on the site, using data from external tools and internal databases.\n* Work with the Cryptowatch Growth team to design lightweight experiments that help us learn about customers and drive key growth metrics. \n* Create structure and process around growth experimentation, data collection, and user research from the ground up.\n* Work with Business Operations, Strategy, Marketing, and Product to collectively grow our understanding of our customer base. \n\n# Requirements\n* 5+ years of work experience in relevant field (Data Engineer, DW Engineer, Software Engineer, etc).\n* You are comfortable with specing analytics from the end user's dashboard down to the events in our UI.\n* You have expertise with the React.JS framework.\n* You have experience with Golang and PostgreSQL.\n* You are quick to pick up new tools and frameworks.\n* You have a strong ability to search a large codebase to find what you’re looking for.\n* You are able to communicate effectively with businesspeople, designers, developers, marketers, product managers and the customer.\n* You always ask “why?” and love searching for ways to answer your questions quantitatively.\n* You are skilled in data visualisation and web analytics tools like Grafana and MixPanel. \n\n#Location\n- North America or Europe

See more jobs at Kraken Digital Asset Exchange

# How do you apply? Apply [Here](https://jobs.lever.co/kraken)
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Thorn


Senior Data Engineer


🇺🇸US-only

Senior Data Engineer


Thorn

🇺🇸US-only

data

engineering

engineer

work from home

data

engineering

engineer

work from home

🇺🇸US-only22d
Thorn is a non-profit focused on building technology to defend children from sexual abuse. Working at Thorn gives you the opportunity to apply your skills, expertise and passions to directly impact the lives of vulnerable and abused children. Our staff solves dynamic, quickly evolving problems with our network of partners from tech companies, NGOs, and law enforcement agencies. If you are able to bring clarity to complexity and lightness to heavy problems, you could be a great fit for our team.\n\nEarlier this year, we took the stage at TED and shared our audacious goal of eliminating child sexual abuse material from the internet. A key aspect of our work is partnering with the National Center for Missing & Exploited Children and building technology to optimize the broader ecosystem combating online child sexual abuse.\n\n**What You'll Do**\n\n* Collaborate with other engineers on your team to build a data pipeline and client application from end-to-end.\n* Prototype, implement, test, deploy, and maintain stable data engineering solutions.\n* Work closely with the product manager and engineers to define product requirements.\n* Present possible technical solutions to various stakeholders, clearly explaining your decisions and how they address real user needs, incorporating feedback in subsequent iterations.\n\n**What We're Looking For**\n\n* You have a commitment to putting the children we serve at the center of everything you do.\n* You have proficient software development knowledge, with experience building, growing, maintaining a variety of products, and a love for creating elegant applications using modern technologies.\n* You’re experienced with devops (Docker, AWS, microservices) and can launch and maintain new services.\n* You are experienced with distributed data storage systems/formats such as MemSQL, Snowflake, Redshift, Druid, Cassandra, Parquet, etc.\n* You have worked with real-time systems using various open source technologies like Spark, MapReduce, NoSQL, Hive, etc.\n* You have knowledge in data modeling, data access, and data storage techniques for big data platforms.\n* You have an ability and interest in learning new technologies quickly.\n* You can work with shifting requirements and collaborate with internal and external stakeholders.\n* You have experience prototyping, implementing, testing, and deploying code to production.\n* You have a passion for product engineering and an aptitude to work in a collaborative environment, can demonstrate empathy and strong advocacy for our users, while balancing the vision and constraints of engineering.\n* You communicate clearly, efficiently, and thoughtfully. We’re a highly-distributed team, so written communication is crucial, from Slack to pull requests to code reviews.\n\n**Technologies We Use**\n\n*You should have experience with at least a few of these, and a desire and ability to learn the rest.*\n\n* Python\n* Elasticsearch / PostgreSQL\n* AWS / Terraform\n* Docker / Kubernetes\n* Node / Typescript \n\n#Salary\n100000-150000\n \n\n#Location\n- 🇺🇸US-only

See more jobs at Thorn

Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

This year

Doximity is transforming the healthcare industry. Our mission is to help doctors be more productive, informed, and connected. As a software engineer focused on our data stack, you'll work within cross-functional delivery teams alongside other engineers, designers, and product managers in building software to help improve healthcare. \n\nOur [team](https://www.doximity.com/about/company#theteam) brings a diverse set of technical and cultural backgrounds and we like to think pragmatically in choosing the tools most appropriate for the job at hand.  \n\n**About Us**\n* We rely heavily on Python, Airflow, Spark, MySQL and Snowflake for most of our data pipelines\n* We have over 350 private repositories in Github containing our pipelines, our own internal multi-functional tools, and [open-source projects](https://github.com/doximity)\n* We have worked as a distributed team for a long time; we're currently [about 65% distributed](https://blog.brunomiranda.com/building-a-distributed-engineering-team-85d281b9b1c)\n* Find out more information on the [Doximity engineering blog](https://engineering.doximity.com/)\n* Our [company core values](https://work.doximity.com/)\n* Our [recruiting process](https://engineering.doximity.com/articles/engineering-recruitment-process-doximity)\n* Our [product development cycle](https://engineering.doximity.com/articles/mofo-driven-product-development)\n* Our [on-boarding & mentorship process](https://engineering.doximity.com/articles/software-engineering-on-boarding-at-doximity)\n\n**Here's How You Will Make an Impact**\n\n* Collaborate with product managers, data analysts, and data scientists to develop pipelines and ETL tasks in order to facilitate the extraction of insights from data.\n* Build, maintain, and scale data pipelines that empower Doximity’s products.\n* Establish data architecture processes and practices that can be scheduled, automated, replicated and serve as standards for other teams to leverage.\n* Spearhead, plan, and carry out the implementation of solutions while self-managing.\n\n**About you**\n\n* You have at least three years of professional experience developing data processing, enrichment, transformation, and integration solutions\n* You are fluent in Python, an expert in SQL, and can script your way around Linux systems with bash\n* You are no stranger to data warehousing and designing data models\n* Bonus: You have experience building data pipelines with Apache Spark in a multi-database ecosystem\n* You are foremost an engineer, making you passionate for high code quality, automated testing, and other engineering best practices\n* You have the ability to self-manage, prioritize, and deliver functional solutions\n* You possess advanced knowledge of Unix, Git, and AWS tooling\n* You agree that concise and effective written and verbal communication is a must for a successful team\n* You are able to maintain a minimum of 5 hours overlap with 9:30 to 5:30 PM Pacific time\n* You can dedicate about 18 days per year for travel to company events\n\n**Benefits**\n\nDoximity has industry leading benefits. For an updated list, see our career page\n\n**More info on Doximity\n**\nWe’re thrilled to be named the Fastest Growing Company in the Bay Area, and one of Fast Company’s Most Innovative Companies. Joining Doximity means being part of an incredibly talented and humble team. We work on amazing products that over 70% of US doctors (and over one million healthcare professionals) use to make their busy lives a little easier. We’re driven by the goal of improving inefficiencies in our $3.5 trillion U.S. healthcare system and love creating technology that has a real, meaningful impact on people’s lives. To learn more about our team, culture, and users, check out our careers page, company blog, and engineering blog. We’re growing steadily, and there’s plenty of opportunities for you to make an impact.\n\n\n*Doximity is proud to be an equal opportunity employer, and committed to providing employment opportunities regardless of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex, gender, gender identity, gender expression, pregnancy, childbirth and breastfeeding, age, sexual orientation, military or veteran status, or any other protected classification. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law.* \n\n#Location\n- North America

See more jobs at Doximity

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Creative Commons


Senior Data Engineer

verified
🌏Worldwide

Senior Data Engineer


Creative Commons

🌏Worldwide verified

data

engineering

engineer

senior

data

engineering

engineer

senior

🌏Worldwide3mo
Creative Commons is building a “front door” to the growing universe of openly licensed and public domain content through CC Search and the CC Catalog API. The Senior Data Engineer reports to the Director of Engineering and is responsible for CC Catalog, the open source catalog that powers those products. This project will unite billions of records for openly-licensed and public domain works and metadata, across multiple platforms, diverse media types, and a variety of user communities and partners.\n\n**Diversity & inclusion**\n\nWe believe that diverse teams build better organizations and better services. Applications from qualified candidates from all backgrounds, including those from under-represented communities, are very welcome. Creative Commons works openly as part of a global community, guided by collaboratively developed codes of conduct and anti-harassment policies.\n\n**Work environment and location**\n\nCreative Commons is a fully-distributed organization - we have no central office. You must have reasonable mobility for travel to twice-annual all-staff meetings and the CC Global Summit (a total of 3 trips per year). We provide a subsidy towards high-speed broadband access. Laptop/desktop computer and necessary resources are supplied.\n\n\n\n# Responsibilities\n **Primary responsibilities**\nArchitect, build, and maintain the existing CC Catalog, including:\n* Ingesting content from new and existing sources of CC-licensed and public domain works.\n* Scaling the catalog to support billions of records and various media types.\n* Implementing resilient, distributed data solutions that operate robustly at web scale.\n* Automating data pipelines and workflows.\n* Collaborating with the Backend Software Engineer and Front End Engineer to support the smooth operation of the CC Catalog API and CC Search.\n\nAugment and improve the metadata associated with content indexed into the catalog using one or more of the following: machine learning, computer vision, OCR, data analysis, web crawling/scraping.\n\nBuild an open source community around the CC Catalog, including:\n* Restructuring the code and workflows such that it allows community contributors to identify new sources of content and add new data to the catalog.\n* Guiding new contributors and potentially participating in projects such as Google Summer of Code as a mentor. \n* Writing blog posts, maintaining documentation, reviewing pull requests, and responding to issues from the community.\n\nCollaborate with other outside communities, companies, and institutions to further Creative Commons’ mission. \n\n# Requirements\n* Demonstrated experience building and deploying large scale data services, including database design and modeling, ETL processing, and performance optimization\n* Proficiency with Python\n* Proficiency with Apache Spark\n* Experience with cloud computing platforms such as AWS\n* Experience with Apache Airflow or other workflow management software\n* Experience with machine learning or interest in picking it up\n* Fluent in English\n* Excellent written and verbal communication skills\n* Ability to work independently, build good working relationships and actively communicate, contribute, and speak up in a remote work structure\n* Curiosity and a desire to keep learning\n* Commitment to consumer privacy and security\n\nNice to have (but not required):\n* Experience with contributing to or maintaining open source software\n* Experience with web crawling\n* Experience with Docker\n \n\n#Salary\n100000 -120000\n \n\n#Location\n- 🌏Worldwide

See more jobs at Creative Commons

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Good Eggs


Senior Data Platform Engineer

verified
🇺🇸US-only

Senior Data Platform Engineer


Good Eggs

🇺🇸US-only verified

data

snowflake

dbt

devops

data

snowflake

dbt

devops

🇺🇸US-only7mo
At Good Eggs, we believe feeding your family well shouldn’t come with a trade off — be it your time, your standards, or your wallet. We’re pioneering a new way to fill your fridge, by sourcing the best food from producers we know and trust, and bringing it straight to you — all at a price the same or less than your grocery store.\n\nWe run a healthy agile engineering process with:\n\n* pair programming\n* test-driven development\n* continuous deployment\n\n\n# Responsibilities\n We're looking for a Data Platform Engineer who is interested in a multidisciplinary engineering environment and is excited to support the culture of data alongside a passionate, mission-driven team.\n\nAs a Data Platform Engineer, you'll work on ingest, modeling, warehousing, BI tools, and have significant influence over the tools & processes we deliver to our customers (Analysts, Engineers, Business Leaders). We have a modern data platform and a strong team of DevOps Engineers and Full-Stack Data Analysts to collaborate with. Some of the tech involved:\n\n* custom code written in multiple languages (primarily Node.js/Typescript, but also Python and Go)\n* Fivetran & Segment\n* Snowflake\n* dbt\n* Mode Analytics\n* a modern, AWS-based, containerized application platform \n\n# Requirements\n**Ideal candidates will have:**\n* A desire to use their talents to make the world a better place\n* 2+ years of agile software development experience including automated testing and pair programming\n* 3+ years of full time, Data experience (ETL, warehousing, modeling, supporting Analysts)\n* interest in learning and adopting new tools and techniques\n* Bachelor’s degree in computer science, computer engineering or equivalent experience\n\n**Experience in some of the following areas:**\n* Node.js/Typescript, Go, Python, SQL\n* DevOps, cloud infrastructure, developer tools\n* Container-based deployments, microservice architecture\n\n**Bonus points for:**\n* Previous work experience involving e-commerce, physical operations, finance, or BizOps\n* Being data-driven - ability to get insights from data\n* Experience with dimensional modeling and/or BEAM* \n\n#Location\n- 🇺🇸US-only

See more jobs at Good Eggs

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Stats (beta): 👁 921 views,✍️ 0 applied (0%)
**The Company**\nSynergy Sports Technology, named by Fast Company as one of the world's top 10 most innovative companies in sports, seeks talented **Senior Backend Data Platform Engineers** to join our team on a long term contract basis. \nThis position offers a tremendous opportunity to work with the only company that delivers on-demand professional-level basketball, baseball, and hockey analytics linked to supporting video to nearly 1500 college, professional, and international teams. Our systems are highly complex and contains petabytes of data and video requiring extremely talented engineers to maintain scale and efficiency of its products.\nAs a member of the Synergy team, its engineering team will contribute to the ongoing development of Synergy’s revolutionary online sports data and video delivery solutions. Building applications such as:\n* Client Analytic Tools\n* Video Editing and Capture Tools\n* Data Logging Tools\n* Operational Game, Data and Video Pipeline Tools\n* Backend Data and Video Platforms\n\nSynergy’s work environment is geographically distributed, with employees working from home offices. The successful candidate must be comfortable working in a virtual office using online collaboration tools for all communication and interaction in conversational English. Synergy development staff work in a deadline-oriented, demanding, non-standard environment in which personal initiative and a strong work ethic are rewarded. Good communication skills, self-motivation, and the ability to work effectively with minimal supervision are crucial. Nonstandard working hours may be required, as Synergy operates on a 24x7 system for clients, with associated deadlines and requirements. Pay rate is dependent on experience.\nInformation for all Positions:\n* All Positions will last for roughly a year with some engineers lasting even longer if they are talented, we will keep them for future projects (contracts are renewing every year)\n* Engineers should be available for phone calls M-F from 7am to 10am Pacific Time zone. There will usually be 1 or 2 phone calls each week that are 30 to 90 minutes each. All other work hours availability is up to the engineer to work when it is a best fit and balance for them to communicate with their team and their personal commitments outside of work.\n* Working an average of 40 hours per week is expected except in rare or temporary circumstances. Each week can be flexible and up to the engineer as to when and how much they work per day. It is ok to work heavier and lighter weeks if desired based upon the engineer’s preference of when and how to work. But a preference is to average 40 hours per week.\n* No travel is required\n\n\n\n\n# Responsibilities\n **Team Objectives**\n\nA candidate joining the Data Platform team can expect to work on the following types of projects:\n* Creating internal and external APIs to support both data and video\n* Building complex data models supporting the business rules of sports\n* Developing algorithms that ingesting and transforming multiple streams of data and collapsing the data into a single event structure\n* Refactoring code to a .NET Core environment\n* Scaling out current systems to support new sports\n* Building build and test automation systems\n* Building complex reporting data structures for analytical systems \n\n# Requirements\n**Required Skill Sets**\n* NoSQL database (MongoDB Preferred)\n* C# (Latest version with a preference to .NET Core)\n

See more jobs at Synergy Sports Technology

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Doximity is transforming the healthcare industry. Our mission is to help doctors save time so they can provide better care for patients.\n\nWe value diversity — in backgrounds and in experiences. Healthcare is a universal concern, and we need people from all backgrounds to help build the future of healthcare. Our data team is deliberate and self-reflective about the kind of team and culture that we are building, seeking data engineers and scientists that are not only strong in their own aptitudes but care deeply about supporting each other's growth. We have one of the richest healthcare datasets in the world, and our team brings a diverse set of technical and cultural backgrounds.\n\nYou will join a small team of Software Engineers focusing on Data Engineering Infrastructure to build and maintain all aspects of our data pipelines, ETL processes, data warehousing, ingestion and overall data stack.\n\n**How you’ll make an impact:**\n\n* Help establish robust solutions for consolidating data from a variety of data sources.\n* Establish data architecture processes and practices that can be scheduled, automated, replicated and serve as standards for other teams to leverage. \n* Collaborate extensively with the DevOps team to establish best practices around server provisioning, deployment, maintenance, and instrumentation.\n* Build and maintain efficient data integration, matching, and ingestion pipelines.\n* Build instrumentation, alerting and error-recovery system for the entire data infrastructure.\n* Spearhead, plan and carry out the implementation of solutions while self-managing.\n* Collaborate with product managers and data scientists to architect pipelines to support delivery of recommendations and insights from machine learning models.\n\n**What we’re looking for:**\n\n* Fluency in Python, SQL mastery.\n* Ability to write efficient, resilient, and evolvable ETL pipelines. \n* Experience with data modeling, entity-relationship modeling, normalization, and dimensional modeling.\n* Experience building data pipelines with Spark and Kafka.\n* Comprehensive experience with Unix, Git, and AWS tooling.\n* Astute ability to self-manage, prioritize, and deliver functional solutions.\n\n**Nice to have:**\n\n* Experience with MySQL replication, binary logs, and log shipping.\n* Experience with additional technologies such as Hive, EMR, Presto or similar technologies.\n* Experience with MPP databases such as Redshift and working with both normalized and denormalized data models.\n* Knowledge of data design principles and experience using ETL frameworks such as Sqoop or equivalent. \n* Experience designing, implementing and scheduling data pipelines on workflow tools like Airflow, or equivalent.\n* Experience working with Docker, PyCharm, Neo4j, Elasticsearch, or equivalent. \n\n**About Doximity**\n\nWe’re thrilled to be named the Fastest Growing Company in the Bay Area, and one of Fast Company’s Most Innovative Companies. Joining Doximity means being part of an incredibly talented and humble team. We work on amazing products that over 70% of US doctors (and over one million healthcare professionals) use to make their busy lives a little easier. We’re driven by the goal of improving inefficiencies in our $2.5 trillion U.S. healthcare system and love creating technology that has a real, meaningful impact on people’s lives. To learn more about our team, culture, and users, check out our careers page, company blog, and engineering blog. We’re growing fast, and there’s plenty of opportunities for you to make an impact—join us!\n\n*Doximity is proud to be an equal opportunity employer, and committed to providing employment opportunities regardless of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex, gender, gender identity, gender expression, pregnancy, childbirth and breastfeeding, age, sexual orientation, military or veteran status, or any other protected classification. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law.* \n\n# Requirements\nUse apply button

See more jobs at Doximity

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.