Remote Data Engineer, Product Integration at Doximity Open Startup
RSS
API
Global PayrollPost a job

find a remote job
work from anywhere

The largest collection of Remote Jobs for Digital Nomads online. Get a remote job you can do anywhere at Remote Companies like Toptal, Zapier and Automattic who embrace the future. There are 43,500+ jobs that allow you to work anywhere and live everywhere.

The largest collection of Remote Jobs for Digital Nomads online. Get a remote job you can do anywhere at Remote Companies like Toptal, Zapier and Automattic who embrace the future. There are 43,500+ jobs that allow you to work anywhere and live everywhere.

Global PayrollPost a job

  Jobs

  People

👉 Hiring for a remote position?

Post a job
on the 🏆 #1 remote jobs board

trusted by

Doximity


Data Engineer Product Integration

verified closed

Data Engineer Product Integration


Doximity


elasticsearch

 

git

 

python

 

engineer

 

elasticsearch

 

git

 

python

 

engineer

 
This job post is closed and the position is probably filled. Please do not apply.
Why work at Doximity?\n\nDoximity is the leading social network for healthcare professionals with over 70% of U.S. doctors as members. We have strong revenues, real market traction, and we're putting a dent in the inefficiencies of our $2.5 trillion U.S. healthcare system. After the iPhone, Doximity is the fastest adopted product by doctors of all time. Our founder, Jeff Tangney, is the founder & former President and COO of Epocrates (IPO in 2010), and Nate Gross is the founder of digital health accelerator RockHealth. Our investors include top venture capital firms who've invested in Box, Salesforce, Skype, SpaceX, Tesla Motors, Twitter, Tumblr, Mulesoft, and Yammer. Our beautiful offices are located in SoMa San Francisco.\n\nYou will join a small team of data infrastructure engineers (4) to build and maintain all aspects of our data pipelines, ETL processes, data warehousing, ingestion and overall data infrastructure. We have one of the richest healthcare datasets in the world, and we're not afraid to invest in all things data to enhance our ability to extract insight.\n\nJob Summary\n\n-Collaborate with product managers, data analysts, and data scientists to develop pipelines and ETL tasks in order to facilitate extraction of insights from data.\n-Build and maintain efficient data integration, matching, and ingestion pipelines.\n-Establish data architecture processes and practices that can be scheduled, automated, replicated and serve as standards for other teams to leverage.\n-Spearhead, plan and carry out the implementation of solutions while self-managing.\n\nRequired Experience & Skills\n\n-At least two years of professional experience developing data infrastructure solutions.\n-Fluency in Python and SQL.\n-Experience building data pipelines with Spark is a big plus.\n-Passion for clean code and testing with Pytest, FactoryBoy, or equivalent.\n-Astute ability to self-manage, prioritize, and deliver functional solutions..\n-Working knowledge of Unix, Git, and AWS tooling.\n\nOur Data Stack\n\n-Python, Kafka, Spark, MySQL, Redshift, Presto, Airflow, Neo4j, Elasticsearch\n\nFun Facts About the Team\n\n-We have one of the richest healthcare datasets in the world.\n-Business decisions at Doximity are driven by our data, analyses, and insights.\n-Hundreds of thousands of healthcare professionals will utilize the products you build.\n-Our R&D team makes up about half the company, and the product is led by the R&D team.\n-Our Data Science team is comprised of about 20 people.


See more jobs at Doximity

Visit Doximity's website

# How do you apply?\n\n This job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.