Remote Data Engineering + Engineer Jobs in Apr 2021 Open Startup
RSS
API
Post a job

find a remote job
work from anywhere

Browse 7+ Remote Data Engineering Engineer Jobs in April 2021 at companies like Shopify, Doximity and Dataroots with salaries from $60,000/year to $135,000/year working as a Sr Software Engineer Data Infra, Senior Data Engineer or Staff Software Developer Data Platform. Last post

Join 92,126+ people and get a  email of all new remote Data Engineering + Engineer + jobs

Subscribe
×

  Jobs

  People

👉 Hiring for a remote Data Engineering + Engineer position?

Post a job
on the 🏆 #1 remote jobs board

Today's remote Data Engineering + Engineer jobs

Shopify


verified
Canada, United States

Senior Data Engineer


Shopify

Canada, United States

senior data engineer

 

data engineering

 

data platform engineering

 

spark

 

senior data engineer

 

data engineering

 

data platform engineering

 

spark

 
**Company Description**\n\nShopify is the leading omni-channel commerce platform. Merchants use Shopify to design, set up, and manage their stores across multiple sales channels, including mobile, web, social media, marketplaces, brick-and-mortar locations, and pop-up shops. The platform also provides merchants with a powerful back-office and a single view of their business, from payments to shipping. The Shopify platform was engineered for reliability and scale, making enterprise-level technology available to businesses of all sizes. \n\n**Job Description**\n\nOur Data Platform Engineering group builds and maintains the platform that delivers accessible data to power decision-making at Shopify for over a million merchants. We’re hiring high-impact developers across teams:\n\n* The Engine group organizes all merchant and Shopify data into our data lake in highly-optimized formats for fast query processing, and maintaining the security + quality of our datasets.\n* The Analytics group builds products that leverage the Engine primitives to deliver simple and useful products that power scalable transformation of data at Shopify in batch, or streaming, or for machine learning. This group is focused on making it really simple for our users to answer three questions: What happened in the past? What is happening now? And, what will happen in the future? \n* The Data Experiences group builds end-user experiences for experimentation, data discovery, and business intelligence reporting.\n* The Reliability group operates the data platform efficiently in a consistent and reliable manner. They build tools for other teams at Data Platform to leverage to encourage consistency and they champion reliability across the platform.\n\n**Qualifications**\n\nWhile our teams value specialized skills, they've also got a lot in common. We're looking for a(n): \n\n* High-energy self-starter with experience and passion for data and big data scale processing. You enjoy working in fast-paced environments and love making an impact. \n* Exceptional communicator with the ability to translate technical concepts into easy to understand language for our stakeholders. \n* Excitement for working with a remote team; you value collaborating on problems, asking questions, delivering feedback, and supporting others in their goals whether they are in your vicinity or entire cities apart.\n* Solid software engineer: experienced in building and maintaining systems at scale.\n\n**A Senior Data Developer at Shopify typically has 4-6 years of experience in one or more of the following areas:**\n\n* Working with the internals of a distributed compute engine (Spark, Presto, DBT, or Flink/Beam)\n* Query optimization, resource allocation and management, and data lake performance (Presto, SQL) \n* Cloud infrastructure (Google Cloud, Kubernetes, Terraform)\n* Security products and methods (Apache Ranger, Apache Knox, OAuth, IAM, Kerberos)\n* Deploying and scaling ML solutions using open-source frameworks (MLFlow, TFX, H2O, etc.)\n* Building full-stack applications (Ruby/Rails, React, TypeScript)\n* Background and practical experience in statistics and/or computational mathematics (Bayesian and Frequentist approaches, NumPy, PyMC3, etc.)\n* Modern Big-Data storage technologies (Iceberg, Hudi, Delta)\n\n**Additional information**\n\nAt Shopify, we are committed to building and fostering an environment where our employees feel included, valued, and heard. Our belief is that a strong commitment to diversity and inclusion enables us to truly make commerce better for everyone. We strongly encourage applications from Indigenous people, racialized people, people with disabilities, people from gender and sexually diverse communities and/or people with intersectional identities.\n\n\n\n#Location\nCanada, United States


See more jobs at Shopify

# How do you apply?\n\n Click here to apply: https://smrtr.io/5kRRR
Apply for this position

This month's remote Data Engineering + Engineer jobs

Shopify


verified
United States, Canada

Staff Software Developer Data Platform


Shopify

United States, Canada

staff software developer

 

data platform engineering

 

data engineering

 

spark

 

staff software developer

 

data platform engineering

 

data engineering

 

spark

 
**Company Description**\n\nShopify is the leading omni-channel commerce platform. Merchants use Shopify to design, set up, and manage their stores across multiple sales channels, including mobile, web, social media, marketplaces, brick-and-mortar locations, and pop-up shops. The platform also provides merchants with a powerful back-office and a single view of their business, from payments to shipping. The Shopify platform was engineered for reliability and scale, making enterprise-level technology available to businesses of all sizes. \n\n**Job Description**\n\nOur Data Platform Engineering group builds and maintains the platform that delivers accessible data to power decision-making at Shopify for over a million merchants. We’re hiring high-impact developers across teams:\n\n* The Engine group organizes all merchant and Shopify data into our data lake in highly-optimized formats for fast query processing, and maintaining the security and quality of our datasets.\n* The Analytics group leverages the Engine primitives to build and deliver simple and useful products that power scalable transformation of data at Shopify in batch, streaming, or for machine learning. This group is focused on making it really simple for our users to answer three questions: What happened in the past? What is happening now? And, what will happen in the future? \n* The Data Experiences group builds end-user experiences for experimentation, data discovery, and business intelligence reporting.\n* The Reliability group operates the data platform in a consistent and reliable manner. They build tools for other teams on Data Platform to leverage and encourage consistency as they champion reliability across the platform.\n\n**Qualifications**\n\n* An experienced technical leader with a proven track record of delivering impactful results.\n* Technical engineering background in one or more areas in the next section.\n* Experience with technical mentoring, coaching, and improving the technical output of the people around you.\n* Exceptional communication skills and ability to translate technical concepts into easy to understand language for our stakeholders. \n* Excitement for working with a remote team; you value collaborating on problems, asking questions, delivering feedback, and supporting others in their goals whether they are in your vicinity or entire cities apart.\n\n**A Staff Data Developer would typically have 6-10 years of experience in one or more of the following areas:**\n\n* Experience with the internals of a distributed compute engine (Spark, Presto, DBT, or Flink/Beam)\n* Experience in query optimization, resource allocation and management, and data lake performance (Presto, SQL)\n* Experience with cloud infrastructure (Google Cloud, Kubernetes, Terraform\n* Experience with security products and methods (Apache Ranger, Apache Knox, OAuth, IAM, Kerberos)\n* Experience deploying and scaling ML solutions using open-source frameworks (MLFlow, TFX, H2O, etc.)\n* Experience building full-stack applications (Ruby/Rails, React, TypeScript)\n* Background and practical experience in statistics and/or computational mathematics (Bayesian and Frequentist approaches, NumPy, PyMC3, etc.)\n* Modern Big-Data storage technologies (Iceberg, Hudi, Delta)\n\n**Additional information**\nAt Shopify, we are committed to building and fostering an environment where our employees feel included, valued, and heard. Our belief is that a strong commitment to diversity and inclusion enables us to truly make commerce better for everyone. We strongly encourage applications from Indigenous people, racialized people, people with disabilities, people from gender and sexually diverse communities and/or people with intersectional identities.\n\n#Location\nUnited States, Canada


See more jobs at Shopify

# How do you apply?\n\n Click here to apply: https://smrtr.io/5kR_7
Apply for this position

Previous remote Data Engineering + Engineer jobs

This job post is closed and the position is probably filled. Please do not apply.
Doximity is transforming the healthcare industry. Our mission is to help doctors be more productive, informed, and connected. Achieving this vision requires a multitude of disciplines, expertises and perspective. One of our core pillars have always been data. As a software engineer focused on the infrastructure aspect of our data stack you will work on improving healthcare by advancing our data capabilities, best practices and systems. Our team brings a diverse set of technical and cultural backgrounds and we like to think pragmatically in choosing the tools most appropriate for the job at hand.\n\n**About Us**\n\nOur data teams schedule over 1000 Python pipelines and over 350 Spark pipelines every 24 hours, resulting in over 5000 data processing tasks each day. Additionally, our data endeavours leverage datasets ranging in size from a few hundred rows to a few hundred billion rows. The Doximity data teams rely heavily on Python3, Airflow, Spark, MySQL, and Snowflake. To support this large undertaking, the data infrastructure team uses AWS, Terraform, and Docker to manage a high-performing and horizontally scalable data stack. The data infrastructure team is responsible for enabling and empowering the data analysts, machine learning engineers and data engineers at Doximity. We provide and evole a foundation on which to build, and ensure that incidental complexites melt into our abstractions. Doximity has worked as a distributed team for a long time; pre-pandemic, Doximity was already about 65% distributed.\n\nFind out more information on the Doximity engineering blog\n* Our [company core values](https://work.doximity.com/)\n* Our [recruiting process](https://technology.doximity.com/articles/engineering-recruitment-process-doximity)\n* Our [product development cycle](https://technology.doximity.com/articles/mofo-driven-product-development)\n* Our [on-boarding & mentorship process](https://technology.doximity.com/articles/software-engineering-on-boarding-at-doximity)\n\n**Here's How You Will Make an Impact**\n\nAs a data infrastructure engineer you will work with the rest of the data infrastructure team to design, architect, implement, and support data infrastructure, systems, and processes impacting all other data teams at Doximity. You will solidify our CI/CD pipelines, reduce production impacting issues and improve monitoring and logging. You will support and train data analysts, machine learning engineers, and data engineers on new or improved data infrastructure systems and processes. A key responsibility is to encourage data best-practices through code by continuing the development of our internal data frameworks and libraries. Also, it is your responsibility to identify and address performance, scaling, or resource issues before they impact our product. You will spearhead, plan, and carry out the implementation of solutions while self-managing your time and focus.\n\n**About you**\n\n* You have professional data engineering or operations experience with a focus on data infrastructure\n* You are fluent in Python and SQL, and feel at home in a remote Linux server session\n* You have operational experience supporting data stacks through tools like Terraform, Docker, and continuous integration through tools like CircleCI\n* You are foremost an engineer, making you passionate about high code quality, automated testing, and engineering best practices\n* You have the ability to self-manage, prioritize, and deliver functional solutions\n* You possess advanced knowledge of Linux, Git, and AWS (EMR, IAM, VPC, ECS, S3, RDS Aurora, Route53) in a multi-account environment\n* You agree that concise and effective written and verbal communication is a must for a successful team\n\n**Benefits & Perks**\n\n* Generous time off policy\n* Comprehensive benefits including medical, vision, dental, generous paternity and maternity leave, Life/ADD, 401k, flex spending accounts, commuter benefits, equipment budget, and continuous education budget\n* Pre-IPO stock incentives\n* and much more! For a full list, see our career page\n\n**More info on Doximity**\n\nWe're thrilled to be named the Fastest Growing Company in the Bay Area, and one of Fast Company's Most Innovative Companies. Joining Doximity means being part of an incredibly talented and humble team. We work on amazing products that over 70% of US doctors (and over one million healthcare professionals) use to make their busy lives a little easier. We're driven by the goal of improving inefficiencies in our $3.5 trillion U.S. healthcare system and love creating technology that has a real, meaningful impact on people's lives. To learn more about our team, culture, and users, check out our careers page, company blog, and engineering blog. We're growing steadily, and there's plenty of opportunities for you to make an impact.\n\n*Doximity is proud to be an equal opportunity employer and committed to providing employment opportunities regardless of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex, gender, gender identity, gender expression, pregnancy, childbirth and breastfeeding, age, sexual orientation, military or veteran status, or any other protected classification. We also consider qualified applicants with criminal histories, consistent with applicable federal, state, and local law.*\n\n\n\n\n#Location\n🇺🇸 US-only


See more jobs at Doximity

# How do you apply?\n\n This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.

dataroots


verified closed
European Economic Area

Senior Data Engineer


dataroots

European Economic Area

data engineering

 

cloud native

 

cicd

 

devops

 

data engineering

 

cloud native

 

cicd

 

devops

 
This job post is closed and the position is probably filled. Please do not apply.
Dataroots researches, designs and codes robust data solutions & platforms for various sectors. Our dedicated in-house team of data specialists uses state-of-the-art techniques to optimise traditional company processes. With a strong focus on DataOps and MLOps we firmly believe in robust and production-ready solutions being an essential part of our work. The result? Our teams provides clients a reliable foundation to make data-driven decisions. \n\nAs a Senior Data Engineer you excel in building data-driven solutions and infrastructure. You're an architectural genius that effortlessly designs, develops and deploys data pipelines and machine learning models. As a senior you will build and share knowledge with your colleagues.\n\n**💪The Skills**\n\n* You know that Kubernetes, Terraform and CI/CD aren't Star Wars characters. You have a significant knowhow of data-architecture and data-infrastructure.\n* You know your way around tools like Apache Spark, Beam and/or Kafka.\n* You're at ease with programming in Scala and Python.\n* You understand how Machine Learning works and can support the deployment of machine learning models on an on-prem or cloud-native infrastructure.\n* You know the ins and outs of cloud platforms like AWS, GCP or Azure.\n* You apply the KISS principle on your day-to-day, we strive to use the right tool for the right job.\n* You have an analytics and solution oriented mindset.\n* You enjoy wrangling large amounts of data.\n* You have strong communication skills.\n* You speak English fluently, a word of Dutch or French is a plus.\nWas this list a bit much? Don't worry, we don't expect you to tick every box from the beginning. Most of all we are looking for colleagues with the same passion for personal and team development as the rest of us.\n\n**💰The Offer**\n\nReady to roll into your next data engineering adventure? Let’s meet! At dataroots we combine 100% focus with fresh and fun ideas. Our offices are the ideal workplace for people with a knack for innovation. Our languages? Terraform, Kubernetes, Cloud, Containers, Python, Scala, and Go. What we speak when we're not coding? English, because with colleagues from Brazil or Serbia you'll enter an inspiring and internationally focused team.\n\n**🎉What you can count on @ dataroots**\n\n* An attractive salary package & competitive vacation plan\n* A personal & team training budget to further your professional career\n* Trainings and seminars with your team and external experts to further sharpen your technical knife\n* Fussball matches or sushidates … We let of steam during our (virtual) fries & beers and monthly [email protected] events #workhardplayhard\n* A constantly changing environment where everyone can contribute and shape the company for the better\n* Your own innovation budget to work on cool, education, challenging and/or opensource projects within your guild.\n* Nicolas beating you at CS:GO\n\nReady for a new chapter as a Senior Data Engineer at dataroots? Super! Apply now! 🎉 \n\n#Salary or Compensation\n$60,000 — $100,000/year\n\n\n#Location\nEuropean Economic Area


See more jobs at dataroots

# How do you apply?\n\n This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.

Doximity

 

verified closed
North America

Senior Data Engineer Manager  


Doximity

North America

data engineering

 

python

 

sql

 

healthcare tech

 

data engineering

 

python

 

sql

 

healthcare tech

 
This job post is closed and the position is probably filled. Please do not apply.
Doximity is transforming the healthcare industry. Our mission is to help doctors be more productive, informed, and connected. As a software engineer focused on our data stack, you'll work within cross-functional delivery teams alongside other engineers, designers, and product managers in building software to help improve healthcare. \n\nOur team brings a diverse set of technical and cultural backgrounds and we like to think pragmatically in choosing the tools most appropriate for the job at hand.  \n\n***One of Doximity's core values is stretching ourselves. Even if you don't check off all the boxes below we encourage you to apply. Doximity is full of exceptional people that don't fit a mold, join us!***\n\n**About The Job**\n\nAs a Data Engineering Manager at Doximity, we expect you to support and enable 3-7 data engineers in doing the best work of their careers. You will do this through strong leadership skills combined with technical expertise and product familiarity. Indeed, we expect you to be an excellent manager but also to roll-up your sleeves and make technical contributions. Our belief in technical leaders is so strong that your first six months at Doximity will consist of ramping up as an individual contributor. During this time, you will develop your product context and learn the data team's engineering patterns before tackling any leadership challenges.\n\n* You will play a liaison role alongside Data, Product, and Engineering teams. Fostering relationships with other engineering managers, product managers and technical leads.\n* You will contribute to the technical direction of software products.\n* You will aid in iteration planning, retrospectives, and devise effective delegation strategies for you and your team.\n* You will review code, write technical proposals, and help your team grow by recruiting, training, and aiding in career progression plans.\n* You will strive to keep the team pragmatic, shipping, and focused on adding business value while maintaining a cohesive software stack.\n* You will be contributing to at least one large or medium-sized data project at any given time.\n\n**About you**\n\n* You are a data engineer at heart; excited to roll up your sleeves and help with hands-on data engineering problems.\n* You have at least 2 years of experience as an Engineering Manager for a data-focused team of at least 3 members.\n* You are passionate about helping others improve and grow their career by exercising coaching, mentorship, and delegation.\n* You know what it takes to push major data projects through a development lifecycle.\n* You are passionate about using data to make a large impact on the product and the business.\n* You are fluent in Python and SQL.\n* You have professional experience architecting, developing, and shipping data and ETL pipelines of all sizes. You do so through the lens of treating pipelines themselves as software.\n* You understand the importance of carefully crafting and designing data models using methods like object-relational, entity-relationship, or dimensional modeling.\n* You default to concise and effective written and verbal communication.\n* Your leadership style is servant to the team and starts with empathy; you trust your team to make the right decision and empower them to execute.\n\n**Benefits & Perks**\n\n* Generous time off policy\n* Comprehensive benefits including medical, vision, dental, Life/ADD, 401k, flex spending accounts, commuter benefits, equipment budget, educational resources and conference access\n* Family support and planning benefits\n* Pre-IPO stock incentives\n* .. and much more! For a full list, see our career page\n\n**About us**\n\n* Here are some of the ways [we bring value to doctors](https://drive.google.com/file/d/1qimYh0mG3i1nTJe6jDCDepJt2i4o8MEB/view)\n* Our web applications are built primarily using Ruby, Rails, Javascript (Vue.js), and a bit of Golang\n* Our data engineering stack runs on Python, Snowflake, MySQL, Spark, and Airflow\n* Our production application stack is hosted on AWS and we deploy to production on average 65 times per day\n* We have over 450 private repositories in Github containing our applications, forks of gems, our own internal gems, and [open-source projects](https://github.com/doximity)\n* We have worked as a distributed team for a long time; we're currently about [75%+ distributed](https://blog.brunomiranda.com/building-a-distributed-engineering-team-85d281b9b1c)\n* Find out more information on the [Doximity technology blog](https://technology.doximity.com/)\n* Our [company core values](https://work.doximity.com/)\n* Our [recruiting process](https://technology.doximity.com/articles/engineering-recruitment-process-doximity)\n* Our [product development cycle](https://technology.doximity.com/articles/mofo-driven-product-development)\n* Our [on-boarding & mentorship process](https://technology.doximity.com/articles/software-engineering-on-boarding-at-doximity)\n\nWe’re thrilled to be named the [Fastest Growing Company in the Bay Area](https://www.prnewswire.com/news-releases/doximity-is-fastest-growing-company-in-bay-area-per-deloittes-2016-technology-fast-500-300367390.html), and one of [Fast Company’s Most Innovative Companies](https://www.fastcompany.com/most-innovative-companies/2018/sectors/social-media). Joining Doximity means being part of an incredibly talented and humble team. We work on amazing products that over 70% of US doctors (and over one million healthcare professionals) use to make their busy lives a little easier. We’re driven by the goal of improving inefficiencies in our $3.5 trillion U.S. healthcare system and love creating technology that has a real, meaningful impact on people’s lives. To learn more about our team, culture, and users, check out our careers page, company blog, and engineering blog. We’re growing steadily, and there’s plenty of opportunities for you to make an impact.\n\n*Doximity is proud to be an equal opportunity employer, and committed to providing employment opportunities regardless of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex, gender, gender identity, gender expression, pregnancy, childbirth and breastfeeding, age, sexual orientation, military or veteran status, or any other protected classification. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law.*\n\n#Location\nNorth America


See more jobs at Doximity

# How do you apply?\n\n This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.

Doximity


verified closed
North America

Python Automation Engineer


Doximity

North America

python

 

data engineering

 

ci pipeline

 

aws

 

python

 

data engineering

 

ci pipeline

 

aws

 
This job post is closed and the position is probably filled. Please do not apply.
Doximity is transforming the healthcare industry. Our mission is to help doctors be more productive, informed, and connected. As a software engineer, you'll work within cross-functional delivery teams alongside other engineers, designers, and product managers in building software to help improve healthcare.  \n\nOur [team](https://www.doximity.com/about/company#theteam) brings a diverse set of technical and cultural backgrounds and we like to think pragmatically in choosing the tools most appropriate for the job at hand.\n\nThis position is for an experienced Python software engineer, with a passion for writing tests, to join our 5 person Test Automation team. We’re looking for someone with a strong track record of putting Python to work in data-oriented products.\n\n**About Us**\n\n* Here are some of the ways [we bring value to doctors](https://drive.google.com/file/d/1qimYh0mG3i1nTJe6jDCDepJt2i4o8MEB/view)\n* Our web applications are built primarily using Ruby, Rails, Javascript (Vue.js), and a bit of Golang\n* Our data engineering stack run on Python, MySQL, Spark, and Airflow\n* Our production application stack is hosted on AWS and we deploy to production on average 50 times per day\n* We have over 350 private repositories in Github containing our applications, forks of gems, our own internal gems, and [open-source projects](https://github.com/doximity)\n* We have worked as a distributed team for a long time; we're currently [about 65% distributed](https://blog.brunomiranda.com/building-a-distributed-engineering-team-85d281b9b1c)\n* Find out more information on the [Doximity engineering blog](https://engineering.doximity.com/)\n* Our [company core values](https://work.doximity.com/)\n* Our [recruiting process](https://engineering.doximity.com/articles/engineering-recruitment-process-doximity)\n* Our [product development cycle](https://engineering.doximity.com/articles/mofo-driven-product-development)\n* Our [on-boarding & mentorship process](https://engineering.doximity.com/articles/software-engineering-on-boarding-at-doximity)\n\n**Here's How You Will Make an Impact**\n\n* Design and build a CI pipeline for the Data Engineering Python codebase.\n* Spearhead the process of establishing a modern CI/CD suite for data pipelines.\n* Deploy and maintain CI servers within AWS, CircleCI, and other partners. \n* Write documentation and guides, be an advocate and mentor to the Data team with regards to Test Automation.\n\n**About you**\n\n* Minimum 2-4 years of professional experience developing software using Python.\n* Experience writing unit and integration tests in Pytest.\n* Experience with shell scripting (bash, zsh).\n* Experience with SQL.\n* Able to troubleshoot test failures and build consistency issues.\n* Able to investigate intermittent CI server failures due to infrastructure shortcomings.\n* Able to communicate effectively.\n* Able to effectively manage time; balance failure investigation with completing sprint tasks.\n* Experience building and maintaining Docker images.\n* CircleCI experience is a plus.\n* Work remotely provided you have 5 hours of overlap with the team in the U.S. Our core hours are 9:30 AM to 5:30 PM PST.\n\n**Benefits & Perks**\n\n* Generous time off policy\n* Comprehensive benefits including medical, vision, dental, Life/ADD, 401k, flex spending accounts, commuter benefits, equipment budget, and continuous education budget\n* Pre-IPO stock incentives\n* .. and much more! For a full list, see our career page\n\n**More info on Doximity**\n\nWe’re thrilled to be named the [Fastest Growing Company in the Bay Area](https://www.prnewswire.com/news-releases/doximity-is-fastest-growing-company-in-bay-area-per-deloittes-2016-technology-fast-500-300367390.html), and one of [Fast Company’s Most Innovative Companies. ](https://www.fastcompany.com/most-innovative-companies/2018/sectors/social-media)Joining Doximity means being part of an incredibly talented and humble team. We work on amazing products that over 70% of US doctors (and over one million healthcare professionals) use to make their busy lives a little easier. We’re driven by the goal of improving inefficiencies in our $2.5 trillion U.S. healthcare system and love creating technology that has a real, meaningful impact on people’s lives. To learn more about our team, culture, and users, check out our careers page, company blog, and engineering blog. We’re growing steadily, and there’s plenty of opportunity for you to make an impact.\n\n*Doximity is proud to be an equal opportunity employer, and committed to providing employment opportunities regardless of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex, gender, gender identity, gender expression, pregnancy, childbirth and breastfeeding, age, sexual orientation, military or veteran status, or any other protected classification. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law.* \n\n#Salary or Compensation\n$135,000/year\n\n\n#Location\nNorth America


See more jobs at Doximity

# How do you apply?\n\n This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.

Catch Co.


closed
🇺🇸 US-only

Data Engineer


Catch Co.

🇺🇸 US-only

data engineering

 

engineer

 

data engineering

 

engineer

 
This job post is closed and the position is probably filled. Please do not apply.
**Position Overview:**\n\n\nThe Catch Company is looking for a Data Engineer to support our Analytics team. The Analytics team is looking to grow and improve our analytics tech stack to enable smarter and faster business decisions, automated processes, and personalized customer experiences.\n\nIn this role, you will own the development and maintenance of our analytics tech stack and will be instrumental in identifying and implementing new technologies and tools to support our goals. While we already have a robust analytics / business intelligence ecosystem in place, we believe the right Data Engineer can push our team to become industry leaders in enabling smarter decision-making and personalizing our customer experience via new technologies and approaches. Our current tech stack includes: Redshift (warehouse), Fivetran/Stitch Data/custom pipelines (ETL), dbt (transformation), Looker (visualization), and a variety of other services that support one-off tools (e.g., Jupyter notebooks, Amazon EC2, etc.).\n\nAdditionally, we welcome both local (Chicago) and remote candidates for this role! Our analytics team is partially remote and our engineering team is fully remote. **Travel is not required for interviews or the job itself.**\n\n\n**What makes this a special opportunity:**\n\n* You will have broad freedom to change and improve the way we do things as the only Data Engineer on the team\n* You will have the opportunity to be a thought leader when it comes to selecting new technologies; you will be responsible for identifying and implementing new tools and technologies\n* You will work with people who are eager to use data to improve our product offerings, our customer experience, and other key components of the business\n* We place a premium on building a great culture made up of great people\n* You will work with and learn from experienced leaders who have a track record of building successful companies\n\n\n**Benefits:**\n\n* "Take what you need" PTO Policy\n* 4 additional paid days off specifically to enjoy the outdoors\n* Flexible working schedule\n* Ability to work from home if there is a need\n* Medical, Dental and Vision Insurance - We cover 85% of your premium and 50% for dependents\n* Health Savings Account\n* 401(K) plan\n* Pre-Tax Commuter Benefits\n* Unlimited fruit snacks\n\n***Unfortunately, visa sponsorship is not available at this time***\n\n\n\n# Responsibilities\n **What you will do:**\n\n* Own the maintenance and development of our analytics tech stack, including identifying and implementing new tools, managing utilization, and improving performance\n* Model and architect our data in a way that will scale with the increasingly complex ways we’re analyzing it\n* Re-structure our processes for ingesting and analyzing website event data to \n * a) Capture more usable, relevant data \n * and b) Use technologies like Spark that allow for faster data transformation\n* Build custom data pipelines that reliably provide clean, ready-to-analyze data and develop systems that monitor those pipelines to ensure their health\n* Work closely with our software engineers to identify new opportunities for data collection (with a focus on personalization/recommendation systems) and build the processes to make that data available in our data warehouse\n* Identify use cases for real-time/streaming analytics and select and implement tools to support those use cases\n* Research and surface new ideas and approaches, whether new technologies, tools, frameworks, or process improvements for the team \n\n# Requirements\n**What experience you need:**\n\n* Experience working in data engineering, data architecture, or another similar field\n* Extensive experience manipulating data using SQL\n* Experience using Git to version/manage code\n* Fluency in one or more programming languages such as Python, Java, Go, etc.\n* Experience working with relational databases/data warehouses\n* Familiarity with ETL tools\n* Familiarity with business intelligence/visualization tools\n* [Optional/Preferred]: Experience building custom data pipelines\n* [Optional/Preferred]: Experience structuring and analyzing high volumes of website event data (e.g., impressions, views, clicks, etc.)\n* You must be eligible to work in the United States; visa sponsorship is not available\n\n**What will make you successful:**\n* Curiosity: Always seeking to understand “why”, always looking to make things better.\n* Passion: You are driven by a love for what you do\n* Optimism: The ability to bounce back quickly when something doesn’t work\n* Action: Knowing when to shift from planning to doing\n* Honesty: Transparency with customers, partners and teammates\n* Entrepreneurial spirit\n* Data-driven mindset\n* An interest in / passion for the outdoors (fishing knowledge not required!)\n\n\n#Location\n🇺🇸 US-only


See more jobs at Catch Co.

# How do you apply?\n\n This job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
201ms