Remote Data Pipeline Jobs Open Startup
RSS
API
Global PayrollPost a job

find a remote job
work from anywhere

Browse 3+ Remote Data Pipeline Jobs in September 2020 at companies like Inreach Ventures, Vidiq and Chartmogul with salaries ranging from $70,000/year to $75,000/year working as a Product Manager - Data Platform, Senior Data Engineer or ML Engineer. Last post

Browse 3+ Remote Data Pipeline Jobs in September 2020 at companies like Inreach Ventures, Vidiq and Chartmogul with salaries ranging from $70,000/year to $75,000/year working as a Product Manager - Data Platform, Senior Data Engineer or ML Engineer. Last post

Global PayrollPost a job

Get a  email of all new remote Data Pipeline jobs

Subscribe
×

  Jobs

  People

👉 Hiring for a remote Data Pipeline position?

Post a job
on the 🏆 #1 remote jobs board
This job post is closed and the position is probably filled. Please do not apply.
InReach is changing how VC in Europe works, for good. Through data, software and Machine Learning, we are building an in-house platform to help us find, reach-out to and invest in early-stage European startups, regardless of the city or country they’re based in.\n\n\n\nWe are looking for a machine learning engineer to lead in the continued development of InReach’s machine learning capabilities. This involves cleaning / wrangling / merging / processing the data on companies and founders from across Europe, building algorithms to find new opportunities, and the pipelines for continuous improvement.\n\n\n\nIt is important to us that candidates be passionate about helping entrepreneurs and startups. This is our bread-and-butter and we want you to be involved.\n\n\n\nThis is a remote-first role, whether you're in the office in London or working remotely, so we are looking for someone with excellent written and spoken communication skills. InReach is a remote-first employer and we are looking to this hire to help us become an exceptional place to work for remote employees.\n\n\n\n**Background Reading**\n\n* [ InReach Ventures, the 'AI-powered' European VC, closes new €53M fund](https://techcrunch.com/2019/02/11/inreach-ventures-the-ai-powered-european-vc-closes-new-e53m-fund/?guccounter=1)\n\n* [The Full-Stack Venture Capital](https://medium.com/entrepreneurship-at-work/the-full-stack-venture-capital-8a5cffe4d71)\n\n* [Roberto Bonanzinga starts InReach Ventures with DIG platform](https://www.businessinsider.com/roberto-bonanzinga-starts-inreach-ventures-with-dig-platform-2015-11?r=US&IR=T)\n\n* [Exceptional Communication; our guidelines for remote working](https://www.craft.do/s/Isrjt4KaHMPQ)\n\n\n\n**Interview Process**\n\n* 15m video chat with Ben, CTO to find out more about InReach and the role\n\n* 2h data pipeline technical test working alongside Ben\n\n* 2h data science technical test working alongside Ghyslain, Product Manager\n\n* 30m architectural discussion with Ben, talking through the work you did on the pipeline\n\n* 30m data science discussion with Ghyslain, talking through the data science work\n\n* 2h interview with the different team members from across InReach. We’re a small company so it’s important we see how we’ll all work together - not just the tech team!\n\n# Responsibilities\n * Creatively and quickly coming up with effective solutions to undefined problems\n\n* Choosing technology that is modern but not hype-driven\n\n* Developing features and tests quickly with good, clean code\n\n* Researching and experimenting on algorithms in a structured fashion, using engineering discipline\n\n* Being part of the wider development team, reviewing code and participating in architecture from across the stack\n\n* Communicating exceptionally, both asynchronously (written) and synchronously (spoken)\n\n* Helping to shape InReach as a remote-first organization \n\n# Requirements\n**Skills**\n\n* Excellent spoken and written English\n\n* Experience working for a remote organization or be able to compellingly describe why you'll be great at it!\n\n* Great time management and communication\n\n\n\n**Technologies**\n\n* Python3\n\n* Jupyter Notebooks\n\n* Pipenv\n\n* Python Unittest\n\n* Postgres\n\n* Pandas\n\n\n\nNone of these are a prerequisite, but help:\n\n* SQS\n\n* Dynamodb\n\n* Scikit Learn\n\n* Pandas\n\n* AWS Lambda\n\n* Docker\n\n* Numpy\n\n* PyTorch \n\n#Salary\n$70,000\n\n\n#Location\nUK or Italy


See more jobs at InReach Ventures

# How do you apply?\n\n This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.

vidIQ


Senior Data Engineer


Timezones: from UTC-2 to UTC+8

Senior Data Engineer


vidIQ

Timezones: from UTC-2 to UTC+8

python

scala

etl

aws

python

scala

etl

aws

This job post is closed and the position is probably filled. Please do not apply.
**About vidIQ:**\n\nvidIQ helps YouTube creators and brands generate more views and subscribers, while saving time. With over 1 Million active weekly users, we are the #1 Chrome Extension for YouTube creators, with clients including Red Bull, Buzzfeed, PBS, TMZ, BBC as well as hundreds of thousands of the largest YouTube creators in the world. We’re backed by top Silicon Valley investors including Scott Banister and Mark Cuban. vidIQ is profitable with a fully remote team over 25 employees and growing.\n\n**Role & Responsibilities**\n\nvidIQ is seeking a highly-motivated Senior Data Engineer with 5+ years of hands-on data engineering experience to join our growing team. The ideal candidate will be a go-getter with the ability to work independently. In this role, you will have oversight of partitioning data, building an ETL pipeline, data compaction, and AWS optimization. \n\n\nYou must be highly collaborative and a self-starter who is able to work in a fast-paced environment. Strong communication skills are essential in this role, as it will be integral in communicating to the back-end team where and how to implement data integration and persistence. You will also communicate to management the volumes of data we are gathering, as well as communicate the data access points and how to use this data, to the team and management. \n\n# Responsibilities\n **You will be a good fit for this role if the following are true:**\n\n* You love building things. You like new challenges and strive to ship new features to customers on a regular basis.\n* You love to learn. You enjoy keeping up with the latest trends. If a project uses a tool that’s new to you, you dive into the docs and tutorials to figure it out.\n* You act like an owner. When bugs appear, you document and fix them. When projects are too complex, you work with others to refine the scope until it’s something you believe can be built in a reasonable amount of time and maintained in the long run.\n* You care about code quality. You believe simple is better and strive to write code that is easy to read and maintain. \n* You consider edge cases and write tests to handle them. When you come across legacy code that is difficult to understand, you add comments or refactor it to make it easier for the next person.\n* You understand balance. Great products must balance performance, customer value, code quality, dependencies, and so on. You know how to consider all of these concerns while keeping your focus on shipping things.\n* You over-communicate by default. If a project is off-track, you bring it up proactively and suggest ways to simplify and get things going. You proactively share status updates without being asked and strive to keep things as honest and transparent as possible. \n\n# Requirements\n**Minimum experience:**\n\n* 5+ years experience using *Python* for internal data pipelines (moving data inside *AWS* account)\n* *numpy*, pandas.\n* Experience with *Scala* - for external data pipelines (moving data from outside of *AWS* account into *AWS* account) *FS2*, *http4s*.\n* Additional experience with *DynamoDB*, *Lambda*, *Athena*, *S3*, *AWS* *GlueFamiliar* with *Spark* (in the moment Scala only) preferred.\n* Hands-on experience with data workflow orchestration (*Airflow*). \n\n#Salary\n$70,000\n\n\n#Location\nTimezones: from UTC-2 to UTC+8


See more jobs at vidIQ

# How do you apply?\n\n This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
This job post is closed and the position is probably filled. Please do not apply.
As Product Manager for Data Platform you will play a central role in ChartMogul’s overall success. The Data Platform team is responsible for improving and maintaining the backend data ingestion pipeline, public APIs and most of the data output features of ChartMogul.\n\n# Responsibilities\n In this role, you will lead execution of the Data Platform product roadmap by prioritizing projects in collaboration with all relevant stakeholders, you will work side by side with the engineers in your squad to design, build, test, and rollout new features and updates. Last, but not least, you will actively seek out feedback from customers to truly understand their problems and find simple and innovative solutions to those problems.\n\nYou will become a domain expert in different subscription business models, the analytics and features that are meaningful to each, the solutions currently in the market, and the problems people face trying to gain insight into their businesses. \n\n# Requirements\n* Due to the highly technical nature of this product role we are looking for someone with some background in software engineering who has made the jump into Product, so a BA in Computer Science or equivalent work experience is required\n* You are able to isolate user pain points and devise simple solutions to complex problems\n* You have passion and intuition for product quality and strong critical thinking and analytical skills to match\n* You have a knack for articulating and distilling complex topics into simple, plain, English\n* Your attention to detail borders on the obsessive\n* You have several years of experience in product management, ideally for a SaaS product in the B2B market\n* Your technical aptitude and interpersonal skills enable you to work productively with Engineering teams\n* You can understand our entire stack - from front-end to the API to data model\n* You are an excellent communicator (verbal and written) and can adapt to a range of different audiences\n* You have people management or coaching/mentoring experience and enjoy seeing people develop\n* You are self-aware and humble - you recognize that the team's success is your success\n\n**Nice to have**\n* Experience in the Analytics/BI market\n* Past experience as an engineer working in Ruby on Rails, Postgres and JavaScript.\n* You have experience working with a variety of data technologies in your past roles (MySQL, Redshift, S3, Snowflake, Hbase, Kafka, Spark) as a user and/or operationally\n\n**Compensation**\n* In addition to your salary you will receive stock options and an education budget regardless of location, other benefits depend on location \n\n#Salary\n$75,000\n\n\n#Location\n🌏 Worldwide


See more jobs at ChartMogul

# How do you apply?\n\n This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
FeedbackIf you find a bug, or have feedback, write it here. Please no job applications in here, click Apply instead! If you want to advertise, we do not do CPA/aff/perf ads ever.Thanks for the message! We will get back to you soon.

[Spam check] What is the name of Elon Musk's company going to Mars?

Send feedback