Remote Data Jobs in November 2019 📈 Open Startup
RSS
API
Post a Job

get a remote job
you can do anywhere

14 Remote Data Jobs at companies like Intrinio, Kraken Digital Asset Exchange and Thorn last posted 21 days ago

14 Remote Data Jobs at companies like Intrinio, Kraken Digital Asset Exchange and Thorn last posted 21 days ago

Get a  email of all new remote Data jobs

Subscribe
×

  Jobs

  People

👉 Hiring for a remote Data position?

Post a Job - $299
on the 🏆 #1 remote jobs board

Last 30 days

Intrinio

 

Senior Ruby Engineer

verified
🇺🇸US-only

Senior Ruby Engineer  


Intrinio

🇺🇸US-only verified

dev

ruby

api

data

dev

ruby

api

data

🇺🇸US-only21d
**Why Work at Intrinio**\n\nWe are a fast-paced and well-funded startup creating new technology in the financial data market. Our team is highly experienced and productive. We enjoy working together and advancing in our craft. Our goal is to produce world-class software that will significantly disrupt the world of finance, creating new efficiencies and encouraging innovation by smaller players.\n\n**About the Job**\n\nWe are looking for a senior-level Ruby software engineer. In this position, you will be actively contributing to the design and development of our financial data platform and products. If you have the skills and ability to build high-quality, innovative and fully functional software in-line with modern coding standards and solid technical architecture - we want to talk to you. Intrinio is a startup (20+ people), so you should be comfortable working on a small team, moving fast, breaking things, committing code several times a day, and delivering working software weekly.\n\n\n**Ideal candidates will have several of the following:**\n* Mastery of the Ruby programming language and its major frameworks\n* Knowledge of data stores and their use cases: SQL databases, Redis, and Elasticsearch\n* Experience with API development and usage\n* A history of learning new technology stacks and methodologies \n* Interest (or experience) in the financial markets\n* A track-record of public and/or private contributions on GitHub\n\n\n# Responsibilities\n * Write well-designed, testable, documented, and performant code\n* Commit code several times a day\n* Deliver working features on a weekly basis\n* Review the code of and help to mentor junior and mid-level developers\n* Communicate clearly and timely with managers and team members (we use a Kanban process with Monday and Slack) \n\n# Requirements\n* Significant time working remotly (please do not apply otherwise)\n* 5+ years of software engineering experience\n* Significant experience developing web applications and APIs\n* Strong knowledge of Relational Databases, SQL and ORM libraries \n\n#Location\n- 🇺🇸US-only

See more jobs at Intrinio

# How do you apply? Visit https://about.intrinio.com/careers and click "Apply" next to "Senior Software Engineer". Our CTO will reach out to you shortly after for next steps.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Kraken Digital Asset Exchange

 

Data Engineer - Cryptowatch


North America or Europe

Data Engineer - Cryptowatch  


Kraken Digital Asset Exchange

North America or Europe

data

engineer

growth

digital nomad

data

engineer

growth

digital nomad

North America or Europe22d
You will be an instrumental piece of a small team with a mandate to understand how Cryptowatch visitors and clients are utilizing the product. Succeeding in this role requires knowledge on architecting data systems, a deep understanding of how to measure user behavior, and ability to translate raw data in to easy-to-understand dashboards. You will work closely with marketers and product managers on the Growth team to design+build user behavior measurement infrastructure and translate this data into insights. By structuring and helping build measurement pipelines, you'll help the team learn about customers and drive growth. Your work will directly impact the product roadmap and bottom line of the Cryptowatch business. \n\nYou will also help establish measurement of key conversion and retention metrics, then use them to identify opportunities for improvement in the product experience. As a fullstack developer passionate about driving towards business goals, you will work up and down the stack and pick up new tools and frameworks quickly.\n\n# Responsibilities\n * Design and help implement data pipelines that collect, transform, and curate data to help our team understand user behavior on the site, using data from external tools and internal databases.\n* Work with the Cryptowatch Growth team to design lightweight experiments that help us learn about customers and drive key growth metrics. \n* Create structure and process around growth experimentation, data collection, and user research from the ground up.\n* Work with Business Operations, Strategy, Marketing, and Product to collectively grow our understanding of our customer base. \n\n# Requirements\n* 5+ years of work experience in relevant field (Data Engineer, DW Engineer, Software Engineer, etc).\n* You are comfortable with specing analytics from the end user's dashboard down to the events in our UI.\n* You have expertise with the React.JS framework.\n* You have experience with Golang and PostgreSQL.\n* You are quick to pick up new tools and frameworks.\n* You have a strong ability to search a large codebase to find what you’re looking for.\n* You are able to communicate effectively with businesspeople, designers, developers, marketers, product managers and the customer.\n* You always ask “why?” and love searching for ways to answer your questions quantitatively.\n* You are skilled in data visualisation and web analytics tools like Grafana and MixPanel. \n\n#Location\n- North America or Europe

See more jobs at Kraken Digital Asset Exchange

# How do you apply? Apply [Here](https://jobs.lever.co/kraken)
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Thorn


Senior Data Engineer


🇺🇸US-only

Senior Data Engineer


Thorn

🇺🇸US-only

data

engineering

engineer

work from home

data

engineering

engineer

work from home

🇺🇸US-only23d
Thorn is a non-profit focused on building technology to defend children from sexual abuse. Working at Thorn gives you the opportunity to apply your skills, expertise and passions to directly impact the lives of vulnerable and abused children. Our staff solves dynamic, quickly evolving problems with our network of partners from tech companies, NGOs, and law enforcement agencies. If you are able to bring clarity to complexity and lightness to heavy problems, you could be a great fit for our team.\n\nEarlier this year, we took the stage at TED and shared our audacious goal of eliminating child sexual abuse material from the internet. A key aspect of our work is partnering with the National Center for Missing & Exploited Children and building technology to optimize the broader ecosystem combating online child sexual abuse.\n\n**What You'll Do**\n\n* Collaborate with other engineers on your team to build a data pipeline and client application from end-to-end.\n* Prototype, implement, test, deploy, and maintain stable data engineering solutions.\n* Work closely with the product manager and engineers to define product requirements.\n* Present possible technical solutions to various stakeholders, clearly explaining your decisions and how they address real user needs, incorporating feedback in subsequent iterations.\n\n**What We're Looking For**\n\n* You have a commitment to putting the children we serve at the center of everything you do.\n* You have proficient software development knowledge, with experience building, growing, maintaining a variety of products, and a love for creating elegant applications using modern technologies.\n* You’re experienced with devops (Docker, AWS, microservices) and can launch and maintain new services.\n* You are experienced with distributed data storage systems/formats such as MemSQL, Snowflake, Redshift, Druid, Cassandra, Parquet, etc.\n* You have worked with real-time systems using various open source technologies like Spark, MapReduce, NoSQL, Hive, etc.\n* You have knowledge in data modeling, data access, and data storage techniques for big data platforms.\n* You have an ability and interest in learning new technologies quickly.\n* You can work with shifting requirements and collaborate with internal and external stakeholders.\n* You have experience prototyping, implementing, testing, and deploying code to production.\n* You have a passion for product engineering and an aptitude to work in a collaborative environment, can demonstrate empathy and strong advocacy for our users, while balancing the vision and constraints of engineering.\n* You communicate clearly, efficiently, and thoughtfully. We’re a highly-distributed team, so written communication is crucial, from Slack to pull requests to code reviews.\n\n**Technologies We Use**\n\n*You should have experience with at least a few of these, and a desire and ability to learn the rest.*\n\n* Python\n* Elasticsearch / PostgreSQL\n* AWS / Terraform\n* Docker / Kubernetes\n* Node / Typescript \n\n#Salary\n100000-150000\n \n\n#Location\n- 🇺🇸US-only

See more jobs at Thorn

Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

This year

Doximity is transforming the healthcare industry. Our mission is to help doctors be more productive, informed, and connected. As a software engineer focused on our data stack, you'll work within cross-functional delivery teams alongside other engineers, designers, and product managers in building software to help improve healthcare. \n\nOur [team](https://www.doximity.com/about/company#theteam) brings a diverse set of technical and cultural backgrounds and we like to think pragmatically in choosing the tools most appropriate for the job at hand.  \n\n**About Us**\n* We rely heavily on Python, Airflow, Spark, MySQL and Snowflake for most of our data pipelines\n* We have over 350 private repositories in Github containing our pipelines, our own internal multi-functional tools, and [open-source projects](https://github.com/doximity)\n* We have worked as a distributed team for a long time; we're currently [about 65% distributed](https://blog.brunomiranda.com/building-a-distributed-engineering-team-85d281b9b1c)\n* Find out more information on the [Doximity engineering blog](https://engineering.doximity.com/)\n* Our [company core values](https://work.doximity.com/)\n* Our [recruiting process](https://engineering.doximity.com/articles/engineering-recruitment-process-doximity)\n* Our [product development cycle](https://engineering.doximity.com/articles/mofo-driven-product-development)\n* Our [on-boarding & mentorship process](https://engineering.doximity.com/articles/software-engineering-on-boarding-at-doximity)\n\n**Here's How You Will Make an Impact**\n\n* Collaborate with product managers, data analysts, and data scientists to develop pipelines and ETL tasks in order to facilitate the extraction of insights from data.\n* Build, maintain, and scale data pipelines that empower Doximity’s products.\n* Establish data architecture processes and practices that can be scheduled, automated, replicated and serve as standards for other teams to leverage.\n* Spearhead, plan, and carry out the implementation of solutions while self-managing.\n\n**About you**\n\n* You have at least three years of professional experience developing data processing, enrichment, transformation, and integration solutions\n* You are fluent in Python, an expert in SQL, and can script your way around Linux systems with bash\n* You are no stranger to data warehousing and designing data models\n* Bonus: You have experience building data pipelines with Apache Spark in a multi-database ecosystem\n* You are foremost an engineer, making you passionate for high code quality, automated testing, and other engineering best practices\n* You have the ability to self-manage, prioritize, and deliver functional solutions\n* You possess advanced knowledge of Unix, Git, and AWS tooling\n* You agree that concise and effective written and verbal communication is a must for a successful team\n* You are able to maintain a minimum of 5 hours overlap with 9:30 to 5:30 PM Pacific time\n* You can dedicate about 18 days per year for travel to company events\n\n**Benefits**\n\nDoximity has industry leading benefits. For an updated list, see our career page\n\n**More info on Doximity\n**\nWe’re thrilled to be named the Fastest Growing Company in the Bay Area, and one of Fast Company’s Most Innovative Companies. Joining Doximity means being part of an incredibly talented and humble team. We work on amazing products that over 70% of US doctors (and over one million healthcare professionals) use to make their busy lives a little easier. We’re driven by the goal of improving inefficiencies in our $3.5 trillion U.S. healthcare system and love creating technology that has a real, meaningful impact on people’s lives. To learn more about our team, culture, and users, check out our careers page, company blog, and engineering blog. We’re growing steadily, and there’s plenty of opportunities for you to make an impact.\n\n\n*Doximity is proud to be an equal opportunity employer, and committed to providing employment opportunities regardless of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex, gender, gender identity, gender expression, pregnancy, childbirth and breastfeeding, age, sexual orientation, military or veteran status, or any other protected classification. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law.* \n\n#Location\n- North America

See more jobs at Doximity

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Creative Commons


Senior Data Engineer

verified
🌏Worldwide

Senior Data Engineer


Creative Commons

🌏Worldwide verified

data

engineering

engineer

senior

data

engineering

engineer

senior

🌏Worldwide3mo
Creative Commons is building a “front door” to the growing universe of openly licensed and public domain content through CC Search and the CC Catalog API. The Senior Data Engineer reports to the Director of Engineering and is responsible for CC Catalog, the open source catalog that powers those products. This project will unite billions of records for openly-licensed and public domain works and metadata, across multiple platforms, diverse media types, and a variety of user communities and partners.\n\n**Diversity & inclusion**\n\nWe believe that diverse teams build better organizations and better services. Applications from qualified candidates from all backgrounds, including those from under-represented communities, are very welcome. Creative Commons works openly as part of a global community, guided by collaboratively developed codes of conduct and anti-harassment policies.\n\n**Work environment and location**\n\nCreative Commons is a fully-distributed organization - we have no central office. You must have reasonable mobility for travel to twice-annual all-staff meetings and the CC Global Summit (a total of 3 trips per year). We provide a subsidy towards high-speed broadband access. Laptop/desktop computer and necessary resources are supplied.\n\n\n\n# Responsibilities\n **Primary responsibilities**\nArchitect, build, and maintain the existing CC Catalog, including:\n* Ingesting content from new and existing sources of CC-licensed and public domain works.\n* Scaling the catalog to support billions of records and various media types.\n* Implementing resilient, distributed data solutions that operate robustly at web scale.\n* Automating data pipelines and workflows.\n* Collaborating with the Backend Software Engineer and Front End Engineer to support the smooth operation of the CC Catalog API and CC Search.\n\nAugment and improve the metadata associated with content indexed into the catalog using one or more of the following: machine learning, computer vision, OCR, data analysis, web crawling/scraping.\n\nBuild an open source community around the CC Catalog, including:\n* Restructuring the code and workflows such that it allows community contributors to identify new sources of content and add new data to the catalog.\n* Guiding new contributors and potentially participating in projects such as Google Summer of Code as a mentor. \n* Writing blog posts, maintaining documentation, reviewing pull requests, and responding to issues from the community.\n\nCollaborate with other outside communities, companies, and institutions to further Creative Commons’ mission. \n\n# Requirements\n* Demonstrated experience building and deploying large scale data services, including database design and modeling, ETL processing, and performance optimization\n* Proficiency with Python\n* Proficiency with Apache Spark\n* Experience with cloud computing platforms such as AWS\n* Experience with Apache Airflow or other workflow management software\n* Experience with machine learning or interest in picking it up\n* Fluent in English\n* Excellent written and verbal communication skills\n* Ability to work independently, build good working relationships and actively communicate, contribute, and speak up in a remote work structure\n* Curiosity and a desire to keep learning\n* Commitment to consumer privacy and security\n\nNice to have (but not required):\n* Experience with contributing to or maintaining open source software\n* Experience with web crawling\n* Experience with Docker\n \n\n#Salary\n100000 -120000\n \n\n#Location\n- 🌏Worldwide

See more jobs at Creative Commons

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Good Eggs


Senior Data Platform Engineer

verified
🇺🇸US-only

Senior Data Platform Engineer


Good Eggs

🇺🇸US-only verified

data

snowflake

dbt

devops

data

snowflake

dbt

devops

🇺🇸US-only7mo
At Good Eggs, we believe feeding your family well shouldn’t come with a trade off — be it your time, your standards, or your wallet. We’re pioneering a new way to fill your fridge, by sourcing the best food from producers we know and trust, and bringing it straight to you — all at a price the same or less than your grocery store.\n\nWe run a healthy agile engineering process with:\n\n* pair programming\n* test-driven development\n* continuous deployment\n\n\n# Responsibilities\n We're looking for a Data Platform Engineer who is interested in a multidisciplinary engineering environment and is excited to support the culture of data alongside a passionate, mission-driven team.\n\nAs a Data Platform Engineer, you'll work on ingest, modeling, warehousing, BI tools, and have significant influence over the tools & processes we deliver to our customers (Analysts, Engineers, Business Leaders). We have a modern data platform and a strong team of DevOps Engineers and Full-Stack Data Analysts to collaborate with. Some of the tech involved:\n\n* custom code written in multiple languages (primarily Node.js/Typescript, but also Python and Go)\n* Fivetran & Segment\n* Snowflake\n* dbt\n* Mode Analytics\n* a modern, AWS-based, containerized application platform \n\n# Requirements\n**Ideal candidates will have:**\n* A desire to use their talents to make the world a better place\n* 2+ years of agile software development experience including automated testing and pair programming\n* 3+ years of full time, Data experience (ETL, warehousing, modeling, supporting Analysts)\n* interest in learning and adopting new tools and techniques\n* Bachelor’s degree in computer science, computer engineering or equivalent experience\n\n**Experience in some of the following areas:**\n* Node.js/Typescript, Go, Python, SQL\n* DevOps, cloud infrastructure, developer tools\n* Container-based deployments, microservice architecture\n\n**Bonus points for:**\n* Previous work experience involving e-commerce, physical operations, finance, or BizOps\n* Being data-driven - ability to get insights from data\n* Experience with dimensional modeling and/or BEAM* \n\n#Location\n- 🇺🇸US-only

See more jobs at Good Eggs

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

In Marketing We Trust


Conversion Optimisation Data Specialist

verified

Conversion Optimisation Data Specialist


In Marketing We Trust

verified

cro

digital marketing

data

analytics

cro

digital marketing

data

analytics

7mo
**The Company**\n\nWe are a digital analytics Agency and we are very good at Analytics, SEO, CRO, Data Science. [Find out more about our services](https://www.inmarketingwetrust.com.sg/services/)\n\n\n> Our motto: Traffic is nice. Revenue is better.\n\nWe have the most fun doing it in competitive environments such as finance, education, jobs, insurance, automotive and many more.\n\nWe strive on being an extension of our client’s team and providing tailor-made, return-focused work.\n\nWe are picky about the clients we bring on-board and very protective of our reputation. We won’t engage with a firm unless we believe we can make a difference to their business and help them Get Stuff Done!\n\nThis stops us from taking on board demoralising projects and having us hating our jobs and our clients.\n\nWe think the current agency model is broken and we really believe we can build a better alternative. But we need great people to help make this happen and that’s where you come in!\n\nWe are a bootstrapped company, we grew exponentially in the last few years and now seating at 60 people working with us.\n\nWe are built to be remote from day 1, but if you want to say ‘Hello’, our office is 78 meters from the beach in Sydney. Make sure you bring your swimwear!\n\n\n\n**The Opportunity**\n\nWe are looking for a passionate  Conversion Rate Optimization (CRO) Specialist to join our growing team. The CRO Specialist will work on multiple client projects in support of our broader team to contribute to and grow our conversion research and testing services.\n\nYou will need a mix of creativity and strong analytical skills to identify website optimisation opportunities for our clients.  You will support to develop and execute strategies and help turn those ideas into an optimisation and testing plan.\n\nYou should have enough statistical and analytical abilities to help us understand data in a way that results in actionable suggestions for our clients.\n\nIf you have a great understanding of how traffic and audiences are built, of what it takes to incrementally drive improvements to UX and CRO for lead generation and for online sales, we want you!\n\n# Responsibilities\n You'll have a data-driven approach to optimisation, with a solid technical understanding of web analytics and CRO tools, your day to day activities will include:\n\n* Analyze quantitative and qualitative data including web, CRM, user testing, polls etc.\n* Use your findings to present actionable recommendations and test ideas to drive the best possible user experience\n* Interpret data, identify key findings, and make recommendations based on testing results.\n* Understanding customer journeys and struggles through analysis like purchase funnel analysis and heat mapping\n* Identify potential areas of improvement on the website based on Adobe/Google Analytics data\n* Help define and recommend measurements, strategies, and reporting using data to drive valuable business decisions\n* Collaborate with CRO team on strategy, testing roadmaps and test hypotheses\n* Support in designing A/B and Multivariate tests including quality assurance process.\n* Monitor and continuously evaluate your ideas and designs through development and production phases\n* Effectively communicate and demonstrate your ideas and concepts to all stakeholders, this could include both technical and non-technical audiences \n\n# Requirements\n**The Ideal candidate**\n* Must have experience running analysis using web analytics tools like Google Analytics (Adobe Analytics will be a plus)\n* Passion in understanding user behaviour and pain points using data\n* Experience in extracting and interpreting tests\n* Must have a good understanding of CRO framework\n* Excellent analytical skills\n* ability to clearly communicate data\n* Detail oriented & organized\n* Willingness and ability to learn\n\n**Attributes and behaviours we love to see**\n* A can-do attitude\n* Ownership of your work\n* Inquisitive\n* Analytical\n* Creative in context\n* Results orientated\n* Collegiate and supportive\n* A sense of humour\n\n\n**What we offer you**\n\nCareer progression is based on your ability to deliver and drive ideas and difference for both the client and the company, not your ability to play politics or by the cut of your suit (in fact, we have a no suit policy).\n \n\nA highly collaborative remote working environment, where teamwork is championed, and ideas shared - you will be someone who is unafraid of contributing ideas and happy to work as part of a remote team and in exchange for your hard work, we can give you a unique opportunity to shape and contribute to a flourishing business to achieve your lifestyle goals.\n\n \n\nWe even fly the team each year to our awesome [TrustEDConf](http://www.trustedconf.com/) event. The last one was in October in Borneo.\n\n\n**A 100% remote team from day 1**\n\n\nAn important point that is often overlooked: you will be truly part of a team. For most remote workers, you can easily feel isolated from what's happening in the business. \n\nWe take good care of our teammates. \nSo much so that a person who joined our team late November told us recently: \n> I felt more welcomed and truly part of a team, even with colleagues 10,000 kms away than when I was working in an office, seating next to my former teammates.\n\n \n\nDon't take our word for it. [Check what our teammates are saying on glassdoor](https://www.glassdoor.com.au/Reviews/In-Marketing-We-Trust-Reviews-E694140.htm) (yep, 2 reviews are not great, but most are). \n\n#Salary\n7,000 AUD\n

See more jobs at In Marketing We Trust

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Saagie


Senior Back-end Developer

verified
France or between GMT-1 to GMT+3

Senior Back-end Developer


Saagie

France or between GMT-1 to GMT+3 verified

kotlin

spring

data

governance

kotlin

spring

data

governance

France or between GMT-1 to GMT+39mo
We're looking for a senior back-end developer to join our product team!\n\n## Why working at Saagie?\n* **Real agile organization**. Human before process\n* Work on **open source** projects\n* Flexible work schedules and **remote work** allowed\n* **Conferences lover**? We can sponsor you! (Limited to Europe)\n\n## Additional Information\n* Location: Rouen or Paris office (France), or anywhere (full-time remote work)\n* Contract: permanent\n* EU work permit needed\n\n# Responsibilities\n You'll be in charge of developing modules and connectors integrated into our product micro-services architecture (running on Kubernetes), and especially on our **data governance** and **security** modules.\nYou'll be working in a feature team with direct relations with the SRE team.\n\nYour team is responsible for his own architectural and technological choices and you are committed to:\n* **🤟 Contribute** to improving Saagie's platform **quality** \n* **🛠 Improve** its **maintainability** \n* **👮 guarantee** its operational condition\n* **🏭 Industrialize** your developments so that they are integrated as soon as possible into our daily deliveries to production. \n\n# Requirements\n* Minimum **5 years experience** as a back-end developer\n* Interested in **Data**, Big data, privacy, GDPR\n* You have skills in development with **Java** (or Kotlin) and Spring Boot\n* You know how to properly **test** your code\n* **Docker** has no secret to you\n* **Automation** and **continuous integration** is a standard for you\n* You know how - and want to - share your knowledge with your teammates\n* Resourceful and **open-minded**: you're keen to enhance your skills and use new tools quickly\n* **Autonomous** but can also work with teammates\n* You are **pragmatic** and **delivery oriented**\n* At least English speaking (French appreciated)\n\n### Nice to have\nKnowledge on:\n* **Kubernetes**\n* Hadoop \n* Front-end development (Angular/React)\n \n\n#Salary\n38K€ - 55K€\n \n\n#Location\n- France or between GMT-1 to GMT+3

See more jobs at Saagie

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Stats (beta): 👁 925 views,✍️ 0 applied (0%)
**The Company**\nSynergy Sports Technology, named by Fast Company as one of the world's top 10 most innovative companies in sports, seeks talented **Senior Backend Data Platform Engineers** to join our team on a long term contract basis. \nThis position offers a tremendous opportunity to work with the only company that delivers on-demand professional-level basketball, baseball, and hockey analytics linked to supporting video to nearly 1500 college, professional, and international teams. Our systems are highly complex and contains petabytes of data and video requiring extremely talented engineers to maintain scale and efficiency of its products.\nAs a member of the Synergy team, its engineering team will contribute to the ongoing development of Synergy’s revolutionary online sports data and video delivery solutions. Building applications such as:\n* Client Analytic Tools\n* Video Editing and Capture Tools\n* Data Logging Tools\n* Operational Game, Data and Video Pipeline Tools\n* Backend Data and Video Platforms\n\nSynergy’s work environment is geographically distributed, with employees working from home offices. The successful candidate must be comfortable working in a virtual office using online collaboration tools for all communication and interaction in conversational English. Synergy development staff work in a deadline-oriented, demanding, non-standard environment in which personal initiative and a strong work ethic are rewarded. Good communication skills, self-motivation, and the ability to work effectively with minimal supervision are crucial. Nonstandard working hours may be required, as Synergy operates on a 24x7 system for clients, with associated deadlines and requirements. Pay rate is dependent on experience.\nInformation for all Positions:\n* All Positions will last for roughly a year with some engineers lasting even longer if they are talented, we will keep them for future projects (contracts are renewing every year)\n* Engineers should be available for phone calls M-F from 7am to 10am Pacific Time zone. There will usually be 1 or 2 phone calls each week that are 30 to 90 minutes each. All other work hours availability is up to the engineer to work when it is a best fit and balance for them to communicate with their team and their personal commitments outside of work.\n* Working an average of 40 hours per week is expected except in rare or temporary circumstances. Each week can be flexible and up to the engineer as to when and how much they work per day. It is ok to work heavier and lighter weeks if desired based upon the engineer’s preference of when and how to work. But a preference is to average 40 hours per week.\n* No travel is required\n\n\n\n\n# Responsibilities\n **Team Objectives**\n\nA candidate joining the Data Platform team can expect to work on the following types of projects:\n* Creating internal and external APIs to support both data and video\n* Building complex data models supporting the business rules of sports\n* Developing algorithms that ingesting and transforming multiple streams of data and collapsing the data into a single event structure\n* Refactoring code to a .NET Core environment\n* Scaling out current systems to support new sports\n* Building build and test automation systems\n* Building complex reporting data structures for analytical systems \n\n# Requirements\n**Required Skill Sets**\n* NoSQL database (MongoDB Preferred)\n* C# (Latest version with a preference to .NET Core)\n

See more jobs at Synergy Sports Technology

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Stencila


Product Designer

Product Designer


Stencila


design

ux

ui

data

design

ux

ui

data

11mo
Join a team developing the next-generation of tools for data-driven scientific discovery. Stencila aims to lower the barriers to reproducible research and enable collaboration between researchers with different levels of technical expertise. We're looking for a full-time UX/UI designer to lead the design and testing of novel user interfaces for data exploration, visualization and analysis.\n\n* [Stencila](https://stenci.la) is building a toolbox of modular, interoperable software components for researchers ranging from Stencila [Dockter](https://github.com/stencila/dockter) (a command line tool that makes it easier to create Docker images) to Stencila [Hub](https://hub.stenci.la) (a web app that integrates our tools with third party services like Github and Dropbox) - and plenty of others in between!\n* We're passionate about making data-driven discovery more accessible to more people and closing the gaps in collaboration between coders and non-coders.\n* We’re obsessed with building tools that make simple thing easy and complex things possible.\n* With funding from the Alfred P. Sloan Foundation we are building a core team to take Stencila from a prototype to production.\n* We are a small diverse, 100% remote, 100% open-source team linked to a broad open source and open science community.\n\n# Responsibilities\n * Lead the design of our user interfaces from research, concept and validation through to documentation, implementation, testing and deployment.\n* Organise and facilitate collaborative design sessions with users.\n* Research the user interface (UI), user experience (UX), and developer experience (DX) of our tools, and other similar tools. Prepare your research as a report for the team and a blogpost that can be shared openly.\n* Users are telling us they have problem X or want feature Y. How do other tools solve this problem or provide this feature? Summarise your research and share it as a mocked up proposal to the team and the community.\n* Do usability testing to truly understand users needs and problems. What is confusing to them? What gets in their way? What things should we take away? What should we add?\n* Mockup new (and refined) user interfaces, and get feedback from users, the rest of the team and the broader community. Lots of bonus points if you can start implementing the feature in coordination with our front-end engineer!\n* What’s our developer onboarding experience like? Talk to 3 - 5 open source contributors and find out what friction points exist in our API and documentation.\n \n\n# Requirements\n#### About you\n\n* You are passionate about understanding how our users think, what their problems are, and building engaging user experiences that solve those problems.\n* You are a designer who knows what it takes to build modern, responsive user interfaces - your thinking doesn’t stop at mock ups, it extends into build and deploy.\n* You are keen to join in-person user workshops, value user feedback, and act upon it.\n* Are committed to open-source and know from experience what it takes to grow a community around open-source software.\n* Have a strong bias towards getting it done; you choose completion over perfection. You get a buzz out of getting things shipped.\n* Take initiative, ownership and responsibility; you don’t need to wait for permission, don’t mind admitting you were wrong, and fix things when you are.\n* Want to work as part of a diverse team with complementary skills; you give support and take advice.\n* Comfortable working across time zones as part of a small, remote, geographically distributed team. You review asynchronous feedback, and get to work independently. \n* You are comfortable regularly communicating on wins, progress, and roadblocks.\n\n#### Your Skills & Experience\n\n* You are experienced in designing, deploying and testing user interfaces, preferably those for data driven discovery.\n* An excellent communicator and facilitator able to translate users needs into implementable mockups and actionable technical requirements.\n* You are proficient with HTML and CSS and have a beautiful portfolio that shows it.\n* You have excellent English communication skills, both written and verbal.\n* It’s a nice-to-have if you have a research or science background working with data.\n\n#### Compensation\n\nUS $100,000\n

See more jobs at Stencila

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Doubledot Media is a New Zealand-based company creating online tools and training for people looking to start their own online businesses. We’re based in Christchurch, but the majority of our team work remotely from various places around the world. \n\nWe have a full-time Product Analyst position available at [SaleHoo.com](https://www.salehoo.com) (Doubledot is the parent company).\n\nSaleHoo is a carefully curated directory of more than 8,000 dropship and wholesale suppliers. Our customers use our directory as a safer and easier way to find wholesale suppliers of goods to sell on eBay, Amazon or in their own stores.\n\nOur new Product Analyst/Analytics Expert will play a vital role in shaping our existing analytics stack, building and documenting the data pipeline, providing product insights and generally making sure we are rigorously testing our assumptions with data. \n\nWe're looking for someone with great analysis skills, a fair amount of cleverness, the ability to get things done, and a genuine desire to understand the customer behind the data.\n\nThis is the perfect position for someone who is knowledgeable and experienced with SQL, Google Analytics, Customer.io, loves data, team-orientated and a positive person.\n\n**Neat things about working with us:**\n* Flexible hours (can discuss fulltime, contract, inhouse or parttime)\n* Work from home, coffee shop or co-op space.\n* Relaxed, ego-free, family-friendly work culture\n* Need a few more reasons? [Read what current and former staff have to say](https://www.glassdoor.com/Reviews/Doubledot-Media-Reviews-E1029356.htm)\n\n# Responsibilities\n You'll be responsible for:\n* Working with Product and Management teams to Understand KPIs & Business Goals\n* A complete audit of current reporting and analytics tools. \n* Making suggestions to improve our tools and encouraging seamless sharing across tools and departments\n* Developing Analytics Documentation - reference materials outlining what data is needed, all terminology and reporting schema\n* Working with product and development team to scope out necessary changes to the analytics stack\n* Auditing any changes to ensure data accuracy and normalization of data\n* Working with product and engineering team to create ongoing reporting for product releases - including sprint specific dashboards, benchmarking and cohort analysis.\n* Working to develop meaningful segmentation models, behavioural analysis of key user segments to analyze churn including elements like origin source, onboarding, plan types and feature retention\n* Completing critical analysis that will drive growth and provide insights \n\n# Requirements\nYou'll need:\n* An excellent understanding of SQL or other query languages\n* To love everything data - designing models, reporting, driving results/growth \n* Excellent written and spoken English and a friendly manner.\n* Meticulous attention to detail. Your spelling and grammar should be top notch.\n* Good time management skills. Since this is a remote working position, you'll need to be organized, motivated, and (dare we say) a "self-starter".\n* A background in cross-functional data analytics on distributed teams.

See more jobs at SaleHoo.com

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Drops


Growth Data Analyst

Growth Data Analyst


Drops


data

analyst

growth hacker

product analytics

data

analyst

growth hacker

product analytics

1yr
If you have a soft spot for bootstrapped, profitable companies with a meaningful product, and you want to use your data analyst / growth skills for good, you’ll like this.\n\n\n\n\n\nAbout us:\n\n\n\nDrops’ goal is to turn language learning into a delightful game while ensuring effective learning. Our app is in the Appstore for 3 years, teaches 31 languages, was featured by both the App Store and Play Store multiple times - Editors' Choice on the Play Store - and the company is still run by the founders. We are a small, super-capable remote team mainly spread across Europe. We’re working synchronously, so time zones matter for us. We communicate via Slack, Git and Trello and have a release twice a week. We want to be the no.1 app for vocabulary learning and we are getting there quickly with our current user base of 7 million, a monthly active of >800,000 and an average store rating of 4.7.\n\n\n\nYou can find us here: [http://drops.app.link/](http://drops.app.link/)\n\n\n\n\n\nAbout you:\n\n\n\nYou’ll be responsible for improving the quantity, quality and reliability of the multivariant tests we run.\n\n\n\nYou’re a no-nonsense person, who is comfortable taking ownership of all aspects of our analytics & growth funnel, who has experience in working at a product company, and who can bring us insights, initiatives and execution that will help us grow.\n\n\n\nWe want everyone to see the big picture: this means you already pushed your boundaries outside of the strict data realm, and are knowledgeable about mobile and web growth frameworks, ASO, SEO, best practices regarding retention and monetization.\n\n\n\nWe’re building a small, but super capable team. You’re naturally more interested in the fate of the product & driven to grow professionally, than in managing people.\n\n\n\nWe are looking for a missionary rather than a mercenary.\n\n\n\nWe value clear and honest communication and transparency, it’s the linchpin of our culture and current success and freedom. You will be involved in both high and low level decision making and will be available during European working hours (9AM - 6PM GMT).\n\n\n\nWe offer:\n\n\n\n* An awesomely compact 13 person team\n\n* Educational allowance\n\n* Fittness allowance\n\n* All the perks of remote working\n\n* 30 days of holiday per year (including Christmas and other holidays)\n\n* Quarterly team gathering somewhere in the world\n\n* Stock options from a high-growth, profitable company\n\n# Responsibilities\n You will:\n\n* Work cross functionally with our engineers, designer and marketing teams on opportunities to initiate / manage / analyze the AB tests we’re running across multiple platforms.\n\n* Make sense out of all the data that comes in (mobile, web, iTunes, Play Store, our own users database).\n\n* Gather insights from the existing data we have, and initiate projects to help improve our KPIs.\n\n* Prefer to use a minimal set of simple tools to complex ones. (This is important for us)\n\n* Communicate effectively and often to ensure that the team is aligned.\n\n* Help the engineers structure the events on existing and new platforms & define our KPIs. \n\n# Requirements\nYou have:\n\n* At least 2 years of experience in working in product analytics, managing multivariant tests and their results.\n\n* At least 3 years of experience in using various Analytics / BI tools (from Amplitude to SQL queries)\n\n* Experience driving product growth in the consumer mobile app space.\n\n* Some experience in qualitative testing methodologies. You’re not afraid with engaging with end-users if needed.\n\n* Knowledge about fundamental mobile and web growth mental models\n\n* Project management experience (everyone is managing projects at Drops)\n\n* Strong verbal and written communication skills and the ability to work well cross-functionally.\n\n* (Preferably) experience in all 4 pillars of the growth funnel: Acquisition, Activation, Retention, Monetization

See more jobs at Drops

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Doximity is transforming the healthcare industry. Our mission is to help doctors save time so they can provide better care for patients.\n\nWe value diversity — in backgrounds and in experiences. Healthcare is a universal concern, and we need people from all backgrounds to help build the future of healthcare. Our data team is deliberate and self-reflective about the kind of team and culture that we are building, seeking data engineers and scientists that are not only strong in their own aptitudes but care deeply about supporting each other's growth. We have one of the richest healthcare datasets in the world, and our team brings a diverse set of technical and cultural backgrounds.\n\nYou will join a small team of Software Engineers focusing on Data Engineering Infrastructure to build and maintain all aspects of our data pipelines, ETL processes, data warehousing, ingestion and overall data stack.\n\n**How you’ll make an impact:**\n\n* Help establish robust solutions for consolidating data from a variety of data sources.\n* Establish data architecture processes and practices that can be scheduled, automated, replicated and serve as standards for other teams to leverage. \n* Collaborate extensively with the DevOps team to establish best practices around server provisioning, deployment, maintenance, and instrumentation.\n* Build and maintain efficient data integration, matching, and ingestion pipelines.\n* Build instrumentation, alerting and error-recovery system for the entire data infrastructure.\n* Spearhead, plan and carry out the implementation of solutions while self-managing.\n* Collaborate with product managers and data scientists to architect pipelines to support delivery of recommendations and insights from machine learning models.\n\n**What we’re looking for:**\n\n* Fluency in Python, SQL mastery.\n* Ability to write efficient, resilient, and evolvable ETL pipelines. \n* Experience with data modeling, entity-relationship modeling, normalization, and dimensional modeling.\n* Experience building data pipelines with Spark and Kafka.\n* Comprehensive experience with Unix, Git, and AWS tooling.\n* Astute ability to self-manage, prioritize, and deliver functional solutions.\n\n**Nice to have:**\n\n* Experience with MySQL replication, binary logs, and log shipping.\n* Experience with additional technologies such as Hive, EMR, Presto or similar technologies.\n* Experience with MPP databases such as Redshift and working with both normalized and denormalized data models.\n* Knowledge of data design principles and experience using ETL frameworks such as Sqoop or equivalent. \n* Experience designing, implementing and scheduling data pipelines on workflow tools like Airflow, or equivalent.\n* Experience working with Docker, PyCharm, Neo4j, Elasticsearch, or equivalent. \n\n**About Doximity**\n\nWe’re thrilled to be named the Fastest Growing Company in the Bay Area, and one of Fast Company’s Most Innovative Companies. Joining Doximity means being part of an incredibly talented and humble team. We work on amazing products that over 70% of US doctors (and over one million healthcare professionals) use to make their busy lives a little easier. We’re driven by the goal of improving inefficiencies in our $2.5 trillion U.S. healthcare system and love creating technology that has a real, meaningful impact on people’s lives. To learn more about our team, culture, and users, check out our careers page, company blog, and engineering blog. We’re growing fast, and there’s plenty of opportunities for you to make an impact—join us!\n\n*Doximity is proud to be an equal opportunity employer, and committed to providing employment opportunities regardless of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex, gender, gender identity, gender expression, pregnancy, childbirth and breastfeeding, age, sexual orientation, military or veteran status, or any other protected classification. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law.* \n\n# Requirements\nUse apply button

See more jobs at Doximity

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Stats (beta): 👁 3,558 views,✍️ 0 applied (0%)
\nWe have lots of data organised in different databases on different platforms and in different formats. We need someone that is super interested in financial data and can build out time series models of how our expenditure and revenue will evolve. If you’re an organised data guru that can think outside of a box, join our data team.\n\nLocation:\nAnywhere / remote\nWe prefer that you are between UTC-3 and UTC-8\nIf you prefer working in an office: Toronto or Grand Cayman are the options\n\nResponsibilities:\nYou’ll be working closely with Finance and executives to answer key financial questions. The data you own can be trusted and you’re diligent at pointing out any caveats or issues in the data.\nYou’ll write Python scripts that pull data from APIs, websites via scraping, Google buckets and dumps it into our Google BigQuery data warehouse.\nYou’ll work closely with different stakeholders across the organisation to make us more efficient by automating reports and dashboards. \nYou’re flexible and interested in different kinds of problems. While your main area of expertise will be Finance, you might enjoy working on an impact study for Marketing \nYou can extract valuable insights from large pools of data, as well as to articulate these insights to other staff members and executives in a non-technical way\nYou’ll will build some predictive models and answer questions around: How do our acquisitions perform? Which acquisition will be the next superstar? What should our budget be?\n\nMust-haves:\nUber analytical\nSQL experience, even better: BigQuery. We’re not a fan of MS SQL :)\nPython scripting experience well beyond print(‘Hello World’), pandas\nExperience with APIs & web scraping (selenium!)\nStats background, t-tests, regressions should be no brainers and random forests fun to walk through\nSome dashboarding experience, such as Google Data Studio\nYou’re creative, both in making visualisations of complex questions and in solving problems\nStrong communicator\nSuper organised\nA couple of years of experience with data wrangling and financial data, particularly on the cost side\n\nNice-to-haves:\nMS Great Plains / Dynamics experience\nKnowledge of Aria, Zuora or other subscription billing platforms\nExperience with payment gateway sources such as Braintree or Vantiv\nRemote working experience\n\nWho we are:\nWith over 1,300 websites attracting more than 100 million monthly unique visitors, Toronto-based VerticalScope is one of the largest and fastest growing online publishers in North America. Our unique combination of wholly-owned community (forum) websites and professional content sites offers shoppers critical information that they can’t get from OEM sites or other interest-based portals, attracting a unique audience and blue chip advertisers seeking to reach them. These sites are organized into 9 different verticals including automotive, outdoors, powersports, pets, home, consumer technology and health & wellness, to name a few.\n\n\nHow to apply:\n\nSend a brief email to [email protected] explaining why you’d like to work with us and why we should hire you. Make sure the subject line is “Data Analyst and Engineer - Finance.”\n\n

See more jobs at VerticalScope Inc.

Visit VerticalScope Inc.'s website

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.