Daniel J Edelman Holdings is hiring a Remote Platform Engineer
\nWe are seeking a skilled Platform Engineer to join our team. They will be instrumental in developing and optimizing our cloud architecture as well as managing our development and analytics tooling. The ideal candidate has a proven track record of maintaining and developing cloud-based architecture which supports large-scale data initiatives for data science and software development teams. \n \nThe Platform Engineer will be responsible for designing, implementing, and managing cloud architectures to meet business requirements. The ideal candidate possesses expertise in translating business needs into scalable cloud solutions while ensuring security, reliability, and cost-effectiveness.\n\n\n\nResponsibilities:\n* Collaborate with business stakeholders to understand the product requirements and translate them into scalable and resilient cloud architectures.\n* Collaborate closely with Data Engineering, Data Science and Software Development teams to contribute to the design of the cloud solutions for our products.\n* End-to-end implementation of the optimized and secure cloud-based/cloud-native architecture. \n* Develop and document cloud architecture designs, ensuring alignment with industry best practices.\n* Provide expertise in cloud and platform engineering to the Product Data team, ensuring alignment with the company's strategic goals.\n* Contribution to the selection and integration of cloud-based vendors, tools and frameworks.\n* Keep up with emerging trends in cloud engineering and introduce new technologies or practices that can benefit the organization.\n* Implement security measures to safeguard cloud environments, including identity and access management, encryption, and compliance controls.\n* Conduct regular security assessments and address vulnerabilities promptly.\n* Monitor and optimize cloud infrastructure for performance, cost, and reliability.\n* Implement performance tuning strategies to enhance overall system efficiency.\n* Continuous improvement and innovation for cloud and platform engineering.\n* Implement and manage the provisioning of cloud resources based on project requirements.\n* Responsible for the maintenance and support of cloud-based and cloud-native architecture including access controls, security and networking.\n* Configure and fine-tune cloud infrastructure components for optimal performance.\n* Perform audits and assessments of cloud environments to ensure compliance with security and regulatory standards.\n* Provide recommendations for continuous improvement and adherence to best practices.\n* Lead the deployment of applications onto our cloud platform, ensuring seamless integration and functionality.\n* Manage and monitor cloud applications to maintain performance, availability, and scalability.\n\n\n\nQualifications:\n* 3-5 years of proven experience as a Platform Engineer or similar role in designing, implementing, and managing cloud architectures.\n* Expertise in constructing, installing, and maintaining large-scale cloud-native and cloud-based architecture.\n* Database management expertise: Postgres, Snowflake, Lucene-based search engines (Apache Solr/AWS OpenSearch/Elastic Search)\n\n\n* Cloud-Native tooling expertise: Amazon S3, AWS EMR, Amazon EC2 Amazon RDS, Amazon Sagemaker, Amazon ECS, Amazon ECR, Amazon VPC, Amazon IAM (*alternatives from other cloud providers are acceptable) \n* Cloud-based Application tooling: Databricks administration\n* Strong communication in English. Ability to communicate technical concepts to non-technical audience.\n* Cloud Certifications from Cloud providers (AWS. GCP, Azure) \n* Experience with Streaming Technologies such as Apache Kafka and AWS Kinesis\n* Experience Productionizing ML based cloud solutions.\n\n\n\n\n\n#LI-RT9\n\n\nEdelman Data & Intelligence (DXI) is a global, multidisciplinary research, analytics and data consultancy with a distinctly human mission.\n\n\nWe use data and intelligence to help businesses and organizations build trusting relationships with people: making communications more authentic, engagement more exciting and connections more meaningful.\n\n\nDXI brings together and integrates the necessary people-based PR, communications, social, research and exogenous data, as well as the technology infrastructure to create, collect, store and manage first-party data and identity resolution. DXI is comprised of over 350 research specialists, business scientists, data engineers, behavioral and machine-learning experts, and data strategy consultants based in 15 markets around the world.\n\n\nTo learn more, visit: https://www.edelmandxi.com \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Cloud and Engineer jobs that are similar:\n\n
$70,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Benchling is hiring a Remote Software Engineer Backend
ROLE OVERVIEW\n\nBenchling's mission is to unlock the power of biotechnology. The world's most innovative biotech companies use Benchling's R&D Cloud to power the development of breakthrough products and accelerate time to milestone and market. Benchling's customers generate a rich and variety of science data. To keep up our innovation, Benchling need a highly scalable and extensible data platform that can serve both its customers and internal application team.\n\nAs one of Benchlingโs Data Platform engineers, youโll join a rapidly growing, premier engineering team and form the foundation of our data pillar, encompassing customer-facing data products, internal analytics, and the customer-facing data warehouse. You will build the next generation of our Data Platform services that enables internal developers to easily build multi-tenant data applications and analytical products. Benchling is growing really quickly, and youโll be setting the bar for high quality data and a metrics-driven culture as we scale. Youโll serve as a key input and thought leader, and work closely with the product teams to deliver data driven capabilities to our internal and external customers.\n\n \nRESPONSIBILITIES\n\n\n* Build & operate high throughput distributed messaging platform like Kafka/kinesis to enable data change capture and data integration across Benchling.\n\n* Build next generation warehouse and compute platform with scalable data ingress/egress for internal and external customers\n\n* Build DSL & schema registry for internal and external customers to build custom data model. Develop Async data migrations over billions of records with zero-downtime while maintaining our data integrity guarantees.\n\n* Define and design data transformations and pipelines for cross-functional datasets, while ensuring that data integrity and data privacy are first-class concerns regarded proactively, instead of reactively.\n\n* Define the right Service Level Objectives for the batch & streaming pipelines, and optimize their performance.\n\n* Designing and creating CI/CD pipelines for platform provisioning, full lifecycle management. Building the platform control panel to operate the fleet of systems efficiently.\n\n* Work closely with the team across Application and Platform to establish best practices around usage of our data platform.\n\n\n\n\n \nQUALIFICATIONS\n\n\n* Have 5+ years of experience or a proven track record in software engineering\n\n* Experience with data analytics and warehouse solutions such as Snowflake, Delta Lake, AWS Redshift, etc\n\n* Experience with data processing technologies Kafka, Kinesis, Spark, Flink, or other open-source or commercial software\n\n* Experience in schema design, SQL & Schema registry\n\n* Strong experience with scripting language (such as Python)\n\n* Experience with deployment and configuration management frameworks such as Terraform, Ansible, or Chef and container management systems such as Kubernetes or Amazon ECS.\n\n* Driven by creating positive impact for our customers and Benchling's business, and ultimately accelerating the pace of research in the Life Sciences\n\n* Comfortable with complexity in the short term but can build towards simplicity in the long term\n\n* Strong communicator with both words and data - you understand what it takes to go from raw data to something a human understands\n\n* Willing to work onsite in our SF office 3 days a week.\n\n\n\nSALARY RANGE\n\nBenchling takes a market-based approach to pay. The candidate's starting pay will be determined based on job-related skills, experience, qualifications, interview performance, and work location. For this role the base salary range is $177,735 to $240,465. \n\nTo help you determine which zone applies to your location, please see this resource. If you have questions regarding a specific location's zone designation, please contact a recruiter for additional information.\nTotal Compensation includes the following:\n\nCompetitive salary and equity\n\nBroad range of medical, dental, and vision plans for employees and their dependents\n\nFertility healthcare and family-forming benefits\n\nFour months of fully paid parental leave\n\n401(k) + Employer Match\n\nCommuter benefits for in-office employees and a generous home office set up stipend for remote employees\n\nMental health benefits, including therapy and coaching, for employees and their dependents\n\nMonthly Wellness stipend\n\nLearning and development stipend\n\nGenerous and flexible vacation\n\nCompany-wide Summer & Winter holiday shutdown\n\nSabbaticals for 5-year and 10-year anniversaries\n\n\n\n\n#LI-Hybrid \n\n#BI-Hybrid \n\n#LI-GP1 \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Recruiter, Cloud, Engineer and Backend jobs that are similar:\n\n
$70,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nSan Francisco, California, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
About This Role\n\nHello prospective pickle! Design Pickle is looking for a Data Engineer to join our team and help us develop new ways to inspire our customers and streamline processes for our global network of creatives. You will be tasked to build and maintain the right data pipelines and data models to power our decision-making and enable actionable insights.\n\nThe ideal candidate will be able to create efficient, flexible, extensible, and scalable data models, ETL designs, and data integration services. They will also be required to support and manage the growth of these data solutions. Given our aspirational vision, to be the most helpful creative platform in the world, and the nature of our products, this role requires entrepreneurial drive and thinking, comfort with ambiguity, and the ability to break down and solve complex problems.\n\nIf you have ever wanted to make a significant contribution and help shape the trajectory of a startup, this role is for you! \n\nReports to: Director Data Science & Analytics\n\nOn a daily basis, works closely with: Engineering, Product Management, Product Marketing and Global Operations.\n\nLocation: Design Pickle is a fully remote company with a Company Hub in Scottsdale, Arizona. \nWho We Are Looking For\n\nFirst, Design Pickle is anything but typical. Weโre a group of hard-working, creativity-loving individuals from around the world.\n\nDo we love pickles, too? Most of us! But donโt stress if pickles arenโt your thing. Itโs not a deal-breaker. We do look for a passion and interest in something though because our employeesโ uniqueness is what helped make us the great company we are today. \n\nWe stand by our vision, purpose, and values, and these are mission-critical to how you show up every single day.\n\nSpecific to your role, weโre looking for individuals who have...\n\n\n* A robust background with at least two years dedicated to software development, encompassing the full spectrum of the product lifecycle. This includes ideation, development, deployment, and iteration.\n\n* A minimum of three years' expertise in crafting and optimizing SQL queries. Candidates should be well-versed in manipulating and extracting data to meet business needs.\n\n* Over two years of hands-on experience in ETL (Extract, Transform, Load) processes, showcasing proficiency in designing, implementing, and maintaining robust ETL pipelines.\n\n* At least two years of programming experience with a focus on object-oriented languages, such as Python. \n\n* A minimum of two years in database schema design and dimensional data modeling, illustrating a deep understanding of how to structure and model data effectively for scalability and performance.\n\n* Proven experience in the data warehousing field, indicating a solid foundation in managing large-scale data storage solutions.\n\n* Demonstrated ability to analyze datasets to uncover discrepancies and inconsistencies, thereby ensuring data quality and reliability.\n\n* Practical experience with Amazon Web Services (AWS), including but not limited to S3, Redshift, and Machine Learning services. Candidates should be comfortable leveraging these services to enhance data storage, processing, and analytics capabilities.\n\n* Expertise in managing and clearly communicating plans for data sourcing and pipeline development to stakeholders within the organization, ensuring alignment and understanding across teams.\n\n* Exceptional problem-solving abilities, with a knack for navigating through unclear requirements and delivering effective solutions.\n\n\n\n\nBonus Pickle Points: \n\n\n* A Bachelor's or Master's degree in Computer Science, a related technical field, or equivalent practical experience.\n\n* Additional experience with AWS, specifically in managing Data Lakes, is highly regarded.\n\n* Familiarity with building and utilizing reports in business intelligence tools such as PowerBI and Tableau, enhancing decision-making and insights.\n\n* Proficiency in Ruby on Rails, adding value through versatile web development skills.\n\n* A proven track record of working independently within globally distributed teams, showcasing effective communication and collaboration across different time zones.\n\n* Demonstrated capacity to leverage data in influencing pivotal business decisions, underlining the strategic use of insights in driving outcomes.\n\n\n\nKey Objectives and Responsibilities \n\nAs a fast-growing company, our roles are always evolving. However, we want you to know exactly what youโre walking into. In the first 90-days, here is a preview of whatโs expected:\n\n\n* Conceptualize and own the data architecture for our suite of tools and analytics platform.\n\n* Create and contribute to frameworks that improve the efficiency of logging data, while working with data infrastructure to troubleshoot and resolve issues.\n\n* Collaborate with engineers, product managers, product design and product marketing to understand data needs, representing key data insights in a meaningful and actionable way.\n\n* Define and manage SLA for all data sets.\n\n* Determine and implement the security model based on security and privacy requirements, confirm safeguards are followed, address data quality issues and evolve governance processes.\n\n* Design, build and launch sophisticated data models and visualizations that support our products and global operational processes.\n\n* Solve data integration problems, utilizing ETL patterns, frameworks, query techniques, sourcing from structured and unstructured data sources.\n\n* Optimize pipelines, dashboards, frameworks, and systems to streamline development of data artifacts.\n\n* Mentor team members for best practices in the data engineering space.\n\n* Commitment to documentation.\n\n\n\n$100,000 - $115,000 a year\nThe compensation range for this position $100,000 to $115,000 annually. The actual salary offer to a candidate will be made with mindful consideration of many factors. These factors include but are not limited to skills, qualifications, education/knowledge, experience, and alignment with market data for a given location within the US. In addition to base salary, some positions may be eligible for additional forms of compensation such as bonuses or commissions. This salary data is for our US-based positions only.\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Ruby, Marketing and Engineer jobs that are similar:\n\n
$77,500 — $117,500/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nScottsdale, Arizona, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
NPR is hiring a Remote Senior Software Engineer Content Services Data
\nApplication Deadline: Monday, November 20th at 5:00 PM EST \n\nEach day you will work with a cross-functional team of brilliant professionals combining business, design, product, user experience and engineering expertise, working relentlessly to push the boundaries of whatโs possible and paving the road for the future of news and entertainment media.\n\nThe Audience Technology group is looking for an experienced, talented and knowledgeable Senior Software Engineer to join the Data/Content Services team responsible for developing, supporting and maintaining our data and analytics products and services. These services are used to highlight key trends and insights across podcasts, web, mobile apps and social media. These datasets are used to extract insights from complex media usage in order to inform stakeholders both at NPR and at Member Stations across the country. \n\nAs a senior software engineer on our team, you will be met with exciting challenges to iterate on existing systems and build new datasets, dashboards and pipelines for analyzing trends in audience engagement.\n\nIn addition, the team is responsible for core backend APIs and other services that are responsible for podcast distribution, as well as content delivery for the NPR.org homepage, topic stories, and local and national newscasts. Our stakeholders range from local member stations around the country to key business stakeholders inside of NPR. Come join us and make an impact for the NPR mission!\n\nThis is a union represented role covered under the terms of a collective bargaining agreement with DMU. \n\nRESPONSIBILITIES\n\n\n* Support the NPR Content Services and Analytics team in data analytics, dashboarding, and pipelining. \n\n* Write clean, efficient and reusable code based on product requirements\n\n* Participate in all phases of quality assurance and defect resolution\n\n* Aid in the development and maintenance of CI/CD pipeline implementations\n\n* Knowledge share, write technical designs & participate in code reviews\n\n* Mentor and coach mid-level engineers on code quality and best practices\n\n* Consult with lead and senior engineers while designing comprehensive solutions \n\n* Provide input on system design and architecture within the feature areas and services owned by the team\n\n* Work closely with other software engineers, partner teams, infrastructure engineers, product designers, QA engineers, engineering managers and product managers\n\n* Improve team/development processes\n\n* Join agile ceremonies, including daily stand-ups, sprint retros, sprint reviews and more\n\n* Join our on-call rotation\n\n* Other duties as assigned\n\n\n\n\nThe above duties and responsibilities are not an exhaustive list of required responsibilities, duties and skills. Other duties may be assigned, and this job description can be modified at any time.\n\nMINIMUM QUALIFICATIONS\n\n\n* Fluency in Python, LookML and other data based languages\n\n* Working knowledge of BigQuery or similar (Redshift, Azure, Snowflake, etc.)\n\n* Prior experience working with business intelligence tools like Looker or similar (Tableau, Power BI, Mode, etc.)\n\n* Familiarity with SQL/DML and RDBMS technologies \n\n* Fluency in JavaScript / TypeScript\n\n* Experience in developing and working with RESTful APIs that utilize cloud infrastructure such as AWS\n\n* Ability to develop software that is scalable and performant under high loads.\n\n* Strong Object-Oriented programming skills \n\n* Familiarity with deploying and monitoring production systems\n\n* Experience writing unit and other automated tests using tools like Postman and Jest.\n\n* Knowledge of web development best practices, coding standards, code reviews, source control management, build processes, deployment, rollback, testing, monitoring\n\n\n\n\nPREFERRED QUALIFICATIONS\n\n\n* Familiarity with R for advanced data analysis\n\n* Experience using APIs to retrieve analytics data\n\n* Excellent problem solving, analysis and data interpretation skills with a keen sense for data inconsistencies.\n\n* Experience with NoSQL databases (e.g. Elasticsearch, DynamoDB)\n\n* Familiarity with Salesforce platform\n\n* Advanced experience with Amazon AWS or equivalent cloud computing platform, including Lambda, SNS, EC2, ASGs, ElastiCache, DynamoDB, RDS and CodeDeploy\n\n* Advanced experience with Google Cloud Platform including BigQuery Omni, Cloud Functions, Dataplex and Composer.\n\n* Experience with CI/CD pipelines (Github Actions, Jenkins, CodeFresh, TravisCI or equivalent)\n\n* Experience using observability and log aggregation platforms (Datadog, CloudWatch)\n\n* Familiarity with different caching layers of caching (browser, DNS, web server, application, etc) and caching technologies/services (Redis, Elasticache, CDNs, AWS CloudFront)\n\n* A passion for NPRโs content and/or familiarity with our digital products\n\n\n\n\nWORK LOCATION\n\nRemote Permitted: This is a remote permitted role. This role is based out of our Washington, DC office but the employee may choose to work on a remote basis from a location that NPR approves.\n\nJOB TYPE\n\nThis is a full time, exempt position.\n\nCOMPENSATION\n\nSalary Range: The U.S. based anticipated salary range for this opportunity is $126,541 - $134,248 plus benefits. The range displayed reflects the minimum and maximum salaries NPR expects to provide for new hires for the position across all US locations.\n\nBenefits: NPR offers access to comprehensive benefits for employees and dependents. Regular, full-time employees scheduled to work 30 hours or more per week are eligible to enroll in NPRโs benefits options. Benefits include access to health and wellness, paid time off, and financial well-being. Plan options include medical, dental, vision, life/ accidental death and dismemberment, long-term disability, short-term disability, and voluntary retirement savings to all eligible NPR employees. \n\nDoes this sound like you? If so, we want to hear from you. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Design, Salesforce, JavaScript, Cloud, NoSQL, Mobile, Senior, Engineer and Backend jobs that are similar:\n\n
$50,000 — $105,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nWashington, District of Columbia, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.