📈 Open Startup
RSS
API
Post a Job

get a remote job
you can do anywhere

The largest collection of Remote Jobs for Digital Nomads online. Get a remote job you can do anywhere at Remote Companies like Buffer, Zapier and Automattic who embrace the future. There are 30,600+ jobs that allow you to work anywhere and live everywhere.

The largest collection of Remote Jobs for Digital Nomads online. Get a remote job you can do anywhere at Remote Companies like Buffer, Zapier and Automattic who embrace the future. There are 30,600+ jobs that allow you to work anywhere and live everywhere.

  Jobs

  People

👉 Hiring for a remote Software Developer position?

Post a Job - $299
on the 🏆 #1 remote jobs board

Scrapinghub

Erlang Developer


Scrapinghub


dev

erlang

digital nomad

dev

erlang

digital nomad

2mo

Apply

{linebreak}About the Job:{linebreak}{linebreak}Crawlera is a smart downloader designed specifically for web crawling and scraping, removing the headaches of proxy management. It is part of the Scrapinghub platform, the world’s most comprehensive web crawling stack which powers crawls of over 8 billion pages per month.{linebreak}{linebreak}As an Erlang developer you will help to ensure the robustness of our services. You will learn to investigate production issues on a server executing customer requests. You will be able to navigate a large code-base and find the least obstructive place for extensions. Beside the technicalities you will gain a holistic view of the product and ensure a greater usability of the system with every single task you complete. In this role, you will partake in brainstorming and delivering improvements to the core of Crawlera.{linebreak}{linebreak}Job Responsibilities:{linebreak}{linebreak}{linebreak}* Develop, maintain and support a high load distributed system.{linebreak}{linebreak}* Analyze our current and historical Crawlera usage to augment and enhance its routing and rotation logic.{linebreak}{linebreak}* Leverage the Scrapinghub platform to provide extended functionality, both to end users and for internal purposes.{linebreak}{linebreak}* Identify and resolve performance and scalability issues with distributed crawling at scale.{linebreak}{linebreak}* Liaison with other platform teams to provide Crawlera with the best possible integration to the growing Scrapinghub platform.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Required Skills:{linebreak}{linebreak}{linebreak}* 2+ years of production experience with Erlang.{linebreak}{linebreak}* Strong communication in written and spoken English.{linebreak}{linebreak}* Strong knowledge of Linux/UNIX, HTTP and Networking.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Desired Skills:{linebreak}{linebreak}{linebreak}* Python or Golang experience.{linebreak}{linebreak}* Familiarity with techniques and tools for crawling, extracting, and processing data.{linebreak}{linebreak}* Knowledge of ELK, Graylog, Docker and Mesos.{linebreak}{linebreak}* Strong record of open source activity{linebreak}{linebreak}* Experience working with Lean principles and a Scrum SDLC{linebreak}{linebreak}{linebreak}

See more jobs at Scrapinghub

Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.