Remote Erlang Developer at Scrapinghub Open Startup
RSS
API
Remote HealthPost a job

find a remote job
work from anywhere

The largest collection of Remote Jobs for Digital Nomads online. Get a remote job you can do anywhere at Remote Companies like Toptal, Zapier and Automattic who embrace the future. There are 43,050+ jobs that allow you to work anywhere and live everywhere.

The largest collection of Remote Jobs for Digital Nomads online. Get a remote job you can do anywhere at Remote Companies like Toptal, Zapier and Automattic who embrace the future. There are 43,050+ jobs that allow you to work anywhere and live everywhere.

Remote HealthPost a job

  Jobs

  People

👉 Hiring for a remote position?

Post a job
on the 🏆 #1 remote jobs board
The first health insurance for remote startups
A fully equipped health insurance that works for all your global employees
The first health insurance for remote startups
A fully equipped health insurance that works for all your global employees

Scrapinghub

 

Erlang Developer

Erlang Developer  


Scrapinghub


dev

 

erlang

 

digital nomad

 

dev

 

erlang

 
This job post is closed and the position is probably filled. Please do not apply.
\nCrawlera is a smart downloader designed specifically for web crawling and scraping, removing the headaches of proxy management. It is part of the Scrapinghub platform, the world’s most comprehensive web crawling stack which powers crawls of over 8 billion pages per month.\n\nAs an Erlang developer you will help to ensure the robustness of our services. You will learn to investigate production issues on a server executing customer requests. You will be able to navigate a large code-base and find the least obstructive place for extensions. Beside the technicalities you will gain a holistic view of the product and ensure a greater usability of the system with every single task you complete. In this role, you will partake in brainstorming and delivering improvements to the core of Crawlera.\n\nJob Responsibilities:\n\n\n* Develop, maintain and support a high load distributed system.\n\n* Analyze our current and historical Crawlera usage to augment and enhance its routing and rotation logic.\n\n* Leverage the Scrapinghub platform to provide extended functionality, both to end users and for internal purposes.\n\n* Identify and resolve performance and scalability issues with distributed crawling at scale.\n\n* Liaison with other platform teams to provide Crawlera with the best possible integration to the growing Scrapinghub platform.\n\n\n\n\nJob Requirements:\n\n\n* 2+ years of production experience with Erlang.\n\n* Strong communication in written and spoken English.\n\n* Strong knowledge of Linux/UNIX, HTTP and Networking.\n\n\n\n\nDesired Skills:\n\n\n* Python or Golang experience.\n\n* Familiarity with techniques and tools for crawling, extracting, and processing data.\n\n* Knowledge of ELK, Graylog, Docker and Mesos.\n\n* Strong record of open source activity\n\n* Experience working with Lean principles and a Scrum SDLC\n\n\n


See more jobs at Scrapinghub

# How do you apply?\n\n This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
FeedbackIf you find a bug, or have feedback, write it here. Please no job applications in here, click Apply instead! If you want to advertise, we do not do CPA/aff/perf ads ever.Thanks for the message! We will get back to you soon.

[Spam check] What is the name of Elon Musk's company going to Mars?

Send feedback