Remote Erlang Developer at Scrapinghub 📈 Open Startup
RSS
API
Post a Job

get a remote job
you can do anywhere

The largest collection of Remote Jobs for Digital Nomads online. Get a remote job you can do anywhere at Remote Companies like Buffer, Zapier and Automattic who embrace the future. There are 32,450+ jobs that allow you to work anywhere and live everywhere.

The largest collection of Remote Jobs for Digital Nomads online. Get a remote job you can do anywhere at Remote Companies like Buffer, Zapier and Automattic who embrace the future. There are 32,450+ jobs that allow you to work anywhere and live everywhere.

  Jobs

  People

👉 Hiring for a remote Software Developer position?

Post a Job - $299
on the 🏆 #1 remote jobs board

Scrapinghub


Erlang Developer

Erlang Developer


Scrapinghub


dev

erlang

digital nomad

dev

erlang

digital nomad

8mo
\nAbout the Job:\n\nCrawlera is a smart downloader designed specifically for web crawling and scraping, removing the headaches of proxy management. It is part of the Scrapinghub platform, the world’s most comprehensive web crawling stack which powers crawls of over 8 billion pages per month.\n\nAs an Erlang developer you will help to ensure the robustness of our services. You will learn to investigate production issues on a server executing customer requests. You will be able to navigate a large code-base and find the least obstructive place for extensions. Beside the technicalities you will gain a holistic view of the product and ensure a greater usability of the system with every single task you complete. In this role, you will partake in brainstorming and delivering improvements to the core of Crawlera.\n\nJob Responsibilities:\n\n\n* Develop, maintain and support a high load distributed system.\n\n* Analyze our current and historical Crawlera usage to augment and enhance its routing and rotation logic.\n\n* Leverage the Scrapinghub platform to provide extended functionality, both to end users and for internal purposes.\n\n* Identify and resolve performance and scalability issues with distributed crawling at scale.\n\n* Liaison with other platform teams to provide Crawlera with the best possible integration to the growing Scrapinghub platform.\n\n\n\n\n\n\nRequired Skills:\n\n\n* 2+ years of production experience with Erlang.\n\n* Strong communication in written and spoken English.\n\n* Strong knowledge of Linux/UNIX, HTTP and Networking.\n\n\n\n\nDesired Skills:\n\n\n* Python or Golang experience.\n\n* Familiarity with techniques and tools for crawling, extracting, and processing data.\n\n* Knowledge of ELK, Graylog, Docker and Mesos.\n\n* Strong record of open source activity\n\n* Experience working with Lean principles and a Scrum SDLC\n\n\n

See more jobs at Scrapinghub

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.