Remote Senior Erlang Developer at Scrapinghub 📈 Open Startup
RSS
API
Post a Job

get a remote job
you can do anywhere

The largest collection of Remote Jobs for Digital Nomads online. Get a remote job you can do anywhere at Remote Companies like Buffer, Zapier and Automattic who embrace the future. There are 32,500+ jobs that allow you to work anywhere and live everywhere.

The largest collection of Remote Jobs for Digital Nomads online. Get a remote job you can do anywhere at Remote Companies like Buffer, Zapier and Automattic who embrace the future. There are 32,500+ jobs that allow you to work anywhere and live everywhere.

  Jobs

  People

👉 Hiring for a remote Software Developer position?

Post a Job - $299
on the 🏆 #1 remote jobs board

Scrapinghub


Senior Erlang Developer

Senior Erlang Developer


Scrapinghub


dev

erlang

senior

digital nomad

dev

erlang

senior

digital nomad

3yr

Stats (beta): 👁 870 views,✍️ 0 applied (0%)
\nScrapinghub is looking for an Erlang software developer to join our Crawlera team.\n\nCrawlera is a smart downloader designed specifically for web crawling and scraping. It allows crawler developers to crawl quickly and reliably by managing thousands of proxies internally. It is part of the Scrapinghub platform, the world's most comprehensive web crawling stack which powers crawls of over 4 billion pages a month.\n\nScrapinghub helps companies, ranging from Fortune 500 enterprises to early stage startups, turn web content into useful data with a cloud-based web crawling platform, off-the-shelf datasets, and turn-key web scraping services.\n\nJoin us in making the world a better place for web crawler developers with our team of top talented engineers working remotely from more than 30 countries.\n\nResponsibilities\n\n\n* Develop, maintain and support a high load distributed system.\n\n* Analyze our current and historical Crawlera usage to augment and enhance its routing and rotation logic.\n\n* Leverage the Scrapinghub platform to provide extended functionality, both to end users and for internal purposes.\n\n* Identify and resolve performance and scalability issues with distributed crawling at scale.\n\n* Liaison with other platform teams to provide Crawlera with the best possible integration to the growing Scrapinghub platform.\n\n\n

See more jobs at Scrapinghub

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.