Remote Scraping Best Practices Investigator at Scrapinghub 📈 Open Startup
RSS
API
Post a Job

get a remote job
you can do anywhere

The largest collection of Remote Jobs for Digital Nomads online. Get a remote job you can do anywhere at Remote Companies like Buffer, Zapier and Automattic who embrace the future. There are 32,500+ jobs that allow you to work anywhere and live everywhere.

The largest collection of Remote Jobs for Digital Nomads online. Get a remote job you can do anywhere at Remote Companies like Buffer, Zapier and Automattic who embrace the future. There are 32,500+ jobs that allow you to work anywhere and live everywhere.

  Jobs

  People

👉 Hiring remotely? Reach 1,000,000+ remote workers on the 🏆 #1 remote jobs board

Post a Job - $299
Hide this

Scrapinghub


Scraping Best Practices Investigator

Scraping Best Practices Investigator


Scrapinghub


10mo
\nAbout the Job:\n\n\nYour key objective will be to advance Scrapinghub’s knowledge of web technologies and web scraping best practices.\n\nThis is not a production role. Instead, you’ll be given the time and resources to iteratively, and with scientific rigor, test hypotheses and produce a research-backed knowledge base for other developers at Scrapinghub.\n\nDespite not working on specific customer projects, your work will help fuel growth across all of Scrapinghub’s Data business (Professional Services & Data on Demand). Your measures of success will be your ability to iterate quickly and produce assets that are useful to other Shubbers.\n\n\nJob Responsibilities:\n\n\n\n\n* Create and execute well designed experiments (repeatable, multiple treatments, testable variables, controls, replication) to learn more about how to best complete web scraping projects\n\n* Produce well written, indexed, reports of your findings (similar to publishing to an academic journal, though not nearly as lengthy)\n\n* Propose new experiments to run\n\n* Work with the Team Lead to prioritize the backlog of experiments\n\n* Maintain best practice guides for other Shubbers who will be implementing client solutions based on your findings\n\n* Propose changes to Scrapinghub’s other products (Crawlera, Scrapy Cloud, etc) or Scrapy itself based on your findings\n\n\n\n\n\n\nJob Requirements:\n\n\n\n\n* Excellent communication in written English.\n\n* A strong understanding of the Scientific Method and the ability to continuously implement a process that follows it with rigor.\n\n* Take a logical, measurement-backed approach to prioritizing projects, and enjoy working with others that do the same.\n\n* Familiarity with techniques and tools for crawling, extracting and processing data, asynchronous communication and distributed systems.\n\n* A strong knowledge of Python along with a broad general programming background; strong problem solver.\n\n* Enjoy working across several teams and communicating with your end customer (other Shubbers)\n\n\n\n\n

See more jobs at Scrapinghub

# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.