This job post is closed and the position is probably filled. Please do not apply. Work for IPinfo and want to re-open this job? Use the edit link in the email when you posted the job!
**About IPinfo**\n\nIPinfo is a leading provider of IP address data. Our API handles over 40 billion requests a month, and we also license our data for use in many products and services you might have used. We started as a side project back in 2013, offering a free geolocation API, and we've since bootstrapped ourselves to a profitable business with a global team of 14, and grown our data offerings to include geolocation, IP to company, carrier detection, and VPN detection. Our customers include T-Mobile, Nike, DataDog, DemandBase, Clearbit, and many more.\n\n**How We Work**\n\nWe have a small and ambitious team, spread all over the globe. We sync up on a monthly all-hands Zoom call, and most teams do a call together every 2 weeks. Everything else happens asynchronously, via Slack, GitHub, Linear, and Notion. That means you can pick the hours that work best for you, to allow you to be at your most productive.\n\nTo thrive in this environment you'll need to have high levels of autonomy and ownership. You have to be resourceful and able to work effectively in a remote setup. \n\n**The Role**\n\nWe're looking to add an experienced engineer to our 4-person data team. You'll work on improving our data, maintaining our data pipelines, defining and creating new data sets, and helping us cement our position as an industry leader. Some things we've recently been working on in the data team:\n\n* Building out our global network of probe servers, and doing internet-wide data collection (ping, traceroute, etc).\n* Finding, analyzing, and incorporating existing data sets into our pipeline to improve our quality and accuracy.\n* Building ML models to classify IP address usage as consumer ISP, hosting provider, or business.\n* Inventing and implementing scalable algorithms for IP geolocation and other big data processing.\n\nHere are some of the tools we use. Great if you have experience with these, but if not we'd expect you to ramp up quickly without any problems:\n\n* BigQuery\n* Google Composer / Apache Airflow\n* Python / Bash / SQL\n* ElasticSearch\n\nAny IP address domain knowledge would be useful too, but we can help get you up to speed here:\n* ASN / BGP / CIDR / Ping / Traceroute / Whois etc\n\n**What We Offer**\n\n* 100% remote team and work environment\n* Flexible working hours\n* Minimal meetings\n* Competitive salary\n* Flexible vacation policy\n* Interesting and challenging work \n\nPlease mention the words **ORIGINAL DESERT INDICATE** when applying to show you read the job post completely (#RMzQuMjMwLjY2LjE3Nw==). This is a feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.\n\n \n\n#Salary and compensation\n
$90,000 — $140,000/year\n
\n\n#Benefits\n
โฐ Async\n\n
\n\n#Location\nWorldwide
# How do you apply?\n\nThis job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.