FeedbackIf you find a bug, or have feedback, put it here. Please no job applications in here, click Apply on the job instead.Thanks for the message! We will get back to you soon.

[Spam check] What is the name of Elon Musk's company going to Mars?

Send feedback
Open Startup
RSS
API
Health InsurancePost a job

find a remote job
work from anywhere

Get a  email of all new Remote Amazon Web Services + Azure + Software Developer Jobs

Subscribe
×

πŸ‘‰ Hiring for a Remote Amazon Web Services + Azure + Software Developer position?

Post a job
on the πŸ† #1 Remote Jobs board

Remote Health by SafetyWing


Global health insurance for freelancers & remote workers

Toptal

 This job is getting a pretty high amount of applications right now (13% of viewers clicked Apply)

verified closed
🌏 Worldwide
 
πŸ’° $60k - $200k

data engineering

 

business intelligence

 
This job post is closed and the position is probably filled. Please do not apply.
***Design your full-time freelance career as a top freelance data engineer with Toptal.***\n\nFreelance work is defining developer careers in exciting new ways. If you’re passionate about finding rapid career growth potential working with leading Fortune 500 brands and innovative Silicon Valley startups, Toptal could be a great fit for your next career shift.\n\nToptal is an elite talent network made up of the world’s top 3% of developers, connecting the best and brightest freelancers with top organizations. Unlike a 9-to-5 job, you’ll choose your own schedule and work from anywhere. **Jobs come to you, so you won’t bid for projects against other developers in a race to the bottom.** Plus, Toptal takes care of all the overhead, empowering you to focus on successful engagements while getting paid on time, at the rate you decide, every time. Our sophisticated screening process makes sure you are provided with top clients without additional overhead, as well as assistance in maximizing the potential of your full-time freelance career. Joining the Toptal network also gives you access to technical training programs, mentors, and coaching programs, so you can connect with a global community of experts like you to share peer-to-peer knowledge and expand your network globally.\n\nAs a freelance developer, you can become a part of an ever-expanding community of experts in over 120 countries, working remotely on projects that meet your career ambitions.\n\nThat’s why the world’s top 3% of developers choose Toptal. Data Engineers in our elite network share:\n* At least 3 years of professional experience in Data Engineering\n* Project management skills\n* A keen attention to detail\n* Experience with cloud technologies (AWS / Azure) is a strong advantage\n* Professional experience with Business Intelligence and Data Engineering is required\n* Full-time availability is a strong advantage\n\n**Curious to know how much you could make? Check out our developer rate calculator: [https://topt.al/aVcvDx](https://topt.al/aVcvDx)**\n\n**If you’re interested in pursuing an engaging career working on full-time freelance jobs for exclusive clients, take the next step by clicking apply and filling out the short form: [https://topt.al/m2cD75](https://topt.al/m2cD75)**\n \n\n#Salary and compensation\n$60,000 — $200,000/year\n\n\n#Location\n🌏 Worldwide


See more jobs at Toptal

# How do you apply?\n\nThis job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.

Toptal

 This job is getting a pretty high amount of applications right now (15% of viewers clicked Apply)

verified closed
Worldwide, US Preferred
 
πŸ’° $50k - $300k

dev ops

 

devops

This job post is closed and the position is probably filled. Please do not apply.
***Design your lifestyle as a top freelance developer, with the freedom to work however, wherever, on your terms.***\n\nFreelance work is defining the careers of today’s developers in exciting new ways. If you’re passionate about working flexibly with leading Fortune 500 brands and innovative Silicon Valley startups, Toptal could be a great fit for your next career shift.\n\nToptal is an elite talent network for the world’s top 3% of developers, connecting the best and brightest freelancers with top organizations. Unlike a 9-to-5 job, you’ll choose your own schedule and work from anywhere. **Jobs come to you, so you won’t bid for projects against other developers in a race to the bottom.** Plus, Toptal takes care of all the overhead, empowering you to focus on successful engagements while getting paid on time, at the rate you decide, every time.\n\nAs a freelance developer, you could join an ever-expanding community of experts in over 120 countries, working remotely on the projects that meet your career ambitions.\n\nThat’s why the world’s top 3% of developers choose Toptal. Developers in our elite network share:\n\n* English language proficiency\n* 3+ years of professional experience\n* Project management skills\n* A keen attention to detail\n\n\nIf you’re interested in becoming part of the Toptal network, take the next step by clicking apply and filling out the short form: **[https://topt.al/GNcPQj](https://topt.al/GNcPQj)**\n\n## Requirements\n* After passing our screening process, you will have access to our network of clients across the globe including leading Fortune 500s and innovative Silicon Valley start-ups.\n* You will have full flexibility to set your working hours per week and your rate. There are no mandatory hours.\n* You will have visibility into all projects published that fit your specialization. Our matching team is here to help you identify the projects that are the best fit for your skills and preferences.\n* As a client-oriented company, we empower you to fully focus on client objectives. We ensure that you always get paid on time for the hours you spend working with clients.\n \n\n#Salary and compensation\n$50,000 — $300,000/year\n\n\n#Location\nWorldwide, US Preferred


See more jobs at Toptal

# How do you apply?\n\nThis job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
This job post is closed and the position is probably filled. Please do not apply.
# We're building the Data Platform of the Future\nJoin us if you want to rethink the way organizations interact with data. We are a **developer-first company**, committed to building around open protocols and delivering the best experience possible for data consumers and publishers.\n\nSplitgraph is a **seed-stage, venture-funded startup hiring its initial team**. The two co-founders are looking to grow the team to five or six people. This is an opportunity to make a big impact on an agile team while working closely with the\nfounders.\n\nSplitgraph is a **remote-first organization**. The founders are based in the UK, and the company is incorporated in both USA and UK. Candidates are welcome to apply from any geography. We want to work with the most talented, thoughtful and productive engineers in the world.\n# Open Positions\n**Data Engineers welcome!** The job titles have "Software Engineer" in them, but at Splitgraph there's a lot of overlap \nbetween data and software engineering. We welcome candidates from all engineering backgrounds.\n\n[Senior Software Engineer - Backend (mainly Python)](https://www.notion.so/splitgraph/Senior-Software-Engineer-Backend-2a2f9e278ba347069bf2566950857250)\n\n[Senior Software Engineer - Frontend (mainly TypeScript)](https://www.notion.so/splitgraph/Senior-Software-Engineer-Frontend-6342cd76b0df483a9fd2ab6818070456)\n\nβ†’ [**Apply to Job**](https://4o99daw6ffu.typeform.com/to/ePkNQiDp) ← (same form for both positions)\n\n# What is Splitgraph?\n## **Open Source Toolkit**\n\n[Our open-source product, sgr,](https://www.github.com/splitgraph/splitgraph) is a tool for building, versioning and querying reproducible datasets. It's inspired by Docker and Git, so it feels familiar. And it's powered by PostgreSQL, so it works seamlessly with existing tools in the Postgres ecosystem. Use Splitgraph to package your data into self-contained\ndata images that you can share with other Splitgraph instances.\n\n## **Splitgraph Cloud**\n\nSplitgraph Cloud is a platform for data cataloging, integration and governance. The user can upload data, connect live databases, or "push" versioned snapshots to it. We give them a unified SQL interface to query that data, a catalog to discover and share it, and tools to build/push/pull it.\n\n# Learn More About Us\n\n- Listen to our interview on the [Software Engineering Daily podcast](https://softwareengineeringdaily.com/2020/11/06/splitgraph-data-catalog-and-proxy-with-miles-richardson/)\n\n- Watch our co-founder Artjoms present [Splitgraph at the Bay Area ClickHouse meetup](https://www.youtube.com/watch?v=44CDs7hJTho)\n\n- Read our HN/Reddit posts ([one](https://news.ycombinator.com/item?id=24233948) [two](https://news.ycombinator.com/item?id=23769420) [three](https://news.ycombinator.com/item?id=23627066) [four](https://old.reddit.com/r/datasets/comments/icty0r/we_made_40k_open_government_datasets_queryable/))\n\n- [Read our blog](https://www.splitgraph.com/blog)\n\n- Read the slides from our early (2018) presentations: ["Docker for Data"](https://www.slideshare.net/splitgraph/splitgraph-docker-for-data-119112722), [AHL Meetup](https://www.slideshare.net/splitgraph/splitgraph-ahl-talk)\n\n- [Follow us on Twitter](https://ww.twitter.com/splitgraph)\n\n- [Find us on GitHub](https://www.github.com/splitgraph)\n\n- [Chat with us in our community Discord](https://discord.gg/eFEFRKm)\n\n- Explore the [public data catalog](https://www.splitgraph.com/explore) where we index 40k+ datasets\n\n# How We Work: What's our stack look like?\n\nWe prioritize developer experience and productivity. We resent repetition and inefficiency, and we never hesitate to automate the things that cause us friction. Here's a sampling of the languages and tools we work with:\n\n- **[Python](https://www.python.org/) for the backend.** Our [core open source](https://www.github.com/splitgraph/splitgraph) tech is written in Python (with [a bit of C](https://github.com/splitgraph/Multicorn) to make it more interesting), as well as most of our backend code. The Python code powers everything from authentication routines to database migrations. We use the latest version and tools like [pytest](https://docs.pytest.org/en/stable/), [mypy](https://github.com/python/mypy) and [Poetry](https://python-poetry.org/) to help us write quality software.\n\n- **[TypeScript](https://www.typescriptlang.org/) for the web stack.** We use TypeScript throughout our web stack. On the frontend we use [React](https://reactjs.org/) with [next.js](https://nextjs.org/). For data fetching we use [apollo-client](https://www.apollographql.com/docs/react/) with fully-typed GraphQL queries auto-generated by [graphql-codegen](https://graphql-code-generator.com/) based on the schema that [Postgraphile](https://www.graphile.org/postgraphile) creates by introspecting the database.\n\n- [**PostgreSQL](https://www.postgresql.org/) for the database, because of course.** Splitgraph is a company built around Postgres, so of course we are going to use it for our own database. In fact, we actually have three databases. We have `auth-db` for storing sensitive data, `registry-db` which acts as a [Splitgraph peer](https://www.splitgraph.com/docs/publishing-data/push-data) so users can push Splitgraph images to it using [sgr](https://www.github.com/splitgraph/splitgraph), and `cloud-db` where we store the schemata that Postgraphile uses to autogenerate the GraphQL server.\n\n- [**PL/pgSQL](https://www.postgresql.org/docs/current/plpgsql.html) and [PL/Python](https://www.postgresql.org/docs/current/plpython.html) for stored procedures.** We define a lot of core business logic directly in the database as stored procedures, which are ultimately [exposed by Postgraphile as GraphQL endpoints](https://www.graphile.org/postgraphile/functions/). We find this to be a surprisingly productive way of developing, as it eliminates the need for manually maintaining an API layer between data and code. It presents challenges for testing and maintainability, but we've built tools to help with database migrations and rollbacks, and an end-to-end testing framework that exercises the database routines.\n\n- [**PostgREST](https://postgrest.org/en/v7.0.0/) for auto-generating a REST API for every repository.** We use this excellent library (written in [Haskell](https://www.haskell.org/)) to expose an [OpenAPI](https://github.com/OAI/OpenAPI-Specification)-compatible REST API for every repository on Splitgraph ([example](http://splitgraph.com/mildbyte/complex_dataset/latest/-/api-schema)).\n\n- **Lua ([luajit](https://luajit.org/luajit.html) 5.x), C, and [embedded Python](https://docs.python.org/3/extending/embedding.html) for scripting [PgBouncer](https://www.pgbouncer.org/).** Our main product, the "data delivery network", is a single SQL endpoint where users can query any data on Splitgraph. Really it's a layer of PgBouncer instances orchestrating temporary Postgres databases and proxying queries to them, where we load and cache the data necessary to respond to a query. We've added scripting capabilities to enable things like query rewriting, column masking, authentication, ACL, orchestration, firewalling, etc.\n\n- **[Docker](https://www.docker.com/) for packaging services.** Our CI pipeline builds every commit into about a dozen different Docker images, one for each of our services. A production instance of Splitgraph can be running over 60 different containers (including replicas).\n\n- **[Makefile](https://www.gnu.org/software/make/manual/make.html) and** [docker-compose](https://docs.docker.com/compose/) **for development.** We use [a highly optimized Makefile](https://www.splitgraph.com/blog/makefile) and `docker-compose` so that developers can easily spin-up a stack that mimics production in every way, while keeping it easy to hot reload, run tests, or add new services or configuration.\n\n- **[Nomad](https://www.nomadproject.io/) for deployment and [Terraform](https://www.terraform.io/) for provisioning.** We use Nomad to manage deployments and background tasks. Along with Terraform, we're able to spin up a Splitgraph cluster on AWS, GCP, Scaleway or Azure in just a few minutes.\n\n- **[Airflow](https://airflow.apache.org/) for job orchestration.** We use it to run and monitor jobs that maintain our catalog of [40,000 public datasets](https://www.splitgraph.com/blog/40k-sql-datasets), or ingest other public data into Splitgraph.\n\n- **[Grafana](https://grafana.com/), [Prometheus](https://prometheus.io/), [ElasticSearch](https://www.elastic.co/), and [Kibana](https://www.elastic.co/kibana) for monitoring and metrics.** We believe it's important to self-host fundamental infrastructure like our monitoring stack. We use this to keep tabs on important metrics and the health of all Splitgraph deployments.\n\n- **[Mattermost](https://mattermost.com/) for company chat.** We think it's absolutely bonkers to pay a company like Slack to hold your company communication hostage. That's why we self-host an instance of Mattermost for our internal chat. And of course, we can deploy it and update it with Terraform.\n\n- **[Matomo](https://matomo.org/) for web analytics.** We take privacy seriously, and we try to avoid including any third party scripts on our web pages (currently we include zero). We self-host our analytics because we don't want to share our user data with third parties.\n\n- **[Metabase](https://www.metabase.com/) and [Splitgraph](https://www.splitgraph.com) for BI and [dogfooding](https://en.wikipedia.org/wiki/Eating_your_own_dog_food)**. We use Metabase as a frontend to a Splitgraph instance that connects to Postgres (our internal databases), MySQL (Matomo's database), and ElasticSearch (where we store logs and DDN analytics). We use this as a chance to dogfood our software and produce fancy charts.\n\n- **The occasional best-of-breed SaaS services** **for organization.** As a privacy-conscious, independent-minded company, we try to avoid SaaS services as much as we can. But we still find ourselves unable to resist some of the better products out there. For organization we use tools like [Zoom](https://www.zoom.us) for video calls, [Miro](https://miro.com/) for brainstorming, [Notion](https://www.notion.so) for documentation (you're on it!), [Airtable for workflow management](https://airtable.com/), [PivotalTracker](https://www.pivotaltracker.com/) for ticketing, and [GitLab for dev-ops and CI](https://about.gitlab.com/).\n\n- **Other fun technologies** including [HAProxy](http://www.haproxy.org/), [OpenResty](https://openresty.org/en/), [Varnish](https://varnish-cache.org/), and bash. We don't touch them much because they do their job well and rarely break.\n\n# Life at Splitgraph\n**We are a young company building the initial team.** As an early contributor, you'll have a chance to shape our initial mission, growth and company values.\n\n**We think that remote work is the future**, and that's why we're building a remote-first organization. We chat on [Mattermost](https://mattermost.com/) and have video calls on Zoom. We brainstorm with [Miro](https://miro.com/) and organize with [Notion](https://www.notion.so).\n\n**We try not to take ourselves too seriously**, but we are goal-oriented with an ambitious mission.\n\n**We believe that as a small company, we can out-compete incumbents** by thinking from first principles about how organizations interact with data. We are very competitive.\n\n# Benefits\n- Fully remote\n\n- Flexible working hours\n\n- Generous compensation and equity package\n\n- Opportunity to make high-impact contributions to an agile team\n\n# How to Apply? Questions?\n[**Complete the job application**](https://4o99daw6ffu.typeform.com/to/ePkNQiDp)\n\nIf you have any questions or concerns, feel free to email us at [[email protected]](mailto:[email protected])\n\n#Location\n🌏 Worldwide


See more jobs at Splitgraph

# How do you apply?\n\nThis job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.

DataKitchen


verified closed
🌏 Worldwide
 
πŸ’° $50k - $85k

python

 

docker

 
This job post is closed and the position is probably filled. Please do not apply.
Job description\n\nWe are seeking a world-class Manager of Toolchain Software Engineering, whose charter is to create a technical design and build team that can rapidly integrate dozens of tools in DataKitchen’s DataOps platform. There are hundreds of tools that our customers use to do their day to day work: data science, data engineering, data visualization, and governance. We have integrated many of those tools, but our customers are better served by starting with example β€˜content.’ And for us, that content is Recipes/Pipelines with working tool integrations across the varied toolchains/clouds that our customers and prospects use to do data analytics. We want our customers to start from example content and be doing DataOps on their platform in less than 10 minutes.\n\nThis is your chance to create a team from scratch and build a capability that is essential to our company’s success. This is a technical role -- we are looking for a person who will code as well hire and manage a team of engineers to do the work. The position demands strong communication, planning, and management abilities. \n\nPRINCIPAL DUTIES & RESPONSIBILITIES\n\nLead and grow the Toolchain Software Engineering organization, building a highly professional and motivated group. \nDeliver example content and integrations with consistently high quality and reliability, in a timely and predictable manner. \nResponsible for the overall toolchain and example life cycle including testing, updates, design, and, open-source sharing, and documentation.\nManagement of departmental resources, staffing, and building a best-of-class engineering team.\nManage customer support issues in order to deliver a timely resolution to their software issues.\n\nESSENTIAL KNOWLEDGE, SKILLS, AND EXPERIENCE \n\nBS or MS in Computer Science or related field\nAt least 3-5 years of development experience building software or software tools\nMinimum of 1 year of experience at the Project Manager or engineering lead position\nExcellent verbal and written communication skills\nTechnical experience in the following areas preferred:\nPython, Docker, SQL, AWS, Azure, or GCP.\nUnderstanding data science, data visualization, data quality, or data integration \nJenkins, DevOps, CI/CD\n\nPERSONALITY TRAITS\n\nLeadership with flexibility and self-motivation – with a problem solver's attitude. \nHighly effective written and verbal communication skills with a collaborative work style\nCustomer focus, and keen desire to make every customer successful\nAbility to create an open environment conducive to freely sharing information and ideas\n\nOur company is committed to being remote-first, with employees in Cambridge MA, various other states, Buenos Aires Argentina, Italy, and other countries. You must be located within GMT+2 (e.g. Italy) to GMT-8 (e.g. CA). We will not consider candidates outside those time zones. We do not work with recruiters. \n\nDataKitchen is profitable and self-funded and located in Cambridge, MA, USA. \n \n\n#Salary and compensation\n$50,000 — $85,000/year\n\n\n#Location\n🌏 Worldwide


See more jobs at DataKitchen

# How do you apply?\n\nThis job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
103ms