FeedbackIf you find a bug, or have feedback, put it here. Please no job applications in here, click Apply on the job instead.Thanks for the message! We will get back to you soon.

[Spam check] What is the name of Elon Musk's company going to Mars?

Send feedback
Open Startup
RSS
API
Health InsurancePost a job

find a remote job
work from anywhere

Get a  email of all new Remote Amazon Web Services + Software Developer + Rust Jobs

Subscribe
×

πŸ‘‰ Hiring for a Remote Amazon Web Services + Software Developer + Rust position?

Post a job
on the πŸ† #1 Remote Jobs board

Remote Health by SafetyWing


Global health insurance for freelancers & remote workers


pganalyze is hiring a Remote Senior Backend Engineer

At pganalyze, we redefine the user experience for optimizing the performance of Postgres databases. Our product helps customers such as Atlassian, Robinhood and DoorDash to understand complex Postgres problems and performance issues.\n\nApplication developers use pganalyze to get deep insights into complex database behaviors. Our product is heavy on automated analysis and custom visualizations, and makes automatic recommendations, such as suggesting the best index to create for a slow query.\n\nYou will enjoy working at pganalyze if you are a software craftsperson at heart, who cares about writing tools for developers. You will take new features from idea to production deployment end-to-end within days. Your work will regularly involve writing or contributing to open-source components as well as the Postgres project.\n\nWe are a fully remote company, with the core team based in the San Francisco Bay Area. Our company is bootstrapped and profitable. We emphasize autonomy and focus time by having few meetings per week.\n\n### About the role\n\nYour core responsibility: To develop and optimize our Postgres statistics and analysis pipeline, end-to-end, and work on the processes that generate automated insights from the complex data set. This work involves having a detailed understanding of the core data points that are collected from the source Postgres database as a timeseries, optimizing how they get retrieved, transported to the pganalyze servers, and then processed and analyzed.\n\nToday, this data pipeline is a combination of open-source Go code (in the [pganalyze collector](https://github.com/pganalyze/collector)), and statistics processing written in Ruby. You will be responsible for improving this pipeline, introducing new technologies, including a potential rewrite of the statistics processing in Rust.\n\nSome of the work will lead into the depths of Postgres code, and you might need to compile some C code, or understand how the pganalyze parser library, [pg_query](https://pganalyze.com/blog/pg-query-2-0-postgres-query-parser), works in detail.\n\nYour work is the foundation of the next generation of pganalyze, with a focus on the automatic insights we can derive from the workload of the monitored Postgres databases, and giving fine-tuned recommendations such as which indexes to create, or which config settings to tune.\n\n#### At pganalyze, you will:\n\n* Collaborate with other engineers on shipping new functionality end-to-end, and ensure features are performant and well implemented\n* Be the core engineer for the foundational components of pganalyze, such as the statistics pipeline that processes all data coming into the product\n* Develop new functionality that monitors additional Postgres statistics, or derives new insights from the existing time series information\n* Write Ruby, Go or Rust code on the pganalyze backend and the pganalyze collector\n* Evaluate and introduce new technologies, such as whether we should utilize Rust in more places of the product\n* Optimize the performance of pganalyze components, using language-specific profilers, or Linux tools like β€œperf”\n* Scale out our backend, which relies heavily on Postgres itself for statistics storage\n* Contribute to our existing open-source projects, such as pg_query, or create new open-source projects in the Postgres space\n* Work with upstream communities, such as the Postgres project, and contribute code back\n\n#### Previously, you have:\n\n* Worked professionally for at least 5 years as a software engineer\n* Written complex, data heavy backend code with Rust, Go, Ruby or Python\n* Used Postgres for multiple projects, are comfortable writing SQL, and are familiar with β€œEXPLAIN”\n* Created indexes on a Postgres database based on a query being slow\n* Looked at the source for a complex open-source project to chase a hard to understand bug\n* Written code that fetches data and/or interacts with cloud provider APIs\n* Structured your work and set your schedule to optimize for your own productivity\n\n#### Optionally, you may also have:\n\n* Written low-level C code, for fun\n* Used Protocol Buffers, FlatBuffers, msgpack or Cap’n Proto to build your own APIs\n* Analyzed patterns in time series data and run statistical analysis on the data\n* Experimented with ML frameworks to analyze complex data sets\n* Optimized a data-heavy application built on Postgres\n* Written your own Postgres extensions\n* Used APM and tracing tools to understand slow requests end-to-end\n\n#### You could also be familiar with:\n\n* Building your own Linux system from scratch\n* The many [regards](https://twitter.com/regardstomlane) of Tom Lane on the Postgres mailing list\n* Reproducible builds, and why it would be really nice to have them, like yesterday \n\n#Salary and compensation\n$140,000 — $180,000/year\n\n\n#Location\nUnited States / Canada


See more jobs at pganalyze

Previous Remote Amazon Web Services + Software Developer + Rust Jobs

Mediasmart.io


closed
Spain
 
πŸ’° $20k - $30k

nodejs

 

go

 
This job post is closed and the position is probably filled. Please do not apply.
We are looking for a person willing to learn our technology stack and our business. This is an entry level position. We expect a good grasp of modern Javascript but we will teach you what is needed from our stack. Reporting to the head of backend, you will partake in our backend dev team.\n\n\n\n**Key Responsibilities** \n\n\n\n* Develop new functionalities and maintain our product. \n\n* Deliver new code fast and reliably, and always ensure that it works in production. \n\n* Be able to promptly solve issues directly in the existing code. \n\n* Proactively analyse areas to improve and propose projects to make our product more effective\n\n\n\n**Desired Skills & Experience** \n\n\n\n* Fluency in English and Spanish.\n\n* Knowledge of JavaScript and some other language. \n\n* Some experience with Git version control. \n\n* Basic Linux administration. \n\n* Good communication skills, oral and written. \n\n* Proactivity, critical thinking and good disposition for working in teams. \n\n\n\n**Bonus Experience** \n\n\n\n* Experience with Node.js. \n\n* Experience with Go. \n\n* Experience with NoSQL and big data databases: BigQuery, Redis, RocksDB... \n\n* Experience in cloud computing like AWS or GCP. \n\n* Interest in machine learning and statistics.\n\n\n\nDon’t worry if you lack some of the requirements; we would teach you everything needed rather than passing on the right person.\n\n\n\n**What We Offer** \n\n\n\n* Competitive salary. \n\n* Permanent contract. \n\n* Career plan. \n\n* Courses (related online or offline courses you might like). \n\n* Remote work, also after COVID. \n\n* Flexible hours. \n\n* Join a motivated and innovative team using stateΒ­ of art technologies. \n\n* Join a company with large expansion and growth projects \n\n* Do work that makes a difference.\n\n\n\n**About mediasmart**\n\n\n\nmediasmart started in January 2012 and is one of the most innovative platforms in the mobile programmatic advertising space. VC funded in its origin, mediasmart is now part of Affle International (Singapore), a global consumer intelligence technology company that has Microsoft, D2C (An NTT DoCoMo, Dentsu & NTT Advertising JV), Itochu, Bennett Coleman & Company (BCCL) as shareholders, and whose Indian subsidiary trades on the India stock exchanges (BSE: 542752 & NSE: AFFLE).\n\n\n\nmediasmart has offices in Madrid, Paris, and New York. Since its inception, mediasmart has always been very clear on its position in the mobile advertising ecosystem: full focus on advertisers and the buying process. Today we are the first programmatic platform that allows advertisers to measure the incremental impact of their drive to store and app promotion campaigns in real time, so that they can invest more in what really helps them grow their business.\n\n\n\nmediasmart was one of the first players to enter the programmatic mobile ecosystem and to date our proprietary technology stack comprises of a DSP, DMP & Ad-Server, as well as direct connections to more than 30 ad exchanges where we buy display, video, audio and native ads on mobile apps, web, smart TV and desktop. \n\n#Salary and compensation\n$20,000 — $30,000/year\n\n\n#Location\nSpain


See more jobs at Mediasmart.io

# How do you apply?\n\nThis job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.
113ms