About Us: We're a bootstrapped startup on a mission to help Walmart and Amazon sellers at all levels, from retail arbitrage to wholesale to private label, find profitable products. Our goal is to offer tools that reduce the time and effort needed for product and market research. While we're not the first in the space, we're striving to be the best by providing actionable insights and innovative solutions.
Position: Senior Data Engineer / Analytics Engineer (Part-time or Full-time)
Hours: No Set Schedule (You Manage Your Time)
What We Need from You:
A self-starter with strong analytical and problem-solving skills, with the ability to extract insights and patterns from complex data.
Good command of English, both written and spoken.
Ability to work remotely with no minimal supervisionย
Meticulous attention to detail and a passion for delivering high-quality work.
Willingness to take short-term pain for long-term gain.
Ability to work in the PST timezone.
Key Responsibilities:
Pioneer, develop, and maintain our data infrastructure, ensuring scalability and efficiency.
Make foundational technical decisions regarding tools, data models, and data processing strategies. Evaluate options such as aggregating data within the database or externally, choosing between PostgreSQL, BigQuery, Redshift, or leveraging PySpark with AWS EMR for data processing. These decisions should be backed by data and consider future scalability and efficiency.
Collaborate with founding members to understand data needs and deliver insights that drive business decisions for our customers.
Design and implement robust data models that support our product and market research capabilities.
Work with various data sources, including third-party APIs, crowd-sourced data, and web scraping, to enrich our dataset.
Build and maintain database infrastructure to enable ad-hoc search, filter, and sort capabilities in our dashboard.
Create data infrastructure to support historical timelines for products, enabling trend analysis and market forecasting.
Required Experience:
Strong business acumen and the ability to translate business needs into data solutions.
Proven experience in data modeling and SQL.
Basic familiarity with git.
Experience with data visualization and dashboard development is a plus.
At least 3 years of experience.
Tech Stack Knowledge:
Experience with data processing frameworks (e.g., Apache Spark, AWS EMR, Airflow).
Experience with cloud providers (e.g., AWS, GCP, Azure)
Familiarity with PostgreSQL or similar relational databases.
Knowledge of front-end technologies such as Vue 3 and Chart.js for dashboard development is a plus.
If your experience matches our requirements be ready for the next steps:
A video detailing a previous project, the problems you encountered, your approach to solving them, the outcome of those solutions, and the impact on performance
Culture-fit screening call with the Founder
Technical interview with the Lead Engineer
Compensation: Negotiable, based on experience and qualifications. Equity options are available.
How to Apply: Please complete our brief survey at Survey Link and share any relevant experience in data engineering or analytics roles that you believe will make you successful in this position.
Join Us: This is not just a job; it's a career opportunity to grow with us. We're looking for a dedicated individual who is ready to make a long-term commitment and contribute to our vision.
Please mention the word COMFORTABLE when applying to show you read the job post completely (#RMy4xMzguMTIyLjE5NQ==). This is a feature to avoid fake spam applicants. Companies can search these words to find applicants that read this and instantly see they're human.
Salary and compensation
$10,000 — $30,000/year
Benefits
๐ฐ Profit sharing
โฌ๏ธ No whiteboard interview
๐ No monitoring system
๐ซ No politics at work
๐ We hire old (and young)
How do you apply?
Please complete our brief survey at Survey Link and share any relevant experience in data engineering or analytics roles that you believe will make you successful in this position.
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
KnowBe4 is hiring a Remote Snr. Data Scientist Python
\nRemote positions open to the US only.\nData scientists work closely with business stakeholders to understand their goals and identify data-driven strategies to achieve those goals. They design data modeling processes, create algorithms and predictive models to extract the insight the business needs and help analyze the data to increase the productivity and efficiency of the business.\n\nResponsibilities:\n\n\n* Expertise working experience with programming languages like Python, R, and SQL\n\n* Solid understanding of statistics, probability, and machine learning\n\n* Research, design, and implement Machine Learning algorithms to solve complex problems\n\n* Communicate complex concepts and statistical models to non-technical audiences through data visualizations\n\n* Identify opportunities and formulate data science/machine learning projects to optimize business impact\n\n* Serve as a subject matter expert in data science and analytics research, and adopt the new tooling and methodologies in Knowbe4\n\n\n\n\n Required Skills: \n\n\n* BS or equivalent plus 8 years experience\n\n* MS or equivalent plus 3 years experience\n\n* Ph.D. or equivalent plus 2 years experience\n\n* Professional experience with Python, including Python data libraries (numpy, pandas, matplotlib, scikit-learn, etc) \n\n* Build machine learning models (training, validation, and testing) with appropriate solutions for data reduction, sampling, feature selection, and feature engineering\n\n* Design and evaluate experiments (including hypothesis testing) by creating key data sets\n\n* Help grow the Data Science function by defining and socializing best practices, particularly within a DataOps and MLOps data ecosystem\n\n* Document every action in either issue/MR templates or READMEs so your learnings turn into repeatable actions and then into automation\n\n* Familiarity with the CRISP-DM analytics development model\n\n* Experience working with a variety of statistical and machine learning methods (time series analysis, regression, classification, clustering, survival analysis, etc)\n\n* Deep understanding of SQL in data warehouses (Snowflake and dbt) and in business intelligence tools (Looker)\n\n* Extensive knowledge, application, and experience in creating and implementing recommendation systems, machine learning, NLP, statistics, and deep learning\n\n* Ability to quantify improvements from business efficiency or customer experience based on research outcomes\n\n* Expert understanding of statistics and the math behind data science algorithms\n\n* Identify and spearhead new data science initiatives, projects, and collaborations that improve results\n\n* Willingness to experiment and confront the hardest or most complex problems\n\n\n\n\nThe base pay for this position ranges from $135,000 - $150,000, which will vary depending on how well an applicant's skills and experience align with the job description listed above. \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to Python, Design and DataOps jobs that are similar:\n\n
$70,000 — $100,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nClearwater, Florida, United States
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
This job post is closed and the position is probably filled. Please do not apply. Work for Revelator and want to re-open this job? Use the edit link in the email when you posted the job!
ABOUT US:
Revelator is a leading provider of business solutions for the music industry. Our all-inclusive music distribution platform, API, protocol, and web3 infrastructure, enhances efficiency in music distribution, financial reporting and simplifies royalty operations. We offer a wide range of services, including catalog management, supply chain, income tracking, rights management, and business intelligence. By leveraging our innovative solutions, music businesses can easily navigate the evolving landscape and capitalize on new opportunities.
THE ROLE:
The Data Ops Engineer is responsible for day-to-day technical development and delivery of data pipelines into Revelatorโs data and analytics platform. You will ensure delivery of solutions based on the backbone of good architecture, best data engineering practices around operational efficiencies, security, reliability, performance and cost optimization.
Key Responsibilities:
Design, build and optimize data engineering pipelines to extract data from different sources and applications and feed into cloud data platform.
Build, test and productize data extraction, transformation and reporting solutions within cloud platform.
Provide accurate and timely information that can be used in day to day operational and strategic decision making.
Code, test, and document new or modified data models and ETL/ELT tools to create robust and scalable data assets for reporting and analytics.
Contribute to our ambition to develop a best practice Data and Analytics platform, leveraging next generation cloud technologies.
Define and build the data pipelines that will enable faster, better, data-informed decision-making within the business.
Ensure data integrity within reports and dashboards by reviewing data, identifying and resolving gaps and inconsistencies, and escalating as required to foster a partnered approach to data accuracy for business reporting purposes.
Requirements:
Bachelor's degree in Computer Science, Engineering, or related field.
5+ years of relevant work experience as a Data Ops/Data Integration Engineer, including:
Building ETL/ELT solutions for large scale data pipelines.
General expertise with SQL and database management (Azure SQL Server). Including performance optimization.
Experience with CI/CD for data pipelines
Using Data Ops to develop data flows and the continuous use of data.
Data modeling
Data analysis
Developing technical and support documentation, translate business requirements and needs into reporting and models.
Required Technical Skills
Azure Data Factory
PowerBI, PowerBI scripting & automation
Snowflake, Snowpipes
.NET / C#
Python
SQL, Stored Procedures
Other Skills
Excellent problem-solving skills and the ability to work independently.
Strong teamwork and collaboration skills with the ability to lead and mentor junior developers.
Exceptional communication skills, both written and verbal in English.
Please mention the word AFFECTION when applying to show you read the job post completely (#RMy4xMzguMTIyLjE5NQ==). This is a feature to avoid fake spam applicants. Companies can search these words to find applicants that read this and instantly see they're human.
Salary and compensation
$70,000 — $100,000/year
Benefits
๐ Distributed team
How do you apply?
This job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
Lightspeed Commerce is hiring a Remote Head of BI & Analytics
\nLightspeed is looking to further bring its data capabilities to the next level, from a corporate perspective. We are assembling a new group and weโre looking for a Head of BI & Analytics to lead the team that will be on the forefront of advancing Lightspeed data driven decisions to accelerate its growth.\n\nYou start your day wondering how you and your team could move the needle and be the reference on how to use and leverage data in the organization. You slay at problem-solving, and your not-so-secret weapon is pragmatism. You know the data landscape and youโre game at bringing order to your craft; governance & processes are your secret weapons. You are, after all, responsible for enterprise-wide governance and utilization of information as an asset, using data processing, analysis, data mining, information trading and other means.\n\nYou're pretty awesome at supporting, mentoring and managing a team, and have a reputation for delivering. You will develop data procedures and policies, and work closely with various departments to collect, prepare, organize, protect and analyze data assets. You are leading, formally and informally, inter-disciplinary teams, improving and streamlining data systems within the company, and driving innovation.\n\nYou pride yourself on being able to bridge the gap between business and technical stakeholders. Your track record speaks for itself on data management, leadership and information technology systems and tools. You bring a strong background in IT, business and mathematics. You know how to build sound foundations encompassing all aspects: technology, people, process.\n\nWhat youโll be responsible for:\n\n\n* Thoroughly understand the business strategy and its stakeholders.\n\n* Lead and facilitate the process of defining an enterprise wide data and analytics strategy with stakeholders in alignment with strategic goals.\n\n* Design and implement data strategies, data governance framework and systems.\n\n* Guide, motivate, and grow a team of skilled data practitioners as they solve complex business problems. Identify and encourage areas for growth, education and career development for your team.\n\n* Understand, manage or influence the collection, storage, management, pipelines, quality and protection of data.\n\n* Implement data privacy policies and ensure compliance with data protection regulations.\n\n* Enable data value extraction, for example, efficiencies or increased revenue based on insights derived from data.\n\n* Effectively communicate the status, value, and importance of data collection and governance to executive members and staff.\n\n* Create a culture that promotes data-driven decisions backed by investigation and collaboration around data.\n\n* Serve as a trusted partner to key business executives focused on the customer, enterprise risk management, regulatory compliance and finance.\n\n* Establish and oversee a mechanism to monitor compliance with governance and standards.\n\n* Define KPIs and follow-up mechanisms to ensure alignment between strategy and execution.\n\n* Facilitate development planning in collaboration with functional managers, customer success and other internal teams.\n\n* Help prioritize and decompose high level requirements into development tasks and work items for delegation. If required, lead effort estimation for sprint and data roadmap deliverables; play a leading role in various committees (eg.: data steerco, community of practice, etc.).\n\n* Initiate and contribute to continuous improvement of our data pipelines, processes and practices.\n\n\n\n\nWhat youโll be bringing to the team: \n\n\n* 15+ years of experience in a senior-level data management role\n\n* Strong management experience in a relevant context, with the ability to provide guidance, mentoring and context to team members\n\n* Strong leadership and communication skills, both written and verbal (French and English)\n\n* Project management skills: solid understanding of Agile development and continuous delivery best practices applied to data related backlogs. Ability to plan, organize, prioritize and keep the team focused on the right things\n\n* Analytical, innovative, and tenacious mindset\n\n* Professionalism and ethical behavior\n\n* Experience with most data governance aspects and master data management\n\n* Though hands-on work is not expected, you will bring very good skills / knowledge of technologies such as Python, PHP, Ruby, SQL and NoSQL databases as well as good knowledge and understanding of modern data systems architecture concepts (cloud based)\n\n* Ability to partner effectively with other teams such as business functional groups (Finance, Marketing, Sales) or technical ones (back-office systems, technology foundations teams or product teams)\n\n* Solid experience delivering automated dashboards with key KPIs and its underlying pipeline thanks to solid business acumen\n\n* Bachelorโs degree in Computer Science, Software Engineering or equivalent experience; Masterโs degree preferred.\n\n\n\n\nEven better if you have, but not necessary:\n\n\n* Knowledge of eCommerce and POS systems\n\n* Experience managing budgets, building business or investment cases\n\n* Change management experience\n\n* Experience with data capabilities in a cloud first / SaaS context\n\n* Experience with BI and Big Data pipelines and tools such as Tableau, Looker, Power BI\n\n* Experience with DataOps concepts, Data science, Cloud Infrastructure (GCP, AWS)\n\n* Experience with Data Governance solutions (Colibra)\n\n* Experience with research methodology ideally with a strong background in statistical analysis\n\n* Familiar with JIRA, Confluence, Monday or similar tools\n\n\n\n\nWhat's in it for you?\n\n\n* Join a fast-paced, high-growth company.\n\n* Work on systems that handle billions of dollars in transactions for our merchants globally.\n\n* Surround yourself with strong talent and enjoy continuous professional growth.\n\n* Develop in a modern and proven technology stack.\n\n* Great benefits and perks, including equity and flexible/hybrid remote work options, in a diverse and inclusive environment.\n\n* Development of very high traffic products, used at the global scale.\n\n* Opportunities to learn and expand your skill set\n\n* Become a valued part of the diverse and inclusive Lightspeed family.\n\n\n\n\n โฆ and enjoy a range of benefits thatโll keep you happy, healthy and (not) hungry:\n\n\n* Lightspeed equity scheme (we are all owners)\n\n* Flexible paid time off policy\n\n* Health Insurance\n\n* Health and wellness benefit of $500 per year\n\n* Paid leave and assistance for new parents\n\n* Mental health online platform and counseling & coaching services\n\n* Volunteer day\n\n\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to SaaS, DataOps, Education, Cloud, NoSQL and Ecommerce jobs that are similar:\n\n
$60,000 — $90,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nToronto, Ontario, Canada
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
Lightspeed Commerce is hiring a Remote Head of BI & Analytics
\nLightspeed is looking to further bring its data capabilities to the next level, from a corporate perspective. We are assembling a new group and weโre looking for a Head of BI & Analytics to lead the team that will be on the forefront of advancing Lightspeed data driven decisions to accelerate its growth.\n\nYou start your day wondering how you and your team could move the needle and be the reference on how to use and leverage data in the organization. You slay at problem-solving, and your not-so-secret weapon is pragmatism. You know the data landscape and youโre game at bringing order to your craft; governance & processes are your secret weapons. You are, after all, responsible for enterprise-wide governance and utilization of information as an asset, using data processing, analysis, data mining, information trading and other means.\n\nYou're pretty awesome at supporting, mentoring and managing a team, and have a reputation for delivering. You will develop data procedures and policies, and work closely with various departments to collect, prepare, organize, protect and analyze data assets. You are leading, formally and informally, inter-disciplinary teams, improving and streamlining data systems within the company, and driving innovation.\n\nYou pride yourself on being able to bridge the gap between business and technical stakeholders. Your track record speaks for itself on data management, leadership and information technology systems and tools. You bring a strong background in IT, business and mathematics. You know how to build sound foundations encompassing all aspects: technology, people, process.\n\nWhat youโll be responsible for:\n\n\n* Thoroughly understand the business strategy and its stakeholders.\n\n* Lead and facilitate the process of defining an enterprise wide data and analytics strategy with stakeholders in alignment with strategic goals.\n\n* Design and implement data strategies, data governance framework and systems.\n\n* Guide, motivate, and grow a team of skilled data practitioners as they solve complex business problems. Identify and encourage areas for growth, education and career development for your team.\n\n* Understand, manage or influence the collection, storage, management, pipelines, quality and protection of data.\n\n* Implement data privacy policies and ensure compliance with data protection regulations.\n\n* Enable data value extraction, for example, efficiencies or increased revenue based on insights derived from data.\n\n* Effectively communicate the status, value, and importance of data collection and governance to executive members and staff.\n\n* Create a culture that promotes data-driven decisions backed by investigation and collaboration around data.\n\n* Serve as a trusted partner to key business executives focused on the customer, enterprise risk management, regulatory compliance and finance.\n\n* Establish and oversee a mechanism to monitor compliance with governance and standards.\n\n* Define KPIs and follow-up mechanisms to ensure alignment between strategy and execution.\n\n* Facilitate development planning in collaboration with functional managers, customer success and other internal teams.\n\n* Help prioritize and decompose high level requirements into development tasks and work items for delegation. If required, lead effort estimation for sprint and data roadmap deliverables; play a leading role in various committees (eg.: data steerco, community of practice, etc.).\n\n* Initiate and contribute to continuous improvement of our data pipelines, processes and practices.\n\n\n\n\nWhat youโll be bringing to the team: \n\n\n* 15+ years of experience in a senior-level data management role\n\n* Strong management experience in a relevant context, with the ability to provide guidance, mentoring and context to team members\n\n* Strong leadership and communication skills, both written and verbal (French and English)\n\n* Project management skills: solid understanding of Agile development and continuous delivery best practices applied to data related backlogs. Ability to plan, organize, prioritize and keep the team focused on the right things\n\n* Analytical, innovative, and tenacious mindset\n\n* Professionalism and ethical behavior\n\n* Experience with most data governance aspects and master data management\n\n* Though hands-on work is not expected, you will bring very good skills / knowledge of technologies such as Python, PHP, Ruby, SQL and NoSQL databases as well as good knowledge and understanding of modern data systems architecture concepts (cloud based)\n\n* Ability to partner effectively with other teams such as business functional groups (Finance, Marketing, Sales) or technical ones (back-office systems, technology foundations teams or product teams)\n\n* Solid experience delivering automated dashboards with key KPIs and its underlying pipeline thanks to solid business acumen\n\n* Bachelorโs degree in Computer Science, Software Engineering or equivalent experience; Masterโs degree preferred.\n\n\n\n\nEven better if you have, but not necessary:\n\n\n* Knowledge of eCommerce and POS systems\n\n* Experience managing budgets, building business or investment cases\n\n* Change management experience\n\n* Experience with data capabilities in a cloud first / SaaS context\n\n* Experience with BI and Big Data pipelines and tools such as Tableau, Looker, Power BI\n\n* Experience with DataOps concepts, Data science, Cloud Infrastructure (GCP, AWS)\n\n* Experience with Data Governance solutions (Colibra)\n\n* Experience with research methodology ideally with a strong background in statistical analysis\n\n* Familiar with JIRA, Confluence, Monday or similar tools\n\n\n\n\nWhat's in it for you?\n\n\n* Join a fast-paced, high-growth company.\n\n* Work on systems that handle billions of dollars in transactions for our merchants globally.\n\n* Surround yourself with strong talent and enjoy continuous professional growth.\n\n* Develop in a modern and proven technology stack.\n\n* Great benefits and perks, including equity and flexible/hybrid remote work options, in a diverse and inclusive environment.\n\n* Development of very high traffic products, used at the global scale.\n\n* Opportunities to learn and expand your skill set\n\n* Become a valued part of the diverse and inclusive Lightspeed family.\n\n\n\n\n โฆ and enjoy a range of benefits thatโll keep you happy, healthy and (not) hungry:\n\n\n* Lightspeed equity scheme (we are all owners)\n\n* Flexible paid time off policy\n\n* Health Insurance\n\n* Health and wellness benefit of $500 per year\n\n* Paid leave and assistance for new parents\n\n* Mental health online platform and counseling & coaching services\n\n* Volunteer day\n\n\n\n\n \n\n#Salary and compensation\n
No salary data published by company so we estimated salary based on similar jobs related to SaaS, DataOps, Education, Cloud, NoSQL and Ecommerce jobs that are similar:\n\n
$60,000 — $90,000/year\n
\n\n#Benefits\n
๐ฐ 401(k)\n\n๐ Distributed team\n\nโฐ Async\n\n๐ค Vision insurance\n\n๐ฆท Dental insurance\n\n๐ Medical insurance\n\n๐ Unlimited vacation\n\n๐ Paid time off\n\n๐ 4 day workweek\n\n๐ฐ 401k matching\n\n๐ Company retreats\n\n๐ฌ Coworking budget\n\n๐ Learning budget\n\n๐ช Free gym membership\n\n๐ง Mental wellness budget\n\n๐ฅ Home office budget\n\n๐ฅง Pay in crypto\n\n๐ฅธ Pseudonymous\n\n๐ฐ Profit sharing\n\n๐ฐ Equity compensation\n\nโฌ๏ธ No whiteboard interview\n\n๐ No monitoring system\n\n๐ซ No politics at work\n\n๐ We hire old (and young)\n\n
\n\n#Location\nMontreal, Quebec, Canada
๐ Please reference you found the job on Remote OK, this helps us get more companies to post here, thanks!
When applying for jobs, you should NEVER have to pay to apply. You should also NEVER have to pay to buy equipment which they then pay you back for later. Also never pay for trainings you have to do. Those are scams! NEVER PAY FOR ANYTHING! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. A good idea is to check the domain name for the site/email and see if it's the actual company's main domain name. Scams in remote work are rampant, be careful! Read more to avoid scams. When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.
This job post is closed and the position is probably filled. Please do not apply. Work for Variant Perception and want to re-open this job? Use the edit link in the email when you posted the job!
Data Engineer with a Passion for Financial Markets
Variant Perception is a small investment research team working with sophisticated investors, asset managers and funds. We are passionate about markets and looking for a data engineer to help us scale our models and capabilities. Formal experience in finance is helpful, but a passion for investing is key.
We are known for using innovative analysis to build predictive models. Our investment process is based on first principles thinking with empirical verification. We study history to understand how markets and economies work, and use a variety of data to build systematic models using these insights. This allows our small team to cover many markets and asset classes.
Your role on the team
We have many investment models that span time-horizons and asset classes. We use Prefect for model & job orchestration and use GCP as our cloud provider. Experience with these or similar tools is preferable. (GCP, AWS, Azure, Kubernetes, Prefect, Airflow, etc)
As our models have grown in number, so has our data. We need someone who can architect and manage the data infrastructure. This includes helping the quant team with ingesting and structuring datasets, and removing bottlenecks in existing workflows. We currently use Snowflake but would like to experiment with moving our data to BigQuery. Experience with SQL, Snowflake, BigQuery and general DB administration is preferable.
We have built a proof-of-concept data api to allow clients to use our data in their investment decisions and strategies. Expanding the api capabilities and performance is one of the first projects you would work on with an immediate impact on the business.
Ultimately we want to utilize recent LLM breakthroughs to build a natural language interface to our APIs for data, charts and research. We are looking for someone with experience or an interest to learn about RAG based LLM models.
As our investment R&D grows we want to ingest more datasets that can help us understand what is happening in economies, beyond the traditional financial and econometric data. We need someone to help orchestrate the ingestion, but also use methods to gather unstructured data and find meaning. Again, an interest in ML is a bonus.
Are you a good fit?
You must believe it's possible to build predictive models for markets. If you believe in efficient markets you will not like this job.
We have big goals for our team and rely on each other to achieve them. To be successful at VP you must be self-motivated, have trust in teammates, be forgiving of mistakes, be open-minded and above all enjoy figuring out problems, often from scratch.
There are many ideas we want to explore, each requiring new skills. You should enjoy continuous learning.
Location
Remote, but if you already live in New York, London or Charlotte, that's an added bonus.
Please mention the word GUTSY when applying to show you read the job post completely (#RMy4xMzguMTIyLjE5NQ==). This is a feature to avoid fake spam applicants. Companies can search these words to find applicants that read this and instantly see they're human.
Salary and compensation
$110,000 — $150,000/year
Benefits
๐ Distributed team
How do you apply?
This job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.
This job post is closed and the position is probably filled. Please do not apply. Work for Tessera and want to re-open this job? Use the edit link in the email when you posted the job!
Total Compensation Value:
$80k - $180k (Salary + equity)
NOTE: Actual total compensation offer will vary based on applicant location / cost of living, skillsets, and level of relevant experience
Time zones:
Eastern (UTC -05:00) and Central European UTC (+1:00)
About the company:
Tessera provides ownership of the worldโs most sought after NFTs! Working at Tessera, you will be building on the cutting edge of art, finance, and blockchain technology to help shape the future of digital collecting experiences.
You will get to work, learn and grow with an experienced team supported by incredible partnerships and committed investments from developers, collectors, investors, and thought leaders deeply passionate about the decentralization ecosystem.
We are looking for an exceptional data engineer to join our team. They will work closely with our CTO and web stack team to build our databases with an anticipation of future data needs, identify where to find that data, and build scalable backend infrastructure. The role will be expected to discover and aggregate data from different sources and blockchains using their skillsets which should include designing database architecture, data structures, pipelines, ETL processes, API/SDK development, and some backend software development as we build our foundation for rapid growth, future data science initiatives, and continued innovation in this exciting space! Experience with blockchain, NFTs, and DeFi is preferred.
Integrations: Various APIs, browser-based crypto wallets (e.g., MetaMask) etc.
What to expect
Make a HUGE impact helping to bring ideas to reality
Learn highly valuable, and complex concepts related to art, finance, and technology in a full-time job in an industry growing exponentially, on a team with some of the leading NFT influencers
Become a core member of a very passionate team in a friendly environment
Work within a dynamic team, who challenges the status quo and champions agile working plus continuous improvement
In this role you will be expected to...
Work with our development team to continually release technical enhancements
Be responsible for identifying and anticipating our data needs and optimizing our backend software infrastructure, setting data priorities, and building a scalable foundation for our fast evolving siteย tessera.co
Own the data lifecycle (from ingest and automated quality checks, to discovery and usage, and database setup)
Build custom integrations between cloud-based or blockchain-based systems using APIs and other data sources
Design efficient data structures, database schemas and ETL for long-term sustainability
Ship high-quality, well-tested, secure, and maintainable code
Write server scripts and APIs
Routinely inspect server code for speed optimization and practical trade-offs
Build a scalable NFT metadata backend infrastructure forย tessera.co
Incorporate data processing and workflow management tools into pipeline design (AWS etc.)
Design, develop, and optimize data pipelines and backend services for real-time decisioning, reporting, data collection, and related features / functions
Drive strategic technology decisions related to the appropriate data stores for the job (e.g., warehouses etc.)
(Long-term) Architect, build, and launch new data models that provide intuitive analytics to the team
Wrangle large-scale data sets from the blockchain and other site APIs (e.g., OpenSea)
Build data expertise and own data quality for the pipelines you create
Strong technical and non-technical communication abilities, both verbal and written
Our fast-paced, agile development environment will require a penchant for task management and respect for efficient, best practice development principles as well!
What weโre looking for
6+ months of tinkering and/or participating somewhere in Web3 (DeFi, NFTs, DAOs, etc.)
Experience with querying and interacting with EVM & non-EVM Based blockchains
Understanding of low-level idiosyncrasies of popular blockchains including Ethereum, Solana
3 or more years of relevant software experience in a data or backend-focused role
Strong experience with two or more of the following languages: Python, SQL, Javascript, Scala
Experience designing data structures, database schemas and ETL pipelines from scratchย
Experience with workflow systems such as Apache Airflow2 or more years of professional work experience on ETL pipeline implementation using services such as Pyspark, Glue, Dataflow, Lambda, Athena, S3, GCS, SNS, PubSub, Kinesis, etc.
Experience with scalable cloud-based solutions
A pro-active and autonomous team player, self-starter with the ability to anticipate future needs
Capable of prioritizing multiple project in order to meet goals without management oversight
Experience in communicating with users, other technical teams, and product management to understand requirements, describe data priorities and challenges, and technical design needs
Excellent writing skills and the ability to drive via influence
Proficiency in the English language, both written and verbal, sufficient for success in a remote and largely asynchronous work environment
Strong attention to detail
Bonus points for
BS/MS in Computer Science, Computer Engineering, or a related technical field
Previous experience in a rapidly scaling start-up environment
Professional work experience using real-time streaming systems (Kafka/Kafka Connect, Spark, Flink or AWS Kinesis)
Previous experience building Analytics/BI systems from scratch
Previous experience building large-scale data architectures
Previous experience in BI or Data Science
What we're offering
Competitive salary (and equity) in an exciting space driving disruptive innovation
The opportunity to play a key voice in our growing organization
A remote work environment with competitive benefits and holidays
7 additional company holidays, including all-company week-long winter break
Medical, Dental, and Vision Insurance for US-based employees
Agile working environment with flexible working hours and location, career advancement, and competitive compensation package
Optional offsite social events to help our employees become familiar with each other and our culture
Applicants for employment in the US must have work authorization that does not now or in the future require sponsorship of a visa for employment authorization in the United States
ย If youโre convinced you are the right fit and you canโt wait to join our team, we look forward to hearing from you!
Once you've applied, please be patient :) it may take us up to 2-3 weeks to get back to you!
Donโt meet every single requirement?
Studies have shown that women and people of color are less likely to apply to jobs unless they meet every single qualification. At Tessera we are dedicated to building a diverse, inclusive and authentic workplace, so if youโre excited about this role but your past experience doesnโt align perfectly with every qualification in the job description, we encourage you to apply anyways.
You may be just the right candidate for this or other roles.
Please mention the word EXHILARATE when applying to show you read the job post completely (#RMy4xMzguMTIyLjE5NQ==). This is a feature to avoid fake spam applicants. Companies can search these words to find applicants that read this and instantly see they're human.
Salary and compensation
$80,000 — $180,000/year
Benefits
๐ Distributed team
โฐ Async
๐ค Vision insurance
๐ฆท Dental insurance
๐ Medical insurance
๐ Unlimited vacation
๐ Company retreats
๐ฐ Equity compensation
How do you apply?
This job post has been closed by the poster, which means they probably have enough applicants now. Please do not apply.