Remote Jobs at Brightfield Group 📈 Open Startup
RSS
API
Remote HealthPost a Job

get a remote job
you can do anywhere

There are now 2 Remote Jobs at Brightfield Group tagged Engineer and Data Science
such as Data Engineer and Data Scientist

  Jobs

  People

👉 Hiring remotely? Reach 1,000,000+ remote workers on the 🏆 #1 remote jobs board

Post a Job - $299
Hide this

Last 30 days

The first health insurance for remote startups
A fully equipped health insurance that works for all your global employees

Brightfield Group

 

Data Scientist

Data Scientist  


Brightfield Group


data science

data science

10d
\nPosition Overview:\n\nOur tech team is looking for a data scientist with excellent communication skills and demonstrated experience writing idiomatic Python code. You’re comfortable fielding a question from a non-technical stakeholder about our dataset and then putting together a data visualization with the answer. You’re also ready to troubleshoot a bug in one of our existing ETL scripts and make a pull request with a detailed write-up of the fix. We use Google BigQuery, PowerBI, spaCy, pandas, Airflow, Docker.\n\nThe right candidate has experience with the Python data science stack as well as one or more BI tools such as Tableau or PowerBI, and is able to juggle competing priorities with finesse. Working in a fast-paced, flexible, start-up environment; we welcome your adaptability, curiosity, passion, grit, and creativity to contribute to our cutting-edge research of this growing, fascinating industry.\n\nKey Responsibilities:\n\n\n* Query and transform data with Standard SQL and pandas\n\n* Build BI reports to answer questions of our data\n\n* Work with our data engineering team to munge large datasets using our existing data pipelines for our existing BI reports\n\n\n\n\nQualifications & Skills:\n\nREQUIRED:\n\n\n* 1-3 years of experience working full-time with Python for data science; we use pandas, scikit-learn, and numpy\n\n* Intermediate-to-expert level SQL experience; we use Standard SQL\n\n* Experience with one or more natural language processing frameworks; we use spaCy.\n\n* Excellent communication skills and demonstrated ability to collaborate with non-technical stakeholders to create compelling answers to tough data questions\n\n* Intermediate-to-expert level skills with one or more interactive business intelligence tools like PowerBI or Tableau\n\n\n\n\nPREFERRED:\n\n\n* Experience with CI/CD tools like CircleCI; we use GitHub Actions\n\n* Experience with Docker\n\n* Experience with Airflow\n\n\n\n\nBENEFITS:\n\n\n* Choose your own laptop\n\n* Health Insurance\n\n* 401K\n\n\n

See more jobs at Brightfield Group

Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Brightfield Group


Data Engineer

Data Engineer


Brightfield Group


engineer

engineer

11d
\nPosition Overview:\n\nThe ideal candidate is an experienced data engineer. You will help us develop and maintain our data pipelines, built with Python, Standard SQL, pandas, and Airflow within Google Cloud Platform. We are in a transitional phase of refactoring our legacy Python data transformation scripts into iterable Airflow DAGs and developing CI/CD processes around these data transformations. If that sounds exciting to you, you’ll love this job. You will be expected to build scalable data ingress and egress pipelines across data storage products, deploy new ETL pipelines and diagnose, troubleshoot and improve existing data architecture. Working in a fast-paced, flexible, start-up environment; we welcome your adaptability, curiosity, passion, grit, and creativity to contribute to our cutting-edge research of this growing, fascinating industry.  \n\n\nKey Responsibilities:\n\n\n* Build and maintain ETL processes with our stack: Airflow, Standard SQL, pandas, spaCy, and Google Cloud. \n\n\n\n\n\n* Write efficient, scalable code to munge, clean, and derive intelligence from our dataPage Break\n\n\n\n\nQualifications & Skills: \n\nREQUIRED:\n\n\n* 1-3 years experience in a data-oriented Python role, including use of: \n\n\n\n\n\n\n\n\n* Google Cloud Platform (GCE, GBQ, Cloud Composer, GKE)\n\n\n* Airflow\n\n* CI/CD like: GitHub Actions or CircleCI \n\n* Docker\n\n\n\n\n\n\n\n* Fluency in the core tenants of the Python data science stack: SQL, pandas, scikit-learn, etc. \n\n\n\n\n\n* Familiarity with modern NLP systems and processes, ideally spaCy\n\n\n\n\nPREFERRED:\n\n\n* Demonstrated ability to collaborate effectively with non-technical stakeholders\n\n* Experience scaling data processes with Kubernetes \n\n* Experience with survey and/or social media data\n\n\n\n\n\n* Experience preparing data for one or more interactive data visualization tools like PowerBI or Tableau\n\n\n\n\n\nBENEFITS:\n\n\n* Choose your own laptop\n\n* Health Insurance\n\n\n\n\n\n* 401K\n\n\n

See more jobs at Brightfield Group

Apply for this Job

👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Posts that link to pages with "how to work online" are also scams. Don't use them or pay for them. Also always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.