๐Ÿ“ˆ Open Startup
RSS
API
Post a Job

get a remote job
you can do anywhere

75 Remote Big Data Jobs at companies like Ahrefs, Causalens and Pixalate last posted 17 days ago. The median salary for Big Data jobs while working remotely is $60,000 as of June 2019.

75 Remote Big Data Jobs at companies like Ahrefs, Causalens and Pixalate last posted 17 days ago. The median salary for Big Data jobs while working remotely is $60,000 as of June 2019.

Get a  email of all new remote Big Data jobs

Subscribe
×

  Jobs

  People

๐Ÿ‘‰ Hiring for a remote Big Data position?

Post a Job - $299
on the ๐Ÿ† #1 remote jobs board

This month


Ahrefs

Oncall DevOps SRE For Big Data Infrastructure  


Ahrefs


big data

devops

devops

big data

devops

devops

17d

Apply

{linebreak}What We Need{linebreak}{linebreak}Ahrefs is looking for a Site Reliability Engineer to help take care of its distributed crawler powered by 2,000 servers and ensure all systems are up and running 24/7. If you possess a healthy desire to automate everything while being able to quickly resolve urgent issues manually, then we want you! We strive to keep humans away from doing repetitive jobs that can be done by computers and focus instead on foreseeing problems and defining programmatic means to handle them.{linebreak}{linebreak}Our system is big part custom OCaml code and also employs third-party technologies - Debian, ELK, Puppet, Clickhouse, and anything else that will solve the task at hand. In this role, be prepared to deal with 25 petabytes storage cluster, 2,000 baremetal servers, experimental large-scale deployments and all kinds of software bugs and hardware deviations on a daily basis.{linebreak}{linebreak}Basic Requirements:{linebreak}{linebreak}{linebreak}* Deep understanding of operating systems and networks fundamentals{linebreak}{linebreak}* Practical knowledge of Linux userspace and kernel internals{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}The ideal candidate is expected to:{linebreak}{linebreak}{linebreak}* Understand the whole technology stack at all levels: from network and user-space code to OS internals and hardware{linebreak}{linebreak}* Independently deal with and investigate infrastructure issues on live production systems including dealing with hardware problems and interact with datacenters{linebreak}{linebreak}* Develop internal automation - monitoring, setup, statistics{linebreak}{linebreak}* Have the ability to foresee potential problems and prevent them from happening. Apply first-aid reaction to infrastructure failures when necessary{linebreak}{linebreak}* Help developers with deployment and integration{linebreak}{linebreak}* Participate in on-call rotation{linebreak}{linebreak}* Make well-reasoned technical choices and take responsibility for it{linebreak}{linebreak}* Approach problems with a practical mindset and suppress perfectionism when time is a priority{linebreak}{linebreak}* Setup automatic systems to control infrastructure{linebreak}{linebreak}* Possess a healthy detestation for complex shell scripts{linebreak}{linebreak}{linebreak}

See more jobs at Ahrefs

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

Last 30 days


causaLens

verified

Senior Python Developer


causaLens

verified

python

big data

machine learning

numpy

python

big data

machine learning

numpy

1mo

Apply

**Summary **{linebreak}We are looking for a motivated and high-achieving Senior Python Developer based anywhere in Europe to join the team working on an exciting new Big Data/Machine Learning platform. This is a full time placement with significant opportunities for growth and advancement as one of the first employees of the company. {linebreak}{linebreak}**The Company **{linebreak}causaLens is a deep-tech startup based in London backed by prominent VCs. We are on a mission to develop a machine that predicts the global economy in real-time. We develop the next generation of autonomous predictive technology for complex and dynamic systems. We call it the CLPU (causaLens Predictive Unit). Our technology helps large organisations optimise business outcomes at scale.{linebreak}Visit www.causaLens.com to find out more.{linebreak}{linebreak}**Benefits: **{linebreak}Successful candidate will have the opportunity to join a fast-growing, agile, and international team passionate about innovation and making a difference. We will offer guidance, mentorship, and opportunities for professional development. {linebreak}{linebreak}# Responsibilities{linebreak} Developing an automated machine learning platform for quantitative finance. The application stack is Python, Numpy, Scipy, Sklearn, Keras, Django, Celery, Postgres, AWS, GCP. {linebreak}{linebreak}# Requirements{linebreak}* Smart, capable and can write clean code{linebreak}* Ability to design and architect high-performance distributed software{linebreak}* Development experience in at least one scripting language - preferably Python{linebreak}* Interest in machine learning/big data (prior experience plus){linebreak}* Knowledge of the Git version control system{linebreak}* Good organisational skills both in management of time and code{linebreak}* Ability to effectively work remotely (from home or a coworking space)

See more jobs at causaLens

# How do you apply? Please send your CV and application to our email below.
Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

This year


Pixalate

Big Data Engineer


Pixalate


big data

engineer

big data

engineer

9mo

Apply

{linebreak}Who are we?{linebreak}{linebreak}{linebreak}Pixalate helps Digital Advertising ecosystem become a safer and more trustworthy place to transact in, by providing intelligence on "bad actors" using our world class data. Our products provide benchmarks, analytics, research and threat intelligence solutions to the global media industry. We make this happen by processing terabytes of data and trillions of data points a day across desktop, mobile, tablets, connected-tv that are generated using Machine Learning and Artificial Intelligence based models.{linebreak}{linebreak}{linebreak}We are the World's #1 decision making platform for Digital Advertising. And don't just take our word for it -- Forrester Research consistently depends on our monthly indexes to make industry predictions.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}What does the media have to say about us?{linebreak}{linebreak}{linebreak}{linebreak}*  Harvard Business Review{linebreak}{linebreak}* Forbes{linebreak}{linebreak}* NBC News {linebreak}{linebreak}* CNBC{linebreak}{linebreak}* Business Insider{linebreak}{linebreak}* AdAge{linebreak}{linebreak}* AdAge{linebreak}{linebreak}* CSO Online{linebreak}{linebreak}* Mediapost{linebreak}{linebreak}* Mediapost{linebreak}{linebreak}* The Drum{linebreak}{linebreak}* Mediapost{linebreak}{linebreak}* Mediapost{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}How is it working at Pixalate?{linebreak}{linebreak}{linebreak}We believe in Small teams that produce high output{linebreak}{linebreak}{linebreak}Slack is a way of life, short emails are encouraged{linebreak}{linebreak}{linebreak}Fearless attitude holds high esteem{linebreak}{linebreak}{linebreak}Bold ideas are worshipped{linebreak}{linebreak}{linebreak}Chess players do really well{linebreak}{linebreak}{linebreak}Titles don't mean much, you attain respect by producing results{linebreak}{linebreak}{linebreak}Everyone's a data addict and an analytical thinker (you won't survive if you run away from details){linebreak}{linebreak}{linebreak}Collaboration, collaboration, collaboration{linebreak}{linebreak}{linebreak}What will you do?{linebreak}{linebreak}{linebreak}Support existing processes running in production{linebreak}{linebreak}{linebreak}Design, develop, and support of various big data solutions at scale (hundreds of Billions of transactions a day){linebreak}{linebreak}{linebreak}Find smart, fault tolerant, self-healing, cost efficient solutions to extremely hard data problems{linebreak}{linebreak}{linebreak}Take ownership of the various big data solutions, troubleshoot issues, and provide production support{linebreak}{linebreak}{linebreak}Conduct research on new technologies that can improve current processes{linebreak}{linebreak}{linebreak}Contribute to publications of case studies and white papers delivering cutting edge research in the ad fraud, security and measurement space{linebreak}{linebreak}{linebreak}What are the minimum requirements for this role?{linebreak}{linebreak}{linebreak}Bachelors, Masters or Phd in Computer Science, Computer Engineering, Software Engineering, or other related technical field.{linebreak}{linebreak}{linebreak}A minimum of 3 years of experience in a software or data engineering role{linebreak}{linebreak}{linebreak}Excellent teamwork and communication skills{linebreak}{linebreak}{linebreak}Extremely analytical, critical thinking, and problem solving abilities{linebreak}{linebreak}{linebreak}Proficiency in Java{linebreak}{linebreak}{linebreak}Very strong knowledge of SQL and ability to implement advanced queries to extract information from very large datasets{linebreak}{linebreak}{linebreak}Experience in working with very large datasets using big data technologies such as Spark, BigQuery, Hive, Hadoop, Redshift, etc{linebreak}{linebreak}{linebreak}Ability to design, develop and deploy end-to-end data pipelines that meet business requirements.{linebreak}{linebreak}{linebreak}Strong experience in AWS and Google Cloud platforms is a big plus{linebreak}{linebreak}{linebreak}Deep understanding of computer science concepts such as data structures, algorithms, and algorithmic complexity{linebreak}{linebreak}{linebreak}Deep understanding of statistics and machine learning algorithms foundations is a huge plus{linebreak}{linebreak}{linebreak}Experience with Machine Learning big data technologies such as R, Spark ML, H2O, Mahout etc is a plus{linebreak}{linebreak}{linebreak}What do we have to offer?{linebreak}{linebreak}{linebreak}Located in sunny Palo Alto and Playa Vista, CA the core of Pixalate's DNA lies in innovation. We focus on doing things differently and we challenge each other to be the best we can be. We offer:{linebreak}{linebreak}{linebreak}Experienced leadership and founding team{linebreak}{linebreak}{linebreak}Casual environment (as long as you wear clothes, we're good!){linebreak}{linebreak}{linebreak}Flexible hours (yes, we mean it - you will never have to sit in traffic anymore!){linebreak}{linebreak}{linebreak}FREE Lunches! (You name it, we've got it){linebreak}{linebreak}{linebreak}Fun team events{linebreak}{linebreak}{linebreak}High performing team who wants to win and have fun doing it{linebreak}{linebreak}{linebreak}Extremely Competitive Compensation{linebreak}{linebreak}{linebreak}OPPORTUNITY (Pixalate will be what you make it)

See more jobs at Pixalate

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


IQVIA The Human Data Science Company

Senior Engineer Big Data Spark Scala


IQVIA The Human Data Science Company


big data

scala

senior

engineer

big data

scala

senior

engineer

1yr

Apply

{linebreak}We are looking for creative, intellectually curious and entrepreneurial Big Data Software Engineers to join our London-based team.{linebreak}{linebreak}The team{linebreak}{linebreak}Join a high-profile team to work on ground-breaking problems in health outcomes across disease areas including Ophthalmology, Oncology, Neurology, Chronic diseases such as diabetes, and a variety of very rare conditions. Work hand-in-hand with statisticians, epidemiologists and disease area experts across the wider global RWE Solutions team, leveraging a vast variety of anonymous patient-level information from sources such as electronic health records; The data encompasses IQVIA’s access to over 530 million anonymised patients as well as bespoke, custom partnerships with healthcare providers and payers. {linebreak}{linebreak}The role{linebreak}{linebreak}As part of a highly talented Engineering and Data Science team, write highly performant and scalable code that will run on top of our Big Data platform (Spark/Hive/Impala/Hadoop). Collaborate with Data Science & Machine Learning experts on the ETL process, including the cohort building efforts. {linebreak}{linebreak}What to expect:{linebreak}{linebreak}{linebreak}* Working in a cross-functional team – alongside talented Engineers and Data Scientists{linebreak}{linebreak}* Building scalable and high-performant code{linebreak}{linebreak}* Mentoring less experienced colleagues within the team{linebreak}{linebreak}* Implementing ETL and Feature Extractions pipelines{linebreak}{linebreak}* Monitoring cluster (Spark/Hadoop) performance{linebreak}{linebreak}* Working in an Agile Environment{linebreak}{linebreak}* Refactoring and moving our current libraries and scripts to Scala/Java{linebreak}{linebreak}* Enforcing coding standards and best practices{linebreak}{linebreak}* Working in a geographically dispersed team{linebreak}{linebreak}* Working in an environment with a significant number of unknowns – both technically and functionally.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Our ideal candidate: Essential experience {linebreak}{linebreak}{linebreak}* BSc or MSc in Computer Science or related field{linebreak}{linebreak}* Strong analytical and problem solving skills with personal interest in subjects such as math/statistics, machine learning and AI.{linebreak}{linebreak}* Solid knowledge of data structures and algorithms{linebreak}{linebreak}* Proficient in Scala, Java and SQL{linebreak}{linebreak}* Strong experience with Apache Spark, Hive/Impala and HDFS{linebreak}{linebreak}* Comfortable in an Agile environment using Test Driven Development (TDD) and Continuous Integration (CI){linebreak}{linebreak}* Experience refactoring code with scale and production in mind{linebreak}{linebreak}* Familiar with Python, Unix/Linux, Git, Jenkins, JUnit and ScalaTest{linebreak}{linebreak}* Experience with integration of data from multiple data sources{linebreak}{linebreak}* NoSQL databases, such as HBase, Cassandra, MongoDB{linebreak}{linebreak}* Experience with any of the following distributions of Hadoop - Cloudera/MapR/Hortonworks.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Bonus points for experience in: {linebreak}{linebreak}{linebreak}* Other functional Languages such as Haskell and Clojure{linebreak}{linebreak}* Big Data ML toolkits such as Mahout, SparkML and H2O{linebreak}{linebreak}* Apache Kafka, Apache Ignite and Druid{linebreak}{linebreak}* Container technologies such as Docker{linebreak}{linebreak}* Cloud Platforms technologies such as DCOS/Marathon/Apache Mesos, Kubernetes and Apache Brooklyn.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}This is an exciting opportunity to be part of one of the world's leading Real World Evidence-based teams, working to help our clients answer specific questions globally, make more informed decisions and deliver results.{linebreak}{linebreak}Our team within the Real-World & Analytics Solutions (RWAS) Technology division is a fast growing group of collaborative, enthusiastic, and entrepreneurial individuals. In our never-ending quest for opportunities to harness the value of Real World Evidence (RWE), we are at the centre of IQVIA’s advances in areas such as machine learning and cutting-edge statistical approaches. Our efforts improve retrospective clinical studies, under-diagnosis of rare diseases, personalized treatment response profiles, disease progression predictions, and clinical decision-support tools.{linebreak}{linebreak}We invite you to join IQVIA™.{linebreak}{linebreak}IQVIA is a strong advocate of diversity and inclusion in the workplace.  We believe that a work environment that embraces diversity will give us a competitive advantage in the global marketplace and enhance our success.  We believe that an inclusive and respectful workplace culture fosters a sense of belonging among our employees, builds a stronger team, and allows individual employees the opportunity to maximize their personal potential.

See more jobs at IQVIA The Human Data Science Company

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Xponential Works

Back End Full Stack Engineers For Big Data Business Applications


Xponential Works


big data

backend

full stack

big data

backend

full stack

1yr

Apply


Stats (beta): ๐Ÿ‘ 699 views,โœ๏ธ 0 applied (0%)
{linebreak}Xponential Works is seeking candidates who are excited about building a petabyte scale data warehouse from the ground up using cutting edge, open source technologies, hosted on Google Cloud Platform (GCP) leveraging Docker, Kubernetes, and App Engine. Our developers and engineers work together to design, implement, and continuously deploy features; continually look for the next problem to solve; and seek to constantly improve existing systems. The platform you will build will be used to enable Internet of Things architectures, which seek to connect everyday objects to the internet, as well as to monitor and improve business operations for new products. We are seeking to fill Back End Software Engineer positions as well as Full Stack Developer positions in our newly renovated Ventura office.{linebreak}{linebreak}Xponential Works houses several startups side by side, in an open office environment. You will have the opportunity to interact with driven, innovative developers working to push emerging products and processes to the market.{linebreak}{linebreak}As a Back End Software Engineer, you will be implementing the Extract Load Transform pipeline, architecting the non-relational data store, and generally making data available for front end use by queries and requests from external systems.{linebreak}{linebreak}As a Full Stack Developer, you will be building the components of our system that securely connect the front end interface with the and back end processing, improving system performance, assisting with data acquisition, and collaborating with teammates to drive modularization and stability across the entire architecture.{linebreak}{linebreak}Expectations{linebreak}{linebreak}Xponential Works houses several startups side by side, in an open office environment. You will have the opportunity to interact with driven, innovative developers working to push emerging products and processes to the market. As part of this technical venture company, you will collaborate with a small (4-6 person) development team to design and build a non-relational data warehouse. Responsibilities include developing new features, improving existing features, testing, fixing bugs, and juggling hats.{linebreak}{linebreak}Skills & Requirements{linebreak}{linebreak}We are seeking applications from strong developers of all backgrounds and orientations but, due to the nature of our work, applicants must be US citizens{linebreak}{linebreak}{linebreak}* Ability to focus on customer-driven requirements and customer satisfaction{linebreak}{linebreak}* Ability to balance multiple projects and priorities in an Agile environment{linebreak}{linebreak}* Desire to conquer ever changing problems and products in a start-up atmosphere{linebreak}{linebreak}* Passion for automation and testing as a part of development{linebreak}{linebreak}* Active curiosity and interest in continually learning new technologies and improving existing skills{linebreak}{linebreak}* Minimal travel possible but not anticipated; local candidate required{linebreak}{linebreak}* Willing to commute or relocate near Ventura, CA{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Basic Qualifications{linebreak}{linebreak}{linebreak}* 3+ years experience designing and delivering production-quality code for backend systems{linebreak}{linebreak}* Working knowledge of multiple programming languages: GO, Python, Java, C++, etc.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Preferred Qualifications {linebreak}{linebreak}{linebreak}* Experience with AWS and Google Cloud Platform{linebreak}{linebreak}* Familiarity with Docker, Kubernetes, Hadoop, SQL, PostgreSQL, MongoDB{linebreak}{linebreak}* Knowledge of data engineering{linebreak}{linebreak}* Experience with continuous integration and deployment{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Local candidates only; relocation assistance is not available. {linebreak}Must be eligible to work in the US; sponsorship not available. 

See more jobs at Xponential Works

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Ultra Tendency

Big Data Developer


Ultra Tendency


big data

dev

digital nomad

big data

dev

digital nomad

1yr

Apply


Stats (beta): ๐Ÿ‘ 393 views,โœ๏ธ 0 applied (0%)
{linebreak}Big Data Developer Moscow {linebreak}{linebreak}LLC  {linebreak}{linebreak}Required experience: 3-6 years{linebreak}{linebreak}Full time, full day{linebreak}{linebreak} is a Moscow-based software-development and consulting company, which is fast growing and specializing in the fields of Big Data and Data Science. We design, develop, and support complex algorithms and applications that enable data-driven products and services for our customers.{linebreak}{linebreak} is looking for a Big Data Developer who wants to join our fast growing team. The ideal candidate is a creative problem solver, resourceful in getting things done, can work independently or in a team.{linebreak}{linebreak}We are looking for a person with:{linebreak}{linebreak}{linebreak}* 2+ years experience with data ingestion, analysis, integration, and design{linebreak}{linebreak}* 2+ years experience utilizing relational concepts, RDBMS systems, and data design techniques{linebreak}{linebreak}* Strong Java or Scala programming skills with 2+ years of experience in software development{linebreak}{linebreak}* Experience developing on Linux{linebreak}{linebreak}* Experience working with and developing on Apache Hadoop and/or Apache Spark{linebreak}{linebreak}* Sound knowledge of SQL{linebreak}{linebreak}* Solid computer science fundamentals (algorithms, data structures, and programming skills){linebreak}{linebreak}* Strong analytical skills{linebreak}{linebreak}* Computer Science (equivalent degree) preferred or comparable years of experience{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}We are expecting from you:{linebreak}{linebreak}{linebreak}* Convert specifications to detailed instructions and logical steps to follow for coding{linebreak}{linebreak}* Build Java or Scala program code, test, and deploy in different cluster systems{linebreak}{linebreak}* Provide user support and problem solving - research results and analyze logs in search of causes and correct problems identified programs{linebreak}{linebreak}* Enjoying being challenged and to solve complex data problems on a daily basis{linebreak}{linebreak}* Document the programs developed, logic, coding, and corrections{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}What we offer:{linebreak}{linebreak}{linebreak}* Fascinating tasks and interesting projects{linebreak}{linebreak}* Young and dynamic team{linebreak}{linebreak}* Individual career development path (Big Data Administrator, Big Data Developer, or Data Scientist){linebreak}{linebreak}* Fair pay, determined upon interview outcome and experience{linebreak}{linebreak}{linebreak}

See more jobs at Ultra Tendency

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Affectv

Java Developer Learn Big Data Machine Learning


Affectv


big data

dev

java

machine learning

big data

dev

java

machine learning

1yr

Apply

{linebreak}Do you want to learn Big Data & Machine Learning within your Java Developer role? Would you enjoy working on Large Data Sets/Data Lakes? Do you want the flexibility of working from home?{linebreak}{linebreak}If the answer’s yes, then this is the role for you!{linebreak}{linebreak}The best part is that you need no prior experience working in a Data Engineering position – just a solid knowledge of Java and the passion for wanting to make the move into a more data heavy role!{linebreak}{linebreak}We’re a mature tech start-up based in Central London and we need an innovative, passionate and enthusiastic Java Developer to create and maintain a highly scalable intelligent Data Lake platform, making use of Big Data and Machine Learning concepts which you will get to learn on the job.{linebreak}{linebreak}We have a unique platform and we’re totally revitalising the way mobile apps are promoted, advertised and marketed. As a result, we’re helping thousands of developers manage and monetise their apps, making them more visible to the wider public, making them easier to download, increasing the amount of downloads and therefore increasing revenue streams.  {linebreak}{linebreak}With excellent investment and a team of experienced leaders, we now need to scale up the development team (starting with the Java team and then looking to scale up the Data Science function).{linebreak}{linebreak}To succeed as Java Developer (with a focus on data), you’ll learn skills on the job to:{linebreak}{linebreak}{linebreak}* Develop a complex, intelligent Data Lake Platform{linebreak}{linebreak}* Work collaboratively with Java and Data Science teams{linebreak}{linebreak}* Design highly scalable Java 8 architecture{linebreak}{linebreak}* Utilise big data and machine learning technology{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}We’re at the forefront of the industry and Tech is everything to us. As Java Developer you’ll have the opportunity to work with the latest technologies in the market (Big Data, Machine Learning, Scala, Python, Spark), with scope to gain cross-training into other areas of the business including Data Scientists roles or further careers within Java at Lead level in the future.{linebreak}{linebreak}In order to do this, you’ll bring your experience including:{linebreak}{linebreak}{linebreak}* A passion for Clean Code and best engineering principles, and you’ll be able to discuss where and why you’ve implemented them in previous roles.{linebreak}{linebreak}* Conceptual knowledge of Dropwizard for microservices architecture{linebreak}{linebreak}* Knowledge of Gradle for Automation and GIT for version control{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Familiarity of Docker Containers and AWS knowledge would be a big bonus, and an understanding of REST API & GUICE for dependency is highly beneficial but definitely not essential.{linebreak}{linebreak}Your work/life balance matters, and we know that you’ll do you best work when you’re comfortable. So, with that in mind, we use Slack, Zoom and other tech to create a flexible working environment, enabling you to work remotely or with the option of flexible hours.{linebreak}{linebreak}Once up to speed with the tech and applications, you can choose to work from home or come into the office after rush hour. Avoid that manic/uncomfortable commute!{linebreak}{linebreak}We know you’ll be successful as our new Java Developer so we’re offering a competitive benefits package including share options and an extremely fast rate of progression due to rapid business growth plans.{linebreak}{linebreak}And that’s not all! As well as the experience you’ll gain in Big Data, Machine Learning & AWS, you’ll have unlimited holidays, work alongside an ambitious Data Science team and progress rapidly. Oh, and as well as a healthy salary, we’ll keep you healthy with unlimited fruit, tea and coffee (OK, caffeine isn’t that healthy, but it makes us happy…).{linebreak}{linebreak}Are you ready to tackle this challenge?{linebreak}{linebreak}Kick your Java Developer career up a gear and apply now! {linebreak}{linebreak}Tapdaq work in partnership with Talent Point, a Hiring Communications business, who have designed this role on behalf of Tapdaq. When you apply, Giggs will give you a call and he'll tell you even more details about the role and help you though every step of the process.{linebreak}{linebreak}{linebreak} {linebreak} No terminology in this advert is designed to discriminate on grounds of gender, race, colour, religion, creed, disability, age, sex or sexual orientation. We’re an equal-opportunity employer and do not discriminate against these or any other class protected by applicable law.

See more jobs at Affectv

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Ultra Tendency

Big Data Administrator


Ultra Tendency


big data

admin

big data

admin

1yr

Apply

{linebreak}Your Responsibilities:{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* Responsible for development and troubleshooting using various Hadoop technologies{linebreak}{linebreak}* Deploy and extend large-scale Hadoop clusters for our clients{linebreak}{linebreak}* Fine tune existing clusters for higher performance and throughput{linebreak}{linebreak}* Work in our offices and on-site at the premises of our clients{linebreak}{linebreak}* Enjoy being challenged and to solve complex problems on a daily basis{linebreak}{linebreak}* Be part of our newly formed team in Berlin and help driving its culture and work attitude{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Job Requirements{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* You are a structured problem solver who is able to work in a team as well as independently{linebreak}{linebreak}* Experience administrating a Linux environment (RHEL){linebreak}{linebreak}* Strong background on the Hadoop ecosystem and its tools{linebreak}{linebreak}* Understanding of software development methodologies and processes{linebreak}{linebreak}* Sound knowledge of SQL, relational concepts and RDBMS systems is a plus{linebreak}{linebreak}* Knowledge of Docker and related tools is a plus{linebreak}{linebreak}* Being able to work in an English-speaking, international environment{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}We offer:{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* Fascinating tasks and interesting Big Data projects in various industries{linebreak}{linebreak}* Benefit from 10 years of delivering excellence to our customers{linebreak}{linebreak}* Work with our open-source Big Data gurus, such as our Apache HBase committer and Release Manager{linebreak}{linebreak}* Work in the open-source community and become a contributor{linebreak}{linebreak}* Learn from open-source enthusiasts which you will find nowhere else in Germany!{linebreak}{linebreak}* Fair pay and bonuses{linebreak}{linebreak}* Enjoy our additional benefits such as a free BVG ticket and fresh fruits in the office{linebreak}{linebreak}* Possibility to work remotely or in one of our development labs throughout Europe{linebreak}{linebreak}* Work with cutting edge equipment and tools{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}

See more jobs at Ultra Tendency

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Ultra Tendency

Software Engineer Big Data


Ultra Tendency


big data

dev

engineer

digital nomad

big data

dev

engineer

digital nomad

1yr

Apply

{linebreak}Your Responsibilities:{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* Deliver value to our clients in all phases of the project life cycle{linebreak}{linebreak}* Convert specifications to detailed instructions and logical steps followed by their implementation{linebreak}{linebreak}* Build program code, test and deploy to various environments (Cloudera, Hortonworks, etc.){linebreak}{linebreak}* Enjoy being challenged and solve complex data problems on a daily basis{linebreak}{linebreak}* Be part of our newly formed team in Berlin and help driving its culture and work attitude{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Job Requirements{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* Strong experience developing software using Java or comparable languages (e.g., Scala){linebreak}{linebreak}* Practical experience with data ingestion, analysis, integration, and design of Big Data applications using Apache open-source technologies{linebreak}{linebreak}* Strong background in developing on Linux{linebreak}{linebreak}* Proficiency with the Hadoop ecosystem and its tools{linebreak}{linebreak}* Solid computer science fundamentals (algorithms, data structures and programming skills in distributed systems){linebreak}{linebreak}* Sound knowledge of SQL, relational concepts and RDBMS systems is a plus{linebreak}{linebreak}* Computer Science (or equivalent degree) preferred or comparable years of experience{linebreak}{linebreak}* Being able to work in an English-speaking, international environment {linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}We offer:{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* Fascinating tasks and interesting Big Data projects in various industries{linebreak}{linebreak}* Benefit from 10 years of delivering excellence to our customers{linebreak}{linebreak}* Work with our open-source Big Data gurus, such as our Apache HBase committer and Release Manager{linebreak}{linebreak}* Work on the open-source community and become a contributor{linebreak}{linebreak}* Learn from open-source enthusiasts which you will find nowhere else in Germany!{linebreak}{linebreak}* Fair pay and bonuses{linebreak}{linebreak}* Enjoy our additional benefits such as a free BVG ticket and fresh fruits in the office{linebreak}{linebreak}* Possibility to work remotely or in one of our development labs throughout Europe{linebreak}{linebreak}* Work with cutting edge equipment and tools{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}

See more jobs at Ultra Tendency

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Anchormen

Senior Big Data Engineer


Anchormen


big data

senior

engineer

big data

senior

engineer

1yr

Apply

{linebreak}Overview{linebreak}{linebreak}{linebreak}Anchormen is growing rapidly! Therefore, we are looking for additional experienced Big Data Engineers to serve our customer base at a desired level. This entails giving advise, building and maintaining Big Data platforms and employing data science solutions/models in enterprise environments.{linebreak}{linebreak}We build and deliver data driven solutions that do not depend on one specific tool or technology. As independent consultant and engineer, your knowledge and experience will be a major contribution to our colleagues and customers. A diverse and challenging position, where technology is paramount. Are you joining our team?{linebreak}{linebreak}{linebreak}Responsibilities{linebreak}{linebreak}{linebreak}{linebreak}* You will be working on 1 to 3 different projects at any given time.{linebreak}{linebreak}* On average you will work 50% of the time at the Anchormen office and 50% at the client's location.{linebreak}{linebreak}* You work closely together with business to achieve data excellence.{linebreak}{linebreak}* You have a pro-active attitude towards the needs of the client.{linebreak}{linebreak}* You will be building test-driven software.{linebreak}{linebreak}* You gather data from external API’s and internal sources and add value to the data platform.{linebreak}{linebreak}* You work closely together with data scientists to bring machine learning algorithms into a production environment.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Your profile{linebreak}{linebreak}{linebreak}{linebreak}* You work and think at a Bachelor’s or Master's level.{linebreak}{linebreak}* You have a minimum of two years experience in a similar position.{linebreak}{linebreak}* You have knowledge about OO and functional programming in languages such as: Java, Scala and Python (knowledge of several languages is a plus).{linebreak}{linebreak}* You have knowledge and experience with building and implementing API’s on a large scale.{linebreak}{linebreak}* You have thorough knowledge of SQL.{linebreak}{linebreak}* You believe in the principle of ”clean coding”; you don’t just write code for yourself or a computer, but for your colleagues as well.{linebreak}{linebreak}* You have hands-on experience with technologies such as: Hadoop, Spark, Kafka, Cassandra, HBase, Hive, Elastic, etc.{linebreak}{linebreak}* You are familiar with the Agile Principles.{linebreak}{linebreak}* You are driven to keep self-developing and following the latest technologies.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}About Anchormen{linebreak}{linebreak}{linebreak}We help our clients to use Big Data in a smart way, which leads to new insights, knowledge and efficiency. We advise our clients on designing their Big Data platform. Our consultants provide advice, implement the appropriate products, and create complex algorithms to do the proper analyses and predictions.{linebreak}{linebreak}{linebreak}Why Anchormen{linebreak}{linebreak}{linebreak}Anchormen has an open working environment. Everyone is open to initiatives. You can be proactive in these, and have every freedom to allow your work to be part of our success. We don’t believe in micro-management, but give our people the freedom to function optimally. Hard work naturally also plays a part – but with enjoyment!{linebreak}{linebreak}{linebreak}What we offer{linebreak}{linebreak}{linebreak}{linebreak}* Flexibility in working from home.{linebreak}{linebreak}* Competitive market salary.{linebreak}{linebreak}* Training and development budget for employees’ personal growth.{linebreak}{linebreak}* Being part of a fast-growing and innovative company.{linebreak}{linebreak}* Travel allowance.{linebreak}{linebreak}* Friendly and cooperative colleagues.{linebreak}{linebreak}* Daily office fruit and snacks.{linebreak}{linebreak}* All the coffee you can consume!{linebreak}{linebreak}{linebreak}

See more jobs at Anchormen

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


AdRoll

Lead Big Data Engineer


AdRoll


big data

exec

engineer

big data

exec

engineer

2yr

Apply

{linebreak}About the Role:{linebreak}{linebreak}AdRoll's data infrastructure processes 100TB of compressed data, 4 trillion events, and 100B real time events daily on a scalable, highly available platform. As a member of the data & analytics team, you will work closely with data engineers, analysts, and data scientists to develop novel systems, algorithms, and processes to handle massive amounts of data using languages such as Python and Java.{linebreak}{linebreak}Responsibilities:{linebreak}{linebreak}{linebreak}* Develop and operate our data pipeline & infrastructure{linebreak}{linebreak}* Work closely with analysts and data scientists to develop data-driven dashboards and systems{linebreak}{linebreak}* Tackle some of the most challenging problems in high-performance, scalable analytics{linebreak}{linebreak}* Available for after hour issues and the ability to be on call, but aiming incessantly to reduce after hours incidents{linebreak}{linebreak}* Communicate with Product and Engineering Managers{linebreak}{linebreak}* Mentor Junior Engineers on the team{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Qualifications:{linebreak}{linebreak}{linebreak}* A BS or MS degree in Computer Science or Computer Engineering, or equivalent experience{linebreak}{linebreak}* 4-6 years experience, atleast 2 of which include leading teams{linebreak}{linebreak}* Experience with scalable systems, large-scale data processing, and ETL pipelines{linebreak}{linebreak}* Experience with big data technologies such as Hadoop, Hive, Spark, or Storm{linebreak}{linebreak}* Experience with NoSQL databases such as Redis, Cassandra, or HBase{linebreak}{linebreak}* Experience with SQL and relational databases such as Postgres or MySQL{linebreak}{linebreak}* Experience developing and deploying applications on Linux infrastructure{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Bonus Points:{linebreak}{linebreak}{linebreak}* Knowledge of Amazon EC2 or other cloud-computing services{linebreak}{linebreak}* Experience with Presto (https://prestodb.io/){linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Compensation:{linebreak}{linebreak}{linebreak}* Competitive salary and equity{linebreak}{linebreak}* Medical / Dental / Vision benefits{linebreak}{linebreak}* Paid time off and generous holiday schedule{linebreak}{linebreak}* The opportunity to win the coveted Golden Bagel award{linebreak}{linebreak}{linebreak}

See more jobs at AdRoll

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


data Artisans

Solutions Architect With Big Data Distributed Systems Experience


data Artisans


big data

architecture

big data

architecture

2yr

Apply

{linebreak} data Artisans is currently building a new team of Solution Architects in Europe and the US. You’ll be part of a new and fast-growing team helping customers having a great experience using our products and Apache Flink. The role will sit at the forefront of one of the most significant paradigm shifts in information processing and real-time architectures in recent history - stream processing - which sets the foundation to transform companies and industries for the on-demand services era.{linebreak}{linebreak}You will work with engineering teams inside of our customers to build the best possible stream processing architecture for their use cases. This includes reviewing their architecture, giving guidance on how they design their Flink applications, and helping them take their first steps with our products.{linebreak}{linebreak}Some of the customer engagements will be carried out remotely via phone and screen share, but the position also includes traveling to customers to help them onsite.{linebreak}{linebreak}And when you’re not working with our customers, there are plenty of opportunities at data Artisans to learn more about Flink, contribute to the products and open source projects, and help evangelizing Apache Flink to users around the world.{linebreak}{linebreak}What you’ll do all day:{linebreak}{linebreak}{linebreak}* Use your experience to solve challenging data engineering and stream processing problems for our customers{linebreak}{linebreak}* Meet with customers, understand their requirements, and help guide them towards best-of-breed architectures{linebreak}{linebreak}* Provide guidance and coding assistance during the implementation phase and make sure projects end in successful production deployments{linebreak}{linebreak}* Become an Apache Flink and stream processing expert{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}You will love this job if you …{linebreak}{linebreak}{linebreak}* ... are experienced in building and operating solutions using distributed data processing systems on large scale production environments (e.g. Hadoop, Kafka, Flink, Spark){linebreak}{linebreak}* … are fluent in Java and/or Scala{linebreak}{linebreak}* … love to spend the whole day talking about Big Data technologies{linebreak}{linebreak}* … have great English skills and like talking to customers{linebreak}{linebreak}* … like traveling around the Europe/USA and visiting new places{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}What we offer:{linebreak}{linebreak}{linebreak}* Competitive salary and stock options{linebreak}{linebreak}* Tech gear of your choice{linebreak}{linebreak}* International team environment (10 nationalities so far){linebreak}{linebreak}* Flexible working arrangements (home office, flexible working hours){linebreak}{linebreak}* Unlimited vacation policy, so take time off when you need it{linebreak}{linebreak}* Spacious office space in the Kreuzberg district of Berlin{linebreak}{linebreak}* Snacks, coffee and beverages in the office{linebreak}{linebreak}* Relocation assistance if needed{linebreak}{linebreak}* Hackathons and weekly technical Lunch Talks to keep your head full of inspirations and ideas!{linebreak}{linebreak}{linebreak}

See more jobs at data Artisans

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


AdRoll

Senior Big Data Engineer


AdRoll


big data

senior

engineer

big data

senior

engineer

2yr

Apply


Stats (beta): ๐Ÿ‘ 305 views,โœ๏ธ 0 applied (0%)
{linebreak}About the Role:{linebreak}{linebreak}AdRoll's data infrastructure processes 100TB of compressed data, 4 trillion events, and 100B real time events daily on a scalable, highly available platform. As a member of the data & analytics team, you will work closely with data engineers, analysts, and data scientists to develop novel systems, algorithms, and processes to handle massive amounts of data using languages such as Python and Java.{linebreak}{linebreak}Responsibilities:{linebreak}{linebreak}{linebreak}* Develop and operate our data pipeline & infrastructure{linebreak}{linebreak}* Work closely with analysts and data scientists to develop data-driven dashboards and systems{linebreak}{linebreak}* Tackle some of the most challenging problems in high-performance, scalable analytics{linebreak}{linebreak}* Available for after hour issues and the ability to be on call, but aiming incessantly to reduce after hours incidents{linebreak}{linebreak}* Available to assist Junior Engineers on the team{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Qualifications:{linebreak}{linebreak}{linebreak}* A BS or MS degree in Computer Science or Computer Engineering, or equivalent experience{linebreak}{linebreak}* 3+ years experience{linebreak}{linebreak}* Experience with scalable systems, large-scale data processing, and ETL pipelines{linebreak}{linebreak}* Experience with big data technologies such as Hadoop, Hive, Spark, or Storm{linebreak}{linebreak}* Experience with NoSQL databases such as Redis, Cassandra, or HBase{linebreak}{linebreak}* Experience with SQL and relational databases such as Postgres or MySQL{linebreak}{linebreak}* Experience developing and deploying applications on Linux infrastructure{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Bonus Points:{linebreak}{linebreak}{linebreak}* Knowledge of Amazon EC2 or other cloud-computing services{linebreak}{linebreak}* Experience with Presto (https://prestodb.io/){linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Compensation:{linebreak}{linebreak}{linebreak}* Competitive salary and equity{linebreak}{linebreak}* Medical / Dental / Vision benefits{linebreak}{linebreak}* Paid time off and generous holiday schedule{linebreak}{linebreak}* The opportunity to win the coveted Golden Bagel award{linebreak}{linebreak}{linebreak}

See more jobs at AdRoll

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Stats (beta): ๐Ÿ‘ 1,099 views,โœ๏ธ 0 applied (0%)
SmileDirectClub is looking for an experienced Data Engineer to help design and scale our data pipelines to help our engineers, operations team, marketing managers, and analysts make better decisions with data. We are looking for engineers that understand that simplicity and reliability are aspects of a system that canโ€™t be tacked on but are carefully calculated with every decision made. If you have experience working on ETL pipelines and love thinking about how data models and schemas should be architected, we want to hear from you.{linebreak}{linebreak}SmileDirectClub was founded on a simple belief: everyone deserves a smile they love. We are the first digital brand for your smile. The company was built upon a realization that recent trends in 3D printing and telehealth could bring about disruptive change to the invisible aligner market. By leveraging proprietary cutting-edge technology, weโ€™re helping customers avoid office visits and cutting their costs by up to 70 percent because people shouldnโ€™t have to pay a small fortune for a better smile.{linebreak}{linebreak}You will:{linebreak}{linebreak}Design and build new dimensional data models and schema designs to improve accessibility, efficiency, and quality of internal analytics data{linebreak}Build, monitor, and maintain analytics data ETL pipelines{linebreak}Implement systems for tracking data quality and consistency{linebreak}Work closely with Analytics, Marketing, Finance, and Operations teams to understand data and analysis requirements{linebreak}Work with teams to continue to evolve data models and data flows to enable analytics for decision making (e.g., improve instrumentation, optimize logging, etc.){linebreak}Weโ€™re looking for someone who:{linebreak}{linebreak}Has a curiosity about how things work{linebreak}Is willing to role-up their sleeves to leverage Big Data and discover new key performance indicators{linebreak}Has built enterprise data pipelines and can craft clean and beautiful code in SQL, Python, and/or R{linebreak}Has built batch data pipelines with Hadoop or Spark as well as with relational database engines, and understands their respective strengths and weaknesses{linebreak}Has experience with ETL jobs, metrics, alerting, and/or logging{linebreak}Has expert knowledge of query optimization in MPP data warehouses (Redshift, Snowflake, Cloudera, HortonWorks, MapR, or similar){linebreak}Experience in the latest/cutting edge design and development of big data solutions{linebreak}Proficiency in the latest trends in big data analytics and architecture{linebreak}Can jump into situations with few guardrails and make things better{linebreak}Possesses strong computer science fundamentals: data structures, algorithms, programming languages, distributed systems, and information retrieval{linebreak}Is a strong communicator. Explaining complex technical concepts to product managers, support, and other engineers is no problem for you{linebreak}When things break, and they will, is eager and able to help fix them{linebreak}Is someone that others enjoy working with due to your technical competence and positive attitude{linebreak}Is ready to design and create ROLAP, MOLAP, and RDBMS data stores{linebreak}How to stand out against the rest:{linebreak}{linebreak}Academic background in computer science or mathematics (BSc or MSc), or demonstrated industry hands-on experience{linebreak}Experience with agile development processes{linebreak}Experience building simple scripts and web applications using Python, Ruby, or PHP{linebreak}A solid grasp of basic statistics (regression, hypothesis testing){linebreak}Experience in small start-up environments{linebreak}Benefits:{linebreak}{linebreak}Competitive salary{linebreak}Health, vision and dental insurance{linebreak}401K plan{linebreak}PTO{linebreak}Discounted SmileDirectClub aligner treatment{linebreak} {linebreak}{linebreak}About SmileDirectClub:{linebreak}{linebreak}SmileDirectClub is backed by Camelot Venture Group, a private investment group that has been pioneering the direct-to-consumer industry since the early โ€˜90s, particularly in highly regulated industries. If youโ€™ve heard of 1-800-CONTACTS, Quicken Loans, HearingPlanet, DiabetesCareClub or SongbirdHearing, then youโ€™ve heard of Camelot. Their hands-on approach, extensive networking, and operational expertise ensures their portfolio companies reach their potential.{linebreak}{linebreak}Having closed on a $46.7 million capital raise in July 2016 led by Align Technology (NASDAQ: ALGN), owner of the Invisalignยฎ brand, SmileDirectClub is now valued at $275 million and is continuing to grow share in the U.S. orthodontics market.

See more jobs at SmileDirectClub

Visit SmileDirectClub's website

# How do you apply? Please apply using the following link: http://smiledirectclub.applicantstack.com/x/apply/a2do6hunjpsb
Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Hotjar

Valletta

Big Data Engineer


Hotjar

Valletta

amazon

elasticsearch

python

engineer

amazon

elasticsearch

python

engineer

Valletta2yr

Stats (beta): ๐Ÿ‘ 2,574 views,โœ๏ธ 0 applied (0%)
**Note: although this is a remote position, we are currently only seeking candidates in time zones between UTC-2 and UTC+7.**{linebreak}{linebreak}{linebreak}{linebreak}Hotjar is looking for a driven and ambitious DevOps Engineer with Big Data experience to support and expand our cloud-based infrastructure used by thousands of sites around the world. The Hotjar infrastructure currently processes more than 7500 API requests per second, delivers over a billion pieces of static content every week and hosts databases well into terabyte-size ranges, making this an interesting and challenging opportunity. As Hotjar continues to grow rapidly, we are seeking an engineer who has experience dealing with high traffic cloud based applications and can help Hotjar scale as our traffic multiplies. {linebreak}{linebreak}{linebreak}{linebreak}This is an excellent career opportunity to join a fast growing remote startup in a key position.{linebreak}{linebreak}{linebreak}{linebreak}In this position, you will:{linebreak}{linebreak}{linebreak}{linebreak}- Be part of our DevOps team building and maintaining our web application and server environment.{linebreak}{linebreak}- Choose, deploy and manage tools and technologies to build and support a robust infrastructure.{linebreak}{linebreak}- Be responsible for identifying bottlenecks and improving performance of all our systems.{linebreak}{linebreak}- Ensure all necessary monitoring, alerting and backup solutions are in place.{linebreak}{linebreak}- Do research and keep up to date on trends in big data processing and large scale analytics.{linebreak}{linebreak}- Implement proof of concept solutions in the form of prototype applications.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak} {linebreak}{linebreak} {linebreak}{linebreak}#Salary{linebreak} - {linebreak} {linebreak}{linebreak}#Location{linebreak}- Valletta

See more jobs at Hotjar

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


SkyTruth

Big Data Engineer


SkyTruth


big data

engineer

big data

engineer

2yr

Apply


Stats (beta): ๐Ÿ‘ 852 views,โœ๏ธ 0 applied (0%)
{linebreak}This is an extraordinary opportunity to get to use cutting-edge big data and machine learning tools while doing something good for the planet and open-sourcing all your code.{linebreak}{linebreak}SkyTruth is seeking an engineer to join the team that is building Global Fishing Watch which is a partnership of SkyTruth, Oceana and Google, supported by Leonardo DiCaprio, and dedicated to saving the world's oceans from ruinous overfishing [Wired],   Our team works directly with Google engineers that support Cloud ML, TensorFlow and DataFlow and we are a featured Google partner.{linebreak}{linebreak}https://cloud.google.com/customers/global-fishing-watch/{linebreak}{linebreak}https://environment.google/projects/fishing-watch/{linebreak}{linebreak}https://blog.google/products/maps/mapping-global-fishing-activity-machine-learning/{linebreak}{linebreak}Your job is to develop, improve and operationalize the multiple pipelines we use to process terrabytes of vessel tracking data collected by a constellation of satellites.  We have a data set containing billions of vessel position reports, from which we derive behaviors based on movement characteristics using Cloud ML, and publish a dynamically updated map of global commercial fishing activity.{linebreak}{linebreak}You will join a fully distributed team of engineers, data scientists and designers who are building and open sourcing the next generation of the product and who are very committed to creating a positive impact in the world while also solving novel problems using cutting edge tools. {linebreak}{linebreak}The company is headquartered in Washington DC, the data science team is in San Francisco, and we have engineers in the US, Europe, South America and Indonesia.  Daily scrums are scheduled around east coast US timezone (so that kind of sucks for the guy in Indonesia :-){linebreak}{linebreak}Because this is open to remote work, we will get a lot of applicants. We are not just looking for an engineer with great skills that wants to work with cool tech.  We also want you to be inspired by the project, so please tell us something that excites you about what we're doing when you contact us. {linebreak}{linebreak}Here's some more stuff you can read about the impact our work has:{linebreak}{linebreak}New York Times: Palau vs the Poachers{linebreak}{linebreak}Science: Ending hide and seek at sea{linebreak}{linebreak}Washington Post: How Google is helping to crack down on illegal fishing — from space

See more jobs at SkyTruth

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Spinn3r

Java 'big Data' Engineer


Spinn3r


big data

java

engineer

big data

java

engineer

2yr

Apply


Stats (beta): ๐Ÿ‘ 473 views,โœ๏ธ 0 applied (0%)
{linebreak}Company{linebreak}{linebreak}Spinn3r is a social media and analytics company looking for a talented Java “big data” engineer. {linebreak}{linebreak}As a mature, ten (10) year old company, Spinn3r provides high-quality news, blogs and social media data for analytics, search, and social media monitoring companies.   We’ve just recently completed a large business pivot, and we’re in the process of shipping new products so it's an exciting time to come on board!{linebreak}{linebreak}Ideal Candidate{linebreak}{linebreak}We're looking for someone with a passion for technology, big data, and the analysis of vast amounts of content; someone with experience aggregating and delivering data derived from web content, and someone comfortable with a generalist and devops role.  We require that you have a knowledge of standard system administration tasks, and have a firm understanding modern cluster architecture.  {linebreak}{linebreak}We’re a San Francisco company, and ideally there should be least a 4 hour overlap with the Pacific Standard Time Zone (PST / UTC-8).  If you don't have a natural time overlap with UTC-8 you should be willing to work an alternative schedule to be able to communicate easily with the rest of the team.  {linebreak}{linebreak}Culturally, we operate as a “remote” company and require that you’re generally available for communication and are self-motivated and remain productive.{linebreak}{linebreak}We are open to either a part-time or full-time independent contractor role.{linebreak}{linebreak}Responsibilities{linebreak}{linebreak}{linebreak}* Understanding our crawler infrastructure;{linebreak}{linebreak}* Ensuring top quality metadata for our customers. There's a significant batch job component to analyze the output to ensure top quality data;{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* Making sure our infrastructure is fast, reliable, fault tolerant, etc.  At times this may involve diving into the source of tools like ActiveMQ to understand how the internals work.  We contribute to Open Source development to give back to the community; and{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* Building out new products and technology that will directly interface with customers. This includes cool features like full text search, analytics, etc. It's extremely rewarding to build something from ground up and push it to customers directly. {linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Architecture{linebreak}{linebreak}Our infrastructure consists of Java on Linux (Debian/Ubuntu) with the stack running on ActiveMQ, Zookeeper, and Jetty.  We use Ansible to manage our boxes. We have a full-text search engine based on Elasticsearch which also backs our Firehose API.{linebreak}{linebreak}Here's all the cool products that you get to work with:{linebreak}{linebreak}{linebreak}* Large Linux / Ubuntu cluster running with the OS versioned using both Ansible and our own Debian packages for software distribution;{linebreak}{linebreak}* Large amounts of data indexed from the web and social media.  We index from 5-20TB of data per month and want to expand to 100TB of data per month; and {linebreak}{linebreak}* SOLR / Elasticsearch migration / install.  We’re experimenting with bringing this up now so it would be valuable to get your feedback.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Technical Skills{linebreak}{linebreak}We're looking for someone with a number of the following requirements:{linebreak}{linebreak}{linebreak}* Experience in modern Java development and associated tools: Maven, IntelliJ IDEA, Guice (dependency injection);{linebreak}{linebreak}* A passion for testing, continuous integration, and continuous delivery;{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* ActiveMQ. Powers our queue server for scheduling crawl work;{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* A general understanding and passion for distributed systems;{linebreak}{linebreak}* Ansible or equivalent experience with configuration management; {linebreak}{linebreak}* Standard web API use and design. (HTTP, JSON, XML, HTML, etc.); and{linebreak}{linebreak}* Linux, Linux, Linux.  We like Linux!{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Cultural Fit{linebreak}{linebreak}We’re a lean startup and very driven by our interaction with customers, as well as their happiness and satisfaction. Our philosophy is that you shouldn’t be afraid to throw away a week's worth of work if our customers aren’t interested in moving in that direction.{linebreak}{linebreak}We hold the position that our customers are our responsibility and we try to listen to them intently and consistently:{linebreak}{linebreak}{linebreak}* Proficiency in English is a requirement. Since you will have colleagues in various countries with various primary language skills we all need to use English as our common company language. You must also be able to work with email, draft proposals, etc. Internally we work as a large distributed Open Source project and use tools like email, slack, Google Hangouts, and Skype; {linebreak}{linebreak}* Familiarity working with a remote team and ability (and desire) to work for a virtual company. Should have a home workstation, and fast Internet access, etc.;{linebreak}{linebreak}* Must be able to manage your own time and your own projects.  Self-motivated employees will fit in well with the rest of the team; and{linebreak}{linebreak}* It goes without saying; but being friendly and a team player is very important.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Compensation{linebreak}{linebreak}{linebreak}* Salary based on experience;{linebreak}{linebreak}* We're a competitive, great company to work for; and{linebreak}{linebreak}* We offer the ability to work remotely, allowing for a balanced live-work situation.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}

See more jobs at Spinn3r

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


PROEMION

Expert Java Developer Big Data For Iot Cloud Platform


PROEMION


big data

dev

java

cloud

big data

dev

java

cloud

2yr

Apply


Stats (beta): ๐Ÿ‘ 1,009 views,โœ๏ธ 0 applied (0%)
{linebreak}We give our customers the technology they need to globally transmit and analyze CAN-based telemetry data of their mobile industrial machinery and therefore boost their efficiency. Some of the world's most respected OEMs rely on Proemion and thousands of off-road vehicles use our solution daily.{linebreak}{linebreak}With our telematics solution (hard- and software) machine data is transmitted globally to our PROEMION Data Platform, and can be accessed and analyzed using our APIs (REST, Streaming) or web portal. For the development of the next generation of this pioneering and exciting technology, we are always seeking highly motivated expert developers for our team.{linebreak}{linebreak}We are looking for an Expert Java Developer, Big Data (m/f) as a full-time remote position (part-time possible too) at the earliest possible date. We also offer the position on-site in Fulda, Germany (relocation support offered).

See more jobs at PROEMION

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Packetwerk

Experienced Java Developer For Elastic Search Flume Big Data Analytics


Packetwerk


big data

dev

elasticsearch

java

big data

dev

elasticsearch

java

2yr

Apply


Stats (beta): ๐Ÿ‘ 694 views,โœ๏ธ 0 applied (0%)
{linebreak}You design and build a cutting edge analyzing and reporting platform for our network data security products and work in close collaboration with our product management. The role requires strong Java knowledge.{linebreak}{linebreak}We are developing a high performance solution for securing and optimizing computer networks. To this end, we analyze user and threat data using real time stream processing and big data solutions. The result of this analysis enables the protection even against novel, previously unknown threats.{linebreak}Your part in the team is the optimization and continuous development of both our components and the data storage, handing and analysis algorithms used.{linebreak}{linebreak}Join us to serve for network security.{linebreak}{linebreak}You'll be free to work on a Mac or Linux PC for development and to use up-to-date tools like Mattermost, Status Hero and git.

See more jobs at Packetwerk

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Carbon Black

Principal Big Data Architect


Carbon Black


big data

architecture

big data

architecture

2yr

Apply


Stats (beta): ๐Ÿ‘ 556 views,โœ๏ธ 0 applied (0%)
{linebreak}Why Carbon Black?{linebreak}{linebreak}At Carbon Black, you’ll have the opportunity to make a huge impact while working alongside a global community of passionate people who are leading the way in cutting-edge technology. Our valued employees across the world have made Carbon Black a Top Place to Work, as named by the Boston Globe for two consecutive years.{linebreak}{linebreak}{linebreak}Our Engineering team is moving at lightning speed on the breaking edge of technology. You’ll be pulling things apart and tinkering, building new platforms, or playing in the cloud. Here, the engineering opportunities are endless. With this fast-paced, synergetic group, you’ll be working together and across the organization to ensure customer success all while continuing to build a product that protects their dearest assets.{linebreak}{linebreak}{linebreak}{linebreak}Why You Matter:{linebreak}{linebreak}There’s incredible excitement and urgency around cyber security software right now.  Protecting the world’s Intellectual Property is an imperative and Carbon Black is at the forefront of this market, providing solutions that protect industry and thought leaders in the Commercial, Retail, Finance, Energy and Government sectors.{linebreak}{linebreak}{linebreak}We’re looking for a Big Data Architect to take ownership of the overall information architecture that will enable our Data Science and Threat Research teams to efficiently access the data that they need to do their jobs.  You will understand everything from data warehousing, to modern NoSQL data stores and analytics technologies and also be a big-picture architect, able to design schemas and build an information architecture that consists of extracting or streaming data from source systems, all while maintaining data quality. You will have experience both in designing ETL for batch processing as well as  data streaming architectures and have a strong background in analytics.{linebreak}{linebreak}{linebreak}What You’ll Do:{linebreak}{linebreak}{linebreak}{linebreak}* Primary: Architecting storage of terabytes of raw and processed data for analysis{linebreak}{linebreak}* Primary: Designing ETL from existing internal data sources{linebreak}{linebreak}* Primary: Designing streaming data architectures{linebreak}{linebreak}* Primary: Designing systems for cleaning, normalizing, and sampling data for use in machine learning algorithms{linebreak}{linebreak}* Primary: Designing data schemas tuned for ad-hoc and scheduled analytics{linebreak}{linebreak}* Secondary: Statistical analysis of data{linebreak}{linebreak}{linebreak}

See more jobs at Carbon Black

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Hotjar

Big Data DevOps Engineer


Hotjar


big data

devops

engineer

devops

big data

devops

engineer

devops

2yr

Apply


Stats (beta): ๐Ÿ‘ 732 views,โœ๏ธ 0 applied (0%)
{linebreak}Note: Although this is a remote position, we are currently only seeking candidates in timezones between UTC-2 and UTC+7.{linebreak}{linebreak}Hotjar is looking for a driven and ambitious DevOps Engineer with Big Data experience to support and expand our cloud-based infrastructure used by thousands of sites around the world. The Hotjar infrastructure currently processes more than 7500 API requests per second, delivers over a billionpieces of static content every week and hosts databases well into terabyte-size ranges, making this an interesting and challenging opportunity. As Hotjar continues to grow rapidly, we are seeking an engineer who has experience dealing with high traffic cloud based applications and can help Hotjar scale as our traffic multiplies. {linebreak}{linebreak}This is an excellent career opportunity to join a fast growing remote startup in a key position.{linebreak}{linebreak}In this position, you will:{linebreak}{linebreak}{linebreak}* Be part of our DevOps team building and maintaining our web application and server environment.{linebreak}{linebreak}* Choose, deploy and manage tools and technologies to build and support a robust infrastructure.{linebreak}{linebreak}* Be responsible for identifying bottlenecks and improving performance of all our systems.{linebreak}{linebreak}* Ensure all necessary monitoring, alerting and backup solutions are in place.{linebreak}{linebreak}* Do research and keep up to date on trends in big data processing and large scale analytics.{linebreak}{linebreak}* Implement proof of concept solutions in the form of prototype applications.{linebreak}{linebreak}{linebreak}

See more jobs at Hotjar

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Stats (beta): ๐Ÿ‘ 1,671 views,โœ๏ธ 0 applied (0%)
The Big Data Services engineering team is responsible for providing software tools, platforms and APIs for collecting and processing large datasets, complete with search, analytics & real-time pipeline processing capabilities to address the unique challenges of our industry. We are building large distributed systems that will be the heart of data architecture to serve billions of requests, provide search & analytics across structured, semi-structured datasets, and scale out to tens of terabytes while maintaining low latency & availability & immediate discoverability by clients. We are reimagining the way we architect our data infrastructure across the company and are looking for an experienced software engineer to help. If solving intricate engineering issues with distributed systems, platform API, real-time big-data pipeline and search & discovery query patterns are your calling, we would like to hear from you.{linebreak}{linebreak}Major Responsibilities:{linebreak}-Develop and maintain internal Big Data services and tools{linebreak}-Leverage Service Oriented Architecture to create APIโ€™s, libraries and frameworks that our Studios will use{linebreak}-Help building the real-time Data Platform to support our games{linebreak}-Design & build data processing architecture in AWS{linebreak}-Design, support and build data pipelines{linebreak}-Develop ETL in distributed processing environment{linebreak}{linebreak}What You Need for this Position:{linebreak}-Bachelor's degree in technical field (e.g., MIS, Computer Science, Engineering, or a related field of study){linebreak}-The ideal candidate should have full-stack experience, as youโ€™ll be delivering data and analytics solutions for business, analytics and technology groups across the organization{linebreak}-Minimum of 3 years of demonstrated experience with object-oriented programming (Java){linebreak}-Working knowledge of Python{linebreak}-Experience in Go (Golang) is a huge plus{linebreak}-Advanced skills in Linux shell and SQL are required{linebreak}-Background with databases{linebreak}-Experience in Data Modeling/Integration and designing REST based API's for consumer based services is a plus{linebreak}-Good knowledge of open source technologies and DevOps paradigm

See more jobs at Peak Games

Visit Peak Games's website

# How do you apply?
Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Hazelcast

Big Data Engineer


Hazelcast


engineer

big data

engineer

big data

3yr

Apply


Stats (beta): ๐Ÿ‘ 1,064 views,โœ๏ธ 0 applied (0%)
{linebreak}Would you like to work on a new and exciting big data project? Do you enjoy any of the following?{linebreak}{linebreak}{linebreak}* Solving complex problems around distributed data processing.{linebreak}{linebreak}* Implementing non-trivial infrastructure code.{linebreak}{linebreak}* Creating well crafted and thoroughly tested features, taking full-responsibility from the design phase.{linebreak}{linebreak}* Paying attention to all aspects of code quality, from clean-code, to allocation-rates.{linebreak}{linebreak}* Digging into mechanical sympathy concepts.{linebreak}{linebreak}* Delivering a technical presentation at a conference.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}At Hazelcast you will have the opportunity to work with some of the best engineers out there:{linebreak}{linebreak}{linebreak}* Who delve into JVM code.{linebreak}{linebreak}* Who implement and scrutinize garbage collection algorithms.{linebreak}{linebreak}* Who take any piece of software and multiply its performance by applying deep technical understanding. {linebreak}{linebreak}* Who regularly squash bugs in the depths of a JVM{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}We are looking for people who can deliver solid production code. You may either work in our office in London, Istanbul or code remotely from a home office. It is also preferable that you are within a few hours of the CET timezone as this is where most of the developers are based.

See more jobs at Hazelcast

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


SiteZeus

Full Stack Web Developer Big Data Machine Learning


SiteZeus


big data

full stack

dev

web dev

big data

full stack

dev

web dev

3yr

Apply


Stats (beta): ๐Ÿ‘ 3,781 views,โœ๏ธ 0 applied (0%)
{linebreak}SiteZeus is a fast-growing startup disrupting the way growing companies select new locations with Predictive Analytics based on big data and machine learning.  We are seeking the best & brightest engineers to join our Product Development team to enhance our award-winning product suite.

See more jobs at SiteZeus

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


GridCell

Graduate Junior Java Developer. Front End Big Data Deep Learning


GridCell


java

junior

front end

big data

java

junior

front end

big data

3yr

Apply


Stats (beta): ๐Ÿ‘ 1,601 views,โœ๏ธ 0 applied (0%)
{linebreak}You will work on really novel software applications; designing the architecture, planning tasks, implementing and testing code and deploying it to production. We always work with latest software platforms, build tools, libraries and frameworks, sometimes before they are released to the public.{linebreak}{linebreak}There will be lots of opportunity to deploy and monitor cloud infrastructure, setup and run deep and machine learning pipelines and work on both back end and front end sections. There will always be plenty of support and a balanced workload.{linebreak}{linebreak}You will get to use really amazing tech in a supportive environment and with the possibility of training in relevant emerging technology. You will also get support and mentoring in developing robust production-ready software and get an insight into managing customer relationships and carrying out business analytics.{linebreak}{linebreak}Compared to mainstream graduate schemes and working in large consultancies there will be no boring meetings, no seminars and no 'what colour best describes you'  training programmes (yes we've all done them). Just pure coding, learning and development and getting the best experience and support you can get.

See more jobs at GridCell

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Clear Returns

Big Data Engineer For Glasgow Based Start Up


Clear Returns


engineer

big data

engineer

big data

3yr

Apply


Stats (beta): ๐Ÿ‘ 396 views,โœ๏ธ 0 applied (0%)
{linebreak}This is an exciting opportunity that incorporates managing all technical and engineering aspects of the analytics infrastructure. You will maintain and improve the data warehouse and ensure that data obtained from diverse and varied sources is appropriately captured, cleaned, and utilised.  As a member of a small team you could also get the opportunity to be involved in Data Science projects, though this is not a prerequisite of the role. You will have significant influence on the future of the technologies used by the team which is central to the company’s growth strategy.

See more jobs at Clear Returns

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Stats (beta): ๐Ÿ‘ 1,062 views,โœ๏ธ 0 applied (0%)
Weโ€™re looking to bring on a new full stack developer to help us build out the next version [v3] of [AwesomeWall](http://awesomewallhq.com). We are analysing over 42 Million posts a day and weโ€™ve got a sweet working relationship with Facebook & Instagram. An ex-facebooker now works on the team, and we are a Facebook Media Solutions Partner. Customers include music festivals, brands, and worldwide events.{linebreak}{linebreak}Current system is built with node, angular, mongo, redis, mysql.{linebreak}{linebreak}Perks:{linebreak}- Negotiable salary / payment, weโ€™re a profitable company{linebreak}- Optional team outings including Glastonbury Festival and skiing / snowboarding.{linebreak}- A super friendly award winning, internationally recognised team that get on like a family{linebreak}- Share options & Profit share{linebreak}- Forced quarterly vacation time{linebreak}- Some worldwide travel available (but not necessary){linebreak}- Remote working (we believe this is a perk){linebreak}{linebreak}{linebreak}Extra tags: javascript, node, php, angular, social media, big data

See more jobs at We Make Awesome Sh

Visit We Make Awesome Sh's website

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Divergent Strategies

Big Data Microservices Developer For Social App


Divergent Strategies


big data

dev

digital nomad

big data

dev

digital nomad

3yr

Apply

{linebreak}We believe in questioning paradigms and are developing an app that challenges widely held beliefs about social media. By daring to do 'social' differently, we solve many of the problems that are hindering social media's evolution. You will be joining a sophisticated team of developers, data scientists and entrepreneurs for the final stage of the app's development.   We are located in Los Angeles and prefer candidates from the area.

See more jobs at Divergent Strategies

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


CSC

Big Data Product Manager


CSC


big data

product manager

exec

big data

product manager

exec

3yr

Apply


Stats (beta): ๐Ÿ‘ 486 views,โœ๏ธ 0 applied (0%)
{linebreak}Seeking a dynamic, lean-inspired Offering Manager. This self-starter is a key hire for our fast growing, execution-oriented company. We need a resourceful problem solver who brings aptitude, energy, and integrity, as well as a hunger for success to the table. Success in this role critically depends on the ability to collaborate with many different customers and stakeholders, translate their needs into actionable product and service priorities, and ensure successful execution of those priorities and actions.{linebreak}{linebreak}{linebreak}As Offering Manager, you will assist with planning and managing key offering development projects and service frameworks that help our customers leverage complex Big Data & Analytics platforms and tools to drive impactful business insight. You will become an expert on our offerings, the Big Data industry and use cases, and our buyers. You will drive customer happiness and sales success with the best Big Data products and services on the market.{linebreak}

See more jobs at CSC

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Linux Academy

Big Data Content Creator Instructor


Linux Academy


big data

teaching

big data

teaching

3yr

Apply


Stats (beta): ๐Ÿ‘ 544 views,โœ๏ธ 0 applied (0%)
{linebreak} Linux Academy is seeking exceptional Big Data developer, and we know we're not alone in our search. Why choose Linux Academy? The Linux Academy is a well-funded, four-year old company that serves tens of thousands of students and thousands of companies. Your teaching will be in front of companies like Rackspace, Mailchimp, Mirantis, Media Temple and more. YOU will be teaching the experts how to be experts! You will have the opportunity to celebrate the success of other students and will be directly responsible for their success. Teaching at the Linux Academy is a very rewarding career. {linebreak}{linebreak}Benefits{linebreak}{linebreak} * High compensation with great bonus opportunities {linebreak} * Three weeks vacation immediately available {linebreak} * Telecommute optional{linebreak} * Free drinks/snacks (delivered to your door step if you are remote) {linebreak} * 3k worth of technology office setup and then 2k/year home office/tech budget {linebreak} * Paid training, paid conferences, and paid travel to any meet ups youd like to attend {linebreak} * Flexible schedule {linebreak} * Health benefits {linebreak} * Simple IRA with match {linebreak} * Satisfaction of helping others {linebreak} * Benefits increase each year of service {linebreak} * Compensation increases proportionate to new knowledge gained while instructing at the Linux Academy {linebreak} {linebreak}

See more jobs at Linux Academy

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Crossover

Senior Big Data Software Engineer $90K 100% Position


Crossover


senior

engineer

big data

dev

senior

engineer

big data

dev

3yr

Apply


Stats (beta): ๐Ÿ‘ 949 views,โœ๏ธ 0 applied (0%)
{linebreak}Are you a Senior Software Engineer that has spent several years working with Big Data technologies? Have you created streaming analytics algorithms to process terabytes of real-time data and deployed Hadoop or Cassandra clusters over dozens of VMs? Have you been part of an organization driven by a DevOps culture, where the engineer has end to end responsibility for the product, from development to operating it in production? Are you willing to join a team of elite engineers working on a fast growing analytics business? Then this role is for you! {linebreak} {linebreak}Job Description {linebreak}The Software and DevOps engineer will help build the analytics platform for Bazaarvoice data that will power our client-facing reporting, product performance reporting, and financial reporting. You will also help us operationalize our Hadoop clusters, Kafka and Storm services, high volume event collectors, and build out improvements to our custom analytics job portal in support of Map/Reduce and Spark jobs. The Analytics Platform is used to aggregate data sets to build out known new product offerings related to analytics and media as well as a number of pilot initiatives based on this data. You will need to understand the business cases of the various products and build a common platform and set of services that help all of our products move fast and iterate quickly. You will help us pick and choose the right technologies for this platform. {linebreak}Key Responsibilities {linebreak} {linebreak}In your first 90 days you can expect the following: {linebreak} * {linebreak}An overview of our Big Data platform code base and development model {linebreak} * {linebreak}A tour of the products and technologies leveraging the Big Data Analytics Platform {linebreak} * {linebreak}4 days of Cloudera training to provide a quick ramp up of the technologies involved {linebreak} * {linebreak}By the end of the 90 days, you will be able to complete basic enhancements to code supporting large-scale analytics using Map/Reduce as well as contribute to the operational maintenance of a high volume event collection pipeline. {linebreak} {linebreak} {linebreak} {linebreak}Within the first year you will: {linebreak} * {linebreak}Own design, implementation, and support of major components of platform development. This includes working with the various stakeholders for the platform team to understand their requirements and delivery high leverage capabilities. {linebreak} * {linebreak}Have a complete grasp of the technology stack, and help guide where we go next. {linebreak} {linebreak}

See more jobs at Crossover

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Instructure

Senior Software Engineer Big Data


Instructure


senior

engineer

big data

dev

senior

engineer

big data

dev

4yr

Apply


Stats (beta): ๐Ÿ‘ 814 views,โœ๏ธ 0 applied (0%)
Instructure was founded to define, develop, and deploy superior, easy-to-use software. (And that’s what we did / do / will keep on doing.) We are dedicated to the fight against iffy, mothbally, shoddy software. We make better, more usable tools for teaching and learning (you know, stuff people will actually use). A better connected and more open edtech ecosystem. And more effective ways for everyone everywhere to access education, make discoveries, share knowledge, be inspired, and do big things. We accomplish all this by giving smart, creative, passionate people opportunities to create awesome. So here’s your opportunity.{linebreak}{linebreak}We are hiring engineers passionate about using data to gain insight, drive behavior and improve our products. Our software helps millions of users learn and grow. Come help accelerate the learning process by developing data centric features for K-12, higher education and corporate users.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}WHAT YOU WILL BE DOING:{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* The Instructure suite of SaaS applications produces terabytes of events and student information weekly. Your challenge will be to create the systems that organize this data and return insights to students, teachers and administrators. You will also work to integrate data driven features into core Instructure products. {linebreak}{linebreak}* This team engineers the data and analytics platform for the entire Instructure application portfolio. This is a growing team at Instructure with the opportunity to provide tangible positive impact to the business and end users. We are looking for creative, self-motivated, highly collaborative, extremely technical people who can drive a vision to reality.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}

See more jobs at Instructure

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Convertro

Big Data Engineer


Convertro


big data

engineer

big data

engineer

4yr

Apply


Stats (beta): ๐Ÿ‘ 493 views,โœ๏ธ 0 applied (0%)
{linebreak}Do you want to solve real-world business problems with cutting edge technology in a creative and exciting start-up? Are you a smart person who gets stuff done?{linebreak} {linebreak} Convertro is looking for you. We are hiring an engineer with experience building analytical systems in Map Reduce, Hadoop, Hbase, or similar distributed systems programming. You will improve the scalability, flexibility, and stability of our existing Hadoop architecture as well as help develop our next generation data analytics platform. You will rapidly create prototypes and quickly iterate to a stable, production-quality release candidate.

See more jobs at Convertro

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


rFactr Providing the leading Enterprise Social Selling Platform

Front End Developer Social Networking Big Data Analytics


rFactr Providing the leading Enterprise Social Selling Platform


front end

big data

dev

stats

front end

big data

dev

stats

4yr

Apply


Stats (beta): ๐Ÿ‘ 1,399 views,โœ๏ธ 0 applied (0%)
{linebreak}{linebreak}* Design and develop highly scalable front-end code for web applications using the Microsoft ASP.NET MVC stack{linebreak}{linebreak}* Determine architectural strategy for the front-end component of web applications{linebreak}{linebreak}* Write clean, elegant code with an eye toward maintainability, flexibility and high performance, balance elegant solutions and a quality product with rapid time to market{linebreak}{linebreak}* Integrate with social networks, service providers and client via their published APIs{linebreak}{linebreak}* Work in an Agile development environment with fast iterations and short cycles{linebreak}{linebreak}* Work effectively with business, technology and our clients to prototype, design, code and deploy web and mobile applications{linebreak}{linebreak}{linebreak}

See more jobs at rFactr Providing the leading Enterprise Social Selling Platform

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Zalando SE

Big Data Architect


Zalando SE


big data

architecture

big data

architecture

4yr

Apply


Stats (beta): ๐Ÿ‘ 1,173 views,โœ๏ธ 0 applied (0%)
{linebreak}The opportunity{linebreak}{linebreak}As a Big Data Architect at Zalando, you will be in charge of building, scaling and architecting one of the largest big data platforms in ecommerce. You will develop big data solutions, services, and messaging frameworks to help us continuously process our data faster and more effectively. You will challenge our status quo and help us define best practices for how we work. And you will have the freedom to launch your own open source projects, contribute to others’ projects, build internal community around your interests, and strengthen your personal brand—while receiving meaningful support at every step.{linebreak}{linebreak}What we are looking for{linebreak}{linebreak}{linebreak}* A university degree in Computer Science,  Mathematics, Statistics or comparable subject, with an academic record of high achievement{linebreak}{linebreak}* Fluency in at least one programming language, such as Python, Java and/or Scala{linebreak}{linebreak}* Solid knowledge of data structures and applied machine learning techniques{linebreak}{linebreak}* Demonstrated experience working on time-critical, mass data-processing, parallel data-processing and database initiatives{linebreak}{linebreak}* Deep knowledge of Hadoop, Cassandra, Spark, MapReduce, Storm, Kafka, and/or other big data technologies and their capabilities{linebreak}{linebreak}* A passion for working with relational (e.g. PostgreSQL) and NoSQL (e.g. Cassandra, Redis) databases{linebreak}{linebreak}* Enthusiasm for microservices architectures and RESTful APIs{linebreak}{linebreak}* Peerless analytical and critical thinking skills{linebreak}{linebreak}* Creative thinkers who value accountability, goal-setting and focusing on solutions instead of problems{linebreak}{linebreak}* English language fluency{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Your responsibilities{linebreak}{linebreak}{linebreak}* Prototype, design and build a mass data-processing and log data-processing architecture that will supplement to our existing analysis and DWH architecture{linebreak}{linebreak}* Align multiple teams to support new big data solutions{linebreak}{linebreak}* Help evaluate and push for the adoption of technologies that are best suited for specific projects{linebreak}{linebreak}* Share your knowledge via coaching, code reviews, articles and tech talks{linebreak}{linebreak}* Demonstrate excellent communication skills and act as a liaison between your team and others{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}What you can expect from us{linebreak}{linebreak}{linebreak}* Internal tech talks, skills-building courses and technical “People Leads” who help you achieve mastery{linebreak}{linebreak}* Personal branding support: From preparing conference talks and blog posts to industry networking Community: hack weeks, movie nights, +70 self-organized tech guilds and more{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* Competitive salary{linebreak}{linebreak}* 40% Zalando shopping discount and commuter discount{linebreak}{linebreak}* Relocation assistance for internationals{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}About Zalando{linebreak}{linebreak}Zalando is Europe’s leading online fashion platform, doing business in 15 markets. Delivering first-class shopping experiences to our +15 million customers requires moving fast — with microservices, Agile processes & autonomous teams  —  and using cutting edge, open source technologies. We are passionate about what we do and have fun while doing it. And we are willing to experiment and make mistakes: It’s how we grow.{linebreak}{linebreak}Want to join us? Then go ahead and apply!{linebreak}{linebreak}If you need guidance or have any questions about our hiring processes, please contact recruiter Matthias Rempel at [email protected]

See more jobs at Zalando SE

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Datametica Solutions Pvt.

Exciting Opportunity Work On Big Data Technologies Analytics


Datametica Solutions Pvt.


big data

stats

big data

stats

4yr

Apply


Stats (beta): ๐Ÿ‘ 1,195 views,โœ๏ธ 0 applied (0%)
{linebreak}Are you a technology buff ? Does the Big Data - Hadoop buzz excite you? Do you want to travel to the world’s most successful companies to help them revolutionize the way they manage gigantic amounts of data, derive insights and take strategic decisions? Are you enthusiastic, self-motivated, innovative and multi-talented? Do you want to join the early adopters of Hadoop?

See more jobs at Datametica Solutions Pvt.

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


American Express

Big Data Engineer


American Express


engineer

big data

engineer

big data

4yr

Apply


Stats (beta): ๐Ÿ‘ 639 views,โœ๏ธ 0 applied (0%)
{linebreak}American Express is looking for energetic, high-performing software engineers to help shape our technology and product roadmap. You will be part of the fast-paced big data team. As a part of the Customer Marketing and Big Data Platforms organization, that enables Big Data and batch/real-time analytical solutions leveraging transformational technologies (Hadoop, HDFS, MapReduce, Hive, HBase, Pig, etc.) you will be working on innovative platform and data science projects across multiple business units (e.g., RIM, GNICS, OPEN, CS, EG, GMS, etc.). Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.{linebreak}{linebreak}{linebreak}{linebreak}Offer of employment with American Express is conditioned upon the successful completion of a background verification check, subject to applicable laws and regulations.{linebreak}Qualifications{linebreak}ยทHands-on expertise with application design, software development, and automated testing - Experience collaborating with the business to drive requirements/Agile story analysis {linebreak}{linebreak}ยทAbility to effectively interpret technical and business objectives and challenges, and articulate solutions {linebreak}{linebreak}ยทAbility to think abstractly and deal with ambiguous/under-defined problems - Ability to enable business capabilities through innovation {linebreak}{linebreak}ยทLooks proactively beyond the obvious for continuous improvement opportunities {linebreak}{linebreak}ยทHigh energy, demonstrated willingness to learn new technologies, and takes pride in how fast they develop working software {linebreak}{linebreak}ยทStrong programming knowledge in C++ / Java {linebreak}{linebreak}ยทSolid understanding of data structures and common algorithms {linebreak}{linebreak}ยทKnowledge of RDBMS concepts and experience with SQL {linebreak}{linebreak}ยทUnderstanding and experience with UNIX / Shell / Perl / Python scripting {linebreak}{linebreak}ยทExperience in Big Data Components/ Frameworks (Hadoop, HBase, HDFS, Pig, Hive, Sqoop, Flume, Ozie, Avro, etc.) and other AJAX tools/Framework {linebreak}{linebreak}ยทDatabase query optimization and indexing Bonus skills: Object-oriented design and coding with variety of languages: Java, J2EE and Parallel and distributed system {linebreak}{linebreak}ยทMachine learning/data mining {linebreak}{linebreak}ยทWeb services design and implementation using REST / SOAP - Bug-tracking, source control, and build system{linebreak}{linebreak}American Express is an equal opportunity employer and makes employment decisions without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other status protected by law. Click here to view the 'EEO is the Law' poster.{linebreak}{linebreak}{linebreak}{linebreak}ReqID: 15017390

See more jobs at American Express

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


INRIX

Senior Big Data Developer


INRIX


big data

dev

senior

digital nomad

big data

dev

senior

digital nomad

4yr

Apply


Stats (beta): ๐Ÿ‘ 577 views,โœ๏ธ 0 applied (0%)
{linebreak}The INRIX Mobile Team is looking for a Senior Big Data Developer to join our team. We are building a comprehensive new mobile SDK, innovative services, new branded INRIX experiences, and helping our partners leverage our SDK into their experiences.  We have a great team in place, and are looking for a senior developer take things even further.{linebreak}{linebreak}On the Mobile team at INRIX, we value open discussion, pushback, and management by objective and prioritization.  You will be working closely with our Android, iOS, and QA teams.{linebreak}{linebreak}How do we plan and build software on the Mobile Team?{linebreak}{linebreak}It’s full Scrum and Agile.  That means weekly sprint planning, poker planning, retrospectives, backlogs, stories -- the full solution.  We work together as a small team, 100% scrum based, and as little process as possible. {linebreak}{linebreak}If you love fast paced development or planning, or you want to learn this skill, the INRIX Mobile team is the place to be.  If you want to ship real experiences, you want to point at something and say “I did that”, and you want to work with customers at scale, INRIX is the place for you.{linebreak}{linebreak}Must Haves:{linebreak}{linebreak}{linebreak}* Customer focus — Our job at the end of the day is to delight customers through technology, you need to be 100% customer focused.{linebreak}{linebreak}* Experience —  At least 5 years as a backend developer showing experience in software development, architecture and design, passion for shipping services, and planning.{linebreak}{linebreak}* Big Data - Strong background in Cloud Analytics and Big Data{linebreak}{linebreak}* Cloud Technologies — Experience pushing AWS technologies to the limit and a passion to be keep on top of the latest innovations in the cloud is a must.{linebreak}{linebreak}* Scalability - Everything you do must work with customers at scale.{linebreak}{linebreak}* Passion for detail - TDD, Unit Testing, Automation, Black Box, White Box, etc.  If you have an obsession with producing high quality code, you’ve come to right place.{linebreak}{linebreak}* Computer Science - You should know the fundamentals of algorithm and data structure design/selection and be able to defend your choices.   {linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Why make the move:{linebreak}{linebreak}{linebreak}* Healthy Compensation & Great Benefits- Including medical, dental and vision coverage; bonuses and stock options; 401k with company match{linebreak}{linebreak}* Bonuses 2x per year and Stock Options– Because you like to be rewarded and recognized for your hard work{linebreak}{linebreak}* INRIX motorboat– Get your boat license and we’ll pay for the gas. You can walk from your office to the dock in 15 minutes{linebreak}{linebreak}* Free Parking- We have free parking in our covered garage and enough of it to go around.{linebreak}{linebreak}* Free Bus Pass- If you don't have a car to park.{linebreak}{linebreak}* Free car– Well…. not really free.  But we have test cars with the latest technology that you can take home.  We pay for the gas{linebreak}{linebreak}* Free Food- Onsite snacks, drinks, and lunches everyday. The food comes to you!{linebreak}{linebreak}* Open Vacation Policy– You’re an adult and can manage your own vacation time.  Work hard; play hard{linebreak}{linebreak}* Family– We don’t believe that you should miss your child’s play because you have to attend a meeting{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak} The INRIX development team is a great place to grow your career and skillset.  Our roadmap is bursting with opportunity for engineers to bring their best (and then some), solving intriguing challenges, and tangibly impacting the customer experience. {linebreak}{linebreak}Our space is dynamic, it changes all the time, and we are 100% focused on solving for our customers.  Today we might be building the next great mobile app; tomorrow it might be a new traffic app in the new hot car.  Our customers are literally building everything you can imagine, and lots you can't.  Our data, SDK, apps, and cloud are used across a breadth of experiences and device types. {linebreak}{linebreak}We take career development very seriously on the mobile development team. Part of our goal is to invest in you. We will support you and help you get better in your craft. We also expect you to teach us a few things along the way.{linebreak}{linebreak}Are you ready to join a different kind of company?{linebreak} {linebreak} {linebreak}{linebreak}INRIX is a leading provider of real-time traffic information, connected car services and analytics worldwide. The company leverages big data analytics to reduce the individual, economic and environmental toll of traffic congestion. Through cutting-edge data intelligence and predictive traffic technologies, INRIX helps leading automakers, fleets, governments and news organizations make it easier for drivers to navigate their world.  {linebreak} {linebreak} Our vision is simple – to solve traffic, empower drivers, inform planning and enhance commerce.  INRIX is at the forefront of connecting cars to smarter cities and serves more than 350 blue-chip customers in more than 40 countries around the world.{linebreak}{linebreak}Exciting things are happening all the time. Come join us! 

See more jobs at INRIX

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Hearst Corporation

Full Stack Big Data Engineer


Hearst Corporation


engineer

full stack

big data

engineer

full stack

big data

4yr

Apply


Stats (beta): ๐Ÿ‘ 951 views,โœ๏ธ 0 applied (0%)
{linebreak}We’re creating a game-changing modern content platform - built from the ground up.  It will give our users, editors, and advertisers tools that enable them to react to the world in real-time in making decisions around content publishing and revenue generation. We are doing this by working with Big Data scientists to build a modern information pipeline to enable intelligent and optimized media applications.  We’re using modern web technologies to do this. We’re building an open, service-oriented platform driven by APIs, and believe passionately in crafting simple, elegant solutions to complex technological and product problems.  Our day to day is much like a technology start-up company - with the strong support of a large corporation that believes in what we're doing.{linebreak}{linebreak}We’re hiring talented and passionate Software Engineers to be part of a corporate open-source movement in the company to build out our new platform. The ideal candidate has extensive experience writing clean object-oriented code, building and working with RESTful APIs, has worked in cloud based environments like AWS and likes being part of a collaborative tech team.{linebreak}{linebreak}We consistently hold ourselves to high standards of software development, code review and deployment.  Our workflow embraces automated testing and continuous integration.  We work closely with our DevOps team to allow for developers to focus on what they do best - creatively build innovative software solutions.

See more jobs at Hearst Corporation

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Turbine WB Games

Senior Big Data Engineer


Turbine WB Games


senior

engineer

big data

senior

engineer

big data

4yr

Apply


Stats (beta): ๐Ÿ‘ 538 views,โœ๏ธ 0 applied (0%)
{linebreak}WBPlay – a team within Turbine that is responsible for delivering key technology platforms that support games across WB – is seeking a Senior Big Data Engineer to provide hands on development within our Core Analytics Platform team. As a key contributor reporting directly to the Director of Analytics Platform Development, this individual will work closely with developers and dev-ops engineers across multiple teams to build and operate a best-in-class game analytics platform.{linebreak}{linebreak}{linebreak}The successful candidate will participate in software development and dev-ops projects, using Agile methodologies, to build scalable, reliable technologies and infrastructure for our cross-game data analytics platform. This big data platform powers analytics for WB’s games across multiple networks, devices and operating environments, including Xbox One, PS4, IOS, and Android. This is a role combining proven technical skills in various Big Data ecosystems, with a strong focus on open-source (Apache) software and cloud (AWS) infrastructure.{linebreak}{linebreak}{linebreak}Our ideal candidate is fluent in several big data technologies - including Hadoop, Spark, MPP databases, and NoSQL databases - and has deep experience in implementation of complex distributed computing environments which ingest, process, and surface hundreds of terabytes of data from dozens of sources, in near real time, for analysis by data scientists and other stakeholders.{linebreak}{linebreak}JOB RESPONSIBILITIES{linebreak}{linebreak}{linebreak}* {linebreak}{linebreak}Responsible for the building, deployment, and maintenance of mission critical analytics solutions that process data quickly at big data scales{linebreak}{linebreak}{linebreak}* {linebreak}{linebreak}Contributes design, code, configurations, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, and loading across multiple game franchises.{linebreak}{linebreak}{linebreak}* {linebreak}{linebreak}Owns one or more key components of the infrastructure and works to continually improve it, identifying gaps and improving the platform’s quality, robustness, maintainability, and speed.{linebreak}{linebreak}{linebreak}* {linebreak}{linebreak}Cross-trains other team members on technologies being developed, while also continuously learning new technologies from other team members.{linebreak}{linebreak}{linebreak}* {linebreak}{linebreak}Interacts with engineering teams across WB and ensures that solutions meet customer requirements in terms of functionality, performance, availability, scalability, and reliability.{linebreak}{linebreak}{linebreak}* {linebreak}{linebreak}Performs development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions.{linebreak}{linebreak}{linebreak}* {linebreak}{linebreak}Works directly with business analysts and data scientists to understand and support their usecases{linebreak}{linebreak}{linebreak}{linebreak}

See more jobs at Turbine WB Games

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Verizon

Big Data Platform Engineer


Verizon


big data

engineer

big data

engineer

4yr

Apply


Stats (beta): ๐Ÿ‘ 580 views,โœ๏ธ 0 applied (0%)
{linebreak}Grow your IT career at one of the leading global technology companies. We offer hands-on exposure to state-of-the-art systems, applications and infrastructures.{linebreak}{linebreak}Responsibilities{linebreak}{linebreak}{linebreak}* Architect, Design and build big data platform primarily based on Hadoop echo system that is fault-tolerant & scalable.{linebreak}{linebreak}* Build high throughput messaging framework to transport high volume data.{linebreak}{linebreak}* Use different protocols as needed for different data services (NoSQL/JSON/REST/JMS).{linebreak}{linebreak}* Develop framework to deploy Restful web services.{linebreak}{linebreak}* Build ETL, distributed caching, transactional and messaging services.{linebreak}{linebreak}* Architect and build security compliant user management framework for multitenant big data platform.{linebreak}{linebreak}* Build High-Availability (HA) architectures and deployments primarily using big data technologies.{linebreak}{linebreak}* Creating and managing Data Pipelines.{linebreak}{linebreak}{linebreak}

See more jobs at Verizon

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Verizon

Big Data Application Engineer


Verizon


big data

engineer

big data

engineer

4yr

Apply


Stats (beta): ๐Ÿ‘ 663 views,โœ๏ธ 0 applied (0%)
{linebreak}Stay on the front lines of groundbreaking technology. Were committed to a dynamic, ever-evolving infrastructure and the hard work it takes to keep our reliable network thriving. Help support the growing demands of an interconnected world.{linebreak}{linebreak}Responsibilities{linebreak}{linebreak}Verizon Corporate Technology's Big Data Group is looking for Big Data engineers with expert level experience in architecting and building our new Hadoop, NoSql, InMemory Platforms(s) and data collectors. You will be part of the team building worlds one of the largest Big Data Platform(s) that can ingest 100’s of Terabytes of data that will be consumed for Business Analytics, Operational Analytics, Text Analytics, Data Services and build Big Data Solutions for various Verizon Business units{linebreak}{linebreak}This is a unique opportunity to be part of building disruptive technology where Big Data will be used as platform to build solutions for Analytics, Data Services and Solutions.{linebreak}{linebreak}Responsibility :{linebreak}{linebreak}{linebreak}* Hands on contribution to biz logic using Hadoop echo system (Java MR, PIG, Scala, Hbase, Hive){linebreak}{linebreak}* Work on technologies related to NoSQL, SQL and InMemory platform(s){linebreak}{linebreak}{linebreak}

See more jobs at Verizon

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Verizon

Principal Big Data Platform Engineer


Verizon


big data

engineer

big data

engineer

4yr

Apply


Stats (beta): ๐Ÿ‘ 518 views,โœ๏ธ 0 applied (0%)
{linebreak}Grow your IT career at one of the leading global technology companies. We offer hands-on exposure to state-of-the-art systems, applications and infrastructures.{linebreak}{linebreak}Responsibilities{linebreak}{linebreak}Verizon Corporate Technology's Big Data Group is looking for Big Data engineers with expert level experience in building our new Hadoop, NoSql, InMemory Platforms(s) ,data collectors and applications. You will be part of the team building worlds one of the largest Big Data Platform(s) that can ingest 100’s of Terabytes of data that will be consumed for Business Analytics, Operational Analytics, Text Analytics, Data Services and build Big Data Solutions for various Verizon Business units{linebreak}{linebreak}Responsibility:{linebreak}{linebreak}{linebreak}* Architect, Design and build big data platform primarily based on Hadoop echo system that is fault-tolerant & scalable.{linebreak}{linebreak}* Build high throughput messaging framework to transport high volume data.{linebreak}{linebreak}* Responsible to provide guidance to members on the team to build complex high throughput big data subsystems.{linebreak}{linebreak}* Use different protocols as needed for different data services (NoSQL/JSON/REST/JMS).{linebreak}{linebreak}* Develop framework to deploy Restful web services.{linebreak}{linebreak}* Build ETL, distributed caching, transactional and messaging services.{linebreak}{linebreak}* Architect and build security compliant user management framework for multitenant big data platform.{linebreak}{linebreak}* Build High-Availability (HA) architectures and deployments primarily using big data technologies.{linebreak}{linebreak}* Expert level experience with Hadoop echo system ( Spark, Hbase, Solr).{linebreak}{linebreak}* Creating and managing Data Pipelines.{linebreak}{linebreak}{linebreak}

See more jobs at Verizon

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Jet.com

Big Data Engineer


Jet.com


engineer

big data

engineer

big data

4yr

Apply


Stats (beta): ๐Ÿ‘ 828 views,โœ๏ธ 0 applied (0%)
{linebreak}“Engineers are Astronauts at Jet”{linebreak}- Mike Hanrahan, Jet’s CTO{linebreak}{linebreak}{linebreak}You'll be responsible for helping to build a world class data platform to collect, process, and manage a vast amount of information generated by Jet's rapidly growing business.{linebreak}{linebreak}{linebreak}{linebreak}About Jet{linebreak}{linebreak}Jet’s mission is to become the smartest way to shop and save on pretty much anything. Combining a revolutionary pricing engine, a world-class technology and fulfillment platform, and incredible customer service, we’ve set out to create a new kind of e-commerce.  At Jet, we’re passionate about empowering people to live and work brilliant.{linebreak}{linebreak}{linebreak}About Jet’s Internal Engine{linebreak}{linebreak}We’re building a new kind of company, and we’re building it from the inside out, which means that investing in hiring, developing, and retaining the brightest minds in the world is a top priority. Everything we do is grounded in three simple values:  trust, transparency, and fairness.  From our business model to our culture, we live our values to the extreme, whether we’re dealing with employees, retail partners, or consumers.  We believe that happiness is the highest level of success and we want every person that crosses paths with Jet to achieve it.  If you’re an ambitious, smart, natural collaborator who likes taking risks, influencing, and innovating in a challenging hyper-growth environment, we’d love to talk to you about joining our team.{linebreak}{linebreak}{linebreak}About the Job{linebreak}{linebreak}We are looking for an exceptional Data Engineer to help build a world class analytical platform to collect, store and expose both structured and un-structured data generated by a vastly growing system landscape at Jet.com.{linebreak}{linebreak}You can expect a freewheeling, informal work environment, populated by a combination of folks from top companies that have produced many successful products, as well as some PhD’s that have escaped the ivory tower.{linebreak}{linebreak}We have lots of perks like free lunches, but you will be so engrossed with the challenges of the job that the free stuff will be more like icing on the cake.{linebreak}{linebreak}Because we work on cutting edge technologies, we need someone who is a creative problem solver, resourceful in getting things done, and productive working independently or collaboratively. This person would take on the following responsibilities:{linebreak}{linebreak}{linebreak}* Design, implement and manage a near real-time ingestion pipeline into a data warehouse and Hadoop data lake.{linebreak}{linebreak}* Gather and process raw data at scale - collect data across all business domains (our functional-first, event sourced, micro services backend) and expose mechanisms for large scale parallel processing{linebreak}{linebreak}* Process unstructured data into a form suitable for analysis and then empower state-of-the-art analysis for analysts, scientists, and APIs.{linebreak}{linebreak}* Support business decisions with ad hoc analysis as needed.{linebreak}{linebreak}* Evangelize an extremely high standard of code quality, system reliability, and performance.{linebreak}{linebreak}* Influence cross functional architecture in sprint planning.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}About You{linebreak}{linebreak}{linebreak}* Experience in running, using and trouble shooting the Apache Big Data stack i.e. Hadoop FS, Hive, HBase, Kafka, Pig, Oozie, Yarn.{linebreak}{linebreak}* Programming experience, ideally in Scala or F# but we are open to other experience if you’re willing to learn the languages we use.{linebreak}{linebreak}* Proficient scripting skills i.e. unix shell and/or powershell{linebreak}{linebreak}* Experience processing large amounts of structured and unstructured data with MapReduce.{linebreak}{linebreak}* We use Azure extensively, so experience with cloud infrastructure will help you hit the ground running.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Compensation Philosophy{linebreak}{linebreak}Our compensation philosophy is simple but powerful. Give everyone a meaningful stake in the company—the purest form of ownership. That’s why on top of base salary, Jet’s comp structure is heavily weighted in equity. Our collective hard work, high performance, and tenure are rewarded as our equity builds in value.{linebreak}{linebreak}{linebreak}Benefits & Perks{linebreak}{linebreak}Competitive Salaries.  Real Ownership in the form of Stock Options.  Unlimited Vacation.  Full Healthcare Benefits.  Exceptional Work Environment.  Learning & Development Opportunities.  Just for fun Networking & Events.

See more jobs at Jet.com

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Criteo

Senior Software Engineer Java Big Data


Criteo


java

senior

engineer

big data

java

senior

engineer

big data

4yr

Apply


Stats (beta): ๐Ÿ‘ 644 views,โœ๏ธ 0 applied (0%)
{linebreak}CRITEO is looking to recruit senior software developers who turn it up to eleven for its R&D Center in Grenoble (South-East from France). Your main missions will be to :{linebreak}{linebreak}- Build systems that make the best decision in 50ms, half a million times per second. Across three continents and six datacenters, 24/7.{linebreak}{linebreak}- Find the signal hidden in tens of TB of data, in one hour, using over a thousand nodes on our Hadoop cluster. And constantly keep getting better at it while measuring the impact on our business.{linebreak}{linebreak}- Get stuff done. A problem partially solved today is better than a perfect solution next year. Have an idea during the night ? Code it in the morning, push it at noon, test it in the afternoon and deploy it the next morning.{linebreak}{linebreak}- High stakes, high rewards: 1% increase in performance may yield millions for the company. But if a single bug goes through, the Internet goes down (we’re only half joking).{linebreak}{linebreak}- Develop open source projects. Because we are working at the forefront of technology, we are dealing with problems that few have faced. We’re big users of open source, and we’d like to give back to the community.

See more jobs at Criteo

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Amazon

Big Data


Amazon


big data

big data

4yr

Apply


Stats (beta): ๐Ÿ‘ 665 views,โœ๏ธ 0 applied (0%)
Amazon has built a reputation for excellence with recent examples of being named #1 in customer service, #1 most trusted, and #2 most innovative. Amazon Web Services (AWS) is carrying on that tradition while leading the world in Cloud technologies. As a member of the AWS Support team you will be at the forefront of this transformational technology assisting a global list of companies that are taking advantage of a growing set of services and features to run their mission-critical applications. You will work with leading companies in this space and directly with the engineering teams within Amazon developing these new capabilities.{linebreak}{linebreak}{linebreak}{linebreak}AWS Support provides global technical support to a wide range of external customers as they build mission-critical applications on top of AWS services such as Amazon S3 and Amazon EC2. We have a team of talented Support Engineers located in 8 countries around the world, and are growing rapidly.{linebreak}{linebreak}{linebreak}{linebreak}We are seeking Devops Support Engineers with an affinity for big-data tools.{linebreak}{linebreak}{linebreak}{linebreak}Every day will bring new and exciting challenges on the job while you:{linebreak}{linebreak}{linebreak}* Learn and use groundbreaking technologies{linebreak}{linebreak}* Apply advanced troubleshooting techniques to provide unique solutions to our customers' individual needs{linebreak}{linebreak}* Interact with leading technologists around the world{linebreak}{linebreak}* Work directly with Amazon Web Service architects to help reproduce and resolve customer issues{linebreak}{linebreak}* Leverage your extensive customer support experience to provide feedback to internal AWS teams on how to improve our services{linebreak}{linebreak}* Drive customer communication during critical events{linebreak}{linebreak}* Drive projects that improve support-related processes and our customers’ technical support experience{linebreak}{linebreak}* Write tutorials, how-to videos, and other technical articles for the customer community{linebreak}{linebreak}* Work on critical, highly complex customer problems that may span multiple AWS services{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* Bachelor’s degree in Information Science / Information Technology, Computer Science, Engineering, Mathematics, Physics, or a related field{linebreak}{linebreak}* 3+ years of experience in a technical position{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}* Strong Linux system administrator skills{linebreak}{linebreak}* Excellent knowledge of Hadoop architecture and administration and support{linebreak}{linebreak}* Proficient in Map-Reduce, Zookeeper, HBASE, HDFS, Pig, Hive,{linebreak}{linebreak}* Python, and shell scripting; experience with Chef a plus.{linebreak}{linebreak}* Expert understanding of ETL principles and how to apply them within Hadoop{linebreak}{linebreak}* Be able to read java code, and basic coding/scripting ability in Java, Perl, Ruby, C#, and/or PHP{linebreak}{linebreak}* Experienced with linux system monitoring and analysis{linebreak}{linebreak}* Good understanding of distributed computing environments{linebreak}{linebreak}* Technology-related Bachelor’s degree or equivalent work experience{linebreak}{linebreak}* Excellent oral and written communication skills{linebreak}{linebreak}* Customer service experience / strong customer focus{linebreak}{linebreak}* Strong multi-tasking skills{linebreak}{linebreak}* Strong analysis and troubleshooting skills and experience{linebreak}{linebreak}* Self-starter who is excited about learning new technology{linebreak}{linebreak}* Networking (DNS, TCP/IP){linebreak}{linebreak}* Databases (MySQL, Oracle, MSSQL){linebreak}{linebreak}* Exposure to Virtualization (VMware, Xen, Hypervisor){linebreak}{linebreak}* Prior working experience with AWS - any or all of EC2, S3, EBS, ELB, RDS, Dynamo DB, EMR{linebreak}{linebreak}* Exposure to security concepts / best practices{linebreak}{linebreak}* Expertise with IPsec, VPN, Load Balancing, Iperf, MTR, Routing Protocols, SSH, Network Monitoring / Troubleshooting tools{linebreak}{linebreak}* Experience managing full application stacks from the OS up through custom applications{linebreak}{linebreak}* Open to working non-standard hours (no night shifts) including a Sun-Mon or Fri-Sat weekend.{linebreak}{linebreak}{linebreak}{linebreak}Amazon Web Services is hiring. For more information: http://aws.amazon.com/careers/{linebreak}{linebreak}{linebreak}{linebreak}*LI-AC2{linebreak}{linebreak}aws-support-na

See more jobs at Amazon

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


nugg.ad AG predictive behavioral targeting

Big Data Engineer


nugg.ad AG predictive behavioral targeting


engineer

big data

engineer

big data

4yr

Apply


Stats (beta): ๐Ÿ‘ 479 views,โœ๏ธ 0 applied (0%)
{linebreak}We are currently building our next generation data management platform and are searching for enthusiastic developers eager to join our team and push back the frontiers of big data processing in high-throughput architectures. Take the unique opportunity to shape and grow an early stage product which will have a significant impact across the advertising market.{linebreak}{linebreak}As our Big-Data Engineer you will: {linebreak}{linebreak}{linebreak}* Design and build the core of our new platform{linebreak}{linebreak}* Identify and deploy the latest big data technologies that suit our challenges{linebreak}{linebreak}* Define new features and products together with our data-science, consulting and sales teams{linebreak}{linebreak}* Migrate existing solutions to our Spark/Scala-based architecture{linebreak}{linebreak}{linebreak}

See more jobs at nugg.ad AG predictive behavioral targeting

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Stats (beta): ๐Ÿ‘ 5,454 views,โœ๏ธ 0 applied (0%)
#ABOUT US{linebreak}We're a London based startup that is building an economy around people's data and attention. In short, weโ€™re creating a digital marketplace where consumers can dynamically license their personal data and attention to brands in return for a payment.{linebreak}{linebreak}Our tech stack currently includes: Node (Heroku), ReactJS and AngularJS (Firebase), Express, Mongoose, SuperTest, MongoDB (MongoLab), npm (npmjs). Our distributed development team covers the development of the responsive web, mobile and browser extension products. {linebreak}{linebreak}We've recently completed the functional MVP and will be pushing on towards our closed-beta launch at the end of January.{linebreak}{linebreak}#ABOUT YOU{linebreak}We're looking for a freelance dev-ops person who has significant experience configuring, managing, and monitoring servers and backend services at scale to support our core development team.{linebreak}{linebreak}{linebreak}#COME HELP US WITH PROJECTS LIKE...{linebreak}- Review our platform architecture requirements and deploy a well documented, secure and scalable cloud based solution{linebreak}- Tighten up security of our servers{linebreak}- Setup autoscaling of our workers{linebreak}- Make our deployments faster and safer{linebreak}- Scale our MongoDB clusters to support our growing data sizes{linebreak}- Improve API performance{linebreak}- Automate more processes{linebreak}- Make sure our backup and recovery procedures are well tested{linebreak}- Implement a centralized logging system{linebreak}- Instrument our application with more metrics and create dashboards{linebreak}- Remove single points of failure in our architecture{linebreak}{linebreak}{linebreak}#YOU SHOULD...{linebreak}- Have real world experience building scalable systems, working with large data sets, and troubleshooting various back-end challenges under pressure{linebreak}- Experience configuring monitoring, logging, and other tools to provide visibility and actionable alerts{linebreak}- Understand the full web stack, networking, and low level Unix computing{linebreak}- Always be thinking of ways improve reliability, performance, and scalability of an infrastructure{linebreak}- Be self-motivated and comfortable with responsibility{linebreak}{linebreak}{linebreak}#WHY WORK WITH US?{linebreak}{linebreak}Work remotely from anywhere in the world, or from our HQ in London, UK. Just be willing to do a bit of traveling every quarter for some face-to-face time with the whole team.{linebreak}Be involved in an early-stage, fast growth startup that has already received national press coverage{linebreak}{linebreak}{linebreak}Extra tags: Devops, AppSec, NodeJS, Cloud, Mongodb, API, Sys Admin, Engineer, Backend, Freelance, Consultant, security, big data, startup

See more jobs at C8

Visit C8's website

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Teraviz

Big Data Visualization Software Architect Development Lead


Teraviz


data viz

big data

exec

architecture

data viz

big data

exec

architecture

4yr

Apply


Stats (beta): ๐Ÿ‘ 743 views,โœ๏ธ 0 applied (0%)
{linebreak}We’re building a web application that will help people explore and visualize 100's of millions of objects at interactive rates. We’re looking for somebody to architect the application, help hire a team, and then lead the development effort and bring the application into production. The implementation is from scratch, but is based on experience with an in-production legacy system, so the requirements are as solid as they come.{linebreak}{linebreak}This is a full-time position offering a market-competitive salary and benefits. Teraviz is being built by a small (but growing) and very stable company that has been delivering big-data solutions for decades. The team is distributed, but you will need to travel regularly (once or twice a month) to the East Coast for in-person meetings, candidates along the Boston-Washington Corridor may find this easier.{linebreak}{linebreak}You will need to have full-stack understanding of everything from how databases work to how to design a GUI, and will be expected to do significant coding. The database has been chosen, but you'll be responsible for picking the rest of the technology stack. Knowledge of web visualization technologies, for example D3.js, would be great. However, you won’t need to do it alone, as you will help choose 2-3 more people to round out your development team.{linebreak}{linebreak}You will need to have led a small team before, be experienced enough to recognize “danger areas” in the plans and schedules you put together, and have reasoned opinions on architectural approaches and development methodologies. You’ll also need to feel comfortable communicating with company management about plans, including justifying budget proposals and updating them on progress.{linebreak}{linebreak}Since the project involves integrating with new database technology (we’re one of the first few licensees), there’s no way to ask for experience with that aspect. Instead, we need somebody who can get up to speed quickly. We believe that requires an existing strong understanding of how databases work down to a fairly low level. To be clear, you won’t need to do implementation work on the database (it’s a very new commercial off-the-shelf system), but you will need to know enough to recognize when things are going sideways and be able to communicate effectively with the vendor's field engineers.{linebreak}{linebreak}If you're interested and think you can do the job, please contact us and we can answer any questions and get you additional details.

See more jobs at Teraviz

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Infinario

JavaScript Developer For A Fast Growing Big Data Platform


Infinario


javascript

big data

dev

digital nomad

javascript

big data

dev

digital nomad

4yr

Apply


Stats (beta): ๐Ÿ‘ 1,022 views,โœ๏ธ 0 applied (0%)
{linebreak}Are you dreaming of being involved in creation of an international product that has a big potential? After 3 years and 2 investments, our solution is used on 4 continents. Come and grow with us!{linebreak}{linebreak}{linebreak}* Be responsible for the front-end of a robust single-page app{linebreak}{linebreak}* Test your brains while optimising the performance of the code{linebreak}{linebreak}* Implement beautiful GUI{linebreak}{linebreak}* Create magic with dependencies{linebreak}{linebreak}* And help us grow a killer programming team :){linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Our engineering culture:{linebreak}{linebreak}We work based on fast iterations, creating an minimum viable product (MVP) and improving on the go. Initiatives that improve how we work - e.g. new project management tool or source of know-how are highly appreciated and put into practice as soon as possible. Initiative counts! As for the code, there is a review before every release and we put emphasis on performance and cleanliness.{linebreak}{linebreak}At the start, you would be mostly working with a  team of 3-4 dedicated programmers and a designer who builds the visual side of our products. 

See more jobs at Infinario

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


SSentif

Help Make The Most Of 'big Data


SSentif


big data

big data

4yr

Apply


Stats (beta): ๐Ÿ‘ 435 views,โœ๏ธ 0 applied (0%)
{linebreak}SSentif are developing a dedicated reporting solution using the full spectrum of modern .net technologies to supplement our current reports functionality.{linebreak}{linebreak}This reporting tool will allow NHS Organisations and Local Councils to better use their data and help them improve their public services. This is an opportunity to make a real difference to the lives of almost everyone in the country!{linebreak}{linebreak}An opportunity has arisen for a developer to join our team to implement and grow the solution.{linebreak}{linebreak}The new solution will be developed using .NET 4.5, MVC 5, Bootstrap, CSS 3, Chart.js, jQuery 2.1, SQL Server 2014 and IIS 7 on top of an OLAP data source.{linebreak}{linebreak}SSentif offers great scope for career progression and have an enviable reputations for developing talented individuals. This is a fantastic and rare opportunity to join this high growth/high tech company.{linebreak}{linebreak}Part time remote working is an option for this position.

See more jobs at SSentif

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

##Job description{linebreak}{linebreak}Hugo is going to transform the festival industry. To do that we're looking for a great full stack developer to grow our team. {linebreak}{linebreak}Hugo is still small and there are many, many things to do. You'll be working on front- and back-end, APIs and consumer apps and your work will have a huge impact on our product. We work from our office in Amsterdam and anywhere else in the world.{linebreak}{linebreak}We're looking for full stack developers versed in Node.JS and PHP. Knowing other languages is a plus.{linebreak}{linebreak}Our applications are developed in PHP, Node.JS and Spark with Elasticsearch, Open Streep Map, MySQL, S3 and SQS. We host our apps in a Mesosphere cluster using Docker images in Amazon EC2. For our apps we employ a micro services architecture. Our complete stack is listed here: http://stackshare.io/hugo-events/hugo-events.{linebreak}{linebreak}We don't do fixed lengths sprints but delivering things fast when they're done using a continues delivery process.{linebreak}{linebreak}You know how to write clean code for the web. You're a front- or backend developer and you're able to do both when the need arises. Preferably you have experience with pushing code to production and have some ops experience. We develop in PHP, Node.JS, Meteor and Spark and HTML, CSS and jQuery are used in our frontend. You have experience working with these.{linebreak}{linebreak}We're a young and growing team. You should be a good communicator and be as open to working in a team as working alone. We're a very agile and fast-paced team with complimenting skill-sets, and you should be open to helping others and expanding your skills.{linebreak}{linebreak}##Job requirements{linebreak}{linebreak}What we're looking for in a full stack developer:{linebreak}{linebreak}- Extensive knowledge of two or more of the programming languages in our stack{linebreak}- Passionate about developing great apps{linebreak}- Your code is cleanly written and easy to read{linebreak}- Familiar with MySQL and Redis{linebreak}- Good communicator, especially when working remote{linebreak}{linebreak}##Bonus points for{linebreak}{linebreak}- Machine learning knowledge{linebreak}- Analytics{linebreak}- A/B testing{linebreak}- Experience with Docker, Mesos and Marathon (Mesosphere){linebreak}- Experience working with Elasticsearch{linebreak}{linebreak}##What to expect from us{linebreak}{linebreak}- A good monthly payment{linebreak}- The tools you need to get your job done{linebreak}- A great, young and enthousiastic team to work with{linebreak}- Being able to work where and when you work{linebreak}{linebreak}##Interested?{linebreak}{linebreak}Please send us a brief CV together with an explanation on why you are the perfect fit. We love to see some examples of your work.{linebreak}{linebreak}{linebreak}Extra tags: php, node.js, scala, spark, elastic, aws, big data

See more jobs at Hugo

Visit Hugo's website

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Crossover

Senior Big Data Software Engineer $90K


Crossover


senior

engineer

big data

dev

senior

engineer

big data

dev

4yr

Apply


Stats (beta): ๐Ÿ‘ 663 views,โœ๏ธ 0 applied (0%)
{linebreak}Are you a Senior Software Engineer that has spent several years working with Big Data technologies? Have you created streaming analytics algorithms to process terabytes of real-time data and deployed Hadoop or Cassandra clusters over dozens of VMs? Have you been part of an organization driven by a DevOps culture, where the engineer has end to end responsibility for the product, from development to operating it in production? Are you willing to join a team of elite engineers working on a fast growing analytics business? Then this role is for you! {linebreak} {linebreak}Job Description {linebreak}The Software and DevOps engineer will help build the analytics platform for Bazaarvoice data that will power our client-facing reporting, product performance reporting, and financial reporting. You will also help us operationalize our Hadoop clusters, Kafka and Storm services, high volume event collectors, and build out improvements to our custom analytics job portal in support of Map/Reduce and Spark jobs. The Analytics Platform is used to aggregate data sets to build out known new product offerings related to analytics and media as well as a number of pilot initiatives based on this data. You will need to understand the business cases of the various products and build a common platform and set of services that help all of our products move fast and iterate quickly. You will help us pick and choose the right technologies for this platform. {linebreak} {linebreak}Key Responsibilities {linebreak}In your first 90 days you can expect the following: {linebreak} * {linebreak}An overview of our Big Data platform code base and development model {linebreak} * {linebreak}A tour of the products and technologies leveraging the Big Data Analytics Platform {linebreak} * {linebreak}4 days of Cloudera training to provide a quick ramp up of the technologies involved {linebreak} * {linebreak}By the end of the 90 days, you will be able to complete basic enhancements to code supporting large-scale analytics using Map/Reduce as well as contribute to the operational maintenance of a high volume event collection pipeline. {linebreak} {linebreak} {linebreak}Within the first year you will: {linebreak} * {linebreak}Own design, implementation, and support of major components of platform development. This includes working with the various stakeholders for the platform team to understand their requirements and delivery high leverage capabilities. {linebreak} * Have a complete grasp of the technology stack, and help guide where we go next.{linebreak} {linebreak}

See more jobs at Crossover

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Crossover

Senior Big Data Software Engineer


Crossover


senior

engineer

big data

dev

senior

engineer

big data

dev

4yr

Apply


Stats (beta): ๐Ÿ‘ 2,290 views,โœ๏ธ 0 applied (0%)
{linebreak}Are you a Senior Software Engineer that has spent several years working with Big Data technologies? Have you created streaming analytics algorithms to process terabytes of real-time data and deployed Hadoop or Cassandra clusters over dozens of VMs? Have you been part of an organization driven by a DevOps culture, where the engineer has end to end responsibility for the product, from development to operating it in production? Are you willing to join a team of elite engineers working on a fast growing analytics business? Then this role is for you!{linebreak}{linebreak}Job Description{linebreak}{linebreak}The Software and DevOps engineer will help build the analytics platform for Bazaarvoice data that will power our client-facing reporting, product performance reporting, and financial reporting. You will also help us operationalize our Hadoop clusters, Kafka and Storm services, high volume event collectors, and build out improvements to our custom analytics job portal in support of Map/Reduce and Spark jobs. The Analytics Platform is used to aggregate data sets to build out known new product offerings related to analytics and media as well as a number of pilot initiatives based on this data. You will need to understand the business cases of the various products and build a common platform and set of services that help all of our products move fast and iterate quickly. You will help us pick and choose the right technologies for this platform.{linebreak}{linebreak}Key Responsibilities{linebreak}{linebreak}In your first 90 days you can expect the following:{linebreak}{linebreak}{linebreak}* An overview of our Big Data platform code base and development model{linebreak}{linebreak}* A tour of the products and technologies leveraging the Big Data Analytics Platform{linebreak}{linebreak}* 4 days of Cloudera training to provide a quick ramp up of the technologies involved{linebreak}{linebreak}* By the end of the 90 days, you will be able to complete basic enhancements to code supporting large-scale analytics using Map/Reduce as well as contribute to the operational maintenance of a high volume event collection pipeline.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Within the first year you will:{linebreak}{linebreak}{linebreak}* Own design, implementation, and support of major components of platform development. This includes working with the various stakeholders for the platform team to understand their requirements and delivery high leverage capabilities.{linebreak}{linebreak}* Have a complete grasp of the technology stack, and help guide where we go next.{linebreak}{linebreak}{linebreak}

See more jobs at Crossover

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Moz

Software Engineer Big Data


Moz


engineer

dev

big data

digital nomad

engineer

dev

big data

digital nomad

4yr

Apply


Stats (beta): ๐Ÿ‘ 646 views,โœ๏ธ 0 applied (0%)
Full Time: Sr. Software Engineer- Big Data at Moz in Seattle, WA or Remote

See more jobs at Moz

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


mCentric

Senior Developer Manager With Java SQL For Telco Big Data Pioneer


mCentric


java

senior

big data

exec

java

senior

big data

exec

4yr

Apply


Stats (beta): ๐Ÿ‘ 671 views,โœ๏ธ 0 applied (0%)
{linebreak}Senior Developer / Manager with Java and SQL for Telco Big Data Pioneer{linebreak}{linebreak}{linebreak}We are looking for a Senior Developer ready to confidently apply a solid technical and managerial skillset to create new or upgrade current solutions, proposing and defending innovations.{linebreak}{linebreak}You will be using your experience as a Senior Developer to create and upgrade solutions which interact with the mobile telecommunications core network, familiarising yourself with a multitude of mobile telecommunications protocols, standards and infrastructure variations of those standards. mCentric solutions range from Data Plan Management, xDR Navigation, SIM Registration, Voucher Services, and other basic mobile network solutions which in turn are feeding its data into a dynamic and intelligent promotions platform allowing for precise Microsegmentation applied to intelligent and interactive promotions and reporting. Our solutions are responsible for dealing with billions of events processed per day, millions of concurrent user sessions, and are expected to deliver results in real time. To handle the demand, several of the solutions are built over a Big Data structure. In fact, mCentric solutions are running over the first BDA installed in Africa.

See more jobs at mCentric

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


mCentric

Senior Java C Developer Architect For Telco Big Data Pioneer


mCentric


java

c

c plus plus

senior

java

c

c plus plus

senior

4yr

Apply


Stats (beta): ๐Ÿ‘ 1,431 views,โœ๏ธ 0 applied (0%)
{linebreak}In your day to day at mCentric you can expect to encounter a buzzing environment of professionals dedicated to creating and enhancing services in several mobile operators worldwide. You will be using your experience as a Senior Developer and Architect to create and upgrade solutions which interact with the mobile telecommunications core network, familiarising yourself with a multitude of mobile telecommunications protocols, standards and infrastructure variations of those standards. mCentric solutions range from Data Plan Management, xDR Navigation, SIM Registration, Voucher Services, and other basic mobile network solutions which in turn are feeding its data into a dynamic and intelligent promotions platform allowing for precise Microsegmentation applied to intelligent and interactive promotions and reporting. Our solutions are responsible for dealing with billions of events processed per day, millions of concurrent user sessions, and are expected to deliver results in real time. To handle the demand, several of the solutions are built over a Big Data structure. In fact, mCentric solutions are running over the first BDA installed in Africa.

See more jobs at mCentric

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Crossover

Senior Big Data Software Engineer $60K


Crossover


senior

engineer

big data

dev

senior

engineer

big data

dev

4yr

Apply


Stats (beta): ๐Ÿ‘ 662 views,โœ๏ธ 0 applied (0%)
{linebreak}Are you a Senior Software Engineer that has spent several years working with Big Data technologies? Have you created streaming analytics algorithms to process terabytes of real-time data and deployed Hadoop or Cassandra clusters over dozens of VMs? Have you been part of an organization driven by a DevOps culture, where the engineer has end to end responsibility for the product, from development to operating it in production? Are you willing to join a team of elite engineers working on a fast growing analytics business? Then this role is for you!{linebreak}{linebreak}{linebreak}{linebreak}Job Description{linebreak}{linebreak}The Software and DevOps engineer will help build the analytics platform for Bazaarvoice data that will power our client-facing reporting, product performance reporting, and financial reporting.  You will also help us operationalize our Hadoop clusters, Kafka and Storm services, high volume event collectors, and build out improvements to our custom analytics job portal in support of Map/Reduce and Spark jobs.  The Analytics Platform is used to aggregate data sets to build out known new product offerings related to analytics and media as well as a number of pilot initiatives based on this data. You will need to understand the business cases of the various products and build a common platform and set of services that help all of our products move fast and iterate quickly. You will help us pick and choose the right technologies for this platform.{linebreak}{linebreak}{linebreak}{linebreak}Key Responsibilities{linebreak}{linebreak}{linebreak}{linebreak}In your first 90 days you can expect the following:{linebreak}{linebreak}{linebreak}* {linebreak}{linebreak}An overview of our Big Data platform code base and development model{linebreak}{linebreak}{linebreak}* {linebreak}{linebreak}A tour of the products and technologies leveraging the Big Data Analytics Platform{linebreak}{linebreak}{linebreak}* {linebreak}{linebreak}4 days of Cloudera training to provide a quick ramp up of the technologies involved{linebreak}{linebreak}{linebreak}* {linebreak}{linebreak}By the end of the 90 days, you will be able to complete basic enhancements to code supporting large-scale analytics using Map/Reduce as well as contribute to the operational maintenance of a high volume event collection pipeline.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Within the first year you will:{linebreak}{linebreak}{linebreak}* {linebreak}{linebreak}Own design, implementation, and support of major components of platform development. This includes working with the various stakeholders for the platform team to understand their requirements and delivery high leverage capabilities.{linebreak}{linebreak}{linebreak}* {linebreak}{linebreak}Have a complete grasp of the technology stack, and help guide where we go next.{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}Bazaarvoice is a network that connects brands and retailers to the authentic voices of people where they shop. Each month, more than 500 million people view and share authentic opinions, questions and experiences about tens of millions of products in the Bazaarvoice network. Our technology platform amplifies these voices into the places that influence purchase decisions. Network analytics help marketers and advertisers provide more engaging experiences that drive brand awareness, consideration, sales and loyalty. Headquartered in Austin, Texas, Bazaarvoice has offices in Chicago, London, Munich, New York, Paris, San Francisco, Singapore, and Sydney.{linebreak}{linebreak}Total Compensation is $30 / hour

See more jobs at Crossover

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


MetricStory

Engineer #1 With Big Data Emphasis Funded Startup


MetricStory


javascript

node js

engineer

big data

javascript

node js

engineer

big data

4yr

Apply


Stats (beta): ๐Ÿ‘ 1,086 views,โœ๏ธ 0 applied (0%)
{linebreak}MetricStory is revolutionizing web analytics. Currently, it is painful to setup web analytics, create reports, and finally get insights out of the reports. Our goal is to make it easy for companies to capture and analyze customer data without having to code. To do this, we are storing and analyzing the full user clickstream. We are a recent Techstars funded company and we have expert domain knowledge in analytics. You'll be our first engineer and have real ownership, responsibility, and impact on the business. The perfect candidate is a senior / lead level engineer with a few years experience in building product and loves architecting complex systems. This position requires solving hard problems and is focused on writing scalable code to capture and analyze big data.{linebreak}{linebreak}We are looking for an engineer that has experience and is passionate with storing large volumes of data and retrieving this data in seconds. The ideal candidate will have experience in storing large amounts of event data in a NoSQL database like DynamoDB and exporting/cleaning with Amazon EMR (HiveQL) to RedShift for fast access. This position requires working knowledge of best database structures for speed, large data sets, data cleaning, and how to transfer NoSQL data to SQL. If you are up for a serious technical challenge to help build this company from the ground up, then contact us!{linebreak}{linebreak}Our stack is NodeJS, DynamoDB, MongoDB, D3.js, Angular, Redis, Amazon Redshift, and plain vanilla Javascript.

See more jobs at MetricStory

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Showroom Logic

Big Data Engineer


Showroom Logic


engineer

big data

engineer

big data

4yr

Apply


Stats (beta): ๐Ÿ‘ 586 views,โœ๏ธ 0 applied (0%)
{linebreak}Showroom Logic is the 26th fastest-growing company in America. It powers paid-search, display & retargeting campaigns for thousands of auto dealerships nationwide with it's industry leading AdLogic platform. Our dev team is an elite group of individuals who love creating solutions to complex technical problems. Our full-time devs enjoy benefits for them and their families, very competitive salaries, periodic trips to Miami or Southern California, the flexibility of telecommuting, an extremely high level of trust, fun and skill-stretching projects. We are changing the way advertisers manage their digital marketing with our award-winning technology.{linebreak}{linebreak}Position Summary:{linebreak}{linebreak}We are looking for a Data Engineer to be responsible for retrieving, validating, analyzing, processing, cleansing, and managing of external data and internal data sources. This is not just a data warehousing position—a critical function of this job is to design and implement optimal ways to manage and analyze data. The Data Engineer is expected to learn existing processes, learn and apply 'Big Data' tools, and apply software development skills for automating processes, creating tools, and modifying existing processes for increased efficiency and scalability. {linebreak}{linebreak}Key functions include:{linebreak}{linebreak}{linebreak}* Developing tools for data processing and information retrieval (both batch processing and real-time querying){linebreak}{linebreak}* Support existing projects where evaluating and providing data quality is vital to the product development process{linebreak}{linebreak}* Analyzing, processing, evaluating and documenting very large data sets{linebreak}{linebreak}* Providing RESTful APIs that other teams can use to store and retrieve data{linebreak}{linebreak}{linebreak}

See more jobs at Showroom Logic

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Tapway

Malaysia

Big Data Engineer Python Developer


Tapway

Malaysia

python

engineer

big data

dev

python

engineer

big data

dev

Malaysia4yr

Apply


Stats (beta): ๐Ÿ‘ 1,890 views,โœ๏ธ 0 applied (0%)
Job Description:{linebreak}{linebreak}- Lead, design, develop and implement large-scale, real-time data processing systems by working with large structured and unstructured data from various complex sources.{linebreak}- Design, implement and deploy ETL to load data into NoSQL / Hadoop.{linebreak}- Performance fine-tuning of the data processing platform{linebreak}- Development of various APIโ€™s to interact with front-end and other data warehouses{linebreak}- Coordinate with web programmers to deliver a stable and highly available reporting platform{linebreak}- Coordinate with data scientist to integrate complex data models into the data processing platform.{linebreak}- Have fun in a highly dynamic team and drive innovations to continue as a leader in one of the fastest-growing industries{linebreak}{linebreak}Job Requirements:{linebreak}{linebreak}- Candidate must possess at least a Bachelorโ€™s Degree in Computer Science, Information System or related discipline. MSc or PhD a plus.{linebreak}- Proficiency in Python{linebreak}- A strong background in interactive query processing{linebreak}- Experience with Big Data applications/solutions such as Hadoop, HBase, Hive, Cassandra, Pig etc. {linebreak}- Experience with NoSQL and handling large datasets{linebreak}- Passion and interest for all things distributed - file systems, databases and computational frameworks{linebreak}- Individual who is passionate, resourceful, self-motivated, highly committed, a team player and able to motivate others{linebreak}- Strong leadership qualities{linebreak}- Good verbal and written communication.{linebreak}- Must be willing in work in highly dynamic and challenging startup environment.{linebreak} {linebreak}{linebreak}#Salary{linebreak}30000 - 45000{linebreak} {linebreak}{linebreak}#Equity{linebreak}30000 - 45000{linebreak} {linebreak}{linebreak}#Location{linebreak}- Malaysia

See more jobs at Tapway

Visit Tapway's website

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Swimlane

Big Data Engineer


Swimlane


engineer

big data

engineer

big data

4yr

Apply


Stats (beta): ๐Ÿ‘ 848 views,โœ๏ธ 0 applied (0%)
{linebreak}Swimlane is looking for a NoSQL engineer with C# experience to join our team. Our product enables Federal and Fortune 100 companies to do business intelligence on big data and implement workflow procedure tasks around that data.. We are looking for a software engineer to help build the next generation security management application.  This is a new, not legacy, product the technology stack is latest and greatest; you will learn and use groundbreaking technologies!  You will have the ability to work from home, work on open-source projects and have the opportunity write articles on isolated components of your work.

See more jobs at Swimlane

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


ScalingData

Infrastructure Engineer Big Data


ScalingData


engineer

big data

engineer

big data

4yr

Apply


Stats (beta): ๐Ÿ‘ 728 views,โœ๏ธ 0 applied (0%)
{linebreak}ScalingData's build infrastructure engineering team builds and maintains our internal build, test, continuous integration, packaging, release, and software delivery systems and infrastructure. Engineers who are interested in devops, configuration management, build systems, and distributed systems will feel at home thinking about developer efficiency and productivity, simplifying multi-language builds, automated testing of complex distributed systems, and how customers want to consume and deploy complex distributed systems in modern data centers. The build infrastructure team is a critical part of the larger engineering team. Distributed systems are hard, but building the infrastructure to develop them is harder.{linebreak}{linebreak}By building on big data technologies such as Hadoop, help us create the essential solution for identify and solving critical performance and compliance issues in data centers. {linebreak}{linebreak}Some of the technology we use:{linebreak}{linebreak}{linebreak}* Java, Go, C/C++{linebreak}{linebreak}* Hadoop, Solr, Kafka, Impala, Hive, Spark{linebreak}{linebreak}* AWS, Maven, Jenkins, Github, JIRA{linebreak}{linebreak}{linebreak}

See more jobs at ScalingData

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


CLARITY SOLUTION GROUP

Big Data


CLARITY SOLUTION GROUP


scala

api

engineer

big data

scala

api

engineer

big data

4yr

Apply

{linebreak}Note: For this position you will be required to travel throughout the country on a weekly basis.{linebreak}{linebreak}DOES WORKING FOR AN ORGANIZATION WITH THESE BENEFITS, APPEAL TO YOU?{linebreak}{linebreak}{linebreak}* Working on complex transformation programs across many clients and many industries{linebreak}{linebreak}* Unlimited paid time off{linebreak}{linebreak}* Competitive compensation which includes uncapped bonus potential based on individual contributions{linebreak}{linebreak}* Mentor program{linebreak}{linebreak}* Career development{linebreak}{linebreak}* Tremendous growth opportunities (company is growing at a rate of 35% or more annually){linebreak}{linebreak}* Strong work/life balance{linebreak}{linebreak}* A smaller nimble organization that is easy to work with{linebreak}{linebreak}* Visibility to the leadership team on a daily basis{linebreak}{linebreak}* Be a part of an Elite Data & Analytics team{linebreak}{linebreak}{linebreak}{linebreak}{linebreak}IF YOU ANSWERED YES TO THE ABOVE ITEMS, KEEP READING!{linebreak}{linebreak}We are looking for individuals with the ability to drive the architectural decision making process, who are experienced with leading teams of developers, but who are also capable and enthusiastic about implementing every aspect of an architecture themselves.{linebreak}{linebreak}OUR DATA ENGINEERS: {linebreak}{linebreak}{linebreak}* Are hands-on, self-directed engineers who enjoysworking in collaborative teams  {linebreak}{linebreak}* Are data transformation engineers{linebreak}{linebreak}{linebreak}{linebreak}* Design and develop highly scalable, end to end process to consume, integrate and analyze large volume, complex data from sources such as Hive, Flume and other APIs{linebreak}{linebreak}* Integrate datasets and flows using a variety of open source and best-in-class proprietary software{linebreak}{linebreak}{linebreak}{linebreak}* Work with business stakeholders and data SMEs to elicit requirements and develop real-time business metrics, analytical products and analytical insights{linebreak}{linebreak}* Profile and analyze complex and large datasets{linebreak}{linebreak}* Collaborate and validate implementation with other technical team members{linebreak}{linebreak}{linebreak}

See more jobs at CLARITY SOLUTION GROUP

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


Logistadvise

Santa Monica

Big Data Architect


Logistadvise

Santa Monica

big data

architecture

big data

architecture

Santa Monica5yr

Apply


Stats (beta): ๐Ÿ‘ 2,276 views,โœ๏ธ 0 applied (0%)
We are looking for somebody that can engineer and manage existing databases. Logistics or supply chain experience a plus. {linebreak}{linebreak}#Salary{linebreak}40000 - 80000{linebreak} {linebreak}{linebreak}#Equity{linebreak}40000 - 80000{linebreak} {linebreak}{linebreak}#Location{linebreak}- Santa Monica

See more jobs at Logistadvise

Visit Logistadvise's website

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.


The Shelf

New York City

Big Data Engineer


The Shelf

New York City

engineer

big data

engineer

big data

New York City5yr

Apply


Stats (beta): ๐Ÿ‘ 964 views,โœ๏ธ 0 applied (0%)
Weโ€™re looking for a data nerd with an engineering bent who enjoys incorporating messy and varied data sets into a clean and efficient data analysis pipeline. Wranging with data to find meaningful insights is what drives you to work every day. You are relentless in getting things done. You donโ€™t need parental supervision (i.e., you donโ€™t like to be micro-managed). You want to take ownership of the code/features youโ€™re building. {linebreak}{linebreak}Must have:{linebreak}Deep understanding of CS fundamentals as well as distributed systems {linebreak}- At least 5 years of experience building production level software (Python, Django required){linebreak}- At least 2 years in a big-data related role at a data-driven company{linebreak}- Continuous integration and deployment experience{linebreak}{linebreak}Experience: you should have experience fetching, processing, and analyzing data in Python:{linebreak}- Experience developing and maintaining the back-end of a data-driven web app{linebreak}- Extensive experience with web-scraping (deep knowledge of Selenium a plus){linebreak}- Experience implementing a data collection and analysis pipeline, scaling up to larger data sets and optimizing as necessary{linebreak}- Experience working with (non-)relational databases, particularly MongoDB{linebreak}- (Not necessary but weโ€™d love you if) Experience with general data mining (NLTK) and machine learning techniques {linebreak}- (Not necessary but weโ€™d love you if) Understanding and experience maintaining & optimizing PostgresSQL database is a major plus{linebreak}{linebreak}Our Stack : {linebreak}Python + Django{linebreak}Amazon EC2, RDS (Postgres), Rackspace, RabbitMQ for messaging, Celery for queues {linebreak}{linebreak}#Salary{linebreak}70000 - 120000{linebreak} {linebreak}{linebreak}#Equity{linebreak}70000 - 120000{linebreak} {linebreak}{linebreak}#Location{linebreak}- New York City

See more jobs at The Shelf

Visit The Shelf's website

Apply for this Job

๐Ÿ‘‰ Please reference you found the job on Remote OK, this helps us get more companies to post here!

When applying for jobs, you should NEVER have to pay to apply. That is a scam! Always verify you're actually talking to the company in the job post and not an imposter. Scams in remote work are rampant, be careful! When clicking on the button to apply above, you will leave Remote OK and go to the job application page for that company outside this site. Remote OK accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.