👉 Hiring for a remote Cloud position?on the 🏆 #1 remote jobs board
Cloud KPI Data Engineer
Cloud KPI Data Engineer
\nAt Elastic, we have a simple goal: to pursue the world's data problems with products that delight and inspire. We help people around the world do exceptional things with their data. From stock quotes to Twitter streams, Apache logs to WordPress blogs, our products are extending what's possible with data, delivering on the promise that good things come from connecting the dots. Often, what you can do with our products is only limited by what you can dream up. We believe that diversity drives our vibe. We unite employees across 30+ countries into one unified team, while the broader community spans across over 100 countries.\n\nElastic’s Cloud product allows users to build new clusters or expand existing ones easily. This product is built on Docker based orchestration system to easily deploy and manage multiple Elastic Clusters.\n\nWhat You Will Do:\n\n\n* Work cross-functionally with product managers, analysts, and engineering teams to extract meaningful data from multiple sources\n\n* Develop analytical models to identify trends, calculate KPIs and identify anomalies. Use these models to generate reports and data dumps that enrich our KPI efforts\n\n* Design resilient data pipelines or ETL processes to collect, process and index business and operational data\n\n* Integrate several data sources like Salesforce, Postgres DB, Elasticsearch to create a holistic view of the Cloud business\n\n* Manage data collection services in production with the SRE team\n\n* Use Kibana and Elasticsearch to analyze business data. Help engineering and product teams to make data based decisions.\n\n* Grow and share your interest in technical outreach (blog posts, tech papers, conference speaking, etc.)\n\n\n\n\nWhat You Bring Along:\n\n\n* You are passionate about developing software that deliver quality data to stakeholders\n\n* Hands-on experience building data pipelines using technologies such as Elasticsearch, Hadoop, Spark\n\n* Experience developing models for KPIs such as user churn, trial conversion, etc\n\n* Ability to code in JVM based languages or Python\n\n* Experience with data modeling\n\n* Experience doing ad-hoc data analysis for key stakeholders\n\n* A working knowledge of Elasticsearch\n\n* Experience building dashboards in Kibana\n\n* Experience working with ETL tools such as Logstash, Apache NiFi, Talend is a plus\n\n* Deep understanding of relational as well as NoSQL data stores is a plus\n\n* A self starter who has experience working across multiple technical teams and decision makers\n\n* You love working with a diverse, worldwide team in a distributed work environment\n\n\n\n\nAdditional Information\n\n\n* Competitive pay and benefits\n\n* Equity\n\n* Catered lunches, snacks, and beverages in most offices\n\n* An environment in which you can balance great work with a great life\n\n* Passionate people building great products\n\n* Employees with a wide variety of interests\n\n* Your age is only a number. It doesn't matter if you're just out of college or your children are; we need you for what you can do.\n\n\n\n\nElastic is an Equal Employment employer committed to the principles of equal employment opportunity and affirmative action for all applicants and employees. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender perception or identity, national origin, age, marital status, protected veteran status, or disability status or any other basis protected by federal, state or local law, ordinance or regulation. Elastic also makes reasonable accommodations for disabled employees consistent with applicable law.
See more jobs at Elastic
# How do you apply? This job post is older than 30 days and the position is probably filled. Try applying to jobs posted recently instead.Apply for this Job
👉 Please reference you found the job on Remote OK, this helps us get more companies to post here!