DevOps Engineer - Data (f/m/d)
About the team
In this position, you will be part of our Chapter Platform Engineering. The chapter represents all experts who deliver DevOps capabilities to all our product teams.
These experts are organized together to enhance those functional skillsets, improve E.ON’s DevOps tech stack and deliver a high level of cloud automation.
The products you will be working on belong to our team Digital Solutions I Future Energy Home. This team develops and operates software to manage home energy devices such as photovoltaic, inverters, batteries, EV charging wall boxes, heat pumps and meters as a cloud-based solution for our central and local units so they can rollout those solutions to the end-customers.
We integrate with numerous vendors’ devices and apply centralized insights, analytics and control mechanisms for these devices.
meaningful & challenging -Your tasks
Responsible for implementing CI / CD and DevOps practices Future Energy Home Data Team focusing on ETL pipelines
Support in automation and maintaining of BI Tools like PowerBI and Google Looker Studio for data quality metrics via the implementation of automated applications
Support data engineers in building and operationalizing data self serve infrastructure across multiple cloud platforms (e.g. AWS, GCP, Azure)
Ensure data security of the data products being deployed by implementing data access and anonymization methods (e.g. data masking, data pseudonymization, etc) in compliance with DPOs recommendation
Operationalize Analytical Products (i.e. B.I dashboards and Data Science ML models) and implement data quality metrics
Contribute to Community of Practice to actively foster collaboration and exchange
authentic & ambitious -Your profile
Several years of experience in building enterprise-grade GitLab CICD pipelines, Data Version Control for python data pipeline applications (ETL)
Several years of experience in buidling BI dashboards and monitoring agents for PowerBI and Google Looker Studio (or similar tools)
Profound experience in GCP BigQuery, Databricks (e.g. PySpark), Snowflake, Python Dask&Pandas, Python-Pytest and Python Behave.
First experience in implementing and using Attribute-Based Access Control (ABAC) tools such as Immuta, Segment or RBAC tools (e.
g.Okta, AWS IAM, AWS Cognito) to democratize data access
Proven knowledge in AWS Cloud with service usage like AWS SageMaker, Cloudformation, CDK, AWS StepFunctions, AWS-Cloudwatch and Blue-Green / Canary deployment strategies
Appropriate communication skills and ability to help others and contribute to Community of Practice
smart & useful -Our benefits
We provide full flexibility : Do your work from home or any other place in Germany - of course including all our great offices from Hamburg to Munich.
You want even more? Go on workation for up to 20 days per year within Europe.
Recharge your battery : You have 30 holidays per year plus Christmas and New Year's Eve on top. Your battery still needs charging?
You can exchange parts of your salary for more holidays or you can take a sabbatical.
- Your development : We grow and we want you to grow with us. Learning on the job, exchanging with others or taking part in an individual training - Our learning culture enables you to bring your personal and professional development to the next level.
- Let’s empower each other : Take the opportunity to engage in our Digital Empowerment Communities for collaboration, learning, and network building.
- We elevate your mobility : From car and bike leasing offers to a subsidised Deutschland-Ticket - your way is our way.
- Let’s think ahead : With our company pension scheme and a great insurance package we take care of your future.
- This is by far not all : We are looking forward to speaking with you about further benefits during the recruiting process.