About the team
In this position, you will be part of our Chapter Platform Engineering. The chapter represents all experts who deliver DevOps capabilities to all our product teams. These experts are organized together to enhance those functional skillsets, improve E.ON’s DevOps tech stack and deliver a high level of cloud automation.
The products you will be working on belong to our team Digital Solutions I Future Energy Home. This team develops and operates software to manage home energy devices such as photovoltaic, inverters, batteries, EV charging wall boxes, heat pumps and meters as a cloud-based solution for our central and local units so they can rollout those solutions to the end-customers. We integrate with numerous vendors’ devices and apply centralized insights, analytics and control mechanisms for these devices.
meaningful & challenging -Your tasks
Responsible for implementing CI / CD and DevOps practices Future Energy Home Data Team focusing on ETL pipelines
Support in automation and maintaining of BI Tools like PowerBI and Google Looker Studio for data quality metrics via the implementation of automated applications
Support data engineers in building and operationalizing data self serve infrastructure across multiple cloud platforms (e.g. AWS, GCP, Azure)
Ensure data security of the data products being deployed by implementing data access and anonymization methods (e.g. data masking, data pseudonymization, etc) in compliance with DPOs recommendation
Operationalize Analytical Products (i.e. B.I dashboards and Data Science ML models) and implement data quality metrics
Contribute to „Community of Practice“ to actively foster collaboration and exchange
authentic & ambitious -Your profile
Several years of experience in building enterprise-grade GitLab CICD pipelines, Data Version Control for python data pipeline applications (ETL)
Several years of experience in buidling BI dashboards and monitoring agents for PowerBI and Google Looker Studio (or similar tools)
Profound experience in GCP BigQuery, Databricks (e.g. PySpark), Snowflake, Python Dask&Pandas, Python-Pytest and Python Behave.
First experience in implementing and using Attribute-Based Access Control (ABAC) tools such as Immuta, Segment or RBAC tools (e.g.Okta, AWS IAM, AWS Cognito) to democratize data access
Proven knowledge in AWS Cloud with service usage like AWS SageMaker, Cloudformation, CDK, AWS StepFunctions, AWS-Cloudwatch and Blue-Green / Canary deployment strategies
Appropriate communication skills and ability to help others and contribute to “Community of Practice“
smart & useful -Our benefits