Senior Data Engineer (f/m/d)
About the team
The chapter Technology & Engineering - Data & AI Engineering represents all experts who deliver data and AI engineering capabilities to our product teams.
These experts are organized together to enhance those functional skillsets, improve E.ON’s MLOps & AIOps tech stack and deliver a high level of data delivery & model automation.
The products you will be working on belong to our team Digital Solutions I Future Energy Home. This team develops and operates software to manage home energy devices such as photovoltaic, inverters, batteries, EV charging wall boxes, heat pumps and meters as a cloud-based solution for our central and local units so they can rollout those solutions to the end-customers.
We integrate with numerous vendors’ devices and apply centralized insights, analytics and control mechanisms for these devices.
meaningful & challenging -Your tasks
Integrate cross clouds (AWS & Azure) Data Platform Pipelines, using Data Mesh and Data Fabric architecture concepts
Implement data sharing interfaces or connectors to share business data (e.g. solar telemetry, electric vehicle charging data , ) to our regional business data consumers
Build robust data pipeline applications with AWS and Azure data services using software principles such as Clean Code and SOLID principles
Work closely with Data Solutions Architects to understand and shape overarching Data Architecture for data sharing interfaces and connectors
Mentor and coach others, conduct pair-programming sessions and review merge requests as well as active contribution to Community of Practice
authentic & ambitious -Your profile
At least 5 years experience in building enterprise-grade python data pipeline (ETL) applications using software best practices such as Clean Code / SOLID principles in AWS / Azure
At least 3 years of experience in relevant AWS Services for Data Engineering (e.g. Athena, Lambda, Glue, AWS IAM & Cloudwatch) and also background in Azure
Prodound knowledge in Databricks (e.g. PySpark), Snowflake, Python Pandas, Python-Pytest and Python Behave.
Experience building DataOps pipelines with GitLab, Cloudformation, Terraform or CDK and using orchestration tools (e.g. AWS StepFunctions)
Preferable experience in Data Modelling Concepts such as Data Vault 2.0 and Dimensional Data Modelling
Excellent communication skills and the ability to mentor and coach other developers
smart & useful -Our benefits
We provide full flexibility : Do your work from home or any other place in Germany - of course including all our great offices from Hamburg to Munich.
You want even more? Go on workation for up to 20 days per year within Europe.
Recharge your battery : You have 30 holidays per year plus Christmas and New Year's Eve on top. Your battery still needs charging?
You can exchange parts of your salary for more holidays or you can take a sabbatical.
- Your development : We grow and we want you to grow with us. Learning on the job, exchanging with others or taking part in an individual training - Our learning culture enables you to bring your personal and professional development to the next level.
- Let’s empower each other : Take the opportunity to engage in our Digital Empowerment Communities for collaboration, learning, and network building.
- We elevate your mobility : From car and bike leasing offers to a subsidised Deutschland-Ticket - your way is our way.
- Let’s think ahead : With our company pension scheme and a great insurance package we take care of your future.
- This is by far not all : We are looking forward to speaking with you about further benefits during the recruiting process.