How can we change the world to make marketing both relevant and impactful? With your help! At Schwarz Media Platform, we are on a mission to build Europe's largest and most advanced ad network for retail - a real-life AdTech application with a big impact on consumers, stores, and advertisers. It is based on Europe's largest retail data pool from Europe's No. 1 retailer, Schwarz Group, and cutting-edge technology that understands individual consumer behavior at scale. If you are interested in this vision and are excited about how data and engineering excellence can help us get there, you will love Schwarz Media Platform.
What you´ll do
- Work in a cross-functional product team to design and implement data centered features for Europe’s largest Ad Network
- Help to scale our data stores, data pipelines and ETLs handling terabytes of one of the largest retail companies
- Design and implement efficient data processing workflows
- Extend our reporting platform for external customers and internal stakeholders to measure advertising performance
- Continue to develop our custom data processing pipeline and continuously search for ways to improve our technology stack along our increasing scale
- Work with machine learning engineers and software engineers, to build and integrate fully automated and scalable reporting, targeting and ML solutions
- You will work in a fully remote setup but you will meet your colleagues in person in the company and engineering specific onsite events
What you’ll bring along
5+ years of professional experience working on data-intensive applicationsFluency with Python and good knowledge of SQLExperience with developing scalable data pipelines with Apache SparkGood understanding of efficient algorithms and know-how to analyze themCuriosity about how databases and other data processing tools work internallyFamiliarity with gitAbility to write testable and maintainable code that scalesExcellent communication skills and a team-player attitudeGreat if you also have
Experience with KubernetesExperience with Google Cloud PlatformExperience with Snowflake, Big Query, Databricks and DataProcKnowledge of columnar databases and file formats like Apache ParquetKnowledge of "Big Data" technologies like Delta LakeExperience with workflow management solutions like Apache AirflowAffinity for Data Science tasks to prototype Reporting and ML solutionsKnowledge of Dataflow / Apache BeamWe look forward to receiving your application.
Schwarz Dienstleistung KG
Larissa BlümichReference no. 43709Stiftsbergstraße 1
74172 Neckarsulm, Germanywww.careers.it.schwarz