Job description
Payla operates a cutting-edge Buy Now, Pay Later platform tailored for Payment Providers and Financial Institutions. We empower our partners to deliver seamless invoice and installment payment solutions, meeting the increasing demand for fast and frictionless transactions.
On this journey, we tackle exciting challenges in areas like risk management, payment processing, and accounts receivable management — all while prioritizing an exceptional customer experience. If you’re passionate about being part of an innovative and dynamic environment and eager to collaborate on shaping the future with us, your next opportunity is here!
Join our team! We’re excited to welcome a (Senior) Data Engineer (all genders) to our Data Platform team.
This position is – in the spirit of Payla – remote first.
What you will do :
- You will create, own, maintain, and develop our Deltalake Data Warehouse Platform based on Dagster and Databricks using Python, Spark and Polars to ensure its robustness, scalability, and extensibility
- You will act as an important link between our DevOps, Data Science, Analysts, and software engineers
- Your ETL processes will make sure that reliable data is available to our data consumers, such as Risk Analysts, Data Scientists, and other departments, as quickly and efficiently as possible
- You will build the foundations for our Data Scientists to bring our Machine Learning models to the next level
- You will enjoy making your code as efficient and readable as possible
- You will take ownership of the quality of data being processed
- You will contribute your own ideas to make the flow of our data even better
Job requirements
What you will bring to the team :
You have a bachelor’s degree in computer science or a related field (or good arguments, why this was a waste of time)You have at least two years of experience as a Data engineer or in a similar roleYou are comfortable working with Python, Containers, and CI infrastructureYou have working experience with scheduling tools such as Dagster or Apache AirflowYou have experience with data modeling, ETLs, data warehousing and SQLYou have already crafted data solutions and deployed data pipelines in productionExperience with building reports and using reporting tools (e.g. Databricks, Apache Superset, Looker, Tableau) is a plusHaving knowledge in AWS, Kubernetes, and DevOps principles is a plusYou are fluent in both written and spoken English, German is a plusYou are a team player with a caring personalityFor us it is most important that you bring in the right engineering mindset and that you are curious on tackling upcoming architectural challenges while being able to solve the details on your own or in consultation with our Software EngineersWhat Payla offers :
Team : An encouraging, passionate and supportive team environmentFlexibility : We will create a professional work environment for you at your remote locationVacation : 30 days paid per yearStart-up Spirit : Exciting challenges from Day 1 with focus on results, mixed with less hierarchy and less meetings (as well as less bullshit-bingo)