Data Engineer
paiqo GmbH.com
Hybrid
Remote job
Full Time
As a Data Platform Engineer with a focus on data engineering, you will be responsible for building and operating modern data platforms. You will develop scalable data pipelines and ETL/ELT processes on Microsoft Azure to make large amounts of data from different sources usable
You develop stable batch and streaming pipelines with Azure Data Factory/Databricks on Microsoft Azure. Best practices such as Delta Lake, automated workflows and VNet-Securing ensure reliability. You will take responsibility for sub-projects in data integration, e.g. setting up a data path from operational systems in a data lakehouse. You will implement and optimize ETL/ELT routes (with Azure Data Factory, Databricks/Spark etc.) and ensure data quality. Use Azure data services for storage and processing (Data Lake, Azure SQL, Databricks, MS Fabric) efficiently. Participates in setting up CI/CD pipelines (Azure DevOps or GitHub) for automated deployments and collaborates closely with Data Scientists and Analytics teams to provide data for analytics and ML
Requirements
- 2-4 years of experience in data engineering or data platform development
- Solid knowledge of SQL and programming (Python or Scala) as well as experience with Azure Data Services (e.g. Azure Data Factory, Azure Databricks, Synapse)
- Familiarity with data modeling (e.g. Star Schema, Kimball) and data platform monitoring & performance optimization.
- Experience in dealing with version management (Git) and DevOps practices (Continuous Integration, Infrastructure as Code).
- Communication & Collaboration: Working with data scientists, analysts and clients and translating technical concepts for non-technical people
- Problem solving & analytical thinking: You optimize data streams, identify bottlenecks and find creative solutions
- Language skills: German fluent, English good.
Data Engineer
Hybrid
Remote job
Full Time
September 22, 2025