Senior Data Engineer (Databricks, Python, AWS/Azure)
Endava.com
Office
Chișinău, Chisinau, Moldova, republic of
Full Time
Company Description
Technology is our how. And people are our why. For over two decades, we have been harnessing technology to drive meaningful change.
By combining world-class engineering, industry expertise and a people-centric mindset, we consult and partner with leading brands from various industries to create dynamic platforms and intelligent digital experiences that drive innovation and transform businesses.
From prototype to real-world impact - be part of a global shift by doing work that matters.
Job Description
We are seeking a skilled Senior Data Engineer to join our team in Chișinău, Moldova. In this role, you will be responsible for designing, implementing, and maintaining robust data infrastructure and pipelines to support our organization's data-driven decision-making processes.
- Design, develop, and maintain scalable data pipelines using Databricks and Python and cloud technologies (AWS, Azure and/or GCP)
- Design, implement, and maintain complex data engineering solutions to acquire and prepare data.
- Create and maintain data pipelines to connect data within and between data stores, applications and organizations
- Carry out complex data quality checking and remediation
- Optimize data storage and retrieval systems for improved performance and efficiency
- Collaborate with data scientists and analysts to understand data requirements and provide efficient solutions
- Ensure data quality, integrity, and security throughout the data lifecycle
- Develop and maintain documentation for data processes and architectures
- Stay up-to-date with emerging technologies and best practices in data engineering
- Troubleshoot and resolve data-related issues in a timely manner
Qualifications
- Bachelor's degree in Computer Science, Engineering, or a related field
- 3+ years of experience in Databricks data engineering or similar role
- Strong experience in Python programming language, Python based pipeline jobs, Apache Spark, PySpark
- Knowledge of AWS, Azure and/or cloud platforms
- Expertise in SQL and NoSQL databases (MongoDB)
- Proficiency in designing and implementing ETL processes
- Solid Data Lake/Data Warehouse principles, techniques and technologies - Star Schema, SQL (data formats Apache Iceberg, Parquet, Avro)
- Knowledge of` and experience working with of APIs (designing with OpenAPI is desirable) and web services, CI/CD pipelines
- Some knowledge of Kubernetes and cloud native practices, containerized workloads with tools such as Docker
- Experience of modern software and data engineering patterns, including those used in highly scalable, distributed, and resilient systems
- Experience developing microservices-based architectures, including distributed messaging patterns is a plus
- Experience of high volume IoT domain would be great but not essential
- Knowledge of data security and compliance best practices
- Excellent problem-solving and analytical skills
- Strong communication and teamwork abilities
- Relevant cloud certifications (e.g., Databricks, AWS Certified Data Analytics, Azure Data Engineer Associate, Snowflake) are a plus
Additional Information
Discover some of the global benefits that empower our people to become the best version of themselves:
- Finance: Competitive salary package, share plan, company performance bonuses, value-based recognition awards, referral bonus;
- Career Development: Career coaching, global career opportunities, non-linear career paths, internal development programmes for management and technical leadership;
- Learning Opportunities: Complex projects, rotations, internal tech communities, training, certifications, coaching, online learning platforms subscriptions, pass-it-on sessions, workshops, conferences;
- Work-Life Balance: Hybrid work and flexible working hours, employee assistance programme;
- Health: Global internal wellbeing programme, access to wellbeing apps;
- Community: Global internal tech communities, hobby clubs and interest groups, inclusion and diversity programmes, events and celebrations.
At Endava, we’re committed to creating an open, inclusive, and respectful environment where everyone feels safe, valued, and empowered to be their best. We welcome applications from people of all backgrounds, experiences, and perspectives—because we know that inclusive teams help us deliver smarter, more innovative solutions for our customers. Hiring decisions are based on merit, skills, qualifications, and potential. If you need adjustments or support during the recruitment process, please let us know.
Senior Data Engineer (Databricks, Python, AWS/Azure)
Office
Chișinău, Chisinau, Moldova, republic of
Full Time
October 16, 2025