company logo

Data Engineer, Pune

Regnology.com

Office

Pune

Full Time

What You'Ll Do

We are seeking an experienced Data Engineer. The role involves designing ingestion pipelines, optimizing query performance, and ensuring data quality, governance, and cost efficiency at scale (50–300 TB workloads).

Key Responsibilities

  • Migration Strategy & Execution
  • Design and implement data ingestion pipelines to extract data from Oracle into GCS/Iceberg.
  • Migrate and modernize existing Oracle schemas, partitions, and materialized views into Iceberg tables.
  • Define CDC (Change Data Capture) strategies using custom ETL.
  • Data Lakehouse Architecture
  • Configure and optimize Trino clusters (coordinator/worker, Helm charts, autoscaling).
  • Design partitioning, compaction, and clustering strategies for Iceberg tables.
  • Implement schema evolution, time-travel, and versioning capabilities.
  • Performance & Cost Optimization
  • Benchmark Trino query performance vs Oracle workloads.
  • Tune Trino/Iceberg for large-scale analytical queries, minimizing query latency and storage costs.
  • Data Quality, Metadata & Governance
  • Integrate Iceberg datasets with metadata/catalog services (Postgre/Hive Metastore, or Glue).
  • Ensure compliance with governance, observability, and lineage requirements.
  • Define and enforce standards for unit testing, regression testing, and data validation.
  • Collaboration & Delivery
  • Support existing reporting workloads (regulatory reporting, DWH) during and after migration.
  • Document architecture, migration steps, and provide knowledge transfer.
  • Design and implement data ingestion pipelines to extract data from Oracle into GCS/Iceberg.
  • Migrate and modernize existing Oracle schemas, partitions, and materialized views into Iceberg tables.
  • Define CDC (Change Data Capture) strategies using custom ETL.
  • Configure and optimize Trino clusters (coordinator/worker, Helm charts, autoscaling).
  • Design partitioning, compaction, and clustering strategies for Iceberg tables.
  • Implement schema evolution, time-travel, and versioning capabilities.
  • Benchmark Trino query performance vs Oracle workloads.
  • Tune Trino/Iceberg for large-scale analytical queries, minimizing query latency and storage costs.
  • Integrate Iceberg datasets with metadata/catalog services (Postgre/Hive Metastore, or Glue).
  • Ensure compliance with governance, observability, and lineage requirements.
  • Define and enforce standards for unit testing, regression testing, and data validation.
  • Support existing reporting workloads (regulatory reporting, DWH) during and after migration.
  • Document architecture, migration steps, and provide knowledge transfer.

Why We Should Decide On You

  • 5 + years of experience
  • Prior experience migrating financial/regulatory datasets.
  • Experience with Regulatory Reporting or similar enterprise workloads.
  • Familiarity with large-scale performance benchmarking and cost modelling.
Required Skills & Experience
  • Core Expertise:
  • Strong hands-on experience with Trino/Presto, Apache Iceberg, and Oracle SQL/PLSQL.
  • Proven experience with data lakehouse migrations at scale (50 TB+).
  • Proficiency in Parquet formats.
  • Programming & Tools:
  • Solid coding skills in Java, Scala, or Python for ETL/ELT pipeline development.
  • Experience with orchestration (Spark).
  • Familiarity with CDC tools, JDBC connectors, or custom ingestion frameworks.
  • Cloud & DevOps:
  • Strong background in GCP (preferred) or AWS/Azure cloud ecosystems.
  • Experience with Kubernetes, Docker, Helm charts for deploying Trino workers.
  • Knowledge of CI/CD pipelines and observability tools.
  • Soft Skills:
  • Strong problem-solving mindset with ability to manage dependencies and shifting scopes.
  • Clear documentation and stakeholder communication skills.
  • Ability to work in tight delivery timelines with global teams.
  • Strong hands-on experience with Trino/Presto, Apache Iceberg, and Oracle SQL/PLSQL.
  • Proven experience with data lakehouse migrations at scale (50 TB+).
  • Proficiency in Parquet formats.
  • Solid coding skills in Java, Scala, or Python for ETL/ELT pipeline development.
  • Experience with orchestration (Spark).
  • Familiarity with CDC tools, JDBC connectors, or custom ingestion frameworks.
  • Strong background in GCP (preferred) or AWS/Azure cloud ecosystems.
  • Experience with Kubernetes, Docker, Helm charts for deploying Trino workers.
  • Knowledge of CI/CD pipelines and observability tools.
  • Strong problem-solving mindset with ability to manage dependencies and shifting scopes.
  • Clear documentation and stakeholder communication skills.
  • Ability to work in tight delivery timelines with global teams.

Why You Should Decide On Us

  • Let’s grow together, join a market leading SaaS company – our agile character and culture of innovation enables you to design our future.
  • We provide you with the opportunity to take on responsibility and participate in international projects. 
  • In addition to our buddy-program, we offer numerous individual and wide-ranging training opportunities during which you can explore technical and functional areas. 
  • Our internal mobility initiative encourages colleagues to transfer cross functionally to gain experience and promotes knowledge sharing.
  • We are proud of our positive working atmosphere characterized by a supportive team across various locations and countries and transparent communication across all levels. 
  • Together we're better - meet your colleagues at our numerous team events.
To get a first impression, we only need your CV and look forward to meeting you in a (personal/virtual) interview! 
Recognizing the benefits of working in diverse teams, we are committed to equal employment opportunities regardless of gender, age, nationality, ethnic or social origin, disability, and sexual identity. 
Are you interested? Apply now! 
https://www.regnology.net

R&D_N_2025_02
R&D_N_2025_03

Data Engineer, Pune

Office

Pune

Full Time

October 3, 2025

company logo

Regnology

regnologygroup