
About this role
CES has 26+ years of experience in delivering Software Product Development, Quality Engineering, and Digital Transformation Consulting Services to Global SMEs & Large Enterprises. CES has been delivering services to some of the leading Fortune 500 Companies including Automotive, AgTech, Bio Science, EdTech, FinTech, Manufacturing, Online Retailers, and Investment Banks. These are long-term relationships of more than 10 years and are nurtured by not only our commitment to timely delivery of quality services but also due to our investments and innovations in their technology roadmap. As an organization, we are in an exponential growth phase with a consistent focus on continuous improvement, process-oriented culture, and a true partnership mindset with our customers. We are looking for the right qualified and committed individuals to play an exceptional role as well as to support our accelerated growth. You can learn more about us at: http://www.cesltd.com/
Job Title: Principal Data Engineer
About the Role :
We are looking for a highly experienced Principal Data Engineer to lead the architecture, design, and implementation of a scalable enterprise HCM data platform for K‑12 education clients. In this role, you will guide the end‑to‑end data engineering strategy—from ingestion and transformation pipelines to medallion architecture execution, governance, security, and performance optimization.
You will work closely with product, engineering, BI, and architecture teams, mentor senior engineers, drive best practices, and ensure the platform is built for resilience, scalability, multi‑tenant security, and high‑quality data delivery.
Ideal Candidate Profile:
10+ years of hands-on data engineering experience.
Demonstrated experience in architecting and scaling data platforms for enterprise clients.
Strong leadership, mentoring, and cross-functional collaboration experience.
Experience in fast-paced Agile teams delivering business-critical analytics platforms.
Key Responsibilities
Flexible working hours to create a work-life balance.
Opportunity to work on advanced tools and technologies.
Global exposure to not only collaborate with the team, but also to connect with the client portfolio and build professional relationships.
Highly encouraged for any innovative ideas & thoughts and we support in executing the same.
Periodical and on-spot rewards and recognitions on your performance.
Provides a better platform for enhancing skills via many different L&D programs.
Enabling and empowering atmosphere to work along.
Job Title: Principal Data Engineer
About the Role :
We are looking for a highly experienced Principal Data Engineer to lead the architecture, design, and implementation of a scalable enterprise HCM data platform for K‑12 education clients. In this role, you will guide the end‑to‑end data engineering strategy—from ingestion and transformation pipelines to medallion architecture execution, governance, security, and performance optimization.
You will work closely with product, engineering, BI, and architecture teams, mentor senior engineers, drive best practices, and ensure the platform is built for resilience, scalability, multi‑tenant security, and high‑quality data delivery.
Ideal Candidate Profile:
10+ years of hands-on data engineering experience.
Demonstrated experience in architecting and scaling data platforms for enterprise clients.
Strong leadership, mentoring, and cross-functional collaboration experience.
Experience in fast-paced Agile teams delivering business-critical analytics platforms.
Key Responsibilities
- Technical Leadership
- Define the technical vision, data architecture roadmap, and engineering standards for the enterprise data platform.
- Lead design decisions for ingestion frameworks, medallion architecture, orchestration patterns, and data modeling approaches.
- Provide hands-on technical leadership in Snowflake, SQL, Python, and ETL/ELT frameworks.
- Perform deep technical reviews of pipelines, SQL logic, dbt models, Snowflake objects, and code quality.
- Data Platform Architecture & Pipeline Delivery
Architect, design, and oversee implementation of scalable data ingestion pipelines using CDC/CT from SQL Server and other sources. - Drive the buildout of Bronze → Silver → Gold layers with Snowflake best practices.
Implement enterprise-grade standards for SCD Type 1 and Type 2, incremental loads, and data versioning. - Architect and optimize Snowflake objects (secure views, streams, tasks, stored procedures, warehouses, clustering strategies).
- Ensure the Gold layer is optimized for downstream analytics—Power BI, Tableau, QuickSight, and operational reporting.
- Data Governance, Quality & Security Establish and enforce data governance standards including lineage, documentation, retention, and metadata management.
- Implement Row-Level Security (RLS) and multi-tenant data access patterns at scale.
Establish automated data quality frameworks with validation rules, alerts, and monitoring dashboards. - Define audit, observability, and compliance frameworks across all data layers.
Cross-Functional Collaboration - Work closely with Product, Architecture, BI, and Engineering teams to translate business requirements into scalable data models.
- Collaborate with cloud engineering teams to optimize Snowflake performance, cost, and environment strategy.
- Drive performance tuning strategies at SQL, Snowflake, and pipeline orchestration layers.
- Mentorship & Team Development :
- Mentor Senior/Junior Data Engineers and uplift the team’s technical capabilities.
Conduct technical training sessions, architecture walkthroughs, and code reviews. - Guide team on adopting modern tools (dbt, Snowpark, Airflow, etc.).
Required Skills & Qualifications - Snowflake Expertise (Must Have)
- 5+ years hands-on Snowflake experience with strong expertise in: Warehousing, micro-partitioning, clustering, cost optimization Streams & Tasks automation Secure views, masking policies, RLS implementation ,Query performance tuning
Data Engineering Core Skills - Advanced SQL proficiency (joins, window functions, CTEs, recursion, optimization).
Strong experience implementing medallion architecture at scale. - Expertise in SCD Types 1 & 2, incremental loads, CDC, CT patterns.
Hands-on experience with Snowflake transformations using:
dbt
Snowflake stored procedures & tasks
AWS Glue or equivalent ETL/ELT tools
Programming & Automation
Strong Python skills for automation, data processing, validations, and API integrations.
Data Modeling
Expert-level understanding of data modeling:
Dimensional modeling
Star/Snowflake schemas
Multi-tenant models
Performance-oriented semantic models
SQL Server
Strong understanding of SQL Server source systems, including CDC and Change Tracking.
- Experience with ingestion tools: Snowflake OpenFlow, Fivetran, Airbyte, AWS DMS.
- Exposure to Snowpark, Snowpark Container Services (SPCS), or Python UDFs/UDAFs.
- Familiarity with BI consumption patterns (Power BI, Tableau, QuickSight).
- CI/CD and Git-based workflow experience for data projects.
- Orchestration experience with Airflow, Dagster, Prefect, or AWS Step Functions.
- AWS ecosystem experience: S3, Glue, Lambda, PrivateLink, IAM, VPC networking.
- SnowPro Core
- SnowPro Advanced – Data Engineer (preferred)I
Flexible working hours to create a work-life balance.
Opportunity to work on advanced tools and technologies.
Global exposure to not only collaborate with the team, but also to connect with the client portfolio and build professional relationships.
Highly encouraged for any innovative ideas & thoughts and we support in executing the same.
Periodical and on-spot rewards and recognitions on your performance.
Provides a better platform for enhancing skills via many different L&D programs.
Enabling and empowering atmosphere to work along.