company logo

Manager (AI Hub GTS)

KPMG India.com

Office

Bangalore, Karnataka, India

Full Time

Roles & Responsibilities

·Lead the design, development, and implementation of scalable data engineering solutions using platforms such as Databricks and Microsoft Fabric, aligning with enterprise architecture and business goals. ·Own the architectural vision for product and application development, ensuring alignment with organizational strategy and technical standards. ·Drive innovation by evaluating emerging technologies and integrating them into solution roadmaps. ·Establish and enforce coding standards and best practices across teams through structured code reviews and technical mentoring. ·Oversee the estimation process for solution development efforts, ensuring accuracy and alignment with delivery timelines and resource planning. ·Ensure comprehensive documentation of solutions, including technical specifications, testing protocols, and datasets, to support maintainability and audit readiness. ·Provide technical leadership and guidance to cross-functional teams, fostering a culture of excellence and continuous improvement. ·Collaborate with audit professionals and business stakeholders to understand regulatory, risk, and operational requirements, ensuring solutions are compliant and value-driven. ·Facilitate knowledge sharing through team meetings, brainstorming sessions, and technical workshops. ·Champion best practices in data engineering, architecture design, testing, and documentation to ensure high-quality deliverables. ·Stay hands-on with critical aspects of system and model design, development, and validation to ensure robustness and scalability. ·Monitor and optimize performance of deployed systems, proactively identifying areas for improvement. ·Lead initiatives within the Data Engineering and Architecture practice area, contributing to capability building, asset development, and strategic growth.

Stay abreast of industry trends and advancements to maintain a competitive edge and drive continuous innovation.

Mandatory  technical & functional skills

Strong understanding of MPP databases and RDBMS fundamentals Hands-on experience with Cloud Platforms (SaaS/PaaS) preferably AZURE Expertise on cloud databases and Datawarehouse EG: AZURE SQL,SYNAPSE etc. Working knowledge with NoSQL databases EG: MongoDB,Cassandra,Redis In-Depth knowledge of SPARK ecosystem and APIs Exposure into Databricks and pySpark Clear understanding of datalakes and data lakehouses Decent understanding of Unity catalog, Delta live tables,MLFlow etc. Exposure into streaming Solid knowledge with building data pipelines using ADF,SYNAPSE,GLUE etc. Familiarity with Event Driven designs and messaging using service bus, Event grid Exposure into Serverless orchestrators EG: LogicApp,Function App,Airflow etc.

Familiarity with CI/CD using Git actions or AZURE devOps

Preferred technical & functional skills

Backend frameworks - FastAPI, Django Machine learning frameworks -TensorFlow, PyTorch

*     Restapi

Experience working with frameworks LangChain/LlamaIndex /LlamaPrase/LlamaCloud/Semantic Kernel etc. Certifications: Relevant certifications such as Microsoft Certified: AI 102, DP 700, DP 900 or AWS certifications

Key Behavioral Attributes/Requirements

·Strong analytical, problem-solving, and critical-thinking skills ·Excellent collaboration skills, with the ability to work effectively in a team-oriented environment ·Excellent written and verbal communication skills, with the ability to present complex technical concepts to non-technical audiences

Willingness to learn new technologies and work on them

Roles & Responsibilities

·Lead the design, development, and implementation of scalable data engineering solutions using platforms such as Databricks and Microsoft Fabric, aligning with enterprise architecture and business goals. ·Own the architectural vision for product and application development, ensuring alignment with organizational strategy and technical standards. ·Drive innovation by evaluating emerging technologies and integrating them into solution roadmaps. ·Establish and enforce coding standards and best practices across teams through structured code reviews and technical mentoring. ·Oversee the estimation process for solution development efforts, ensuring accuracy and alignment with delivery timelines and resource planning. ·Ensure comprehensive documentation of solutions, including technical specifications, testing protocols, and datasets, to support maintainability and audit readiness. ·Provide technical leadership and guidance to cross-functional teams, fostering a culture of excellence and continuous improvement. ·Collaborate with audit professionals and business stakeholders to understand regulatory, risk, and operational requirements, ensuring solutions are compliant and value-driven. ·Facilitate knowledge sharing through team meetings, brainstorming sessions, and technical workshops. ·Champion best practices in data engineering, architecture design, testing, and documentation to ensure high-quality deliverables. ·Stay hands-on with critical aspects of system and model design, development, and validation to ensure robustness and scalability. ·Monitor and optimize performance of deployed systems, proactively identifying areas for improvement. ·Lead initiatives within the Data Engineering and Architecture practice area, contributing to capability building, asset development, and strategic growth.

Stay abreast of industry trends and advancements to maintain a competitive edge and drive continuous innovation.

Mandatory  technical & functional skills

Strong understanding of MPP databases and RDBMS fundamentals Hands-on experience with Cloud Platforms (SaaS/PaaS) preferably AZURE Expertise on cloud databases and Datawarehouse EG: AZURE SQL,SYNAPSE etc. Working knowledge with NoSQL databases EG: MongoDB,Cassandra,Redis In-Depth knowledge of SPARK ecosystem and APIs Exposure into Databricks and pySpark Clear understanding of datalakes and data lakehouses Decent understanding of Unity catalog, Delta live tables,MLFlow etc. Exposure into streaming Solid knowledge with building data pipelines using ADF,SYNAPSE,GLUE etc. Familiarity with Event Driven designs and messaging using service bus, Event grid Exposure into Serverless orchestrators EG: LogicApp,Function App,Airflow etc.

Familiarity with CI/CD using Git actions or AZURE devOps

Preferred technical & functional skills

Backend frameworks - FastAPI, Django Machine learning frameworks -TensorFlow, PyTorch

*     Restapi

Experience working with frameworks LangChain/LlamaIndex /LlamaPrase/LlamaCloud/Semantic Kernel etc. Certifications: Relevant certifications such as Microsoft Certified: AI 102, DP 700, DP 900 or AWS certifications

Key Behavioral Attributes/Requirements

·Strong analytical, problem-solving, and critical-thinking skills ·Excellent collaboration skills, with the ability to work effectively in a team-oriented environment ·Excellent written and verbal communication skills, with the ability to present complex technical concepts to non-technical audiences

Willingness to learn new technologies and work on them

This role is for you if you have  the below

Educational Qualifications

Minimum qualification required: BTech in Computer Science/ MTech/ MCA - Fulltime education.

Work Experience

7-9 years of experience in design, develop data centric applications using various tools and technologies e.g. Databases, reporting, ETL, NoSQL etc. 5+ years of experience in designing, architecting solutions using Microsoft Data technologies like ADF/SYNAPSE Relevant Data Professional certifications – Databricks, AWS, GCP or Azure

Manager (AI Hub GTS)

Office

Bangalore, Karnataka, India

Full Time

September 1, 2025

company logo

KPMG India