GCP Data Engineer | Cloud Data Pipelines, Python, BigQuery, Agile Methodology
Synechron.com
Office
Hyderabad, India
Full Time
Job Summary
Synechron is seeking a highly skilled GCP Data Engineer to join our dynamic team. In this role, you will be responsible for designing, developing, and maintaining data solutions leveraging Google Cloud Platform (GCP) technologies. Your expertise will support the organization's data infrastructure, enabling scalable analytics, data processing, and integration capabilities. This position plays a vital role in ensuring data quality, security, and performance, contributing directly to the organization’s strategic data initiatives.
Software Requirements
Required:
- Proficiency with Google Cloud Platform (GCP) services such as BigQuery, Cloud Dataflow, Cloud Storage, Cloud Composer, Pub/Sub, and Cloud SQL (minimum GCP/Google Cloud experience of 2+ years).
- Programming skills in Python and/or Java (experience of 2+ years).
- Familiarity with Data Engineering tools like Apache Beam, Apache Airflow, and Data Studio (or other visualization tools).
- Experience with version control systems such as Git and project management tools like JIRA and Confluence.
- Understanding of software development life cycle (SDLC) and Agile methodologies.
Preferred:
- Knowledge of other cloud platforms (AWS, Azure) is a plus.
- Experience with containerization (Docker, Kubernetes).
- Familiarity with IoT, mobile, or blockchain data integrations.
- Proficiency with Google Cloud Platform (GCP) services such as BigQuery, Cloud Dataflow, Cloud Storage, Cloud Composer, Pub/Sub, and Cloud SQL (minimum GCP/Google Cloud experience of 2+ years).
- Programming skills in Python and/or Java (experience of 2+ years).
- Familiarity with Data Engineering tools like Apache Beam, Apache Airflow, and Data Studio (or other visualization tools).
- Experience with version control systems such as Git and project management tools like JIRA and Confluence.
- Understanding of software development life cycle (SDLC) and Agile methodologies.
- Knowledge of other cloud platforms (AWS, Azure) is a plus.
- Experience with containerization (Docker, Kubernetes).
- Familiarity with IoT, mobile, or blockchain data integrations.
Overall Responsibilities
- Collaborate with cross-functional teams to gather and analyze data requirements, translating them into scalable cloud-based solutions.
- Design, develop, and optimize data pipelines and workflows within GCP to support analytics, reporting, and data science initiatives.
- Create and maintain detailed technical documentation, including architecture diagrams, data models, and processes.
- Conduct code reviews to uphold code quality, security, and maintainability standards.
- Monitor and troubleshoot data workflows, resolving technical issues promptly to ensure system stability.
- Stay updated on emerging cloud and data engineering trends, recommending improvements and innovative solutions.
- Collaborate with stakeholders to ensure data solutions align with business objectives and compliance standards.
Technical Skills (By Category)
Programming Languages (Required):
- Python, Java, or Node.js (minimum 2 years hands-on experience)
- Experience with Google BigQuery, Cloud SQL, or similar RDBMS and NoSQL databases
- Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer) — required
- Experience with other cloud providers is advantageous
- Apache Beam, Apache Airflow (used within GCP), Data Studio or similar visualization tools
- Familiarity with Agile/Scrum workflows
- Basic understanding of cloud security principles, IAM policies, and data encryption practices
- Minimum of 2-4 years of professional experience in data engineering or cloud-based data solutions.
- Python, Java, or Node.js (minimum 2 years hands-on experience)
- Experience with Google BigQuery, Cloud SQL, or similar RDBMS and NoSQL databases
- Google Cloud Platform (BigQuery, Dataflow, Pub/Sub, Cloud Storage, Cloud Composer) — required
- Experience with other cloud providers is advantageous
- Apache Beam, Apache Airflow (used within GCP), Data Studio or similar visualization tools
- Familiarity with Agile/Scrum workflows
- Basic understanding of cloud security principles, IAM policies, and data encryption practices
- Minimum of 2-4 years of professional experience in data engineering or cloud-based data solutions.
Databases/Data Management:
Cloud Technologies:
Frameworks And Libraries:
Development Tools & Methodologies:
Git, Jira, Confluence
Security Protocols:
Experience Requirements
- Proven experience designing and implementing data pipelines using GCP or comparable cloud platforms.
- Demonstrated ability to work with cross-disciplinary teams to deliver complex data solutions.
- Hands-on experience with Agile methodologies and modern development tools (Git, JIRA).
- Prior experience supporting data-driven decision-making in industries such as finance, healthcare, or technology is desirable but not mandatory.
- Alternative Pathways:
- Candidates with extensive experience in alternative cloud environments (AWS, Azure) and proven ability to adapt to GCP are encouraged to apply.
Day-To-Day Activities
- Participate in daily stand-up meetings and sprint planning sessions.
- Analyze business data requirements and design cloud-based data pipelines.
- Develop, test, and deploy data processing workflows ensuring efficiency and reliability.
- Conduct peer code reviews, offering constructive feedback to team members.
- Monitor pipeline performance, troubleshoot issues, and optimize workflows.
- Document technical specifications, system architectures, and operational procedures.
- Collaborate with data analysts, data scientists, and business stakeholders to refine data solutions.
- Keep abreast of new GCP features, tools, and best practices, evaluating their applicability.
- Provide technical support and mentorship within the team when needed.
Qualifications
- Bachelor's or Master’s degree in Computer Science, Information Technology, Data Science, or a related discipline.
- Relevant industry certifications such as Google Professional Data Engineer, GCP Associate Cloud Engineer, or similar are preferred.
- Continuous learning through professional training and certification programs is expected.
Professional Competencies
- Strong analytical and problem-solving skills with attention to detail.
- Effective communication abilities for articulating technical concepts to non-technical stakeholders.
- Proven ability to work collaboratively within diverse teams.
- Adaptability to evolving technologies and project requirements.
- Proactive approach to learning and innovation.
- Excellent organizational skills to manage multiple priorities and meet deadlines.
SYnechron’S Diversity & Inclusion Statement
Diversity & Inclusion are fundamental to our culture, and Synechron is proud to be an equal opportunity workplace and is an affirmative action employer. Our Diversity, Equity, and Inclusion (DEI) initiative ‘Same Difference’ is committed to fostering an inclusive culture – promoting equality, diversity and an environment that is respectful to all. We strongly believe that a diverse workforce helps build stronger, successful businesses as a global company. We encourage applicants from across diverse backgrounds, race, ethnicities, religion, age, marital status, gender, sexual orientations, or disabilities to apply. We empower our global workforce by offering flexible workplace arrangements, mentoring, internal mobility, learning and development programs, and more.
All employment decisions at Synechron are based on business needs, job requirements and individual qualifications, without regard to the applicant’s gender, gender identity, sexual orientation, race, ethnicity, disabled or veteran status, or any other characteristic protected by law.
Candidate Application Notice
GCP Data Engineer | Cloud Data Pipelines, Python, BigQuery, Agile Methodology
Office
Hyderabad, India
Full Time
October 8, 2025