Snowflake Data Engineer
Weekday
Remote
Remote
Full Time
This role is for one of the Weekday's clients
Min Experience: 3 years
Location: Remote (India)
JobType: full-time
We are seeking a highly motivated and detail-oriented Snowflake Data Engineer to join our data engineering team. The ideal candidate will have a strong background in Snowflake, DBT, ETL processes, and modern Lakehouse architecture, along with hands-on experience in SnapLogic and Python. You will be responsible for designing, developing, and optimizing data pipelines that support business intelligence, analytics, and reporting needs across the organization. This role requires a blend of technical expertise, problem-solving skills, and the ability to work collaboratively in a fast-paced environment.
Requirements
Key Responsibilities
- Data Modeling & Warehousing: Design, implement, and optimize scalable data models and data warehouses using Snowflake to meet analytical and business requirements.
- ETL Development: Build, maintain, and optimize ETL/ELT pipelines with SnapLogic, DBT, and Python, ensuring efficient ingestion and transformation of structured and unstructured data.
- Lakehouse Architecture: Contribute to the development of a modern Lakehouse architecture, enabling unified data storage and advanced analytics.
- Performance Optimization: Monitor, troubleshoot, and fine-tune queries, pipelines, and workflows to ensure high performance and reliability.
- Automation & Integration: Develop reusable frameworks for automation of data workflows, orchestration, and integration with enterprise systems.
- Data Quality & Governance: Implement data validation, error handling, and monitoring mechanisms to ensure data accuracy, integrity, and compliance with governance policies.
- Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand requirements and deliver solutions that meet organizational goals.
- Documentation & Best Practices: Maintain clear documentation of designs, workflows, and processes while advocating best practices for data engineering.
Required Skills & Qualifications
- 3 – 5 years of professional experience in data engineering or data warehousing roles.
- Strong expertise in Snowflake, including performance tuning, query optimization, and warehouse management.
- Proficiency in DBT (Data Build Tool) for modeling, transformations, and version-controlled data workflows.
- Hands-on experience with ETL/ELT pipeline design using SnapLogic or similar tools.
- Solid understanding of Lakehouse architecture and modern data platform concepts.
- Strong programming skills in Python for data transformation, automation, and API integrations.
- Knowledge of SQL and advanced query writing for analytics and reporting.
- Familiarity with data governance, data security, and compliance frameworks.
- Strong problem-solving skills with the ability to analyze complex data challenges and propose effective solutions.
- Excellent communication and collaboration skills, with the ability to work in cross-functional teams.
Preferred Qualifications
- Experience with cloud platforms such as AWS, Azure, or GCP.
- Exposure to orchestration tools such as Airflow, Prefect, or Dagster.
- Familiarity with CI/CD pipelines for data engineering.
- Experience working in an agile, fast-paced product or data-driven organization.
Snowflake Data Engineer
Remote
Remote
Full Time
August 29, 2025