
About this role
Position Title: Senior Data Engineer
Base Salary: $114,668 to $154,216 annually DOE
Benefits: Medical, dental, vision, 401k, flexible spending account, paid sick leave and paid time off, parental leave, quarterly performance bonus, training, career growth and education reimbursement programs.
Ziply Fiber is a local internet service provider dedicated to elevating the connected lives of the communities we serve. We offer the fastest home internet in the nation, a refreshingly great customer experience, and affordable plans that put customers in charge.
As our state-of-the-art fiber network expands, so does our need for team members who can help us grow and realize our goals.
Our Company Values:
- Genuinely Caring: We treat customers and colleagues like neighbors, with empathy and full attention.
- Empowering You: We help customers choose what is best for them, and we support employees in implementing new ideas and solutions.
- Innovation and Improvement: We constantly seek ways to improve how we serve customers and each other.
- Earning Your Trust: We build trust through clear, honest, human communication.
Job Summary
The Senior Data Engineer will be responsible for designing, building, and maintaining scalable data pipelines, data models, and infrastructure that support business intelligence, analytics, and operational data needs. This role involves working with various structured and unstructured data sources, optimizing data workflows, and ensuring high data reliability and quality. The ideal candidate will be proficient in modern data engineering tools and cloud platforms bringing innovative solutions to a fast-paced and diverse data infrastructure.
Essential Duties and Responsibilities:
The Essential Duties and Responsibilities listed below are a range of duties performed by the employee and not intended to reflect all duties performed.
Data Pipeline Engineering & Automation
· Design, develop, and maintain scalable data pipelines for ingestion, transformation, and storage of large datasets.
· Troubleshoot and resolve data pipeline and ETL failures, implementing robust monitoring and alerting systems.
· Automate data workflows increase efficiency and reduce manual intervention.
Data Infrastructure, Modeling & Governance
· Optimize data models for analytics and business intelligence reporting.
· Build and maintain data infrastructure, ensuring performance, reliability, and scalability.
· Implement best practices for data governance, security, and compliance.
· Work with structured and unstructured data, integrating data from various sources including databases, APIs, and streaming platforms.
Cross‑Functional Collaboration, Leadership & Documentation
· Collaborate with data analysts, data scientists, and business stakeholders to understand data needs and design appropriate solutions.
· Mentor and train junior engineers, fostering a culture of learning and innovation.
· Develop and maintain documentation for data engineering processes and workflows.
Other Duties
· Performs other duties as required to support the business and evolving organization .
Required Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- Minimum of eight (8) years of experience in data engineering, ETL development, or related fields.
- Strong proficiency in SQL and database technologies (PostgreSQL, MySQL, Oracle, SQL Server, etc.).
- Familiarity with Linux/Unix and scripting technologies utilized on them.
- Proficiency in programming languages such as Python for data engineering tasks.
- Hands-on experience with cloud platforms such as Microsoft Azure and its data services such as Azure Data Factory and Azure Synapse Analytics,
- Experience working with data warehouses such as Snowflake or Azure SQL Data Warehouse.
- Familiarity with workflow automation tools such as Autosys.
- Knowledge of data modeling, schema design, and data architecture best practices.
- Strong understanding of data governance, security, and compliance standards.
- Ability to work independently in a remote environment across different time zones and collaborate effectively across teams.
- Exposure to GraphQL and RESTful APIs for data retrieval and integration.
- Familiarity with NoSQL databases such as MongoDB.
- Experience with version control software such as GitLab.
Preferred Qualifications:
· Proven aptitude for independently managing complex procedures, even when encountered infrequently.
- Proactive approach to learning and optimizing operational workflows.
- Familiarity with DevOps practices and CI/CD pipelines for data engineering, including Azure DevOps.
- Proficient in designing, writing, and maintaining complex stored procedures and stored procedure–based ETL workflows for robust data processing.
· Comfortable working in complex ecosystems with heterogeneous data sources and diverse end-user requirements, adapting solutions to fit unique contexts.
· Working knowledge of data wrangling and ETL tools, including Alteryx or similar technologies.
- Understanding of data privacy regulations such as GDPR and CCPA.
Knowledge, Skills, and Abilities:
- Strong problem-solving and analytical skills.
- Ability to manage multiple priorities and work in a fast-paced environment.
- Excellent verbal and written communication skills.
- Ability to translate business requirements into scalable technical solutions.
- Strong attention to detail and a commitment to data quality.
- Ability to work with Agile methodologies and tools such as Jira, Confluence, and Azure DevOps.
- Strong collaboration skills with cross-functional teams including product managers, software engineers, and business analysts.
Work Authorization
Applicants must be currently authorized to work in the US for any employer. Sponsorship is not available for this position.
Physical Requirements
The physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Essential and marginal functions may require maintaining physical condition necessary for bending, stooping, sitting, walking, or standing for prolonged periods of time; most of time is spent sitting in a comfortable position with frequent opportunity to move about. The employee must occasionally lift and/or move up to 25 pounds. Specific vision abilities required by the job include close vision, distance vision, color vision, peripheral vision, depth perception, and the ability to adjust focus.
Work Environment
Work is performed in an office setting with exposure to computer screens and requires extensive use of a computer, keyboard, mouse, and multi-line telephone system. The work is primarily a modern office setting.
At all times, Ziply Fiber must be your primary employer. Unless otherwise prohibited by law, employees may not hold outside employment nor be self-employed without obtaining approval in writing from Ziply Fiber. In holding outside employment or self-employment, employees should ensure that participation does not conflict with responsibilities to Ziply Fiber or its business interests.
Diverse Workforce / EEO:
Ziply Fiber requires a pre-employment background check as conditions of employment. Ziply Fiber may require a pre-employment drug screening.
Ziply Fiber is a drug free workplace.