AWS Data Architect
Fractal.com
Office
Bengaluru, India
Full Time
It's fun to work in a company where people truly BELIEVE in what they are doing!
We're committed to bringing passion and customer focus to the business.
AWS Data Architect
Fractal is a strategic AI and analytics partner to Fortune 500 companies, powering human decisions at scale by integrating AI, Engineering, and Design. With 5000+ consultants across 16 global locations, Fractal combines cutting-edge technology with human-centered design.
Responsibilities:
- Consult, Design, build and operationalize large scale enterprise data solutions using one or more of AWS data and analytics services in combination with 3rd parties - Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue, Snowflake and Databricks.
- Analyze, re-architect and re-platform on-premises data stores/ Databases to modern data platforms on AWS cloud using AWS or 3rd party services.
- Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala.
- Design and optimize data models on AWS Cloud using AWS data stores such as Redshift, DynamoDB, RDS, S3
- Design and implement data engineering, ingestion and curation functions on AWS cloud using AWS native or custom programming.
- Perform detail assessments of current state data platforms and create an appropriate transition path to AWS cloud as part of customer consultation and business proposals.
- Participate in client design workshops and provide trade-offs and recommendations towards building solutions.
- Mentor other engineers in coding best practices and problem solving
Requirements:
- 9 to 15 years’ experience in the industry.
- Bachelor’s Degree in computer science, Information Technology or other relevant fields
- Experience and knowledge of Big Data Architectures, on cloud and on premise
- Proficiency in AWS Collection Services: Kinesis, Kafka, Database Migration Service
- Proficiency in AWS main Storage Service: S3, EBS, EFS
- Proficiency in AWS main Compute Service: EC2, Lambda, ECS, EKS
- Proven experience in: Java, Scala, Python, and shell scripting.
- Working experience with: AWS Athena and Glue Pyspark, EMR, DynamoDB, Redshift, Kinesis, Lambda, Apache Spark, Databricks on AWS, Snowflake on AWS
- Proficient in AWS Redshift, S3, Glue, Athena, DynamoDB
- AWS Certification: AWS Certified Solutions Architect and/or AWS Certified Data Analytics
- Working experience with Agile Methodology and Kanban
- Good knowledge of SQL .
- Experience in building and delivering proofs-of-concept, in order to address specific business needs, using the most appropriate techniques, data sources and technologies
- Experience partnering with executive stakeholders as a trusted advisor as well as enabling technical implementers
- Working experience in migrating workloads from on premise to cloud environment
- Experience in monitoring distributed infrastructure, using AWS tools or open-source ones Experience in monitoring distributed infrastructure, using AWS tools or open-source ones such as CloudWatch, Prometheus, and the ELK stack would be big advantage.
If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!
Not the right fit? Let us know you're interested in a future opportunity by clicking Introduce Yourself in the top-right corner of the page or create an account to set up email alerts as new job postings become available that meet your interest!
