Software Engineer II, PySpark, Databricks
JPMorgan Chase & Co.
Office
Hyderabad, Telangana, India
Full Time
We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.
As a Software Engineer II at JPMorgan Chase within the Corporate technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.
Job responsibilities
- Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
- Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development
- Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems
- Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture
- Contributes to software engineering communities of practice and events that explore new and emerging technologies
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 2+ years applied experience
- Over 1+ years of practical experience with Spark, SQL, Databricks, and the AWS cloud ecosystem.
- Expertise in Apache NiFi, Lakehouse/Delta Lake architectures, system design, application development, testing, and ensuring operational stability.
- Strong programming skills in PySpark and SparkSQL.
- Proficient in orchestration using Airflow.
- In-depth knowledge of Big Data and data warehousing concepts.
- Experience with CI/CD processes.
- Solid understanding of agile methodologies, including DevOps practices, application resiliency, and security measures.
- Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and Security
- Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)
Preferred qualifications, capabilities, and skills
- Familiarity with Snowflake, Terraform, and LLM.
- Exposure to cloud technologies such as AWS Glue, S3, SQS, SNS, Lambda, etc.
- AWS certifications such as SAA, Associate Developer, Data Analytics Specialty, or Databricks certification
Software Engineer II, PySpark, Databricks
Office
Hyderabad, Telangana, India
Full Time
August 18, 2025