
Software Engineer III Data Engineer - Databricks, AWS, Python
at J.P. Morgan
Posted a month ago
No clicks
- Compensation
- Not specified
- City
- Bengaluru
- Country
- India
Currency: Not specified
Join JPMorgan Chase as a Databricks Developer within the Data Engineering team to design, build, and migrate data workflows to cloud platforms (Databricks/AWS). You will develop and deploy scalable ETL processes, optimize data pipelines, ensure data integrity/security, and support production issues. The role requires collaboration with cross-functional teams and partnering with finance to surface analytics for financial trends. Strong Python, Spark, SQL, and Databricks expertise are essential.
Location: Bengaluru, Karnataka, India
Unlock the power of data with our expert Databricks Developer, transforming complex datasets into actionable insights with seamless efficiency. Elevate your business intelligence and drive innovation through cutting-edge data engineering solutions.
As a Databricks Developer at JPMorgan Chase within our Data Engineering team, you will be a pivotal player in our cloud transformation journey.
Job responsibilities
- Collaborate with cross-functional teams to understand business requirements and design cloud-based solutions.
- Lead the migration of existing SQL databases/ETL Layer and applications to Databricks and other cloud platforms.
- Drive the development and deployment of process using DBX framework.
- Develop, test, and deploy scalable and efficient cloud applications.
- Optimize data processing workflows while ensuring data integrity and security.
- Provide technical guidance and support to team members and stakeholders.
- Stay abreast of the latest cloud technologies and best practices.
- Support production tasks and resolve issues.
- Partner with finance teams to develop and optimize data mining and analytics for financial trends and initiatives.
Required Qualifications, Capabilities, and Skills:
- Formal training or certification on software engineering concepts and 3+ years applied experience
- Proficiency with hands on experience in programming languages such as Python and Big Data technologies like Spark, Kafka, etc
- Strong expertise in Databricks and cloud platforms such as AWS Cloud.
- Proven experience in SQL database management and development.
- Experience with data integration, ETL processes, and data warehousing.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills. Ability to work independently and efficiently.
Preferred Qualifications, Capabilities, and Skills:
- Certification in cloud technologies (e.g., AWS Certified Solutions Architect).
- Knowledge on ETL technologies like Abinitio/Inoformatica/Prophecy would be a plus.
- Knowledge of DevOps practices and CI/CD pipelines, containerization - docker, Kubernetes.
- Databricks certified Data Engineer or equivalent ones.
- Strong financial and business analytical skills.




