
Data Engineer II, Glass Liquidity and Cash Management
at J.P. Morgan
Posted 11 days ago
No clicks
- Compensation
- Not specified SGD
- City
- Singapore
- Country
- Singapore
Currency: $ (SGD)
Join JPMorgan Chase's Corporate and Investment Banking Technology team as a Data Engineer II in Glass Liquidity and Cash Management in Singapore. You will design, build, and maintain robust data pipelines for extraction, transformation, and ingestion of large-scale datasets into a lakehouse, and develop data workflows using Java, Spring Boot, Apache Flink, Python, PySpark, and Databricks. You will integrate streaming data with Kafka and Flink, build Databricks pipelines for complex BI data, ensure data quality, security, and governance, monitor performance, and document data flows and architectures.
Location: Singapore
You thrive on diversity and creativity, and we welcome individuals who share our vision of making a lasting impact. Your unique combination of design thinking and experience will help us achieve new heights.
Job responsibilities
- Design, develop, and maintain robust data pipelines for data extraction, transformation, and ingestion of large-scale datasets into a lakehouse
- Build and optimize data workflows using Java, Spring Boot, Apache Flink, Python, PySpark and Databricks.
- Integrate and process streaming data using Kafka and Apache Flink
- Implement databricks pipelines that build complex business data aggregates for BI reporting and analytics.
- Ensure data quality, integrity, and security across all data platforms.
- Monitor, troubleshoot, and optimize data pipeline performance.
- Document data flows, architecture, and processes.
Required qualifications, capabilities, and skills
- Bachelor’s degree in computer science, Engineering, or related field
- Formal training or certification on software engineering concepts with 2+ years of experience
- Strong hands-on Experience in Java, Apache Flink, Python or PySpark.
- Strong hands-on experience in writing complex SQL/PLSQL
- Proficient in Kafka for real-time data streaming.
- Good understanding of data extraction, ingestion, and ETL processes.
- Understanding of Lakehouse Architecture and working experience on Databricks platform would be advantage.
- Experience building dashboards using Tableau would be an advantage.
- Familiarity with Cloud-based storage technologies like AWS S3.
- Excellent problem-solving and communication skills.





