
Software Engineer III – Big Data & Analytics
at J.P. Morgan
Posted 19 hours ago
No clicks
- Compensation
- Not specified
- City
- Columbus
- Country
- United States
Currency: Not specified
Senior software engineer role on the Consumer and Community Banking Data Technology team at JPMorgan Chase focused on Big Data and Analytics. Deliver scalable data and analytics solutions using Java, Python, Scala, Spark/PySpark, and AWS across batch and streaming use cases. Responsibilities include data strategy and governance, designing and deploying data platforms, CI/CD and cloud deployments, and applying data warehousing and ETL best practices. Work within an Agile feature team using TDD, code reviews, and continuous improvement practices.
Location: Columbus, OH, United States
We have an opportunity to impact your career and provide an adventure where you can push the limits of what's possible.
As a Software Engineer for Big Data and Analytics at JPMorgan Chase within Consumer and Community Banking Data Technology, you will be an integral part of an agile team that enhances, builds, and delivers trusted, market-leading technology products in a secure, stable, and scalable way. As a core technical contributor, you will be responsible for delivering critical technology solutions across multiple technical areas and business functions, supporting the firm’s objectives using Java, J2EE, Microservices, Python, Spark, Scala, and AWS for Business Banking Data Products.
Job Responsibilities
- Oversee all aspects of data strategy, governance, data risk management, reporting, and analytics.
- Manage risks associated with data use, retention/destruction, and privacy.
- Design, develop, code, test, debug, and deploy scalable and extensible applications.
- Produce high-quality code utilizing Test Driven Development techniques.
- Participate in retrospectives to drive continuous improvement within the feature team.
- Participate in code reviews, ensuring all solutions align with pre-defined architectural specifications.
- Implement automation through Continuous Integration and Continuous Delivery.
- Manage cloud development and deployment, supporting applications in both private and public clouds.
Required Qualifications, Capabilities, and Skills
- Formal training or certification in software engineering concepts and 3+ years of applied experience.
- Advanced knowledge of architecture, design, and business processes.
- Full Software Development Life Cycle experience within an Agile framework.
- Expert-level skills in Java, AWS, database technologies, Python, Scala, Spark/PySpark, or any ETL technology.
- Experience developing and decomposing complex SQL on RDMS platforms.
- Experience with Data Warehousing concepts (including Star Schema).
- Practical experience delivering projects in Data and Analytics, Big Data, Data Warehousing, and Business Intelligence; familiarity with relevant technological solutions and industry best practices.
- Strong understanding of data engineering challenges and proven experience with data platform engineering (batch and streaming, ingestion, storage, processing, management, integration, consumption).
- Familiarity with multiple Data & Analytics technology stacks.
- Awareness of various Data & Analytics tools and techniques (e.g., Python, data mining, predictive analytics, machine learning, data modeling, etc.).
- Experience with one or more leading cloud providers (AWS, Azure, GCP).
Preferred Qualifications, Capabilities, and Skills
- Ability to work quickly and ramp up on new technologies and strategies.
- Strong collaboration skills and ability to develop meaningful relationships to achieve common goals.
- Appreciation for controls and compliance processes for applications and data.
- In-depth understanding of data technologies and solutions is preferable.
- Ability to drive process improvements and implement necessary changes.
- Knowledge of industry-wide Big Data technology trends and best practices.




