LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
First name
Last name
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Software Engineer III - Full Stack + AWS + Elastic / Open Search

at J.P. Morgan

Back to all Data Engineering jobs
J.P. Morgan logo
Bulge Bracket Investment Banks

Software Engineer III - Full Stack + AWS + Elastic / Open Search

at J.P. Morgan

Mid LevelNo visa sponsorshipData Engineering

Posted a day ago

No clicks

Compensation
Not specified USD

Currency: $ (USD)

City
Not specified
Country
United States

As a Software Engineer III in JPMorgan Chase's digital communications compliance team, you will design and implement core backend services, streaming pipelines and data flows powering detection logic, alert triage, reviewer workflows and explainable audit trails for massive volumes of daily enterprise communications and content data. You will build scalable microservices and APIs, integrate ML/LLM based processing pipelines for on-demand language translation, and develop streaming and batch data pipelines for content ingestion, indexing and alerts with upstream and downstream integrations. The role emphasizes collaboration with product managers, architects, data science, platform and operational teams, while also contributing to software engineering communities to explore new technologies and best practices. It also requires building robust tests, CI/CD pipelines, and strong observability to reduce production toil.

Location: Plano, TX, United States

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. 

As a Software Engineer III at JPMorgan Chase within the digital communications compliance team in Enterprise Technology, you will have the opportunity to design and implement core backend services, streaming pipelines and data flows that power detection logic, alert triage, reviewer workflows and explainable audit trails on massive volumes of daily enterprise communications and content data for the Supervision & Surveillance product line. This role offers a chance to collaborate with product managers, architects, data science, platform and operational teams, while also engaging in software engineering communities to explore new and emerging technologies.

 

Job responsibilities

  • Design and develop scalable, fault tolerant micro services and APIs that support rule based and ML based detection pipelines after thorough evaluation for performance and cost trade-offs
  • Design, implement and optimize supervision workflows by applying concepts of state machines
  • Integrate ML and LLM based processing pipeline for on-demand language translation
  • Develop streaming and batch data pipelines that ingest and index content and alerts generated on the content along with APIs for upstream and downstream integrations
  • Design data models to represent Alerts, Queues, policies and audit artifacts ensuring system maintains immutability, lineage and full traceability for audits
  • Build robust unit, integration and performance tests aligning to an ideal test pyramid, following Test Driven Development
  • Build CI/CD pipelines ensuring required quality control gates through the automated delivery lifecycle
  • Implement Observability hooks -metrics, tracing, logging that help reduce production toil through proactive monitoring and troubleshooting
  • Work with data science and ML Engineers to operationalize models into detection pipelines
  • Partner with product management and compliance SMEs to monitor and improve accuracy and reliability of generated alerts 
  • Proactively identify hidden problems and patterns in data and use the insights to drive product and process improvement

 

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 3+ years applied experience
  • Expert Java, Python, React programmer with experience in front end development using react . Experience in building resilient, scalable, cost efficient enterprise grade cloud native products with 2+ years in compliance for Financial industry
  • Have built cloud native microservices for streaming and batching architectures using Spark/Storm/Flink
  • Well versed with AWS services and eco system including but not limited to EC2, ECS, EKS, EMR, S3, Glacier
  • Hands-on with Elastic/OpenSearch, Kafka, PostgreSQL ; Hands-on with AI productivity tools such as Codium, GitHub co-pilot
  • Experience in using Observability & monitoring tools such as prometheus, grafana, OpenTelemetry
  • Experience in building CI/CD pipelines using ArgoCD, Helm, Terraform, Jenkins, github actions along with experience in building externally consumable APIs and high responsive UI micro-frontends
  • Experience integrating with ML/LLM models and pipelines
  • Strong ownership mentality with good Communication skills and collaborative mindset
  • Prior experience in test driven development delivering products with well defined SLI/SLO/SLAs
  • Knowledge and exposure of leveraging and integrating AI/ML models and pipelines

 

Preferred qualifications, capabilities, and skills

  • Experience/Exposure to MLOps in cloud
  • Experience in building cost models aligning to SLIs/SLOs

Software Engineer III - Full Stack + AWS + Elastic / Open Search

at J.P. Morgan

Back to all Data Engineering jobs
J.P. Morgan logo
Bulge Bracket Investment Banks

Software Engineer III - Full Stack + AWS + Elastic / Open Search

at J.P. Morgan

Mid LevelNo visa sponsorshipData Engineering

Posted a day ago

No clicks

Compensation
Not specified USD

Currency: $ (USD)

City
Not specified
Country
United States

As a Software Engineer III in JPMorgan Chase's digital communications compliance team, you will design and implement core backend services, streaming pipelines and data flows powering detection logic, alert triage, reviewer workflows and explainable audit trails for massive volumes of daily enterprise communications and content data. You will build scalable microservices and APIs, integrate ML/LLM based processing pipelines for on-demand language translation, and develop streaming and batch data pipelines for content ingestion, indexing and alerts with upstream and downstream integrations. The role emphasizes collaboration with product managers, architects, data science, platform and operational teams, while also contributing to software engineering communities to explore new technologies and best practices. It also requires building robust tests, CI/CD pipelines, and strong observability to reduce production toil.

Location: Plano, TX, United States

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. 

As a Software Engineer III at JPMorgan Chase within the digital communications compliance team in Enterprise Technology, you will have the opportunity to design and implement core backend services, streaming pipelines and data flows that power detection logic, alert triage, reviewer workflows and explainable audit trails on massive volumes of daily enterprise communications and content data for the Supervision & Surveillance product line. This role offers a chance to collaborate with product managers, architects, data science, platform and operational teams, while also engaging in software engineering communities to explore new and emerging technologies.

 

Job responsibilities

  • Design and develop scalable, fault tolerant micro services and APIs that support rule based and ML based detection pipelines after thorough evaluation for performance and cost trade-offs
  • Design, implement and optimize supervision workflows by applying concepts of state machines
  • Integrate ML and LLM based processing pipeline for on-demand language translation
  • Develop streaming and batch data pipelines that ingest and index content and alerts generated on the content along with APIs for upstream and downstream integrations
  • Design data models to represent Alerts, Queues, policies and audit artifacts ensuring system maintains immutability, lineage and full traceability for audits
  • Build robust unit, integration and performance tests aligning to an ideal test pyramid, following Test Driven Development
  • Build CI/CD pipelines ensuring required quality control gates through the automated delivery lifecycle
  • Implement Observability hooks -metrics, tracing, logging that help reduce production toil through proactive monitoring and troubleshooting
  • Work with data science and ML Engineers to operationalize models into detection pipelines
  • Partner with product management and compliance SMEs to monitor and improve accuracy and reliability of generated alerts 
  • Proactively identify hidden problems and patterns in data and use the insights to drive product and process improvement

 

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 3+ years applied experience
  • Expert Java, Python, React programmer with experience in front end development using react . Experience in building resilient, scalable, cost efficient enterprise grade cloud native products with 2+ years in compliance for Financial industry
  • Have built cloud native microservices for streaming and batching architectures using Spark/Storm/Flink
  • Well versed with AWS services and eco system including but not limited to EC2, ECS, EKS, EMR, S3, Glacier
  • Hands-on with Elastic/OpenSearch, Kafka, PostgreSQL ; Hands-on with AI productivity tools such as Codium, GitHub co-pilot
  • Experience in using Observability & monitoring tools such as prometheus, grafana, OpenTelemetry
  • Experience in building CI/CD pipelines using ArgoCD, Helm, Terraform, Jenkins, github actions along with experience in building externally consumable APIs and high responsive UI micro-frontends
  • Experience integrating with ML/LLM models and pipelines
  • Strong ownership mentality with good Communication skills and collaborative mindset
  • Prior experience in test driven development delivering products with well defined SLI/SLO/SLAs
  • Knowledge and exposure of leveraging and integrating AI/ML models and pipelines

 

Preferred qualifications, capabilities, and skills

  • Experience/Exposure to MLOps in cloud
  • Experience in building cost models aligning to SLIs/SLOs

SIMILAR OPPORTUNITIES

No similar jobs available at the moment.