LOG IN
SIGN UP
Tech Job Finder - Find Software, Technology Sales and Product Manager Jobs.
Sign In
OR continue with e-mail and password
E-mail address
Password
Don't have an account?
Reset password
Join Tech Job Finder
OR continue with e-mail and password
E-mail address
Username
Password
Confirm Password
How did you hear about us?
By signing up, you agree to our Terms & Conditions and Privacy Policy.

Lead Data Integration Engineer

at Raymond James

Back to all Data Engineering jobs
Raymond James logo
Investment Banking

Lead Data Integration Engineer

at Raymond James

Tech LeadNo visa sponsorshipData Engineering

Posted a day ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Saint Petersburg
Country
United States

This role seeks a lead data engineer with deep expertise in MS SQL Server, SSIS, and Python to maintain a critical pre-trade clearance system and re-engineer the data stack into AWS. The position requires strong experience in data integration, ETL processes, and collaboration with business stakeholders within a hybrid work environment.

Are you an expert in MS SQL Server & SSIS development who learned python and cloud engineering to stay ahead of the curve? We are seeking a lead data engineer to maintain and support our critical Pre-trade clearance system while working on re-engineering our entire data stack into AWS using python. This position follows our hybrid workstyle policy: Expected to be in a Raymond James office location a minimum of 10-12 days a month. Please note: This role is not eligible for Work Visa sponsorship, either currently or in the future. Responsibilities: Deep expertise in Microsoft SQL Server, SSIS, and SQL development. Strong proficiency in writing and optimizing complex stored procedures, functions, and packages. Hands-on experience with Python for data manipulation, automation, and pipeline development. Familiarity with Oracle databases and PL/SQL development is required for cross-platform data integration. Experience in implementing CI/CD pipelines and DevOps practices for data solutions. Understanding data warehousing concepts, ETL methodologies, and data modeling techniques. Experience with Unix and Shell scripting. Experience with job scheduler tools such as BMC Control-M. Proven track record working in both waterfall and agile SDLC frameworks. Knowledge of the Financial Services industry including middle and back-office functions. Experience in collaborating with business counterparts to understand detailed requirements. Excellent verbal and written communication skills. Produce and maintain detailed technical documentation for all development efforts. Skills: MS SQL Server & SQL Proficiency: Deep expertise in writing and optimizing complex SQL queries, stored procedures, functions, and triggers is fundamental. SSIS Expertise: In-depth knowledge of designing, developing, deploying, and maintaining ETL (Extract, Transform, Load) processes and packages using SQL Server Integration Services (SSIS). This includes robust error handling and logging mechanisms. ETL & Data Warehousing: Strong understanding of ETL methodologies, data warehousing concepts (e.g., Kimball methodology, star schemas), and data modeling techniques (normalization/denormalization). Performance Tuning: Ability to identify, investigate, and resolve database and ETL performance issues, including capacity and scalability planning. Programming Languages: Proficiency in additional programming/scripting languages, such as Python or PowerShell/Shell scripting, for automation, data manipulation, and pipeline development. Cloud & DevOps (Desired): Familiarity with cloud platforms (e.g., Azure Data Factory, AWS Glue, Google Cloud) and experience implementing CI/CD pipelines and DevOps practices for data solutions is a strong advantage. Exposure to streaming technologies such as Kafka is a plus. Experience in financial services or enterprise-scale applications is preferred. Excellent communication, analytical, and problem-solving skills. Education: Bachelor’s: Computer and Information Science (Required), High School (HS) (Required). Work Experience: General Experience - 6 to 10 years. Certifications: None specified. Travel: Less than 25%. Workstyle: Hybrid. At Raymond James our associates use five guiding behaviors (Develop, Collaborate, Decide, Deliver, Improve) to deliver on the firm's core values of client-first, integrity, independence and a conservative, long-term view. We expect our associates at all levels to grow professionally and inspire others to do the same; work with and through others to achieve desired outcomes; make prompt, pragmatic choices and act with the client in mind; take ownership and hold themselves and others accountable for delivering results that matter; contribute to the continuous evolution of the firm. The Company is an equal opportunity employer and makes all employment decisions on the basis of merit and business needs.

Lead Data Integration Engineer

at Raymond James

Back to all Data Engineering jobs
Raymond James logo
Investment Banking

Lead Data Integration Engineer

at Raymond James

Tech LeadNo visa sponsorshipData Engineering

Posted a day ago

No clicks

Compensation
Not specified

Currency: Not specified

City
Saint Petersburg
Country
United States

This role seeks a lead data engineer with deep expertise in MS SQL Server, SSIS, and Python to maintain a critical pre-trade clearance system and re-engineer the data stack into AWS. The position requires strong experience in data integration, ETL processes, and collaboration with business stakeholders within a hybrid work environment.

Are you an expert in MS SQL Server & SSIS development who learned python and cloud engineering to stay ahead of the curve? We are seeking a lead data engineer to maintain and support our critical Pre-trade clearance system while working on re-engineering our entire data stack into AWS using python. This position follows our hybrid workstyle policy: Expected to be in a Raymond James office location a minimum of 10-12 days a month. Please note: This role is not eligible for Work Visa sponsorship, either currently or in the future. Responsibilities: Deep expertise in Microsoft SQL Server, SSIS, and SQL development. Strong proficiency in writing and optimizing complex stored procedures, functions, and packages. Hands-on experience with Python for data manipulation, automation, and pipeline development. Familiarity with Oracle databases and PL/SQL development is required for cross-platform data integration. Experience in implementing CI/CD pipelines and DevOps practices for data solutions. Understanding data warehousing concepts, ETL methodologies, and data modeling techniques. Experience with Unix and Shell scripting. Experience with job scheduler tools such as BMC Control-M. Proven track record working in both waterfall and agile SDLC frameworks. Knowledge of the Financial Services industry including middle and back-office functions. Experience in collaborating with business counterparts to understand detailed requirements. Excellent verbal and written communication skills. Produce and maintain detailed technical documentation for all development efforts. Skills: MS SQL Server & SQL Proficiency: Deep expertise in writing and optimizing complex SQL queries, stored procedures, functions, and triggers is fundamental. SSIS Expertise: In-depth knowledge of designing, developing, deploying, and maintaining ETL (Extract, Transform, Load) processes and packages using SQL Server Integration Services (SSIS). This includes robust error handling and logging mechanisms. ETL & Data Warehousing: Strong understanding of ETL methodologies, data warehousing concepts (e.g., Kimball methodology, star schemas), and data modeling techniques (normalization/denormalization). Performance Tuning: Ability to identify, investigate, and resolve database and ETL performance issues, including capacity and scalability planning. Programming Languages: Proficiency in additional programming/scripting languages, such as Python or PowerShell/Shell scripting, for automation, data manipulation, and pipeline development. Cloud & DevOps (Desired): Familiarity with cloud platforms (e.g., Azure Data Factory, AWS Glue, Google Cloud) and experience implementing CI/CD pipelines and DevOps practices for data solutions is a strong advantage. Exposure to streaming technologies such as Kafka is a plus. Experience in financial services or enterprise-scale applications is preferred. Excellent communication, analytical, and problem-solving skills. Education: Bachelor’s: Computer and Information Science (Required), High School (HS) (Required). Work Experience: General Experience - 6 to 10 years. Certifications: None specified. Travel: Less than 25%. Workstyle: Hybrid. At Raymond James our associates use five guiding behaviors (Develop, Collaborate, Decide, Deliver, Improve) to deliver on the firm's core values of client-first, integrity, independence and a conservative, long-term view. We expect our associates at all levels to grow professionally and inspire others to do the same; work with and through others to achieve desired outcomes; make prompt, pragmatic choices and act with the client in mind; take ownership and hold themselves and others accountable for delivering results that matter; contribute to the continuous evolution of the firm. The Company is an equal opportunity employer and makes all employment decisions on the basis of merit and business needs.

SIMILAR OPPORTUNITIES

No similar jobs available at the moment.