At IBM, work is more than a job - it's a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you've never thought possible. Are you ready to lead in this new era of technology and solve some of the world's most challenging problems? If so, lets talk.Your Role and Responsibilities
We are currently recruiting Data Engineer-Data Integration for a 12-month fixed term contract, based in Sydney.
Data Engineer-Data Integration is responsible for supporting the full life cycle of Development and enhancements.Data Engineer-Data Integration, your responsibilities include.
- Work effectively to drive solution architecture discussions face to face by interacting with several business stakeholders Assist in creating solutions for client and/or internal review.
- Participate in client meetings to communicate status or give demos.
- Deal with multiple upstream and downstream systems having different integration complex patterns.
- Assumes primary responsibly for making decisions around the nature of the Integrated solution.
- Deploy the solution on a high-availability shared service production environment.
- Work effectively individually and with team members toward customer satisfaction and success
- Assist in creating solutions for client and/or internal review.
- Participate in client meetings to communicate status or give demos.
- Work in areas which adhere to state and federal regulatory standards, or where no such standards exist, to nationally accepted guidelines.
- To be part of different client workshops to understand business requirements and produce architectural related artifacts.
- Take care of smooth daily data loads, real-time trade booking, Start-of-Day and End-of-Day data load processes.
- work with client architect group closely to ensure solution architecture takes into consideration of Data Quality, Security, Infrastructure Architects very closely.
- Analyse data-related systems integration opportunities and challenges, validating data models from a data integration perspective for current projects other impacted projects and drive quality improvements.
- To be client and offshore team focal point during the project execution.
- Abide by team coding standards, development processes and best practices.
- Perform code reviews and ensure that all solutions are aligned to pre-defined architectural standards, guidelines, best practices, and meet quality standards.
To ensure success in the role you will possess the following skills –
- 10-15 yrs of experience in Data warehouse industry in Banking and financial market industry.
- Expertise in designing and developing the Batch and realtime Applications.
- Integrated with Multiple upstream and downstream Applications.
- Good knowledge in understanding client system and building solution design.
- Conceptualizes creative ETL solutions, develops specifications and presents solutions to senior.
- Good knowledge of Trading systems such as Murex, Wallstreet and Trading portals.
- Expertise in developing the Applications using ETL Tools: Abinitio GDE, EME, Express>it (ACE and BRE), continuous flows, conduct>it.
- Additional skills on other ETL tools, AWS Glue, AWS Step functions, Amazon redshift and SSIS.
- Expertise in deploying and working with the Application in AWS EKS cluster using ArgoCD.
- Expertise in Realtime Applications developed using REST API Calls, MQ’s AWS SQS Queue, JMS, RPC e.t.c.
- Scheduling tools: Control Centre.
- Expertise in Unix shell scripting.
- Experience with relational databases Oracle, Postgres
- Strong written and verbal communications.
- Basic OO programming skills Python and Java.
As Above