About the role
At ANZ our purpose is to shape a world where people and communities thrive. We’re changing the way we run ANZ by adopting new ways of working so we can build a better ANZ for our customers today and in the future. The success of our Business Transformation is dependent on having great people to deliver quality outcomes for our customers, so we’re looking for people who are passionate about transformational change so we can redefine banking for the future.
A senior data engineer has a broad and deep understanding of data engineering principles and practices.They are responsible for designing, building, and maintaining data pipelines and infrastructure that enable businesses to collect, store, process, and analyse data.They also work with application development, stakeholders and analysts to ensure that data is accessible, and adds value to business.
This role is accountable for:
• Database Reliability Engineer (DBRE)/ Data Engineer (DE) design and implement these pipelines to ensure that data is moved efficiently and reliably.
• Build and Support Data infrastructure includes the hardware and software that is used to store and process data
• DBRE / DE are responsible for building and maintaining infrastructure to ensure that it is scalable, performant, and secure.
• DBRE / DE work with applications developers and analysts to ensure that data is accessible and usable for decision-making.
• DBRE / DE support applications developers and analysts to query data, develop data models, and create reports.
• DBRE / DE may lead other engineers, for delegating tasks, providing training, and mentoring junior engineers.
• Data pipelines are the systems that move data from different sources into a data warehouse or data lake.
What will your day look like
‘Must have’ knowledge, skill and experience:
• Strong understanding of data engineering principles and practices
• Design and manage automated deployment of infrastructure provisioning on Google Cloud Platform (GCP) using code.
• Demonstrated working experience with GCP
• Experience designing, building, and maintaining data pipelines and infrastructure.
• Experience in managing Oracle, Postgres databases
• Deep knowledge of Oracle, PostgreSQL databases, CloudSQL, AlloyDB, BigQuery
• Knowledge or experience in working with Kafka, Kubernetes
• Knowledge or experience with infrastructure / architecture / configuration as code tools like Python, Terraform, Ansible
• Ability to work with large datasets and complex data models
• Excellent problem-solving, performance optimization and debugging skills
• Strong communication and teamwork skills
‘Good to have’ knowledge, skills and experiences
• Strong understanding of the financial industry and the regulatory environment.
• Familiar with cloud data concepts and infrastructure.
• Familiar data security and access controls
• Large scale logical data migration projects
Qualifications
• Degree/Diploma in Information Communication Technology or Computer Science or relevant field advantageous but not mandatory
• Google Certified professional with experience in automating and orchestrating workloads on GCP
• Scripting experience with tools like Bash, Python or similar.
So, why join us?
Recruiter Please update with country specific content.
Job Posting End Date
19/02/2024, 11.59pm, (Melbourne Australia)