Our Client is looking for an experienced Data Engineer who will report to the Data Lead. The Data Engineer is responsible for the development of data models, data pipelines, test automation solutions and mapping the business requirements to systems/technical requirements to ensure they are in line with the enterprise architectural plan. This is likely to be a 6-month rolling contract in Northwest Melbourne.Please note, to apply applicants must have the following:
- The requisite skill and experience defined below,
- At least Australian Permanent Residence working rights,
- At least 4+ year's relevant local working experience in the Data Engineering field.
- Create comprehensive end-to-end data solutions for both structured and unstructured data. This includes tasks from ingestion and parsing, through integration, to auditing, logging, aggregation, normalization, modelling, and error handling.
- Collaborate closely with business teams to understand complex business logic and translate it into actionable data assets that drive strategic decision-making.
- Work collaboratively with cross-functional teams to identify and resolve data quality issues and operational challenges, ensuring the integrity and reliability of data systems.
- Produce high-quality, secure, and maintainable code within an agile development environment, contributing to the continuous improvement of our data processes.
- Develop and maintain test automation frameworks as needed to support reliable and efficient data operations.
- Strong understanding of a "well architected" Data Lakehouse architecture
- Experience working on Azure Data products like Data Factory and Databricks
- Experience in data modelling methods such as Data Vault, Star schema, Kimball
- Ability to create Python/SQL notebooks to transform & load data using Databricks.
- Experience working with Microsoft D365 data
- Hand-on experience on Microsoft Fabric is highly desirable.
- 4+ years of experience working on developing and designing on Azure
- Experience working with Data Lakehouse using Azure Databricks and open-source Delta
- Experience and understanding of large-scale cloud data platform deployments in enterprise-wide environments
- Hands on experience implementing DevOps is desirable
- Experience working with Microsoft Fabric is desirable
- Databricks Developer certification highly regarded.
- Your daily rate will depend on skills and experience.
- Start date is ASAP
- Full time on-site requirement with the possibility of some flexible working arrangements.