Data Engineer | 12 Month Contract | $1326 p/day Incl. Super
- $1000 - $1200 p/day + Super
- 12 Month Contract
- Macquarie Park (hybrid)
This NSW Government agency is a major service provider who strives to deliver a more consistent and efficient experience within government. This government agency uses data and behavioural insights to drive improvements with government and providing services to NSW. We are looking for someone to start ASAP for a 12 month contract paying up to $1326 Day incl. Super. This role is located in Macquarie Park (hybrid).
About the Role
The NSW Government agency is looking for a Data Engineer with technical leadership potential. This is a role working with modern data architectures, and in a hands-on capacity design, build and deploy data lake, database and data warehouse solutions. The Data Engineer will be implementing foundational, robust and production-ready data platforms to enable business data-discovery, self-service, and data analytics functions, allowing us to do more with data in partnership with our stakeholders across the Infrastructure & Place division.
Key accountabilities include:
- Keeping ahead of the curve with learning and adopting the latest data technologies and tools relevant in an enterprise data environment
- Ongoing optimisation, orchestration and administration of data platforms
- Modeling, build and deployment of data lakes and data warehouses, in line with modern data architecture principles
- Design and development of data ingestion, transformation, and presentation pipelines
- Working knowledge of the different aspects of enterprise data environments, i.e. collect, store, process, verify and consume
- Demonstrated experience in designing, building, deploying and operating data warehouses and data lakes
- Ability to model use case scenarios and data landscapes, to help maximise business value for internal customers and stakeholders
- Hands-on data engineering experience and knowledge in Azure Data Factory, Snowflake, Databricks, and dbt
- Strong Python and SQL skills
- Understanding of data structures and formats e.g. Parquet, Avro, JSON etc
- Data modelling knowledge and experience, e.g. Data Vault, Kimball
- Demonstrated knowledge and hands-on experience in utilising Azure cloud and data services, incl. infrastructure-as-code
- Data integration knowledge and experience, e.g. designing and interacting with APIs and data source endpoints
- Experience with CI/CD practices and version management tools