We bring planning, property and land data together into one place as your resource for making faster, insight-driven decisions. We pride ourselves on our culture and values which include:
Generous and genuine
We provide a useful, user-oriented service that genuinely meets our members’ needs and enhances their day-to-day experience. It’s about them, not us— we constantly look for ways to create value for our community.
Going beyond good
Ours is a relentless focus on achieving the highest standards for our brand—the most accurate, convenient product in our field, excellent customer service and consistently positive feedback.
Ever restless
For us it’s all about agility and innovation—the hustle. Never complacent, we won't rest on our laurels. Solution-focused and results-driven, we want to do things that haven’t been done, staying nimble, resourceful and persistent. Always pushing, learning and looking ahead.
Shared outcomes
We’re true individuals but above all we’re a team. We are motivated to make an impact for our members, in our business and for each other. Being inclusive and relationship-focused, being accountable to each other and celebrating our collective success.
Job OverviewThe successful candidate will be utilising their software development and Data Engineering knowledge to develop and maintain our data pipelines.
Responsible for the development, testing, and optimization of our python based data workflows in Airflow.
Able to write clean, scalable code that is at the same time tested and documented
Responsibilities
Manage data requirements, design and develop dataset integrations, and ensure seamless and accurate data flows between clients systems and our data platform.
Collaborate with other engineers to architect, develop, test, and support technical solutions to achieve business goals.
Work within a cross functional team containing members with different skills and experience.
Design, implement and maintain highly efficient, resilient and scalable automated data pipelines.
Develop, test and monitor our data workflows.
Automate data quality checks to ensure accuracy and integrity of datasets
Build reusable code and libraries for future use.
Stay updated on the latest changes in Data Engineering technologies.
Implement monitoring and measurement of data quality across systems (accuracy, consistency, completeness, etc.)
Required:
5+ years of professional experience in DataEngineering, specialising in ETL and data warehousing.
4+ years of practical experience with Python and JavaScript, including the use of frameworks like Django and Node.js, and libraries such as Pandas and React, for data analysis, automation, and web application development.
3+ years of practical experience in GIS/Spatial domains, working with geospatial libraries and tools like PostgreSQL, PostGIS, GDAL, Proj, ArcGIS Rest Server, QGIS, and Mapbox.
VCS experience (Git/Gitflow).
Extensive knowledge of planning controls and geographiclayers in Australia, with expertise in spatial datasets such as ABS data, Data Vic, NSW Spatial Services, TAS List, and other relevant geospatial resources.
Working experience with container engines and container orchestration service specifically Docker and AWS Elastic Container Service
Experience with web servers and frameworks (Flask, PHP/Laravel, Ruby on Rails, NextJS)
Experience with Data Orchestration / Workflow frameworks (Airflow, Dagster, Prefect)
Exposure to Cloud technologies (AWS, Batch, ECS, EC2, Fargate, RDS, S3, and OpenSearch).
Experience in Cloud Infrastructure and Networking (Compute, Storage, Virtualisation, Load Balancing, Caching, Routing, and Networking)
Deep understanding of Linux and ability to create automation scripts like Bash, Python or Terraform
Desirable:
Relevant certifications in any of the skills listed (CompTIA, AWS, GCP, Azure, etc.)
Ability to automate provisioning and configuration of cloud infrastructure using AWS, Terraform, and Ansible
Experience with developing CI/CD Pipelines
Experience with Agile methodologies.
Experience with integrating data monitoring and error reporting (Sentry, Datadog, ELK Stack)
Experience with automating web page and pdf content extraction
Excellent problem solving and communication skills.
Continuous learning. Passion to learn and innovate.
Please be advised that this position requires working on-site at our South Melbourne office. We kindly request that recruiters refrain from contacting us. Applicants must have valid working rights. Thank you for your understanding.
To apply for this role, please email your current CV along with a cover letter to *******@landchecker.com.au
Summary of role requirements:
- Looking for candidates available to work:
- Monday: Morning, Afternoon
- Tuesday: Morning, Afternoon
- Wednesday: Morning, Afternoon
- Thursday: Morning, Afternoon
- Friday: Morning, Afternoon
- More than 4 years of relevant work experience required for this role
- Work visa can be provided for this role
- Expected start date for role: 01 July 2024