About GOSOL :
GOSOL is a well-known provider of IT services, Consulting, Staffing, and On-hire work visa company. We help businesses to enhance Technology, Digital, Project and Business Transformation. Guided by integrity and trust, we believe in offering an unparalleled experience as your Tech of choice.
Find us at - https://gosol.com.au/
Role name - Databricks Data Engineer
Job Overview:We are seeking a talented and experienced Databricks Data Engineer to join our dynamic team. The ideal candidate will have a strong background in implementing Data Engineering solutions using Databricks, with expertise in Pyspark, Delta Live Tables, workflow management, and deployments. The role involves designing, developing, testing, deploying, and maintaining data pipelines and solutions within the Databricks Data Intelligence Platform.
Responsibilities:- Design, develop, and implement scalable and efficient Data Engineering solutions on the Databricks platform.
- Utilize Pyspark to process and transform large volumes of data, ensuring data quality, reliability, and performance.
- Implement and optimize Delta Live Tables for streaming and batch ETL processes, ensuring data consistency and reliability.
- Manage workflow orchestration and automation using Databricks, ensuring smooth execution and monitoring of data pipelines.
- Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions.
- Perform code reviews, debugging, and troubleshooting to ensure high-quality deliverables.
- Develop and maintain documentation for data pipelines, workflows, and deployments.
- Bachelor's degree in Computer Science, Engineering, or a related field. Master's degree preferred.
- Proven experience as a Data Engineer with a focus on Databricks, Pyspark, and Delta Live Tables.
- Strong proficiency in implementing and managing data workflows and deployments on Databricks.
- Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform.
- Solid understanding of Data Engineering concepts, data modeling, and database systems.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills, with the ability to work effectively in a team environment.
- Databricks Certification (e.g., Databricks Certified Developer).
- Experience with other Big Data technologies such as Apache Spark, Apache Hadoop, etc.
- Familiarity with DevOps practices and tools for CI/CD pipeline automation.
Only candidates with valid work rights of Australia will be considered.