- World Leading Financial Institution
- Sydney CBD Location
- Excellent Salary and Bonus Scheme Offered
We are searching for a Mid Level Data Engineer to join our global client to help develop innovative applications and analysis for a migration project in their commodities global markets team.
In This Role You Will build complex data models and work with the business to understand data requirements. you will deliver insights from a wide range of data sources/databases as well as create reusable tools in Python/JavaScript.
a major part of the role will be documenting your findings using Confluence, Collibra and Alation. This will all be done in an Agile environment.
What you will be skilled in:
- Strong data modelling experience with a background in Banking and Finance industry
- Good SQL experience
- Python experience
- Experience with Collibra and Alation
- Experience with Big Data querying tools like Hive, Spark, Presto
- Knowledge of NoSQL databases
- Managed data pipelining with tools like Apache Oozie or Airflow
- Knowledge of programming languages like Java, C++ or Scala in context of Big data technologies
- Prior working experience with AWS - any of EC2, S3, EBS, ELB, RDS, Dynamo DB, EMR, Apache Parquet
- API integration
- Knowledge of real time integration using Collibra Kafka, Spark streaming or other technologies.
If you tick these boxes and want to know more please reach out to Steve Bielby on 0419588*** or email your CV to *****@clarrow.com.au.