We are seeking a highly skilled and motivated Machine Learning Engineer to join our team in Australia. The successful candidate will have a strong background in Machine Learning and software engineering, with experience in developing and deploying Machine Learning models in real-world applications. The ideal candidate will have a passion for working with data, a strong understanding of Machine Learning algorithms and techniques, cloud technology and the ability to communicate complex technical concepts to stakeholders.
As a Machine Learning Engineer, you will work closely with Technical teams on the Customer side and Enterprise Support and Product Engineering teams on the H2O side.
What You Will Do
- Deliver technical professional services to the customer
- Work closely with H2O Data Scientists in advising and developing end to end Machine Learning solutions (from a data engineering perspective) for the customer requirements
- Integrating H2O products with customer data sources for model training
- Integrating Machine Learning models/pipelines (python and mojo scoring pipelines) with customer systems for scoring (relatime/batch) as well as model monitoring and operations
- Implementing end to end ML data flow pipelines that help streamline and data science solutions to a business problem
- Implementing AI driven applications using the open source H2O Wave SDK
- Provide/gather customer feedback so that you can work with the H2O.ai Engineering team to further enhance our products for needed features
- Ensure the scalability, reliability, and performance of deployed models by implementing appropriate monitoring, testing, and debugging processes
- Be the trusted solutions advisor for our customers and partners
- Communicate effectively with a diverse audience of internal and external stakeholders consisting of: engineers, business people, partners, executives
- Translate business cases and requirements into value based technical solutions through the architecture of Machine Learning workflows and systems from data ingestion to model deployment
- Bachelor’s or a higher education degree in Computer Science/Engineering, Data Science, Statistics or related field
- Data Engineering Skills:
- Experience building data pipelines, ETL data sets, preferably on ‘Big Data’
- Excellent understanding and experience with big data tools like Hadoop and Spark
- Excellent knowledge of SQL query language and working with relational databases
- Understanding of various NoSQL database types and their application scenarios
- Experience in Spark/Kafka and Hadoop ecosystem
- Programming Languages/Frameworks:
- Proficient in Python or R for data science. Java, Bash scripting, Scala Go are a plus
- Experience with writing REST API using microservices frameworks in Python or Java
- Experience with dockerization of services (i.e. creating docker images)
- Understanding of Kubernetes based application development
- Data Science skills:
- Experience of visualizing and presenting (EDA) to stakeholders using H2O Wave (plus), or other standard data visualization libraries in the Python and R stacks or using Tableau/PowerBi.
- Experience with post production model monitoring tools like H2O ML Ops (plus) MLFlow etc.
- Understanding of using a variety of Machine Learning techniques (supervised/unsupervised, clustering, decision tree learning, neural networks, etc.) and their real-world advantages/drawbacks/tuning techniques
- Understanding of using advanced statistical techniques and concepts (regression, properties of distributions, statistical tests and proper usage, etc.) for practical applications
- Experience of working in a customer-facing environment, providing technical services
- Excellent communication skills (proficient in spoken and written English). Additional languages are a plus.
- Amicable attitude
- Aptitude to independently investigate and find solutions to technical problems; urge to learn/master new technologies
- Experience with big data technologies such as Hadoop, Spark, or NoSQL databases
- Familiarity with DevOps practices and tools such as Git, Jenkins, ArgoCD
- Knowledge of data privacy and security principles
- Experience with natural language processing or computer vision tasks
- Experience with model interpretability techniques such as feature importance, partial dependence plots, or SHAP values
- Experience with IaC technologies such as Terraform, AWS Cloud Formation
- Experience with containerization technologies such as Kubernetes, Docker
- Experience of debugging/troubleshooting containerized workloads and familiarity with tools such as kubectl, helm etc.
- Market Leader in Total Rewards
- Remote-Friendly Culture
- Flexible working environment
- Be part of a world-class team
- Career Growth
H2O.ai is an innovative AI cloud platform company, leading the mission to democratize AI for everyone. Thousands of organizations from all over the world have used our cutting-edge technology across a variety of industries. We’ve made it easy for people at all levels to generate breakthrough solutions to complex business problems and advance the discovery of new ideas and revenue streams. We push the boundaries of what is possible with artificial intelligence.
H2O.ai employs the world’s top Kaggle Grandmasters, the community of best-in-the-world Machine Learning practitioners and data scientists. A strong AI for Good ethos and responsible AI drive the company’s purpose.
Please visit to learn more.
Powered by JazzHR