Company | Tech MahindraSee more |
Address | Macquarie Park, NSW |
Form of work | Full-time |
Salary | $100,000 per year |
Category | Computer |
Job description
- Minimum 10 years of experience in Enterprise Data Warehouse solutioning, Exposure of Big Data technology stacks like Cloudera, HBase, Hive, Impala Kafka and Spark, Analytics tools like Python, R
- In depth knowledge of Teradata Utilities, Macros
- Data lake, Azure
- Strong SQL analytical skills
- Knowledge in Unix system
- Good to have knowledge in Control M scheduling
- Involve in Business requirement gatherings, Analysis of requirements, Design, solution walkthrough, Workshops & Identify gap's in solution & Business requirement with Business & IT team.
- Create detailed technical design document based on the requirements and High-Level Solution Design.
- Basic knowledge in Spark leveraging Scala or Python and Optimize the performance of the built Spark applications in Big-data Platform.
- Strong analytical mindset and ability to work independently and in fast-paced and quickly changing environment
- Work and continuously improve the DevOps pipeline and tooling to provide active management of the continuous integration/continuous deployment processes
- Good to have experience in any ETL tools
- Experience in Teradata, Informatica, Linux, Hadoop Spark, Control-M
- Experience working in an Agile delivery model
- Preparing implementation plans, reports, manuals and other documentation on the status, operation and maintenance of Data ware housing Applications.
Refer code: 1432810. Tech Mahindra - The previous day - 2024-02-05 05:11