Minimum 9 years of experience in Enterprise Data Warehouse solutioning, Exposure of Big Data technology stacks like Cloudera, HBase, Hive, Impala Kafka and Spark, Analytics tools like Python, R
In depth knowledge of Teradata Utilities, Macros
Strong SQL analytical skills
Knowledge in Unix system
Good to have knowledge in Control M scheduling
Involve in Business requirement gatherings, Analysis of requirements, Design, solution walkthrough, Workshops & Identify gap's in solution & Business requirement with Business & IT team.
Create detailed technical design document based on the requirements and High-Level Solution Design.
Basic knowledge in Spark leveraging Scala or Python and Optimize the performance of the built Spark applications in Big data Platform.
Strong analytical mindset and ability to work independently and in fast-paced and quickly changing environment
Work and continuously improve the DevOps pipeline and tooling to provide active management of the continuous integration/continuous deployment processes
Experience in Error handling, debugging coding issues and troubleshoot production problem solving.
Good to have experience in any ETL tools
Experience in Teradata, Informatica, Linux, Hadoop Spark, Control-M
Experience working in an Agile delivery model
Preparing implementation plans, reports, manuals and other documentation on the status, operation and maintenance of Data ware housing Applications.
Refer code: 1487959. Tech Mahindra - The previous day - 2024-02-11 16:51
Developer Programmer-261312/ Developer (other) recruited by Tech Mahindra in Macquarie Park, NSW. The salary range is from ,001-100,000 per year. It requires Full-time employment. You can also explore other Programmer Developer jobs in Macquarie Park, NSW on the 123work job recruitment channel.