Role and responsibility –
- Design, Develop, test and support future-ready data solutions for customers across industry verticals
- Produce Detailed Solution Design and support data engineers and testers as a technical lead in a data project
- Develop, test and support end-to-end batch and near real-time data flows/pipelines
- Demonstrate understanding in data architectures, modern data platforms, big data, data modelling, ML/AI, analytics, cloud platforms, data governance and information management and associated technologies
- Develop and demonstrate Proof of Concepts and Working Demos
- Support and collaborate with other internal/external consultants in consulting, workshops and delivery engagements.
- Mentor junior IBM consultants in the practice and delivery engagements
To ensure success in the role you will possess the following skills –
- Must have leadership qualities applicable for the project team and beyond
- Must understand and navigate large organisation’s governance and processes with ease
- Must have exceptional inter-personal skills and able to present solution design with confidence in large technical forums
- Minimum of 5+ years of experience as a solution designer, tech lead, developer, tester or support role in IT Consulting or other technology business units across the industry but Banking/FSS is preferrable
- Minimum of 3+ years of hands-on development, testing and administration experience with Big Data/Data Lake services in cloud and on-premise
- Experience in applying industry best practices, design patterns and first-of-a-kind technical solutions as developer or administrator on data and non-data specific platforms and applications
- Experience in NoSQL databases such as HBase(Expert in design and dev)
- Experience in implementing near real-time data flows/pipelines and distributed streaming applications using technology such as Apache Kafka (Expert in design and dev)
- Understanding of various data modelling concepts and techniques such as Entity Relationship Modelling, Third Normal Form, Dimensional Modelling etc
- Understanding of Data Mesh Architecture and Domain Data Products
- Experience in implementing traditional ETL, data warehousing and BI solutions
- Hands-on development experience and working skill level in Scala(Expert), Python, SQL, shell scripting and other programming/scripting languages
- Hands-on implementation of services sensitive to performance SLAs with high availability, fault-tolerance, automatic fail-over and geographical redundancy
- Working knowledge and hands-on experience with Cloud Platform such as Azure, AWS, GCP and IBM and other modern data platforms
- Solid understanding of containerisation, virtualisation, infrastructure and networking
- Diverse experience in software development methodologies and project delivery frameworks such as Agile Sprints, Kanban, Waterfall etc
- Degree in Computer Science, Information Technology or related Engineering courses
- One or more industry-recognised technology certifications or badges