Company

Cognizant Technology SolutionsSee more

addressAddressMelbourne, VIC
CategoryIT

Job description

What makes Cognizant a unique place to work? The combination of rapid growth and an international and innovative environment! This is creating many opportunities for people like YOU - people with an entrepreneurial spirit who want to make a difference in this world. At Cognizant, together with your colleagues from all around the world, you will collaborate on creating solutions for the world's leading companies and help them become more flexible, more innovative, and successful. Moreover, this is your chance to be part of the success story.

Position Summary:

We are looking for a Big Data Developer who will work on the collecting, loading, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company.

  • Exposure to, or an understanding of a broad cross-section of on prem, hybrid and cloud data platforms, including Azure/AWS.
  • design and engineer innovative cloud-based solutions in multi-cloud scenarios across the Microsoft Azure/AWS.
  • 7+ years demonstrated coding experience in batch framework & hive/impala. (Language such as Scala/Java /Python/PL-SQL, Unix)
  • experience in design, build and implement the Lambda architecture in native cloud.
  • Expertise in creating data factory pipeline to ingest data from flat files of different formats.
  • A demonstrated experience in analysing the data stored in data warehouse using SQL.
  • Expertise in writing stored procedures, functions to process or transform large datasets in SQL database.
  • Inhouse Proprietary Tool - 3+ years of working experience with JOF Job, Merlin and JOF Ingestion and Replication framework to develop data ingestions pipelines and deploy the code artifact.
  • Hands on implementation of end-to-end data pipelines using Azure Data Lake, Azure Data factory, Azure Databricks.
  • Expertise with database services & data lake such as Azure Synapse, Azure SQL, Azure Blob Storage, and Azure Data Lake Storage Gen2.
  • Solid experience in setting up advanced deployment techniques like CI/CD using Azure DevOps, GITHUB integration, Jenkins, Bamboo & etc.
  • Should involve in End-to-End project execution, Requirement gathering, transforming legacy design to Bigdata eco system. Development, Testing, UAT Support and GO Live support.
  • Understanding and creating Tableau/PowerBI reports would be additional advantage.
  • Should have knowledge of Control M to create, monitor and schedule jobs.

Mandatory Skills:

  • Big data PaaS Services Ecosystem: Hadoop, Flume, Kafka, Spark Streaming, Spark SQL, Impala, SQL DW, Kudu tables.
  • Languages: - SQL, PL-SQL, Java, UNIX, Scala, Python & Ruby.
  • Azure Stack: - Azure Function, Azure EventHub, Azure HDInsight, Azure Data Bricks, Azure VM, Azure Data Factory, Azure storage, Azure Data Lake, Azure Cosmos DB, Azure KeyVault, Azure SQLDW(Synapse)
  • AWS Stack: - Kinesis, Lambda, S3, EC2, DynamoDB, API Gateway, KMS, VPC, CloudWatch, IAM roles & Policies.
  • Dev-ops Tools: - Azure DevOps, JIRA, GIT, Bitbucket, Bamboo, Jenkins, Confluence.
  • Client Specific: - JOF Job, Merlin, JOF Ingestion framework, Control M.

Duties and Responsibilities:

  • Associate would be responsible for design, code, test, document and maintain high-quality and scalable Big Data solutions on-prem or on-cloud.
  • Research, evaluate and deploy new tools frameworks and design patterns to build sustainable Big Data platform.
  • Proactively communicate and collaborate with stakeholders, product specialists and data architects to capture functional and non-functional requirements.
  • Make accurate development effort estimates to assist management in project planning.
  • Responsible for migrating on-premises workloads to the cloud and debug cloud stacks.
  • Responsible for the design and build of the CI/CD and/or DevOps pipelines and looks to improve existing workflows through deployment automation.
  • Respond to technical issues in a professional and timely manner.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

Salary Range: >$100,000

Next Steps: If you feel this opportunity suits you, or Cognizant is the type of organization you would like to join, we want to have a conversation with you! Please apply directly with us. For a complete list of open opportunities with Cognizant, visit http://www.cognizant.com/careers.Cognizant is committed to providing Equal Employment Opportunities. Successful candidates will be required to undergo a background check.

#LI-CTSAPAC

Refer code: 1427706. Cognizant Technology Solutions - The previous day - 2024-02-04 12:11

Cognizant Technology Solutions

Melbourne, VIC
Popular Big Data Developer jobs in top cities
Jobs feed

Finance Business Partner

First People Recruitment Solutions

New South Wales

Contract

Senior Business Analyst - Digitial

Suncorp

Sydney, NSW

Permanent

Electrical Fitter Mechanic / Linesperson - Whitsunday Region, Cannonvale

Energy Queensland

Cannonvale, QLD

$97,006 per year

Senior Business Analyst - Data

Suncorp

Melbourne, VIC

Permanent

Senior Business Analyst - Commercial Regulatory

Suncorp

Australia

Permanent

Finance Business Partner

Rac

West Perth, WA

Permanent

Senior IT Business Analyst

Professional Search Group

Perth, WA

$160,000-170,000 per year

Finance Business Partner

First People Recruitment Solutions

Australia

Contract

Senior Finance Business Analyst - News & Current Affairs

Nine

North Sydney, NSW

Permanent

Junior to Mid Level Data Analytics Engineer

Robert Half

Melbourne, VIC

Permanent

Share jobs with friends

Related jobs

Big Data Developer-Pyspark

Big Data Developer-PYSPARK

Cognizant

$100,000 per year

Melbourne, VIC

4 months ago - seen