GCP Architect at Cognizant

Posted in Information Technology 7 days ago.

Type: Full-Time
Location: Richardson, Texas





Job Description:

We are Cognizant Artificial Intelligence

Digital technologies, including analytics and AI, give companies a once-in-a-generation opportunity to perform orders of magnitude better than ever before. But clients need new business models built from analyzing customers and business operations at every angle to really understand them.

With the power to apply artificial intelligence and data science to business decisions via enterprise data management solutions, we help leading companies prototype, refine, validate and scale the most desirable products and delivery models to enterprise scale within weeks

*You must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future *

Job Title – GCP Architect

Location – Richardson, TX (Remote until COVID)

Roles/Responsibilities:


  • Conduct design reviews and create architecture blue print – choosing the right stack on GCP

  • Experience in large data platform implementation and design  in GCP  - Batch and streaming workloads preferably

  • Experience architecting, developing, or maintaining secure cloud solutions (e.g., Google Cloud Platform).

  • Provide technical expertise in security, compliance, and security best practices as it relates to data services – Dataflow, BQ, Dataproc, Cloud Functions , Composer

  • Be a trusted technical security advisor and resolve technical challenges for customers.

  • E2E Implementation experience on complete Warehouse/Mart migration from On-premise to Google Cloud.

  • Hands on Experience in BigQuery, Composer, Cloud Storage, Compute Engine, Cloud functions, Cloud SQL, BigTable, DataProc, Dataflow and Transfer services.

  • Implementation experience in building data pipelines using cloud composer, Kubernetes, Dataproc and storing analytical data in BigQuery for downstream consumption.

  • Good Exposure and experience in migrating the On-Premise data warehouse/ marts to GCP based solution using the GCP cloud native (BigQuery, Composer, Cloud Storage and BigQuery Transfer Service) solutions

  • Data Ingestion and ELT design and implementation

  • Experience defining ABC process and ETL framework.

  • Coding review and Unit test automation.

Required Qualifications:  


  • Bachelor’s degree or equivalent experience in Computer Science

  • Must have is Design and Architecture experience of atleast 2 projects in end to end GCP stack .

  • Must have been involved in atleast one DW or Hadoop migration project to GCP and used Spark programming in Pyspark or Scala.

  • Must have used atleast one of Dataflow, Dataproc Ephemeral clusters and BigQuery.

  • Have worked in atleast one project in event processing using Kafka or Pub Sub.

Technical Skills




















SNo Primary Skill Proficiency Level * Rqrd./Dsrd.
1 Snowflake PL1 Required
2 Google Cloud Platform PL1 Required

 

* Proficiency Legends






















Proficiency Level Generic Reference
PL1 The associate has basic awareness and comprehension of the skill and is in the process of acquiring this skill through various channels.
PL2 The associate possesses working knowledge of the skill, and can actively and independently apply this skill in engagements and projects.
PL3 The associate has comprehensive, in-depth and specialized knowledge of the skill. She / he has extensively demonstrated successful application of the skill in engagements or projects.
PL4 The associate can function as a subject matter expert for this skill. The associate is capable of analyzing, evaluating and synthesizing solutions using the skill.