Empleos

GCP Solution Architect (Data Platform)

Posted 23 days ago
Culinovo
Job Title: GCP Solution Architect (Data Platform)

Location: Onsite New York City, NY

Employment Type: W2 Only

Work Authorization: U.S. Citizens / Green Card Holders only

Experience Required: Minimum 12+ Years in IT

LinkedIn Profile: Mandatory

Require Candidates on our W2 only

Job Summary

We are seeking an experienced GCP Solution Architect (Data Platform) to lead the design, development, and implementation of large-scale cloud data platforms. The ideal candidate will have deep expertise in Google Cloud Platform, strong architectural experience, and hands-on exposure to modern data engineering, analytics, AI/ML, and regulatory compliance within the utilities domain.

Key Responsibilities

  • Lead end-to-end architecture, design, and delivery of scalable GCP-based data platforms.
  • Define cloud migration strategies, including re-platforming and re-architecting data workloads.
  • Architect, optimize, and implement solutions using Vertex AI, BigQuery, Dataflow, Pub/Sub, and other GCP-native services.
  • Build and support batch and streaming data pipelines, data warehouses, and analytics frameworks.
  • Integrate AI/ML solutions into data platforms to enhance intelligence and automation.
  • Collaborate with cross-functional teams to ensure cloud architecture aligns with business and security requirements.
  • Evaluate and compare Azure & Databricks capabilities to guide solution tradeoffs and ensure compatibility with GCP solutions.
  • Ensure all solutions adhere to regulatory and security standards, including NERC CIP and other utilities governance frameworks.
  • Provide architectural assessments, recommendations, technical leadership, and best practices across cloud data engineering initiatives.

Required Skills & Experience

  • 12+ years of overall IT experience with extensive cloud architecture exposure.
  • Strong hands-on experience with GCP Architecture, GCP migrations, and data platform modernization.
  • Proficiency with BigQuery, Dataflow, Pub/Sub, Vertex AI, and cloud-native data analytics services.
  • Expertise with data engineering concepts including batch/streaming pipelines, data warehouses, and distributed processing.
  • Experience with AI/ML integration within cloud ecosystems.
  • Working knowledge of Azure and Databricks, with ability to analyze tradeoffs and design hybrid solutions.
  • Familiarity with NERC CIP, security, compliance frameworks, and governance processes for utilities.
  • Excellent communication, leadership, and stakeholder management skills.
  • LinkedIn profile is mandatory for submission.

Preferred Qualifications

  • GCP Professional Certifications (Cloud Architect, Data Engineer, Machine Learning Engineer).
  • Experience in the utilities or energy domain.
  • Strong scripting or programming knowledge (Python, SQL).
Login to Apply Now