Job Overview We are looking for a Senior DataOps Architect to help define and lead the architectural direction of our large‑scale data platforms. In this role, you will design modern, scalable data infrastructure across hybrid on‑prem and cloud environments, driving best practices in automation, reliability, and DataOps across the organization. You will partner closely with engineering and leadership teams to deliver high‑performance, secure, and globally scalable data solutions.Key ResponsibilitiesDesign and implement large‑scale, high‑availability data platform solutions across hybrid environments (on‑prem + GCP primary, Azure secondary)Architect DataOps pipelines and automation workflows using Terraform, Ansible, and other Infrastructure‑as‑Code frameworksLead the end‑to‑end design and lifecycle management of data infrastructure running across on‑premises data centers and cloud platformsEstablish and enforce DataOps best practices including CI/CD for data pipelines, automated testing, and data quality frameworksDefine and guide cloud migration strategies from on‑prem to GCP, including hybrid patterns and data residency considerationsBuild and optimize data infrastructure leveraging Kubernetes, container orchestration, and service mesh technologiesDevelop high‑quality technical and architectural documentation for data infrastructure and DataOps workflowsEvaluate and introduce modern tooling, platforms, and methodologies to improve reliability, efficiency, and developer velocityMentor engineering teams on DataOps principles, cloud‑native architectures, and infrastructure automation best practicesCollaborate with platform, data, and security teams to ensure governance, compliance, and operational excellenceStay up to date on DataOps, cloud, and data infrastructure technologies, proactively recommending improvementsBasic QualificationsBachelor’s degree in Computer Science, Data Engineering, or related technical field10+ years of experience in Infrastructure, DevOps, or DataOps rolesProven experience architecting and operating on‑prem data infrastructure and hybrid cloud environmentsDeep expertise with GCP data‑related services (BigQuery, Dataflow, Composer, GKE, Cloud Storage, IAM) – 5+ yearsStrong hands‑on background with on‑prem technologies including bare metal, virtualization (VMware, KVM), storage, and networking – 5+ yearsExpert‑level proficiency with Kubernetes, Helm, and service mesh frameworks (Istio, Linkerd) in production environmentsExperience with modern orchestration tools such as Apache Airflow, Prefect, or DagsterFamiliarity with monitoring and observability stacks (Prometheus, Grafana, ELK, Datadog)Experience with GitOps and CI/CD platforms (GitLab CI, GitHub Actions, Jenkins)Understanding of data security, compliance, and disaster recovery planningAdvanced proficiency with Terraform, Ansible, and CloudFormationStrong communication skills and the ability to explain complex technical concepts to diverse stakeholdersExperience with Azure services and hybrid connectivity technologies (ExpressRoute, VPN) is a plus
Job Title
Senior DataOps Architect (GCP)