About Position:We are seeking GCP Data Engineers with Python, Bigquery, SQL, Composer and CloudRun experience.- Role: GCP Data Engineer - Location: All Persistent Locations - Experience: 7 to 15 Years - Job Type: Full Time EmploymentWhat You'll Do:- Design, build, and maintain scalable, event-driven, and batch data pipelines using Python, SQL, BigQuery and big data technologies (Spark, Dataflow, etc.). - Partner with data scientists and ML engineers to build, deploy, and evolve services encapsulating DS/ML models at scale. - Design and implement data algorithm pipelines to efficiently and rapidly process millions of products and requests. - Architect, build, and manage robust data models and infrastructure on Google Cloud Platform (GCP), with a heavy focus on BigQuery, Dataflow, Pub/Sub, and GKE. - Build and manage the CI/CD, orchestration (Kubernetes, Airflow), and infrastructure for ML systems. - Work cross-functionally with product management, data scientists, and analysts to understand problems and design solutions. - Champion and implement best practices for data quality, reliability, and performance for petabyte-scale datasets. - Implement and advocate for engineering best practices, helping level up other engineers on the team.Expertise You'll Bring:- 7 to 15 years of hands-on experience in data engineering or backend software engineering. - Expertise in Python for data processing (e.g., Pandas, Spark) and/or backend development. - Strong, advanced SQL skills and extensive experience with cloud data warehouses like BigQuery. - Proven, deep hands-on experience designing and operating scalable data solutions on Google Cloud Platform (GCP), with expertise in services like BigQuery, Dataflow, Pub/Sub, GCS, Dataproc, and Vertex AI. - Experience with big data technologies (Spark, Hadoop, MapReduce) and messaging systems (Kafka, Pub/Sub). - Production experience with containerized development (Docker) and orchestration (Kubernetes). - Experience building or maintaining the infrastructure for ML systems and integrating models into production services. - An ability to understand and make tradeoffs between different technologies and patterns. - Excellent communication skills and ability to work effectively with engineers, product managers, and business stakeholders. - Experience leveraging modern IDEs and AI-assisted development tools (e.g., Cursor, GitHub Copilot) to accelerate development cycles.Benefits:- Competitive salary and benefits package - Culture focused on talent development with quarterly growth opportunities and company-sponsored higher education and certifications - Opportunity to work with cutting-edge technologies - Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards - Annual health check-ups - Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parentsValues-Driven, People-Centric & Inclusive Work Environment:Persistent is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds.- We support hybrid work and flexible hours to fit diverse lifestyles. - Our office is accessibility-friendly, with ergonomic setups and assistive technologies to support employees with physical disabilities. - If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employmentLet’s unleash your full potential at Persistent - /careers“Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind.”
Job Title
GCP Data Engineer