Skip to Main Content

Job Title


GCP Data Engineer


Company : TELUS Digital


Location : Mumbai, Maharashtra


Created : 2025-05-07


Job Type : Full Time


Job Description

Who We Are:We are a Digital Customer Experience organization, with a comprehensive coverage of IT Services from Traditional Services to Next Gen Digital Services. At TELUS International, we focus on lean, agile, human-centered design. We have been in the technology business since 2002, with HQs in California, USA. TELUS International also invests in R&D where innovators, researchers and visionaries collaborate to explore emerging customer experience tech to disrupt the future.We are about 70,000 employees working across 35 delivery centers across Asia, Europe, North America & Near shore in Central America & Canada.We are focused on enabling Digital Transformation for our customers by driving Innovation & Automation through self-service options like AI Bots, Robotic Process Automation etc. for hyper personalized, secure, on demand, and elastic solutions. Our workforce is connected to drive customer experience in Media & Communications, Travel & Hospitality, eCommerce, Technology, Fintech & Financial services & Healthcare domains.How we Help you Grow:Our development programmers are designed to promote technical growth, enhance leadership and relationship skills across individuals. To stimulate your career growth, a vast array of in-house training programs which are listed below, but not limited to:-Job Title: Sr.GCP Data Engineer(SAS to GCP Migration)Work Mode: Hybrid/RemoteYears of Exp: 7- 10yearShift Time: 3PM-12AMNotice- Immediate to 15 Days PreferedImmediate Joiners PreferredAs a Sr. Data Engineer with a focus on pipeline migration from SAS to Google Cloud Platform (GCP) technologies, you will tackle intricate problems and create value for our business by designing and deploying reliable, scalable solutions tailored to the company’s data landscape. You will be responsible for the development of custom-built data pipelines on the GCP stack, ensuring seamless migration of existing SAS pipelines.Responsibilities:Design, develop, and implement data pipelines on the GCP stack, with a focus on migrating existing pipelines from SAS to GCP technologies.Develop modular and reusable code to support complex ingestion frameworks, simplifying the process of loading data into data lakes or data warehouses from multiple sources.Collaborate with analysts and business process owners to translate business requirements into technical solutions.Utilize your coding expertise in scripting languages (Python, SQL, PySpark) to extract, manipulate, and process data effectively.Leverage your expertise in various GCP technologies, including BigQuery, Dataproc, GCP Workflows, Dataflow, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM, and Vertex AI, to enhance data warehousing solutions.Maintain high standards of development practices, including technical design, solution development, systems configuration, testing, documentation, issue identification, and resolution, writing clean, modular, and sustainable code.Understand and implement CI/CD processes using tools like Pulumi, GitHub, Cloud Build, Cloud SDK, and Docker.Participate in data quality and validation processes to ensure data integrity and reliability.Optimize performance of data pipelines and storage solutions, addressing bottlenecks.Collaborate with security teams to ensure compliance with industry standards for data security and governance.Communicate technical solutions engineering teams and business stakeholders.Required Skills & Qualifications:7-10 years of experience in software development, data engineering, business intelligence, or a related field, with a proven track record in manipulating, processing, and extracting value from large datasets.Extensive experience with GCP technologies in the data warehousing space, including BigQuery, Dataproc, GCP Workflows, Dataflow, Cloud Scheduler, Secret Manager, Batch, Cloud Logging, Cloud SDK, Google Cloud Storage, IAM, and Vertex AI.Proficient in Python, SQL, and PySpark for data manipulation and pipeline creation.Experience with SAS, SQL Server, and SSIS is a significant advantage, particularly for transitioning legacy systems to modern GCP solutions.Ability to develop reusable, modular code for complex ingestion frameworks and multi-use pipelines.Understanding of CI/CD processes and tools, such as Pulumi, GitHub, Cloud Build, Cloud SDK, and Docker.Proven experience in migrating data pipelines from SAS to GCP technologies.Strong problem-solving abilities and a proactive approach to identifying and implementing solutions.Familiarity with industry best practices for data security, data governance, and compliance in cloud environments.Bachelor's degree in Computer Science, Information Technology, or a related technical field, or equivalent practical experience.GCP Certified Data Engineer (preferred).Excellent verbal and written communication skills, with the ability to advocate for technical solutions to a diverse audience including engineering teams, and business stakeholders.Willingness to work in the afternoon shift from 3 PM to 12 AM IST.How will this opportunity be a catalyst in your career graph?To stimulate your career growth, a vast array of in-house training programs which are listed below, but not limited to:-Trending technical skillsBusiness domain & customer interactionBehavioral & effective communicationTransparent work culture to lift your ideas & initiatives at enterprise level & investment to execute successfully.Equal Opportunity Employer:At TELUS International, we are proud to be an equal opportunity employer and are committed to creating a diverse and inclusive workplace. All aspects of employment, including the decision to hire and promote, are based on applicants’ qualifications, merits, competence, and performance without regard to any characteristic related to diversity.