Skip to Main Content

Job Title


Senior Data Engineer [T500-25114]


Company : Costco IT


Location : Hyderabad, Telangana


Created : 2026-04-12


Job Type : Full Time


Job Description

About Costco Wholesale:Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.About Costco Wholesale India:At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale. Overview: The Sr. Data Engineer is responsible for leading the development of data pipelines and/or data integrations of for Costco’s enterprise certified data sets that are used for business-critical data consumption use cases (i.e. Marketing & Ad Technology, Reporting, Data Science/Machine Learning, Data APIs, etc.). The Sr. Data Engineer will partner with product owners, data architects, and data platform teams to strategize, design, build, test, and automate data pipelines that are relied upon across the company as the single source of truth. Provide mentoring and guidance to more junior engineers and assist in the career growth of the team.Responsibilities:Own the design, development, and maintenance of ongoing data pipelines to drive key business decisions across Martech / AdTech solutions for web, mobile, and backend systems.Collaborate with product, data, and engineering teams to deliver scalable and reliable data-based solutions for performant ads & marketing experiences.Contribute to the Data architecture and reporting roadmap of the Martech/AdTech stack.Recognize and adopt best data engineering practices in data integrity, test design, analysis, validation, and documentation.Ensure security, compliance, and data quality across integrations and data flows.Identify root causes of data issues and develop solutions; Troubleshoot and resolve issues related to latency, data quality and accuracy.Develop and operationalize data pipelines to bring data into Costco’s GCP landscape for the delivery of certified data sets.Work in leadership roles with data architects, data stewards, and data quality engineers to design data pipelines and recommend ongoing optimization of data storage, data ingestion, data quality, and orchestration.Set strategy for data pipelines and transformations and communicate with leadership on more complex subjects.Design, develop, and implement ETL/ELT/CDC processes using Data Build Tool (DBT) and other native GCP Services (BigQuery Subscriptions, Dataproc, Dataflow, etc.).Utilize GCP services such as BigQuery, AlloyDB, Spanner, Dataplex, Pub/Sub, Cloud Storage, etc. to improve and speed delivery of our data products and services.Identify, design, and implement internal process improvements, automating manual processes, and optimizing pipeline delivery and support.Identify ways to improve data reliability, efficiency, and quality of data management.Participate in off-hours 24/7 on-call support on a rotational basis.Technical Skills:8+ years’ experience engineering and operationalizing data pipelines with large and complex datasets.Hands-on experience with AdTech or Martech platforms (DSP, DMP, CDP, CRM, tracking tools, etc.).2+ years’ hands-on experience with Data Build Tool (DBT), Dataflow & Dataproc (Spark)3+ years’ experience working with BigQuery, Google Cloud Storage, AlloyDB and Spanner6+ years’ experience with Data Pipeline, ETL/ELT, and Data WarehousingEffective use of AI and/or LLMs to increase efficiency in deliverablesExtensive experience working with various data sources (DB2, SQL,Oracle, flat files (csv, delimited), APIs, XML, JSON)Experience implementing data integration techniques such as event/message based integration (Kafka, Google Pub/Sub)Advanced SQL skills; solid understanding of relational databases and business data; ability to write complex SQL queries against a variety of data sources.Strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing).Experience with Git / Azure DevOps / JiraCDMP (Certified Data Management Professional) Certification