Skip to Main Content

Job Title


Data Engineer [T500-25102]


Company : Costco IT


Location : Hyderabad, Telangana


Created : 2026-04-10


Job Type : Full Time


Job Description

About Costco Wholesale:Costco Wholesale is a multi-billion-dollar global retailer with warehouse club operations in eleven countries. They provide a wide selection of quality merchandise, plus the convenience of specialty departments and exclusive member services, all designed to make shopping a pleasurable experience for their members.About Costco Wholesale India:At Costco Wholesale India, we foster a collaborative space, working to support Costco Wholesale in developing innovative solutions that improve members’ experiences and make employees’ jobs easier. Our employees play a key role in driving and delivering innovation to establish IT as a core competitive advantage for Costco Wholesale. Position Summary:Data Engineers are responsible for developing and operationalizing data pipelines/integrations to make data available for consumption (i.e. Reporting, Data Science/Machine Learning, Data APIs, etc.). This includes data ingestion, data transformation, data validation/quality, data pipeline optimization, orchestration; and deploying code to production via CI/CD. The Data Engineer role requires knowledge of software development/programming methodologies, various data sources (Relational Databases, flat files (csv, delimited), APIs, XML, JSON, etc.), data access (SQL, Python, etc.), followed by expertise in data modeling, cloud architectures/platforms, data warehousing, and data lakes. This role also will partner closely with Product Owners, Data Architects, Platform/DevOps Engineers, etc. to design, build, test, implement and maintain data pipelines.Roles and Responsibilities:Own the design, development, and maintenance of ongoing data pipelines. to drive key business decisions across Martech / AdTech solutions for web, mobile, and backend systemsCollaborate with product, data, and engineering teams to deliver scalable and reliable data-based solutions for performant ads & marketing experiencesContribute to the Data architecture and reporting roadmap of the Martech/AdTech stackRecognize and adopt best data engineering practices in data integrity, test design, analysis, validation, and documentation.Ensure security, compliance, and data quality across integrations and data flowsIdentify root causes of data issues and develop solutions; Troubleshoot and resolve issues related to latency, data quality and accuracyThis position will be filled onsite in Hyderabad, India.Job Duties/Essential Functions:Develops complex SQL & Python against a variety of data sourcesImplements streaming data pipelines using event/message-based architecturesDemonstrates ability to communicate technical concepts to non-technical audiences both in written and verbal formWorks in tandem with Data Architects to align on data architecture requirements provided by the requestor.Defines and maintains optimal data pipeline architectureIdentifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery/orchestrationDemonstrates strong understanding with coding and programming concepts to build data pipelines (e.g. data transformation, data quality, data integration, etc.)Analyzes data to spot anomalies, trends and correlate data to ensure Data QualityDevelops data pipelines to store data in defined data models / structuresDemonstrates strong understanding of data integration techniques and tools (e.g. Extract, Transform, Load (ETL) / Extract, Load, Transform (ELT)) toolsDemonstrates strong understanding of database storage concepts (data lake, relational databases, NoSQL, Graph, data warehousing)Identifies ways to improve data reliability, efficiency and quality of data managementPerforms peer review for another Data Engineer’s workRequired Skills:2+ years of experience in Data EngineeringHands-on experience with AdTech or Martech platforms (DSP, DMP, CDP, CRM, tracking tools, etc.).Proficiency in programming (e.g., Python, Java, or similar).Experience with data integration, APIs, and cloud infrastructure (GCP/AWS/Azure).Strong collaboration, problem-solving, and communication skills.