Skip to Main Content

Job Title


Consultant - Technical - Data


Company : Value Creed


Location : Belgaum, Karnataka


Created : 2025-12-19


Job Type : Full Time


Job Description

We are looking for a Data Engineer with strong end-to-end experience in building and optimizing data solutions within cloud environments. The ideal candidate will have expertise in designing ETL pipelines using Python, SQL, and modern ETL tools such as Azure Data Factory and AWS Glue.You should have a solid background working with cloud platforms like Azure and AWS, and experience with data warehousing and data lakehouse technologies such as Databricks and Snowflake. A critical part of the role involves building scalable data products for consumption and creating actionable insights through data visualization tools like Power BI or Tableau.The successful candidate will work closely with business stakeholders to understand data needs, build reliable pipelines, and ensure data is readily available for analytics and reporting, all while focusing on scalability, performance, and security.LocationHyderabadExperience2-4 yearsOn Job ResponsibilitiesDesign, implement, and optimize end-to-end data pipelines for ingesting, processing, and transforming large volumes of structured and unstructured data.Develop robust ETL processes to integrate data from diverse sources into the data ecosystem.Implement data validation and quality checks to ensure accuracy and consistency.Create and maintain interactive dashboards and reports using Power BI or Tableau to provide meaningful insights to stakeholders.Design and maintain data models, schemas, and database structures to support analytical and operational use cases.Optimize data storage and retrieval mechanisms for performance and scalability.Evaluate and implement data storage solutions, including relational databases, NoSQL databases, data lakes, and cloud storage services.Design, deploy, and manage scalable cloud infrastructure on Azure Cloud or AWS.Implement data engineering pipelines using Databricks for large-scale data processing and machine learning applications.Configure and manage data infrastructure components, including databases, data warehouses, and data lakes.Monitor system performance, troubleshoot issues, and implement optimizations to enhance reliability and efficiency.Develop scripts and queries using Python and SQL to automate data extraction, transformation, and loading (ETL) processes.Work with Snowflake to build and manage data platforms for seamless data integration and querying across cloud services.Build and maintain integrations with internal and external data sources and APIs.Implement RESTful APIs and web services for data access and consumption.Ensure compatibility and interoperability between different systems and platforms.Collaborate with data scientists, analysts, and stakeholders to understand data requirements and deliver tailored solutions.Document technical designs, workflows, and best practices to facilitate knowledge sharing and maintain system documentation.Provide technical guidance and support to team members and stakeholders as needed.Optimize data solutions for performance, scalability, and efficiency.Implement data security controls and access management policies to protect sensitive information.Required Skills & QualificationsBachelor’s degree in computer science, engineering, information systems, or related field. Master's degree preferred.2+ years of experience in data engineering or data analysis roles.Proficiency in Power BI or Tableau for data visualization.Experience with data warehouse and database design concepts, including schema design and optimization.Hands-on experience with Azure Cloud, AWS, and Databricks for data processing and infrastructure management.Strong programming skills in Python and SQL for data manipulation and analysis.Experience with Snowflake or other cloud-based data platforms.Excellent problem-solving and analytical skills with the ability to work in a fast-paced environment.Strong communication skills to work effectively with technical and non-technical stakeholders.