Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge.Job Position : Manager_Data Architect_Data and Analytics_Advisory_BangaloreAbout the Role:We are hiring sharp, hands-on Data Architect to lead the design and implementation of scalable, high-performance data solutions across both traditional and cloud-based data platforms. This role demands deep expertise in PySpark, SQL, Python and Data Modelling, along with a strong understanding of cloud platforms and modern data engineering practicesWhat you will do: Architect, Design and implement scalable end-end data solutions, ensuring scalability, performance, and cost-efficiency.Build and Deploy batch and near real-time use cases in cloud environmentsDevelopment using Pyspark and Python scripts for large-scale data processing and ETL workflowsWrite optimized, complex SQL for data transformation and analysis.Optimize existing Pyspark and SQL scripts over large-scale datasets (TBs) with a focus on performance and cost-efficiency.Create and maintain data models, ensuring data quality and consistencyLeverage AI/ML models in data transformations and analytics.Implement data governance and security best practices in cloud environmentsCollaborate across teams to translate business requirements into robust technical solutions‘Must have’ Primary skills and experiences 7+ years of hands-on experience in Data EngineeringStrong command over SQL, Python, and PySpark for data manipulation and analysisDeep experience with data & analytics & warehousing and implementation in cloud environments (Azure/AWS)Proficiency in data modeling techniques for cloud-based systems (Databricks, Snowflake)Solid understanding of ETL/ELT processes and best practices in cloud architecturesExperience with dimensional modeling, star schemas, and data mart designPerformance optimization techniques for cloud-based data warehousesStrong analytical thinking and problem-solving skillsSecondary Skills:Airflow (Workflow Design and Orchestration)Apache Kafka – real-time streamingCI/CD (Automation, GitOps, DevOps for Data)Understanding of warehousing tools like Teradata, Netezza, etc.‘Good to have’ knowledge, skills and experiences Familiarity with data lake architectures and delta lake concepts Data Warehouse experience using Databricks/SnowflakeKnowledge of data warehouse migration strategies to cloud Experience with real-time data streaming technologies (e.g., Apache Kafka, Azure Event Hubs) Exposure to data quality and data governance tools and methodologies Understanding of Certifications in Azure or AWS or DatabricksExperience 7-10 yearsCertificationsSpark CertifiedDatabricks DE Associate/Professional CertifiedGood to Have:Snowflake SnowPro Core CertifiedEducation qualification:BE, B.Tech, ME, M,Tech, MBA, MCA (60% above)
Job Title
Data Architect