For one of our international clients, we are seeking an experienced Project Manager with exposure to Databricks and modern data platforms to lead the delivery of data engineering, analytics, and AI initiatives within a cloud-based environment.Project Details:Start: ASAPDuration: 6 months + possible extensionLocation: Hyderabad, India (Remote)Language: EnglishKey Skills & Responsibilities:5 to 8+ years of project management (Scrum/Kanban/Hybrid) experience, preferably in data engineering/analytics/ML programs.Strong understanding of Databricks ecosystem including Apache Spark, Delta Lake, MLflow, Unity Catalog, and Databricks SQL.Experience working with cloud platforms (AWS, Azure, or GCP) and modern data tools such as dbt, Airflow, Azure Data Factory, Kafka, or Fivetran.Manage end-to-end delivery of data platform and analytics projects using Agile methodologies.Familiarity with modern data stacks (e.g.Airflow/Azure Data Factory, Kafka/Event Hubs, Fivetran).Coordinate with data engineering, architecture, and security teams to deliver Databricks Lakehouse solutions.Oversee project planning, stakeholder management, risk management, reporting, and cost optimization (DBU/cloud usage).Certifications (nice to have): PMP, CSM/PSM, SAFe, Databricks (Lakehouse Fundamentals, Data Engineering Associate/Professional), Azure/AWS/GCP cloud certs.If you are interested about the role, please send me your updated CV to
Job Title
Project Manager - Databricks & Data Platforms