Skip to Main Content

Job Title


Databricks Engineer


Company : DataE2E Technologies


Location : Kolhapur, Maharashtra


Created : 2025-07-23


Job Type : Full Time


Job Description

Job Title:Databricks Engineer (Remote) Location:Remote Job Type:Full-TimeAbout the Role: We are looking for an experiencedDatabricks Engineerwith a strong background indata engineeringto help build and optimize scalable, high-performance data solutions. The ideal candidate has hands-on experience in Databricks production environments and a deep understanding of modern data architecture. You'll work with cross-functional teams to create robust data pipelines and ensure the reliability, quality, and observability of data across platforms.Key Responsibilities: Design, build, and manage large-scale data pipelines and ETL/ELT workflows in Databricks using PySpark/Scala. Develop scalable data solutions using Python and SQL across diverse cloud environments (AWS, Azure, or GCP). Implement and optimize data lakes, data warehouses, and data mesh architectures using modern storage formats like Parquet, Avro, and Delta Lake. Ensure data governance, observability, and quality across all stages of the data lifecycle. Collaborate with data architects, analysts, and DevOps teams to drive architecture design, pipeline performance, and deployment practices. Optionally, utilize Docker and Kubernetes for containerized data engineering workflows.Required Qualifications: 5+ Years of experience in data engineering, including 3+ years in Databricks production environments. Strong hands-on expertise in Python, SQL, and Apache Spark (PySpark/Scala). Experience working with cloud platforms (AWS, Azure, or GCP) and building cloud-native data pipelines. Deep understanding of data lake, data warehouse, and data mesh principles. Proficiency with file formats like Parquet, Avro, and Delta Lake. Familiarity with containerization and orchestration tools (Docker, Kubernetes) is a plus. Solid grasp of data quality, observability, and governance best practices.