Role : Databricks Developer with PythonLocation: BangaloreExp: 3-5 yrsNotice Period: Immediate to 15 daysMandatory Skills:· Around 5 years experience as Databricks/PySpark developer· Strong Python programming skills, a solid understanding of Apache Spark's concepts (like DataFrames, Spark SQL, and MLlib), and familiarity with cloud platforms.· 1 to 3 days per week - Work from Bengaluru officeJob Description Summary:Design and Development:• Design, develop, and deploy ETL processes on Databricks for data integration and transformation.• Implement and optimize Spark jobs, data transformations, and data processing workflows.• Build and optimize data pipelines, architectures, and datasets.• Collaboration - Collaborate with data engineers, architects and team members to ensure efficient data processing and storage.Best Practices - Implement best practices for data pipelines, including monitoring, logging, and error handling.Tools and Technologies:•Proficient in using Databricks, Azure Data Factory, and other Azure services.•Experience with PySpark, SQL for ETL logic.•Familiarity with Apache Spark and Delta Lake.Other Skills:·Experience with data warehousing concepts and data modeling.·Hands-on experience in creating stored procedures, functions, tables, cursors.·Experience in database testing, data comparison, and data transformationscripting·Capable of troubleshooting common database issues·Familiar with tools that can aid with profiling server resource usage and optimizing it·Hands on experience in Gitlab with understanding of CI/CD Pipeline, DevOps tools
Job Title
Software Engineer