Job DescriptionDetails1Role -Senior Developer2Required Technical Skill Set - Spark/Scala/Unix3Desired Experience Range -5-8 years4Location of Requirement - PuneDesired Competencies (Technical/Behavioral Competency)Must-Have**(Ideally should not be more than 3-5)Minimum 4+ years of experience in development of Spark ScalaExperience in designing and development of solutions for Big Data using Hadoop ecosystem technologies such as with Hadoop Bigdata components like HDFS, Spark, Hive Parquet File format, YARN, MapReduce, SqoopGood Experience in writing and optimizing Spark Jobs, Spark SQL etc. Should have worked on both batch and streaming data processing.Experience in writing and optimizing complex Hive and SQL queries to process huge data. good with UDFs, tables, joins, Views etcExperience in debugging the Spark codeWorking knowledge of basic UNIX commands and shell scriptExperience of Autosys, GradleGood-to-HaveGood analytical and debugging skillsAbility to coordinate with SMEs, stakeholders, manage timelines, escalation & provide on time statusWrite clear and precise documentation / specificationWork in an agile environmentCreate documentation and document all developed mappingsSNResponsibility of / Expectations from the Role 1Create Scala/Spark jobs for data transformation and aggregation2Produce unit tests for Spark transformations and helper methods3Write Scaladoc-style documentation with all code4Design data processing pipelines
Job Title
Scala Developer