Position Requirement4+ years of experience using Data Integration Tools - Pentaho Or any other ETL/ELT tools.4+ years of experience using traditional databases like Postgres, MSSQL, Oracle1+ years of experience using Columnar databases like Vertica, Google BigQuery, Amazon Redshift1+ years of experience in Scheduler/Orchestration Tools Like Control-M, Autosys, Airflow, JAMSGood conceptual knowledge on ETL/ELT Strategies.Good conceptual knowledge in any Code Versioning ToolsGood collaboration, communication and documentation skills.Experience of working in Agile Delivery Model.Requires minimal or no direct supervisionDesirable:Good knowledge in Data Visualization Tools like Tableau, Pentaho BA Tools. Digital Marketing/Web analytics or Business Intelligence a plus.Knowledge of scripting languages such as Python.Experience in the Linux environment is preferred but not mandatory.Roles & Responsibilities:Develop & Support multiple Data Engineering projects with heterogeneous data sources, produce/consume data to/from messaging queues like Kafka, push/pull data to/from REST API’s.Support in-house build Data Integration Framework, Data Replication Framework, Data Profiling & Reconciliation Framework.Develop Data Pipelines with good coding standards, unit testing with detailed test cases.Willingness to learn new technologies.
Job Title
Data Engineer