Skip to Main Content

Job Title


Data Analytics Engineer


Company : techjays


Location : Coimbatore, Tamil Nadu


Created : 2025-06-13


Job Type : Full Time


Job Description

About the Job :At Techjays, we are driving the future of artificial intelligence with a bold mission to empower businesses worldwide by helping them build AI solutions that transform industries. As an established leader in the AI space, we combine deep expertise with a collaborative, agile approach to deliver impactful technology that drives meaningful change.Our global team consists of professionals who have honed their skills at leading companies such as Google, Akamai, NetApp, ADP, Cognizant Consulting, and Capgemini. With engineering teams across the globe, we deliver tailored AI software and services to clients ranging from startups to large-scale enterprises.Be part of a company that’s pushing the boundaries of digital transformation. At Techjays, you’ll work on exciting projects that redefine industries, innovate with the latest technologies, and contribute to solutions that make a real-world impact. Join us on our journey to shape the future with AI.We’re looking for a skilled AI Implementation Engineer with a strong JavaScript/TypeScript background to help us build scalable AI-powered systems, with a particular focus on Retrieval-Augmented Generation (RAG) and LLM integrations. You will play a key role in developing intelligent applications that combine vector search, natural language processing, and LLM-driven reasoning, delivering real-time AI experiences to end users.You’ll work with full-stack engineers, AI researchers, and data teams to create seamless interfaces between front-end applications, back-end services, and AI models.We are seeking a Data Analytics Engineer to design, develop, and optimize data pipelines and analytical dashboards that drive key business decisions. The ideal candidate will have hands-on experience working with BI tools like Power BI and Tableau and a strong background in building scalable data pipelines in AWS, GCP, or Azure cloud environments./Experience Range: 3 to 6 yearsPrimary skills : Power BI, Tableau, SQL, Data Modeling, Data Warehousing, ETL/ELT Pipelines, AWS Glue, AWS Redshift, GCP BigQuery, Azure Data Factory, Cloud Data Pipelines, DAX, Data Visualization, Dashboard DevelopmentSecondary Skills : Python, dbt, Apache Airflow, Git, CI/CD, DevOps for Data, Snowflake, Azure Synapse, Data Governance, Data Lineage, Apache Beam, Data Catalogs, Basic Machine Learning ConceptsWork Location : RemoteKey Responsibilities:- Develop and maintain scalable, robust ETL/ELT data pipelines across structured and semi-structured data sources. - Collaborate with data scientists, analysts, and business stakeholders to identify data requirements and transform them into efficient data models. - Design and deliver interactive dashboards and reports using Power BI and Tableau. - Implement data quality checks, lineage tracking, and monitoring solutions to ensure high reliability of data pipelines. - Optimize SQL queries and BI reports for performance and scalability. - Work with cloud-native tools in AWS (e.g., Glue, Redshift, S3), GCP (e.g., BigQuery, Dataflow), or Azure (e.g., Data Factory, Synapse). - Automate data integration and visualization workflows.Required Qualifications:- Bachelor's or Master’s degree in Computer Science, Information Systems, Data Science, or a related field. - 3+ years of experience in data engineering or data analytics roles. - Proven experience with Power BI and Tableau – including dashboard design, DAX, calculated fields, and data blending. - Proficiency in SQL and experience in data modeling and relational database design. - Hands-on experience with data pipelines and orchestration using tools like Airflow, dbt, Apache Beam, or native cloud tools. - Experience working with one or more cloud platforms – AWS, GCP, or Azure. - Strong understanding of data warehousing concepts and tools such as Snowflake, BigQuery, Redshift, or Synapse.Preferred Skills:- Experience with scripting in Python or Java for data processing. - Familiarity with Git, CI/CD, and DevOps for data pipelines. - Exposure to data governance, lineage, and catalog tools. - Basic understanding of ML pipelines or advanced analytics is a plus.What We Offer:- Competitive salary and benefits. - Opportunity to work with modern cloud-native data stack. - Collaborative, innovative, and data-driven work environment. - Flexible working hours and remote work options.