Skip to Main Content

Job Title


Backend Automation Engineer – Python (Web/Data Extraction) ( 2 to 5 yrs )


Company : AIMLEAP


Location : kanpur, Uttar Pradesh


Created : 2026-04-23


Job Type : Full Time


Job Description

Backend Automation Engineer – Python (Web/Data Extraction) (WFH) Experience:  2–5 Years  Location:  Remote (Work from Home)  Mode of Engagement:  Full-time  No of Positions:  8  Educational Qualification:  Bachelor's degree in computer science, IT, or related field  Industry:  IT / Software Services / Data & AI  Notice Period:  Immediate Joiners Preferred What We Are Looking For Strong hands-on experience handling dynamic, JavaScript-heavy websites  using Selenium, Playwright, or Puppeteer. Expertise in managing cookies, sessions, and local storage  to maintain state and bypass authentication/anti-bot systems. Ability to solve CAPTCHAs programmatically  using third-party or AI-based solutions. Proven experience in proxy rotation, IP management, and fingerprinting techniques  to avoid detection and rate limits. Capability to design scalable data pipeline architectures  to automate extraction, validation, transformation, and storage. Responsibilities Develop and maintain high-scale automated scraping workflows  for dynamic and protected websites. Implement browser automation solutions using Selenium, Playwright, Puppeteer, or similar frameworks for complex user flows and asynchronous rendering. Integrate CAPTCHA-solving services , proxy rotation systems, and advanced anti-detection mechanisms. Build robust ETL-style data pipelines  ensuring data quality, monitoring, retries, and error handling. Collaborate with AI, data engineering, and product teams to deliver reliable scraping datasets. Qualifications 2–5 years of experience in web scraping and automation using Python or JavaScript . Strong experience in Selenium, Playwright, Puppeteer , and browser automation. Proficiency in Python (Requests, BeautifulSoup, Scrapy, Async frameworks) or JavaScript/Node.js for automation workflows. Hands-on experience with proxy networks, fingerprinting, session handling , and anti-bot strategies. Understanding of SQL/NoSQL databases  for structured data storage. Experience working with AWS/GCP/Azure  is a plus. Strong debugging, analytical, and problem-solving skills.