Search by job, company or skills

AIMLEAP

Data Crawling Analyst (2 to 5 yrs)

3-5 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 16 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Data Crawling Analyst

Experience: 25 Years

Location: Remote (Work from Home)

Mode of Engagement: Full-time

No of Positions: 8

Educational Qualification: Bachelor's degree in Computer Science, IT, or related field

Industry: IT / Software Services / Data & AI

Notice Period: Immediate Joiners Preferred

What We Are Looking For

  • Strong hands-on experience handling dynamic, JavaScript-heavy websites using Selenium, Playwright, or Puppeteer.
  • Expertise in managing cookies, sessions, and local storage to maintain state and bypass authentication/anti-bot systems.
  • Ability to solve CAPTCHAs programmatically using third-party or AI-based solutions.
  • Proven experience in proxy rotation, IP management, and fingerprinting techniques to avoid detection and rate limits.
  • Capability to design scalable data pipeline architectures to automate extraction, validation, transformation, and storage.

Responsibilities

  • Develop and maintain high-scale automated scraping workflows for dynamic and protected websites.
  • Implement browser automation solutions with Playwright/Selenium/Puppeteer for complex user flows and asynchronous rendering.
  • Integrate CAPTCHA-solving services, proxy rotation systems, and advanced anti-detection mechanisms.
  • Build robust ETL-style data pipelines ensuring data quality, monitoring, retries, and error handling.
  • Collaborate with AI, data engineering, and product teams to deliver reliable scraping datasets.

Qualifications

  • 35 years of experience in Python-based web scraping and automation.
  • Strong experience in Selenium, Playwright, Puppeteer, and browser automation.
  • Fluent in Python with experience using Requests, BeautifulSoup, Async frameworks, or Scrapy.
  • Hands-on experience with proxy networks, fingerprinting, session handling, and anti-bot strategies.
  • Understanding of SQL/NoSQL databases for structured data storage.
  • Experience working with AWS/GCP/Azure is a plus.
  • Strong debugging, analytical, and problem-solving skills.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 137583787