Search by job, company or skills

All European Careers

Senior Python Developer - Data Pipelines - Full remote - Full remote - contractor in USD

new job description bg glownew job description bg glownew job description bg svg
  • Posted 3 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

For an international project in Chennai, we are urgently looking for a Full Remote Senior Python Developer. We seek a Python Developer to build and maintain reliable data pipelines and analytics workflows that ingest data from multiple internal and external sources, transform it into clean, analysis-ready datasets, and validate quality end-to-end. The work supports reporting, monitoring, and advanced analytics.

We are looking for a motivated contractor. Candidates need to be fluent in English.

Tasks and responsibilities:

  • Design, build, and maintain data ingestion pipelines to collect information from APIs, databases, cloud storage, web services, and other internal/external data sources;
  • Develop data transformation processes for cleaning, transforming, aggregating, and harmonizing data into analytics-ready formats;
  • Implement data validation and quality assurance checks to ensure data accuracy, completeness, and consistency;
  • Automate data extraction and loading (ETL/ELT) processes using Python and SQL-based frameworks;
  • Collaborate with analysts, report developers, product owner, and business users to define data requirements and ensure smooth integration with analytics platforms;
  • Optimize data pipelines for scalability, reliability, and performance.
  • Maintain data documentation, metadata, and version control for all data assets;

Profile:

  • Bachelor or Master degree;
  • +6 years of relevant experience as a Python developer.
  • Strong proficiency in Python, including libraries such as pandas, numpy, sqlalchemy, and requests;
  • Expertise in data extraction and integration (REST APIs, SQL/NoSQL databases, file systems, cloud data sources);
  • Solid understanding of data transformation, cleaning, and validation techniques;
  • Strong hands-on SQL skills, including query optimization and performance tuning;
  • Experience with ETL orchestration tools (e.g., Airflow, ADF) and workflow automation;
  • Familiarity with data warehouses (Oracle, Azure Lakehouse, iceberg, PostgreSQL);
  • Experience implementing data validation and quality checks (e.g., dbt tests);
  • Experience with version control systems (Git) and CI/CD pipelines;
  • Knowledge of containerization (Docker) and cloud platforms (Azure);
  • Knowledge of Dremio and Pyarrow;
  • Knowledge of ADF and Databrick using pyspark;
  • Fluent in English;

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 140869383