Search by job, company or skills

S&P Global Market Intelligence

Senior Data Engineer(GCP, Python)

5-10 Years
new job description bg glownew job description bg glownew job description bg svg
  • Posted 9 hours ago
  • Over 50 applicants
Quick Apply

Job Description

Responsibilities

  • Design, develop, and maintain scalable ETL/ELT pipelines for high-volume data processing
  • Optimize and automate data ingestion, transformation, and storage workflows
  • Handle both structured and unstructured data sources, ensuring data quality and consistency
  • Develop and maintain data models, data warehouses, and databases
  • Collaborate with cross-functional teams to support and enable data-driven decision-making
  • Ensure data security, privacy, and compliance with industry and regulatory standards
  • Troubleshoot and resolve data-related issues promptly and efficiently
  • Monitor and enhance system performance, reliability, and scalability
  • Stay up-to-date with emerging data technologies and recommend improvements to data architecture and engineering practices

What You Will Need

  • 5+ years of experience in data engineering, ETL development, or a related field
  • Strong programming skills in Python
  • Proficiency in SQL and experience with both relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB, MongoDB)
  • Proven experience building data pipelines on Google Cloud Platform (GCP) using services like:
  • DataFlow, Cloud Batch, BigQuery, BigTable, Cloud Functions, Cloud Workflows, Cloud Composer
  • Solid understanding of data modeling, data warehousing, and data governance principles
  • Capability to mentor junior data engineers and assist with technical challenges
  • Familiarity with orchestration tools such as Apache Airflow
  • Experience with containerization and orchestration tools like Docker and Kubernetes
  • Proficiency with version control systems (e.g., Git) and CI/CD pipelines
  • Excellent problem-solving and communication skills
  • Ability to work effectively in a fast-paced, agile environment
  • Experience with Snowflake, big data technologies (e.g., Hadoop, Spark, Kafka), and AWS is a plus
  • Skilled at converting business requirements into technical documentation

Education and Experience

  • Bachelor's degree in Computer Science, Information Systems, Information Technology, or a related field
  • Certified development training/program is a plus
  • 5+ years of hands-on experience building data pipelines using Python and GCP

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

Job ID: 116704093