Search by job, company or skills

S&P Global Market Intelligence

Senior Data Engineer(Python, AWS)

5-10 Years
Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 4 hours ago
  • Over 100 applicants
Quick Apply

Job Description

Responsibilities

  • Design, develop, and maintain scalable ETL/ELT pipelines.
  • Optimize and automate data ingestion, transformation, and storage processes.
  • Work with structured and unstructured data sources, ensuring data quality and consistency.
  • Develop and maintain data models, warehouses, and databases.
  • Collaborate with cross-functional teams to support data-driven decision-making.
  • Ensure data security, privacy, and compliance with industry standards.
  • Troubleshoot and resolve data-related issues in a timely manner.
  • Monitor and improve system performance, reliability, and scalability.
  • Stay up-to-date with emerging data technologies and recommend improvements to our data architecture and engineering practices.

What you will need:

  • Strong programming skills using python.
  • 5+ years of experience in data engineering, ETL development, or a related role.
  • Proficiency in SQL and experience with relational (PostgreSQL, MySQL, etc.) and NoSQL (DynamoDB, MongoDB etc) databases.
  • Proficiency building data pipelines in AWS using services like Glue, Batch, Step Functions, Lambda, SQS, SNS, DynamoDb etc..
  • Strong understanding of data modeling, data warehousing, and data governance principles.
  • Should be capable of mentoring junior data engineers and assisting them with technical challenges.
  • Familiarity with orchestration tools like Apache Airflow.
  • Familiarity with containerization and orchestration (Docker, Kubernetes).
  • Experience with version control systems (Git) and CI/CD pipelines.
  • Excellent problem-solving skills and ability to work in a fast-paced environment.
  • Excellent communication skills.
  • Should be able to convert business queries into technical documentation.
  • Hands-on experience with snowflake is a plus.
  • Experience with big data technologies (Hadoop, Spark, Kafka, etc.) is a plus.
  • Experience in GCP is a plus.

Education and Experience

  • Bachelors degree in Computer Science, Information Systems, Information Technology, or a similar major or Certified Development Program
  • 5+ years of experience building data pipelines using python & AWS.

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

Job ID: 115964853