Search by job, company or skills

C

AWS + Python Professional

new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 20 applicants
Early Applicant
Quick Apply

Job Description

We are seeking a highly skilled AWS + Python Professional with a strong focus on data engineering. The ideal candidate will possess 6-7 years of experience with AWS data services (S3, Athena, CDK, Redshift) and data modeling, complemented by at least 4 years of hands-on Python development. You will be responsible for designing, developing, and maintaining robust data pipelines, integrating various data sources, and building scalable data APIs on the cloud.

Key Responsibilities:

  • AWS Data Engineering:Utilize thorough understanding and hands-on experience with AWS data services such as S3, Athena, and Redshift.
  • Leverage AWS CDK (Cloud Development Kit) for infrastructure as code.
  • Design and develop data pipelines using AWS Glue, Spark, and Python with Airflow.
  • Experience with Informatica Cloud is advantageous for ETL processes.
  • Python Development:Develop and maintain high-quality code primarily in Python.
  • Design and develop Data APIs (Python, Flask/FastAPI) to expose data on the platform, ensuring security and scalability.
  • Data Modeling & Warehousing:Apply Advanced/Intermediate Data Modeling skills (Master/Reference, ODS, Data Warehouse, Data Mart - Dimension and Fact tables) to enable analytics on the platform.
  • Utilize traditional data warehousing and ETL skillset, including strong SQL and PL/SQL skills.
  • Data Integration:Handle inbound and outbound integrations on the cloud platform.
  • Experience loading and querying cloud-hosted databases like Redshift, Snowflake, and BigQuery.
  • Collaboration & Quality Assurance:Partner with Solution Architects (SA) to identify data inputs, related data sources, review sample data, identify gaps, and perform quality checks.
  • DevOps/DataOps (Preferred):Experience with Infrastructure as Code and setting up CI/CD pipelines.
  • Real-time Streaming (Preferred):Experience building real-time streaming data ingestion solutions.
  • System Integration (Preferred):Knowledge of system-to-system integration, messaging/queuing, and managed file transfer.

Required Skills & Experience:

  • Total Years of Experience: 6-7 years.
  • Relevant Experience:AWS (6-7 years): S3, Athena, CDK, Redshift.
  • Data Modeling: Strong understanding of Dimension and Fact tables.
  • Python (4 years): Hands-on development.
  • Mandatory Skills:AWS, Python, Flask (implies API development expertise).
  • Cloud Platforms: Thorough understanding of AWS from a data engineering and tools standpoint. Experience in another cloud (Azure, GCP) is beneficial.
  • ETL Tools: Experience in AWS Glue, Spark, and Python with Airflow.
  • Data Modeling: Advanced/Intermediate data modeling skills.
  • Database Skills: Strong SQL and PL/SQL skills.
  • Data Loading/Querying: Experience with Redshift, Snowflake, BigQuery.

Additional Information:

  • Vendor Billing Rate: INR 10,500 per day.
  • Background Check: Post onboarding.
  • Notice Period: 15 days, not more than that.

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

About Company

At Clifyx, aligning great talent with clientsneeds is at the core of who we are. We are passionate about our Consultants, our Clients and our MSP partners. Our rich experience combined with our unyielding care for our employees is the driving force behind all we do. And we deliver! Our 24x7 global service delivery drives time, cost and risk out of any process or project, providing you with best-possible business outcomes and best-fit talent on-demand when, where and how you need it.

Job ID: 118659083