Search by job, company or skills

Coforge

AWS Data Architect

new job description bg glownew job description bg glownew job description bg svg
  • Posted 5 months ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Title: AWS Data Architect

Skills: Python, Airflow, AWS Services S3, Spark(Glue, EMR), Kafka (SQS, Event Bridge), Integration (AppFlow, APIs), DW Concepts & Data Modeling experience

Experience Required: 12 - 15 years

Job Location: Hyderabad Only

We at Coforge are hiring AWS Data Architects with following skillset:

  • Lead Data Architect with a strong background in AWS, Python, and data engineering.
  • Lead a team of data engineers and architects, providing technical guidance and mentorship.
  • Your expertise will shape our data strategy, ensuring efficient data processing, storage, and analytics.
  • Lead a team of data engineers and architects, providing technical guidance and mentorship.
  • Develop and execute a strategic roadmap for data processing, storage, and analytics in alignment with organizational goals.
  • Candidate should possess a deep understanding of AWS cloud services, data architecture with a proven track record of leading data-driven projects to successful completion.
  • Design, implement, and maintain robust data pipelines using Python and Airflow, ensuring efficient data flow and transformation for analytical and operational purposes.
  • Utilize AWS services, including S3 for data storage, Glue and EMR for data processing, and orchestrate data workflows that are scalable, reliable, and secure.
  • Implement real-time data processing solutions using Kafka, SQS, and Event Bridge, addressing high-volume data ingestion and streaming needs.
  • Oversee the integration of diverse systems and data sources through AppFlow, APIs, and other integration tools, ensuring seamless data exchange and connectivity.
  • Lead the development of data warehousing solutions, applying best practices in data modelling to support efficient data storage, retrieval, and analysis.
  • Continuously monitor, optimize, and troubleshoot data pipelines and infrastructure, ensuring optimal performance and scalability.
  • Ensure adherence to data governance, privacy, and security policies, implementing measures to protect sensitive data and comply with regulatory requirements.
  • Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
  • 10 to 15 years of experience in data engineering, with at least 3 years in a leadership role.
  • Proficient in Python programming and experience with Airflow for workflow management.
  • Strong expertise in AWS cloud services, particularly in data storage, processing, and analytics (S3, Glue, EMR, etc.).
  • Experience with real-time streaming technologies like Kafka, SQS, and Event Bridge.
  • Solid understanding of API based integrations and familiarity with integration tools such as AppFlow.
  • Deep knowledge of data warehousing concepts.

Please share your CV to [Confidential Information] or WhatsApp 9667427662 for any queries.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 127695095