Search by job, company or skills

ProcDNA

AWS Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 3 days ago
  • Be among the first 30 applicants
Early Applicant

Job Description

About ProcDNA

ProcDNA is a global consulting firm. We fuse design thinking with cutting-edge technology to create game-changing Commercial Analytics and Technology solutions for our clients. We're a passionate team of 400+ across 6 offices, all growing and learning together since our launch during the pandemic. Here, you won't be stuck in a cubicle - you'll be out in the open water, shaping the future with brilliant minds. At ProcDNA, innovation isn't just encouraged; it's ingrained in our DNA. Ready to join our epic growth journey

What We Are Looking For

Seeking an experienced AWS Data Engineer with 58 years of expertise in designing and building scalable cloud-based data solutions. Strong hands-on experience in Python, SQL, Spark, and AWS services such as S3, Glue,Redshift, and Athena is essential. Proven ability to develop robust ETL/ELT pipelines and implement efficient datawarehousing solutions is required. Solid understanding of performance tuning, data modeling, and cloud bestpractices is expected. Excellent problem-solving skills, ownership mindset, and effective collaboration with cross-functional teams are key to success in this role.

What You'll Do

  • Design, develop, and maintain scalable datapipelines, ETL processes and data lake/ datawarehouse solution on AWS
  • Develop and optimize ETL/ELT workflows usingPython, SQL, and Spark to process large-scalestructured and semi-structured data.
  • Implement data modeling techniques (star/snowflakeschema) by defining proper grain, keys, relationships,partitioning, indexing and tuning queries for optimalperformance
  • Orchestrate and automate workflows using tools likeAirflow or AWS-native services while ensuringreliability and monitoring.
  • Collaborate with data analysts, data scientists, andbusiness stakeholders to translate requirements intorobust data solutions.
  • Ensure data quality, security, governance, and costoptimization across cloud-based data platforms.

What You'll Need

  • 58 years of hands-on experience in Data Engineeringwith strong exposure to AWS cloud environments.
  • Proven expertise in building scalable ETL/ELT pipelinesusing Python, SQL, and Spark.
  • Strong experience with AWS services including S3,Glue, EMR, Redshift, Athena, and Lambda.
  • Solid understanding of data warehousing concepts,dimensional modeling (star/snowflake), and CDCimplementation.
  • Proficiency in query optimization, partitioningstrategies, and performance tuning techniques.
  • Hands-on experience with workflow orchestration toolssuch as Airflow or AWS-native orchestration services.
  • Strong knowledge of IAM roles, encryption, and cloudsecurity best practices.
  • Familiarity with CI/CD pipelines, version control (Git),and infrastructure-as-code tools like Terraform orCloudFormation.
  • Excellent communication skills with experienceworking in cross-functional and agile teams.

Skills: glue,amazon redshift,python,spark,sql,data warehouse,athena,cloud,etl,aws s3,aws

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 144015697

Similar Jobs