Search by job, company or skills

ProcDNA

Cloud Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 6 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

About ProcDNA

ProcDNA is a global consulting firm. We fuse design thinking with cutting-edge technology to create game-changing Commercial Analytics and Technology solutions for our clients. We're a passionate team of 400+ across 6 offices, all growing and learning together since our launch during the pandemic. Here, you won't be stuck in a cubicle - you'll be out in the open water, shaping the future with brilliant minds. At ProcDNA, innovation isn't just encouraged; it's ingrained in our DNA. Ready to join our epic growth journey

What We Are Looking For

Looking for a motivated Cloud Data Engineer with 2–4 years of experience to design, develop, and maintain scalable data pipelines and cloud-based platforms. Hands-on experience with AWS or Azure, along with strong programming skills in Python and SQL, and a solid understanding of ETL/ELT processes, data warehousing, and data modeling is essential. Experience with orchestration, exposure to batch or real-time processing, and familiarity with CI/CD, Git, and basic DevOps/DataOps practices is desirable. Strong communication skills, a collaborative mindset, and the ability to work effectively with cross-functional teams are key expectations.

What You'll Do

  • Design, develop, and maintain scalable ETL/ELT pipelines using Databricks and cloud platforms (AWS/Azure).
  • Implement data transformation logic, validation rules, and data quality checks to ensure accuracy, consistency, and reliability of datasets.
  • Develop and manage data models, schemas, and data warehouse structures to enable seamless data consumption for analytics and reporting.
  • Monitor, troubleshoot, and continuously optimize data pipelines for performance, scalability, reliability, and cost efficiency in cloud environments.
  • Ensure adherence to data governance, security, and compliance standards across all data processing and storage layers.
  • Collaborate closely with cross-functional teams to deliver scalable, high-quality, and business-aligned data solutions.

Must Have

  • Minimum 2-4 years of experience in a Data Engineering role with a B.Tech/BE or equivalent technical background.
  • Strong programming skills in Python and SQL with experience in building and optimizing data pipelines.
  • Hands-on experience building ETL/ELT pipelines and cloud data services (Azure or AWS).
  • Experience working with Databricks or similar distributed data processing frameworks (e.g., AWS Glue).
  • Familiarity with workflow orchestration tools like Airflow and version control systems such as Git.
  • Experience handling large datasets, implementing data quality checks, and understanding data ingestion, storage, and consumption architectures.
  • Good understanding of data quality, monitoring, and basic DevOps/DataOps practices, along with strong problem-solving and communication skills.

Skills: cloud,aws,etl,azure,data engineering

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 145566189