Search by job, company or skills

mCaffeine

Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 13 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Company overview:

PEP is a dynamic personal care company that proudly houses two innovative brands mCaffeine & Hyphen. With a passion for creating high-performance, conscious, and consumer-loved products, we are redefining the way personal care is experienced. While mCaffeine is India's first caffeinated personal care brand, loved for its energizing and playful approach, Hyphen is built on the philosophy of simplifying skincare through science-backed formulations. Together, our brands reflect PEP's mission to deliver quality, creativity, and care to millions of consumers. We believe in Confidence over all skin & body biases.

Come, join the pack!

About the role:

We are seeking a highly skilled Data Engineer with hands-on experience in AWS cloud services to join our team. The ideal candidate will be responsible for building and maintaining scalable data pipelines, integrating data systems, and ensuring data integrity and availability. You will work closely with data analysts and stakeholders to support the company's data-driven decision-making processes. Strong documentation skills are also essential to ensure that processes, pipelines, and systems are clearly defined and easily understood by both technical and non-technical stakeholders.

Key Responsibilities:

  • Data Pipeline Development: Design, build, and maintain efficient and scalable data pipelines to process structured and unstructured data.
  • AWS Cloud Services: Utilize AWS services such as S3, Lambda, CloudWatch, Kinesis, EMR, RDS and EC2 to store, process, and analyze large datasets.
  • Data Integration: Collaborate with internal teams to integrate and consolidate data from various sources into a centralized data warehouse or data lake. This includes integrating third-party data sources via API, webhooks and automating the process of pushing data into your systems
  • ETL Processes: Develop ETL (Extract, Transform, Load) processes for data integration and transformation, leveraging AWS tools like AWS Glue or custom Python/Scala scripts.
  • Data Warehousing: Implement and manage data warehousing solutions, with experience in AWS Redshift or other cloud-based databases.
  • Automation & Optimization: Automate data processing tasks and optimize the performance of data workflows.
  • Collaboration: Work closely with data analysts and stakeholders to ensure that data solutions meet business requirements.
  • Monitoring & Maintenance: Monitor data pipelines and ensure data availability, quality, and performance. Troubleshoot issues as needed.
  • Documentation: Document data engineering processes, systems, pipelines, and workflows. Ensure all code, processes, and procedures are clearly explained and accessible for future reference or team members.
  • Best Practices: Maintain high standards for code quality and documentation to ensure ease of collaboration, support, and long-term maintainability of data systems.

Qualifications and Skills:

  • Education: Bachelor's degree in Computer Science, Engineering, Mathematics, or a related field (or equivalent work experience).
  • AWS Certifications (e.g. AWS Certified Data Engineer)
  • Experience: 4+ years of experience in a data engineering role with a focus on AWS cloud services, infrastructure management, and data warehousing solutions
  • Experience in integrating third-party systems and services via webhooks for data ingestion.
  • Experience in developing ETL processes and working with large-scale datasets.
  • Strong experience with SQL and database technologies, particularly in relational databases.
  • 2+ years of experience in handling Postgres Database.
  • Experience with Python, PySpark, Scala, or other programming languages for data engineering tasks.
  • Proven ability to write clear, comprehensive documentation for complex data engineering systems and workflows.
  • Version Control: Knowledge of Git or other version control systems
  • Problem-Solving: Strong analytical and problem-solving skills.
  • Documentation Tools: Familiarity with documentation tools like Confluence, Markdown, or Jupyter Notebooks for writing clear, organized technical documentation.
  • Excellent communication skills to collaborate with cross-functional teams.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 139213477

Similar Jobs