Search by job, company or skills

Sundew

Senior Data Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted 6 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Company Overview:

Sundew is a leading digital transformation firm with an 18-year legacy of excellence. We specialize in digital strategy, application development, and engineering, utilizing MEAN, MERN, and LAMP stacks, with PHP Laravel as our primary proficiency. We are looking for a Senior Data Engineer who will take ownership of end-to-end data strategy, mentor and guide a team, and ensure seamless delivery of high-quality data pipelines, storage solutions, and analytics platforms.

About the Role:

This role requires strong expertise in data engineering technologies, ETL development, data architecture design, Agile methodologies, and root cause analysis, along with exposure to emerging trends such as AI-driven data processing and advanced analytics.

Roles and Responsibilities:

  • Data Pipeline Development: Design, develop, and maintain scalable data pipelines to support ETL (Extract, Transform, Load) processes using tools like Apache Airflow, AWS Glue, or similar.
  • Database Management: Design, optimize, and manage relational and NoSQL databases (such as MySQL, PostgreSQL, MongoDB, or Cassandra) to ensure high performance and scalability.
  • SQL Development: Write advanced SQL queries, stored procedures, and functions to extract, transform, and analyze large datasets efficiently.
  • Cloud Integration: Implement and manage data solutions on cloud platforms such as AWS, Azure, or Google Cloud, utilizing services like Redshift, BigQuery, or Snowflake.
  • Data Warehousing: Contribute to the design and maintenance of data warehouses and data lakes to support analytics and BI requirements.
  • Programming and Automation: Develop scripts and applications in Python or other programming languages to automate data processing tasks.
  • Data Governance: Implement data quality checks, monitoring, and governance policies to ensure data accuracy, consistency, and security.
  • Collaboration: Work closely with data scientists, analysts, and business stakeholders to understand data needs and translate them into technical solutions.
  • Performance Optimization: Identify and resolve performance bottlenecks in data systems and optimize data storage and retrieval.
  • Documentation: Maintain comprehensive documentation for data processes, pipelines, and infrastructure.
  • Stay Current: Keep up-to-date with the latest trends and advancements in data engineering, big data technologies, and cloud services.

Required Skills and Qualifications:

  • 4+ years of experience in Data Engineering (or equivalent).
  • Proficiency in SQL and relational databases (PostgreSQL, MySQL, etc.).
  • Experience with NoSQL databases (MongoDB, Cassandra, etc.).
  • Strong programming skills in Python; familiarity with Java or Scala is a plus.
  • Experience with data pipeline tools (Apache Airflow, Luigi, or similar).
  • Expertise in cloud platforms (AWS, Azure, or Google Cloud) and data services (Redshift, BigQuery, Snowflake).
  • Knowledge of big data tools like Apache Spark, Hadoop, or Kafka is a plus.
  • Data Modelling: Experience in designing and maintaining data models for relational and non-relational databases.
  • Analytical Skills: Strong analytical and problem-solving abilities with a focus on performance optimisation and scalability.

Soft Skills:

  • Excellent verbal and written communication skills to convey technical concepts to non-technical stakeholders.
  • Ability to work collaboratively in cross-functional teams.

Certifications (Preferred): AWS Certified Data Engineer, GCP Professional Data Engineer, or similar.

Mindset: Eagerness to learn new technologies and adapt quickly in a fast-paced environment.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 139043441

Similar Jobs