Search by job, company or skills

Kfin Technologies Limited

Data Engineer

2-7 Years

This job is no longer accepting applications

new job description bg glownew job description bg glownew job description bg svg
  • Posted 4 months ago
  • Over 50 applicants

Job Description

Job description

We are seeking a highly skilled and experiencedSenior Data Engineerto design, develop, and optimize scalable data pipelines, ensuring the availability, performance, and reliability of our data systems. The ideal candidate will have a deep understanding of data architecture, ETL processes, and cloud technologies, with the ability to lead complex projects and mentor junior team members.

Key Responsibilities

1. Data Infrastructure Design and Development

  • Design and implement robust, scalable, and efficient data pipelines to support business analytics and decision-making.
  • Develop, test, and maintain ETL/ELT workflows using modern tools and technologies.
  • Ensure data quality, consistency, and reliability across all pipelines and systems.

2. Data Architecture and Modelling

  • Build and optimize data architectures for large-scale, distributed systems.
  • Develop data models and schemas to support analytical and operational reporting.
  • Design strategies for data warehousing and data lake integration.

3. Technology and Tools Management

  • Leverage cloud platforms (e.g., AWS, Azure, GCP) to build and manage scalable solutions.
  • Work with big data technologies such as Hadoop, Spark, Kafka, or similar tools.
  • Manage relational and NoSQL databases (e.g., PostgreSQL, MongoDB, Cassandra).

4. Collaboration and Mentorship

  • Collaborate with data scientists, analysts, and software engineers to meet organizational goals.
  • Mentor and guide junior engineers, promoting best practices in coding, architecture, and data management.
  • Partner with stakeholders to translate business requirements into technical solutions.

5. Performance Optimization and Security

  • Monitor and improve data pipeline performance, addressing bottlenecks and inefficiencies.
  • Implement security best practices to ensure the safety and privacy of data.
  • Automate routine processes to reduce manual intervention and operational overhead.

Required Skills and Qualifications

  • Education: Bachelors or Master's degree in Computer Science, Data Engineering, or a related field.
  • Experience:
  • 3-8 years of experience in data engineering or related roles.
  • Proven expertise in designing and building large-scale data pipelines and architectures.
  • Technical Skills:
  • Proficiency in programming languages like Python, Java, or Scala.
  • Advanced SQL skills for complex queries and performance tuning.
  • Hands-on experience with cloud platforms (AWS, Azure, GCP).
  • Experience with big data tools (e.g., Spark, Hadoop, Kafka).
  • Knowledge of containerization (e.g., Docker) and orchestration tools (e.g., Kubernetes, Airflow).
  • Soft Skills:
  • Strong problem-solving abilities and attention to detail.
  • Excellent communication and collaboration skills.
  • Ability to lead projects and work independently in a fast-paced environment.

Preferred Qualifications

  • Certification in cloud platforms like AWS Certified Data Analytics Specialty, Google Professional Data Engineer, or Azure Data Engineer Associate.
  • Experience with streaming data pipelines and real-time processing.
  • Familiarity with machine learning workflows and tools.
  • Knowledge of data governance and compliance standards.

More Info

Job Type:
Function:
Employment Type:
Open to candidates from:
Indian

Job ID: 107707683

Similar Jobs

(estd)