Search by job, company or skills

gts techlabs

ETL & Data Engineer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 11 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Company: GTS TechLabs

Job Title: ETL & Data Engineer

Location: Bangalore

Employment Type: Full-time

Education: B.Tech / M.Tech / MCA

Experience: 4–5 years (relevant experience required)

Job Summary

We are seeking a skilled and detail-oriented ETL & Data Engineer to design, develop, and optimize scalable data pipelines supporting both real-time and batch processing use cases. The role is critical to enabling advanced analytics in domains such as telecom traffic analysis and fraud detection, with a strong emphasis on performance, reliability, and data security.

Key Responsibilities

●       Design, develop, and maintain robust ETL workflows for processing high-volume SMS and telecom data.

●       Build and manage scalable real-time and batch data pipelines using technologies such as Apache Kafka, Spark, Flink, and Hadoop.

●       Optimize SQL and NoSQL databases for high-performance data storage and retrieval.

●       Implement data security controls including encryption, masking, and access governance to ensure compliance.

●       Automate data ingestion, transformation, and orchestration using Python, Apache Airflow, or shell scripting.

●       Collaborate with cross-functional teams to ensure data quality, consistency, and availability across systems.

Required Skills & Competencies

●       Hands-on experience with ETL tools and orchestration frameworks such as Python, Apache NiFi, Airflow, Talend, AWS Glue, or equivalent.

●       Strong expertise in SQL query optimization and database performance tuning.

●       Solid understanding of OLTP and OLAP systems, including data warehousing and data lake architectures.

●       Experience with big data and streaming technologies such as Kafka, Spark, Flink, Hadoop, or similar platforms.

●       Proficiency in relational and NoSQL databases including MySQL, PostgreSQL, MongoDB, Cassandra, and Redis.

●       Programming proficiency in Python, SQL, and at least one of Scala or Java.

Preferred Qualifications

●       Experience with cloud platforms such as AWS, Google Cloud Platform (GCP), or Microsoft Azure.

●       Exposure to AI/ML-driven data pipelines and real-time fraud detection systems.

●     Relevant certifications in Data Engineering (e.g., AWS Certified Data Analytics, Google Professional Data Engineer).

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 146435121

Similar Jobs