Search by job, company or skills

letitbex ai

Data Engineer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 3 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

About Letitbex AI

Letitbex AI is a fast-growing AI-driven technology company focused on building intelligent, scalable, and enterprise-grade solutions. We work at the intersection of AI, data engineering, cloud, and business transformation, helping organizations unlock real value from artificial intelligence.

Position: Data Engineer

Experience: 4+ Years

Notice Period: Can be considered up to 20 Days

Location: Hyderabad (Hybrid)

Role Summary

We are looking for a skilled Data Engineer to design, build, and maintain scalable data pipelines and systems. The ideal candidate should have strong hands-on experience with Python, PySpark, AWS, and SQL, and be comfortable working with large-scale data in cloud environments.

Key Responsibilities

  • Design, develop, and optimize scalable data pipelines using Python and PySpark
  • Build and manage ETL/ELT processes for structured and unstructured data
  • Work extensively with AWS services (S3, Glue, EMR, Redshift, Lambda, etc.)
  • Develop and optimize complex SQL queries for data transformation and analytics
  • Ensure data quality, integrity, and performance across pipelines
  • Collaborate with data scientists, analysts, and stakeholders to understand data needs
  • Monitor, troubleshoot, and improve existing data workflows
  • Implement best practices for data security and governance

Required Skills

  • Strong experience in Python for data engineering
  • Hands-on experience with PySpark / Apache Spark
  • Solid knowledge of AWS cloud services
  • Advanced SQL skills for querying and optimization
  • Experience with data warehousing and large-scale datasets
  • Understanding of ETL frameworks and data modeling concepts

Good To Have

  • Experience with Airflow or other orchestration tools
  • Knowledge of streaming tools (Kafka, Kinesis)
  • Exposure to DevOps, CI/CD, or Terraform
  • Experience in Agile/Scrum environments

Education

  • Bachelor's or Master's degree in Computer Science, Engineering, or related field

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 147364941

Similar Jobs

Hyderabad, India

Skills:

Google Cloud Machine Learning Servicesdesigning and implementing ETL workflowsdata pipeline orchestration and automation toolsGoogle Cloud Platform Architecturedata integration solutionscloud-based data storage and processing technologies

Hyderabad, India

Skills:

PysparkDatabricksRedshiftPythonAWS

Hyderabad, India

Skills:

AWS GluePrometheusGrafanaApache NifiApache AirflowDockerTerraformOpenshiftAzure Data LakeTalendPythonAzure DevOpsApache SparkBashElk StackSqlJenkinsAnsibleAmazon RedshiftAWS CloudFormationApache KafkaPuppetKubernetesAws S3AWS Step FunctionsGoogle BigQueryGitLab CI

Hyderabad, India

Skills:

HadoopSparkAWS GlueData WarehousingData ModelingPythonSqlETL pipeline development

Hyderabad, India

Skills:

snowflake GdprSnaplogicKafkaData ModelingEncryptionELTApache AirflowKinesisVpnDockerTerraformPythonAWSApisCloudformationSqlJenkinsGitGcpIamData IntegrationAzureKubernetesEtldata segregation policiesCCPAGitHub ActionsSecuritydbtarchitecturepipeline optimizationCompliance