
Search by job, company or skills
Location - Bengaluru ( Hybrid )
Experience - 7+ years
We are seeking a Lead Data Engineer with strong expertise in Databricks, Python, and modern data pipeline architectures to design, build, and optimize scalable data platforms. The role focuses on developing robust ELT/ETL pipelines, implementing data lake architectures, and ensuring efficient data transformations to support analytics and AI-driven use cases. The ideal candidate will have hands-on experience with distributed computing concepts, SQL-based transformations, and cloud-scale data engineering practices, along with strong problem-solving and mentoring abilities.
Key Responsibilities
Essential Qualifications
Nice to Have Skills
Soft Skills
Benefits
About Potentiam
Potentiam is a global provider of highly qualified professionals to European SMEs from our offices in Romania, South Africa and India. Potentiam works with clients in finance, energy, leisure, marketing, business services and technology industries, providing technical, professional multi- lingual highly motivated staff, most of whom have had experience of working for international companies. Staff cover a wide range of roles from accounting, marketing, data management, HR, sales/account management, engineering, technology, and operations. Potentiam manages our staff's career development and personal development training, all infrastructure, HR and payroll with our clients directly managing day-to-day staff responsibilities and role training and development.
If interested please apply here, if you have any questions regarding the role, please feel free to write to [Confidential Information]
Data Privacy Notice
The personal information you provide during the application and recruitment process will be used solely for recruitment purposes, in accordance with our data protection policies.
For any questions regarding data processing related to HR activities, please contact at [HIDDEN TEXT]
All data shared with third parties complies with applicable confidentiality and retention requirements.
Job ID: 147368207
Skills:
Java, BigQuery, Data Modeling, Data Warehousing, Sql, Cloud Storage, Etl Tools, DataFlow, Python, Pub Sub, Cloud Composer, Big data processing frameworks
Skills:
Azure Data, Azure, Python, Azure SQL Development
Skills:
python, databricks, aws
Skills:
Apache Airflow, Emr, Java, Lambda, Grafana, Maven, Pyspark, MySQL, Cloudformation, Cassandra, JUnit, Terraform, Git, PostgreSQL, S3, Spring Boot, Apache Flink, Github, Apache Spark, Kinesis, Gradle, AWS, Gitlab, Redis, Prometheus, Spark SQL, Python, Kubernetes, Docker, Apache Kafka, MongoDB, Redshift, Pulsar, Glue
Skills:
S3, Pyspark, PostgreSQL, Data Warehousing, Emr, Redshift, Sql, Gcp, Pandas, MySQL, Azure, Python, AWS, Airflow, ClickHouse, Synapse, Lambda Functions, NiFi
We don’t charge any money for job offers