
Search by job, company or skills
Skill Required: AWS + Snowflake SQL + Kafka + Python + Pyspark
(strong in python)
Location : Pune Only
Notice : immediate joiner, currently serving who can join immediately
Role Summary:
We are looking for a highly skilled and proactive Data Engineer with deep expertise in AWS, Snowflake, SQL, Kafka, Python, and PySpark. This role is ideal for someone who thrives in building scalable data pipelines, optimizing cloud infrastructure, and enabling real-time data processing across enterprise systems.
Key Responsibilities:
Required Qualifications:
Job ID: 147487941
Skills:
S3, Hadoop, Pyspark, Scala, Big Data, HBase, Jenkins, Lambda, Git, Hive, Spark, Python, AWS, Athena
Skills:
BigQuery, Dataproc, Sql, Apache Airflow, Jenkins, Terraform, DataFlow, Python, datastream, Pub Sub, GitLab CI, Cloud Composer
Skills:
Power Bi, Pyspark, Node.js, Sql, React, Javascript, Flask, Databricks, FastAPI, Python, scikit-learn, Azure SQL Database, cleansing validation frameworks, Azure Web Apps, Application Insights, RESTful API design and integration, Data Quality profiling
Skills:
BigQuery, Apache Spark, Kafka, Numpy, Pandas, Gcp, Spark, Databricks, Python, AWS, Airflow, Data Pipelines, Copilot, Claude, Code Cursor
Skills:
Informatica - Power Center, Lsmw, Oracle Sql, Idq, Shell script, Python, IICS, AI ML capabilities, advanced data analytics tools, LTMC
We don’t charge any money for job offers