Search by job, company or skills

  • Posted 21 days ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Azure Databricks JD- 87597

Job Description

  • Experience with Azure Data Bricks, Data Factory
  • Experience with Azure Data components such as Azure SQL Database, Azure SQL Warehouse, SYNAPSE Analytics
  • Experience in Python/Pyspark/Scala/Hive Programming
  • Experience with Azure Databricks/ADB
  • Experience with building CI/CD pipelines in Data environments

Primary Skills

  • ADF (Azure Data Factory) OR
  • ADB (Azure Data Bricks)

Secondary Skills

  • Excellent verbal and written communication and interpersonal skills
  • Ability to work independently and within a team environment

AWS Data Engineer- 86638

2+ year of experience working on AWS Cloud platform with strong experience in Python Knowledge on AWS services such as

  • AWS S3, Glue, API Gateway, Crawler, Athena, , Lambda, Dynamic DB, Redshift is an advantage
  • Experience/knowledge with streaming technologies is must preferably Kafka
  • Should have knowledge/experience with SQL
  • Good analytical skills
  • Familiar working on Linux platforms
  • Have good understanding on pro's and con's and the cost impact of the AWS services being leveraged
  • Good Communication skills.

GCP Data Engineer- 95823

  • Minimum 4 years experience in GCP Data Engineering.
  • Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud.
  • Strong data engineering experience using Java or Python programming languages or Spark on Google Cloud.
  • Should have worked on handling big data.
  • Strong communication skills.experience in Agile methodologies ETL, ELT skills, Data movement skills, Data processing skills.Certification on Professional Google Cloud Data engineer will be an added advantage.
  • Proven analytical skills and Problem-solving attitudeAbility to effectively function in a cross-team's environment.
  • Primary Skills:
  • GCP, data engineering.Java/ Python/ Spark on GCP, Programming experience in any one language - either Python or Java or PySpark.
  • GCS (Cloud Storage), Composer (Airflow) and BigQuery experience.
  • Experience building data pipelines using above skills

More Info

Job Type:
Industry:
Employment Type:

Job ID: 141081477