Search by job, company or skills

Artmachine

Sr. AWS Data Engineer - Airflow + ETL ( AWS, PySpark, Databricks)

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted an hour ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Who We Are

Artmac Soft is a technology consulting and service-oriented IT company dedicated to providing innovative technology solutions and services to customers.

Job Description

Job Title : Sr. AWS Data Engineer - Airflow + ETL ( AWS, PySpark, Databricks)

Job Type : Contract

Experience : 8-15 Years

Location : Hyderabad, Telangana (Hybrid)

Responsibilities

  • Experience in data engineering with a strong focus on ETL processes and data pipeline development.
  • Proficiency in using Airflow for orchestrating complex data workflows.
  • Extensive experience with AWS services (e.g., S3, Lambda, Glue, Redshift).
  • Strong programming skills in PySpark, with a deep understanding of distributed data processing.
  • Hands-on experience with Databricks for data processing and analytics.
  • Experience with additional big data technologies such as Kafka, Hadoop, or Snowflake.
  • Familiarity with data visualization tools like Tableau, Power BI, or similar.
  • Experience with CI/CD tools and practices for data pipelines.
  • Certifications in AWS or Databricks are a plus.
  • Familiarity with SQL and relational databases for data extraction and manipulation.
  • Proven experience with data modeling, data warehousing, and building scalable data solutions.
  • Knowledge of best practices in data management, including data governance and data security.
  • Strong problem-solving skills and ability to troubleshoot complex data issues.
  • Excellent communication and collaboration skills, with the ability to work effectively in a team environment.
  • Optimize data pipelines for performance, scalability, and reliability.
  • Implement best practices for data governance, data quality, and data security.
  • Monitor and troubleshoot ETL processes to ensure data accuracy and integrity.
  • Work in an agile environment, actively participating in sprint planning, daily stand-ups, and retrospectives.
  • Document data workflows, processes, and technical specifications to ensure clear communication and knowledge sharing within the team.

Qualification

  • Bachelor s degree in Computer Science, Information Technology, or a related field.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147315293

Similar Jobs

Hyderabad, India

Skills:

Data ModellingKafkaAzure DatabricksSqlGitDockerTerraformSparkKubernetesPythonDelta TableIcebergDagster

Hyderabad, India

Skills:

Google Cloud PlatformPrometheusElk StackGrafanaSqlNosqlTensorflowAzure MLJenkinsGitPytorchDockerAzureKubernetesPythonAWSAirflowScikit-learnGitLab CIRAWS SageMaker

Hyderabad, India

Skills:

Spark SQLPysparkPerformance TuningApache SparkData ModelingSqlData QualityStreamingDatabricksLakehouse architecturesDatabricks platform featuresCloud-native data engineeringTroubleshootingDelta Lake

Hyderabad, India

Skills:

data engineering Data ModellingData StructuresSqlNosqlPythonAWSAcceldataAI assisted developmentDatabricks LakehouseData AnalysisFivetran

Hyderabad, India

Skills:

Azure Cosmos DBTerraformAzure Data LakeAzure BicepDatabricks administrationcloud security best practicesAzure SQL DatabaseAzure SQL Managed Instance