Search by job, company or skills

Softtek

Data Management Engineer

new job description bg glownew job description bg glownew job description bg svg
  • Posted a month ago
  • Be among the first 30 applicants
Early Applicant

Job Description

  • The Data Operations Engineer is the guardian of the data lifecycle. They ensure that data flows from source to destination without interruption, maintaining the plumbing of the organizations data infrastructure
  • 1. Orchestration & Workflow Engines
  • Apache Airflow: Deep understanding of Operators, Sensors, Hooks, and the Airflow UI to manage complex task dependencies.
  • Workflow Logic: Knowledge of how to handle retries, branching, and 'backfilling historical data.
  • 2. ETL/ELT Methodologies
  • Data Ingestion: Familiarity with moving data from APIs, logs, and relational databases into data warehouses (like Snowflake, BigQuery, or Redshift).
  • Transformation: Understanding how data is cleaned and restructured during transit.
  • 3. Scripting & Command Line
  • Python: Ability to read and perform 'hotfixes or minor adjustments to existing scripts.
  • SQL: Proficiency in writing queries to validate data quality and troubleshoot ingestion errors.
  • Linux/Shell: Comfort navigating servers and checking system logs.
  • 4. System Monitoring & Reliability
  • SLA/SLO Management: Understanding the business impact of data delays.
  • Observability: Using tools (like Grafana, Datadog, or Airflow logs) to monitor system health and data throughput
  • GCS (Google Cloud Storage) (Medium)
  • BigQuery required
  • Dataproc (Medium)
  • Spark (Pyspark/Scala) (Medium)
  • Airflow required
  • Python Medium

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 143308285