Search by job, company or skills

Zigsaw

Data Engineer

Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted 10 hours ago
  • Be among the first 10 applicants
Early Applicant

Job Description

L2 Techops

Year of experience 5-8 yrs

Budget- 12-18 LPA

Immediate Joiners only

Purpose of the Job: We are looking for L2 techops - Data Engineering Platform who have extensively worked on the Cloudera Hadoop ecosystem (including Hive, Zookeeper), Spark, Airflow, Nifi, ELK, etc. The ideal candidate will have strong expertise in monitoring infrastructure and applications, debugging complex issues, and automating processes.

Deliverables:

  • Responsible to deliver KPI/Reports as per agreed SLO
  • Keep Platform and Application up and running
  • In-depth analysis of issue and provide resolution
  • Service Restoration
  • Setup alerting and trending of various matrices
  • Build various Dashboards/Reports in ELK for L1 use
  • Build Automations for mundane tasks
  • Implement self-heal mechanism for recurring issues
  • Handle Customer Escalations
  • Configuration, customization, new development and Integration level changes
  • Coordination with L3 / Product teams to resolve the issue
  • Periodic Maintenance activities
  • Support & Mentor L1 team
  • Responsible for day to day technical operations and support of Data Platform.
  • Will be working in 24x7 rotational shifts.
  • SR /Incident tracking / resolution & Problem Management for recurring incidents
  • Build / Update detailed SOP
  • Respond to user requests, queries, issues within agreed SLO, provide timely update and perform regular follow-up till issue closure.
  • Fetch and Provide data as per user request
  • Document and report daily issue, help analyze critical issue
  • Co-ordinate with various teams on routine basis on issues/fixes etc.



Work Experience & Skills

  • et:5+ Years of Experience as Techops on Spark, Cloudera, Airflow, Nifi (Preferably on RedHat OCP and Telecom Doma
  • in)Strong understanding of monitoring processes for infrastructure and applicatio
  • ns.Hands-on experience in mainstream database (Oracle, S
  • QL)Having good understanding of SQL/exadata query writing skills (DDL,DML, Joins, Sub queries, View, Select stateme
  • nt)Excellent Analytics, Debugging and Problem-solving skills for infrastructure, tools, and API iss
  • uesIn-depth understanding of Linux environme
  • ntsExpertise in Linux commands and middleware such as Tomcat and Ng
  • inxHands-on experience with ELK Stack and other monitoring to
  • olsProficiency in Kubernetes, Docker, and Swarm ecosyst
  • em.Automation skills using Python or Shell script
  • ingStrong proven hands-on experience on big data technologies as HDFS, Spark, Ran
  • gerStrong Understanding of Change/Incident/Problem management proce
  • ss.Hands-on experience on Apache Airflow, Kafka, Nifi, Y
  • arnHands-on experience with data and analyt
  • icsExperience with CI/CD pipelines using Jenki
  • ns.Quickly gain understanding of existing processes, Jobs, KPI, Repor
  • ts.Should be able to query/extract data from Hive/H
  • DFSFamiliarity with DataHub tools and DataMesh concepts is a pl
  • us.Strong communication and collaboration skil
  • ls.Experience with monitoring tools such as Zabbix, Prometheus, Loki, and the TIG sta
  • ck.Addressing and closing various security vulnerabilities reported in the syst

em.Educational Level: Bachelor's degree in computer science, Information Technology, or a related fi

eldWork location:Gurg

aonPersonal Characteristics & Behavio

  • ur:Deliver on commitme
  • ntsTeam pla
  • yerQuality of w
  • orkOwnership and initiat
  • iveCustomer orientat

ion

More Info

Job Type:
Industry:
Function:
Employment Type:

About Company

Job ID: 147229661

Similar Jobs

Gurugram, Gurugram, India

Skills:

JavaBigQueryHadoopPysparkSpring BootSqlUNIXTensorflowRestful Web ServicesCloud StorageGitHiveGcpshell scriptingLinuxPerlSparkDataprocDataFlowPythonScikit-learnCloud Composer

Gurugram, Gurugram, India

Skills:

Data ArchitectureEtl ToolsCloud Infrastructuredata modeling principlesGenerative AIdata integration techniquesdata quality frameworksPython Programming Language

Noida, India

Skills:

BigQueryGcpTerraformPysparkApache SparkDataprocSqlCloud FunctionsVertex AIGoogle Kubernetes EngineCloud Monitoring

Noida, India

Skills:

Shell scriptingPython

Gurugram, Gurugram, India

Skills:

data audit AdfBIData ExtractionSSISMicrosoft SqlSparkAzureData TransformationData LoadETL ProcessesData Quality ManagementDatabrickEDWHSynapseData AnalysisMicrosoft ETL