Search by job, company or skills

Digile Ltd.

Data Platform Engineer

Fresher
Save
new job description bg glownew job description bg glow
  • Posted a month ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Key Responsibilities

  • Provide BAU support for the Enterprise Data Warehouse (EDW), including monitoring, troubleshooting, performance tuning, and issue resolution
  • Support Qlik Replicate processes and ensure reliable data ingestion across systems
  • Implement enhancements and modification requests to the EDW based on business needs
  • Perform data validation, quality checks, and root cause analysis for data issues
  • Ensure operational stability, security, and performance of live data pipelines and applications
  • Document processes, configurations, and support procedures to maintain platform resilience

Project & Modernization Exposure

While this is primarily a BAU-focused role, the Data Platform Engineer will also:

  • Contribute to the migration of the EDW to Microsoft Fabric
  • Gain exposure to modern cloud data technologies and evolving data architecture patterns
  • Support development work related to platform modernization initiatives
  • Work under the guidance and mentorship of senior data engineers on strategic transformation efforts.
  • Follow coding best practices and ensure that the development works comply with the best practice guidelines.

Maintenance & continuous improvement

  • Perform ITIL Incident, Problem, and Change Management practices in accordance to SLAs and follow processes.
  • Identify key problem areas within the application and implement improvements. Evaluate and improve existing data analytics systems

Data Expertise

  • Understand the IB's main business processes and how it relates to data that is generated or captured.
  • Understand associated data flows and dependencies between different enterprise systems ‍
  • Required to independently research, test and problem-solve for technical issues or blockers, obtaining guidance from Leads or Architects when required. ‍
  • Drive the troubleshooting of key technical issues or escalate and work with appropriate teams. ‍
  • Identify key improvement areas and discuss with the technical lead where needed.
  • BSc/BA in Computer Science, Engineering or relevant field
  • Building the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources, applications and platforms.
  • Able to integrate multiple data sources & user-end applications with databases into one system. (to store the data and its retrieval from the databases)
  • Experience in designing and implementing robust data pipelines and ETL/ELT framework
  • Proven experience as a data warehouse developer, including full implementation of data warehousing solution.
  • Experience in data engineering solutions built on modern data lake or Lakehouse architectures, including Delta Lake or equivalent frameworks e.g. Microsoft Fabric
  • Good understanding of enterprise design concepts: re-usability, continuous integration, security, scheduling, monitoring, etc
  • In-depth understanding of database management systems, online analytical processing (OLAP), SQL queries (Azure SQL DB)
  • Expertise with Azure Resource Management and templates is an added advantage
  • Exposure to cloud technologies (MS Azure, AWS) & desire to learn and deliver new things on a needs-basis. (big data, BI, data science, etc.)
  • Expertise in data warehouse design methodologies and technologies, data modelling (Data Vault modelling methodology experience is preferable), data quality and metadata
  • Effective oral, written communication and presentation skills.
  • Strong interpersonal skills. Self-motivated with a keen attention to detail and quality of work.



Send Your Resume

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 145744047

Similar Jobs

Hyderabad, India

Skills:

snowflake JavaApache SparkKafkaJsonAvroSqlDatabricksSybase IqKubernetesPythonParquetApache IcebergCI CD toolingHadoop ecosystem technologies

India

Skills:

Version ControlScalaSqlGcpAzurePythonAWSAPI and event-based integrationstream processing frameworksIcebergHudiDeltainfrastructure-as-code for data platformsLakehouse architecture patterns

Chennai, India

Skills:

snowflake GithubKafkaBashSqlGitAzureKubernetesPythonAWSSQLMeshdbtIaaC

Pune, India

Skills:

Apache AirflowGithubGoogle Cloud PlatformTerraformDockerPythonSqlEncryptionIamGitHub ActionsVPC Service ControlsGCP security features

Bengaluru, India

Skills:

Agile MethodologiesMicrosoft Azure Data ServicesMicrosoft Azure Data Engineer Associate certifiedAdvanced programming skillsMicrosoft Azure Fundamentals certifiedSQL T-SQL programmingExperience in Microsoft FabricMicrosoft Certified Fabric Data Engineer AssociateAny other data engineering or Cloud platform certifications