Search by job, company or skills

ciklum india

Senior Data Engineer

Save
new job description bg glownew job description bg glow
  • Posted an hour ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Job Description

Ciklum is looking for a Senior Data Engineer to join our team full-time in India.

We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live.

About The Role

As a Senior Data Engineer, become a part of a cross-functional development team engineering experiences of tomorrow.

A hands-on Data Engineer responsible for building, maintaining, and optimizing data pipelines and lakehouse solutions in Microsoft Fabric, with a focus on reliable data delivery, performance, and scalability.

Responsibilities

  • Fabric Ecosystem & Data Engineering
    • OneLake Usage: Work with organizational OneLake structure, creating and managing shortcuts for efficient data access without duplication
    • Medallion Architecture Implementation: Build and maintain Bronze, Silver, and Gold layers using Delta Lake tables to support reporting and analytics
    • Data Pipelines: Develop and manage pipelines using Fabric Data Factory and Notebooks for data ingestion, transformation, and orchestration
    • Data Modeling: Implement basic dimensional models (Star Schema) to support BI and reporting use cases
  • Azure & Data Integration
    • Data Ingestion: Develop batch and incremental data pipelines from sources like ADLS Gen2, APIs, Azure SQL, and Blob Storage
    • Pipeline Development: Use Fabric Data Factory / Azure Data Factory to orchestrate ETL/ELT workflows
    • Basic Automation: Support automation using Azure Functions or Logic Apps for simple event-based triggers (optional)
    • Legacy Support: Assist in maintaining existing ADF and Synapse pipelines and support migration to Microsoft Fabric
  • Data Processing & Optimization
    • Spark Development: Write and optimize PySpark notebooks for data transformation and cleansing
    • SQL Development: Develop efficient queries, views, and stored procedures in Fabric Warehouse / Azure SQL
    • Performance Tuning: Apply basic optimization techniques like partitioning, caching, and query tuning
    • Monitoring: Track pipeline performance and troubleshoot failures
  • Governance & Security
    • Access Control: Implement RBAC roles and manage secure data access across Fabric workspaces
    • Data Quality: Apply validation checks and ensure consistency across data pipelines
    • Version Control: Work with Git integration (Fabric / Azure DevOps) for managing code and deployments
    • Metadata & Lineage: Support data cataloging and lineage using Microsoft Purview (basic level)
Requirements

  • 6 years in Data Engineering
  • Hands-on exposure to Microsoft Fabric (preferred) or strong Azure Data Engineering background with willingness to learn Fabric
  • Microsoft Fabric (Data Factory, OneLake, Synapse Data Engineering – basics)
  • Azure Data Services: ADLS Gen2, Azure SQL, Blob Storage
  • Strong SQL (joins, aggregations, performance tuning basics)
  • Python (PySpark) for data transformation
  • Azure Data Factory / Fabric Pipelines

Desirable

  • Azure Functions / Logic Apps
  • Event Hubs / streaming concepts
  • Microsoft Purview (basic exposure)
  • Cosmos DB

What`s in it for you

  • Strong community: Work alongside top professionals in a friendly, open-door environment
  • Growth focus: Take on large-scale projects with a global impact and expand your expertise
  • Tailored learning: Boost your skills with internal events (meetups, conferences, workshops), Udemy access, language courses, and company-paid certifications
  • Endless opportunities: Explore diverse domains through internal mobility, finding the best fit to gain hands-on experience with cutting-edge technologies
  • Care: We've got you covered with company-paid medical insurance, mental health support, and financial & legal consultations

About Us

At Ciklum, we are always exploring innovations, empowering each other to achieve more, and engineering solutions that matter. With us, you'll work with cutting-edge technologies, contribute to impactful projects, and be part of a One Team culture that values collaboration and progress.

India is a strategic innovation hub for Ciklum, with growing teams in Chennai and Pune leading advancements in EdgeTech, AR/VR, IoT, and beyond. Join us to collaborate on game-changing solutions and take your career to the next level.

Want to learn more about us Follow us on Instagram , Facebook , LinkedIn .

Explore, empower, engineer with Ciklum!

Interested already We would love to get to know you! Submit your application. We can't wait to see you at Ciklum.

More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147539501

Similar Jobs

Chennai, India

Skills:

composer Cloud StorageBigQueryGoogle Cloud PlatformDataFlowData GovernancePythonSqlAirflowAccess Control frameworks

Chennai, India

Skills:

PysparkAzure SqlSqlAzure Data FactoryCosmos DBFabric PipelinesOneLakeMicrosoft PurviewEvent HubsDelta LakeBlob StorageMicrosoft Fabric

Remote

Skills:

data engineering PythonPysparkAWS GlueDockerAWS Batch

Chennai, India

Skills:

Spark SQLScalaSqlELTAzurePythonAWSEtldata quality frameworksMS Fabric Dataflows Gen2LakehouseDelta Lakemonitoring observability toolsMicrosoft FabricMedallion architecture

Coimbatore, Chennai, Pune

Skills:

snowflake SnowpipeCortexPythonAdvanced SqlLlmAi