Search by job, company or skills

A

Cloud Platform Engineer

4-6 Years
Save
new job description bg glownew job description bg glownew job description bg svg
  • Posted a day ago
  • Be among the first 10 applicants
Early Applicant

Job Description

Project Role : Cloud Platform Engineer

Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance.

Must have skills : Microsoft Fabric

Good to have skills : NA

Minimum 3 Year(s) Of Experience Is Required

Educational Qualification : 15 years full time education

We are seeking a skilled Microsoft Fabric Data Engineer to design, build, optimize, and maintain modern data solutions using Microsoft Fabric. The ideal candidate will have strong experience with data engineering, analytics workloads, cloud-based data platforms, and end-to-end data pipeline development.

Min 4yrs exp - Microsoft Fabric Data Engineer

Key Responsibilities

  • Data Architecture & Modeling

Design and implement scalable data architectures using Microsoft Fabric components such as Lakehouse, Data Warehouse, OneLake, and KQL Databases.

Create and optimize star schemas, data marts, semantic models, and medallion architectures.

Manage and enforce data governance, security, and access control within Fabric workspaces.

  • ETL/ELT Pipeline Development

Develop, orchestrate, and maintain data ingestion and transformation pipelines using Data Factory, Fabric Pipelines, and Dataflows Gen2.

Build automated workflows for batch, streaming, or event-driven ingestion.

Optimize pipeline performance and ensure reliability, scalability, and fault-tolerance.

  • Data Integration & Processing

Work with structured and unstructured data from various enterprise systems, APIs, and external sources.

Utilize Apache Spark within Fabric Notebooks for large-scale data processing.

Implement Delta Lake best practices (Z-ordering, OPTIMIZE, VACUUM, etc.).

  • Analytics & Reporting Enablement

Partner with BI analysts to create and optimize Power BI semantic models and direct lake mode datasets.

Publish high-quality, certified data assets for business consumption.

Ensure data quality, accuracy, and consistency across analytic layers.

  • Monitoring, Optimization & Operations

Monitor Fabric workloads, storage utilization, capacity models, and performance.

Implement logging, alerting, and automated testing for pipelines.

Perform cost optimization for compute workloads and OneLake storage.

  • Collaboration & Stakeholder Engagement

Work closely with data analysts, data scientists, and business stakeholders to understand data needs.

Translate business requirements into scalable data solutions.

Document workflows, architectures, and best practices.

Required Skills & Qualifications

Bachelor s degree in Computer Science, Information Systems, Engineering, or related field.

Hands-on experience with Microsoft Fabric (Lakehouse, Data Factory, Pipelines, OneLake, Notebooks, Power BI).

Strong proficiency with SQL, Python, Spark, and Delta Lake.

Experience with Azure services (Azure Data Lake, Azure Synapse, Azure Data Factory, AAD).

Solid understanding of ETL/ELT methodologies, data modeling, and data warehousing concepts.

Knowledge of version control (Git) and CI/CD workflows.

Excellent analytical, problem-solving, and communication skills.

Preferred Qualifications

Fabric Analyst or Fabric Engineer Certification.

Experience with MLOps or DataOps practices.

Familiarity with DevOps tools (Azure DevOps, GitHub Actions).

Experience with streaming technologies (Event Hubs, Kafka, Fabric Real-Time Analytics)., 15 years full time education











More Info

Job Type:
Industry:
Employment Type:

About Company

Job ID: 147165387

Similar Jobs

Pune, India

Skills:

Natural Language ProcessingPowerShellNeural NetworksDevopsJenkinsMicrosoft 365TerraformAnsiblePythonAzure DevOpsEntra IDGenAIData AnalysisMicrosoft TeamsML algorithmsAI lifecycleMicrosoft Azure ADGitLab CIGitHub ActionsEnterprise Voice Contact Center solutionsAgentic AI

Pune, India

Skills:

NetworkingCloudformationPowerShellBashDnsDatadogFirewallJenkinsCloudwatchTerraformDockerAzurePythonKubernetesAWSAzure DevOpsGitHub ActionsBicepAzure Monitor

Pune

Skills:

AzureDevopsPythonAWS DMSAzure DMS

Pune, India

Skills:

data engineering snowflake GithubPysparkSqlELTTerraformGitlabDatabricksBambooData GovernancePythonEtlAWSDataOpsGenAI assistants

Pune, India

Skills:

NosqlJavaFile SystemPythonSqlSystems ProgrammingRoot Cause AnalysisGoOS servicesNetwork stack