
Search by job, company or skills
Position: MS Fabric Data and SQL Engineer
Experience: 3+ years
Location: Bangalore (5 days WFO)
Mandatory Skill: Hands on experience in MS Fabric (1 year experience min.)
Immediate Joiners Preferred
If you're interested, directly apply here: https://forms.gle/sMEPhFzuUhhUJcWq6
Key Responsibilities
• Design and implement Lakehouse architecture using OneLake, Lakehouse, and Fabric Data Warehouse.
• Build Bronze / Silver / Gold data layers using Delta tables and medallion architecture.
• Migrate and modernize legacy SQL Server / SSIS workloads to Microsoft Fabric Pipelines and Notebooks.
• Analyze and study existing stored procedures, views, functions, triggers, and SQL jobs within on-prem SQL Server
• Reverse-engineer complex business and calculation logic embedded in database objects
• Document logic in a clear, structured manner suitable for Workday re-implementation
• Produce functional descriptions, data mappings, and dependency documentation
• Collaborate with functional & HRIS teams
• Support clarification of legacy logic during development and migration activities
• Identify redundant or obsolete logic and highlight simplification opportunities
Primary Skills:
• Develop and orchestrate data pipelines using:
• Fabric Data Pipelines (ADF Gen2 equivalent)
• PySpark / Spark SQL notebooks
• Dataflows Gen2
• API integration
• MS SQL Server 2016 (On-Premises, Legacy Systems)
• Advanced T-SQL (Stored Procedures, Functions, Views, Triggers)
• Strong understanding of complex joins, CTEs, and query optimization
• Excellent analytical and documentation skills
• Ability to translate technical SQL logic into functional documentation
Secondary / Good-to-Have Skills:
• .Net
• SSIS / PowerBI knowledge
• Agile delivery experience
If you're interested, you can also share your updated resumes at: [Confidential Information]
Job ID: 147321165
Skills:
snowflake , Query Optimization, Plsql, Advanced Sql, Python, Aws S3, AWS, Data Integration tools, ETL ELT processes, window functions, Data Clustering, custom Python solutions, stored procedures, CI CD data pipelines, data quality validation, AWS Schedulers, NiFi, cost management, Data Validation frameworks
Skills:
snowflake , Pyspark, Power Bi, Cassandra, PostgreSQL, Tableau, Sql, Devops, MySQL, Databricks, MongoDB, Azure, Python, AWS, Dev Sec Ops, Data Ops, Large Language Models
Skills:
S3, Pyspark, PostgreSQL, Data Warehousing, Emr, Redshift, Sql, Gcp, Pandas, MySQL, Azure, Python, AWS, Airflow, ClickHouse, Synapse, Lambda Functions, NiFi
Skills:
AWS Glue, Prometheus, Grafana, Apache Nifi, Apache Airflow, Docker, Terraform, Openshift, Azure Data Lake, Talend, Python, Azure DevOps, Apache Spark, Bash, Elk Stack, Sql, Jenkins, Ansible, Amazon Redshift, AWS CloudFormation, Apache Kafka, Puppet, Kubernetes, Aws S3, AWS Step Functions, Google BigQuery, GitLab CI
Skills:
T-sql, SQL Server, Pl Sql, Azure Logic Apps, Azure Databricks, Data Warehousing Concepts, Sql Basic, Azure Data Factory, Azure Synapse Analytics, Cosmos DB, Python, Data Modeling Fundamentals, Azure Data Lake Storage Gen2, Azure Monitoring, Azure Blob Storage, Modern Data Platform Fundamentals, Azure Data Explorer, ETL Fundamentals, Microsoft Purview, Azure Stream Analytics, Stored Procedures, Azure Event Hub
We don’t charge any money for job offers