We are looking for a skilled Power BI + Snowflake Developer with 45 years of experience to join our team in Bangalore. The ideal candidate will have strong experience working across the data stack, building scalable data solutions, and delivering impactful dashboards and insights for business stakeholders.
This role requires strong expertise in data warehousing, data pipelines, backend data processing, and BI visualization using Power BI.
Experience: 45 Years
Location: Bangalore (Work From Office)
Shift: 2:00 PM 11:00 PM IST
Employment Type: Full-Time
Key Responsibilities
- Design, develop, and maintain Power BI dashboards and reports to support business decision-making.
- Work with Snowflake data warehouse to manage, transform, and analyze large datasets.
- Develop and optimize data pipelines and transformations to ensure efficient data flow.
- Write and optimize complex SQL queries for data extraction, transformation, and analysis.
- Design and implement data models (fact and dimension tables, star schema) for reporting and analytics.
- Integrate data from multiple sources into Snowflake for centralized reporting.
- Ensure data accuracy, performance optimization, and security of BI solutions.
- Collaborate with data engineers, analysts, and business stakeholders to understand reporting requirements.
- Troubleshoot and resolve data pipeline, query performance, and reporting issues.
Required Skills & Qualifications
- 45 years of experience in Data Engineering / Business Intelligence / Data Analytics roles.
- Strong hands-on experience with Snowflake including data loading, stages, Snowpipe, Streams, Tasks, and query optimization.
- Strong experience with Power BI (DAX, Power Query, dashboard development).
- Experience working with Power BI Service, including dataset management, report publishing, and dashboard sharing.
- Advanced SQL skills for complex data querying and transformation.
- Candidates should have strong backend data processing and data pipeline development experience.
- Solid understanding of data modeling concepts such as star schema, fact and dimension tables.
- Experience working across the full data stack including data ingestion, transformation, warehousing, and reporting.
- Strong analytical and problem-solving skills.
Good to Have (Preferred Skills)
- Experience with Python or other scripting languages for data processing.
- Exposure to cloud platforms such as AWS, Azure, or GCP.
- Familiarity with modern data pipeline or orchestration tools (e.g., Airflow, dbt).
- Experience integrating data from APIs or multiple enterprise data sources.