We are seeking a Senior Data Engineer & BI Specialist with expertise in data engineering and business intelligence (BI). This role requires a deep understanding of data warehousing, ETL processes, and dashboarding, along with the ability to ensure data quality and governance. The ideal candidate will be responsible for designing, building, and optimising high-quality data warehouses, writing complex queries, and handling end-to-end data pipelines that power both ETL processes and BI dashboards.
Responsibilities
- Design and Build Data Warehouses: Architect scalable and efficient data warehouses that support analytics and reporting needs. Strong understanding of data warehouse and data lake concepts.
- Develop and Optimise ETL Pipelines: Write complex SQL queries and use tools like Apache Airflow to automate data pipelines.
- Query Optimisation and Performance Tuning: Write efficient SQL queries for ETL jobs as well as for dashboards in BI tools.
- Database Management: Work with MySQL, PostgreSQL, and Spark to manage structured and semi-structured data.
- Data Quality and Governance: Ensure data accuracy, consistency, and completeness through validation, monitoring, and governance practices.
- Implement Data Governance Best Practices: Define and enforce data standards, access controls, and policies to maintain a well-governed data ecosystem.
- Data Modelling and ETL Best Practices: Ensure robust data modelling and apply best practices for ETL development.
- BI and Dashboarding: Work with BI tools such as Power BI, Tableau, and Apache Superset to create insightful dashboards and reports.
- Propose and Implement Solutions: Identify and propose improvements to existing systems and take ownership of designing and developing new data solutions.
- Collaboration and Problem Solving: Work independently, collaborate with cross-functional teams, and proactively troubleshoot data challenges.
Requirements
- 3-6 years of experience in the data domain (data engineering + BI).
- Strong SQL skills with expertise in writing efficient and complex queries.
- Hands-on experience with data warehouse concepts and ETL best practices.
- Proficiency in MySQL, PostgreSQL, and Spark.
- Experience with Python and building pipelines in Airflow or similar tools.
- Strong understanding of data modelling techniques for analytical workloads.
- Experience with Power BI, Tableau, or Apache Superset for reporting and dashboarding.
- Experience with data quality frameworks, data validation techniques, and governance policies.
- Ability to work independently, identify problems, and propose effective solutions.
- Experience building real-time pipelines is preferred.
- Experience handling multi-tenant data is a plus.
- Bonus: Experience with dbt for data transformations.
- Should be enthusiastic to learn new concepts and technologies, and be able to implement them with minimal supervision.
- Strong adherence to best practices in coding and database/visualisation development.
- Excellent problem-solving skills, attention to detail, and communication skills.
This job was posted by Nexquare Hiring from nexquare.