Project Role : Data Engineer
Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems.
Must have skills : Tableau
Good to have skills : Microsoft SQL Server, Google BigQuery
Minimum 7.5 Year(s) Of Experience Is Required
Educational Qualification : 15 years full time education
Summary:
As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and provide innovative solutions to enhance data accessibility and usability.
Roles & Responsibilities:
- Strong Understanding of Tableau tool and supporting architecture
- Hands on experience of working on Tableau .
- Strong understanding and hands on experience of working on SQL
- Experience working directly and consulting with business clients to design a solution
- Relevant experience in end-to-end BI development and solutioning
- Design and implement innovative data visualization solutions that enhance user experience.
- Collaborate with cross-functional teams to gather requirements and translate them into effective visualizations.
- Continuously evaluate and improve existing dashboards and reports based on user feedback.
- Stay updated with the latest trends in data visualization and business intelligence to bring fresh ideas to the team.
- Mentor junior team members in best practices for data visualization and analysis
Professional & Technical Skills:
- Must To Have Skills: Proficiency in Tableau.
- Good To Have Skills: Experience with Microsoft SQL Server, Google BigQuery.
- Universe Design & Development:
- Minimum 35 years of hands-on experience with Universe Design Tool (UDT) for .unv universes and Information Design Tool (IDT) for .unx universes
- Expertise in developing and maintaining universes based on Oracle relational databases, including use of stored procedures and materialized views
- Ability to manage and optimize complex joins, including outer joins, shortcut joins, and custom SQL within IDT
- Strong command of contexts, aliases, and strategies to resolve loops, fan traps, and chasm traps
- Design and maintenance of object hierarchies, measure objects, and detail objects with proper aggregation and formatting
- Creation of Prompt functions for dynamic user input filtering and integration with row-level security
- UNV to UNX Conversion:
- In-depth experience in analyzing, converting, and validating legacy UNV universes to UNX using IDT
- Ensuring parity in functionality, object logic, and data results post-migration
- Documentation and version control of universe changes during conversion
- Web Intelligence (WebI) Report Repointing:
- Skilled in identifying reports bound to UNV universes and repointing them to the corresponding UNX
- Validating data consistency and report performance post-repointing
- Updating input controls, variables, and prompts in WebI documents to align with the new UNX structure
- Troubleshooting and resolving any mismatches or failures during the repointing process
- Performance Tuning & Optimization:
- Use of aggregate awareness, index awareness, and derived tables to improve performance
- Optimization of universe structure to support efficient query generation
- Use of Query Stripping and WebI data tracking features
- Experience working with large datasets and optimizing data fetch for reports
Additional Information:
- The candidate should have minimum 5 years of experience in Tableau.
- This position is based at our Mumbai office.
- A 15 years full time education is required.
Candidate should be ready for Shift B and work as individual contributor