At The Institute of Clever Stuff (ICS), we don't just solve problemswe revolutionise results. Our mission is to empower a new generation of Future Makers today, to revolutionise results and create a better tomorrow. Our vision is to pioneer a better future together. We are a consulting firm with a difference, powered by AI, driving world-leading results from data and change.
We partner with visionary organisations to solve their toughest challenges, drive transformation, and deliver high-impact results. We combine a diverse network of data professionals, designers, software developers, and rebel consultants alongside our virtual AI consultant, fortu.ai, who combine human ingenuity with fortu.ai's AI-powered intelligence to deliver smarter, faster and more effective results.
Meet fortu.ai
- Used by some of the world's leading organisations as a business question pipeline generator, ROI tracker, and innovation engine all in one.
- Trained on 400+ accelerators and 8 years of solving complex problems with global organisations.
- With fortu.ai, we're disrupting a $300+ billion industry, turning traditional consulting on its head.
Key Responsibilities:
Complete Data Modelling Tasks
- Initiate and manage Gap Analysis and Source-to-Target Mapping Exercises.
- Gain a comprehensive understanding of the EA extract.
- Map the SAP source used in EA extracts to the AWS Transform Zone, AWS Conform Zone, and AWS Enrich Zone.
- Develop a matrix view of all Excel/Tableau reports to identify any missing fields or tables from SAP in the Transform Zone.
- Engage with SME's to finalize the Data Model (DM).
- Obtain email confirmation and approval for the finalized DM.
- Perform data modelling using ER Studio and STTM.
- Generate DDL scripts for data engineers to facilitate implementation.
Complete Data Engineering Tasks
- Set up infrastructure for pipelines this includes Glue Jobs, crawlers, scheduling, step functions etc.
- Build, deploy, test and run pipelines on demand in lower environments.
- Verify data integrity: no duplicates, all columns in final table etc.
- Write unit tests for methods used in pipeline and use standard tools for testing.
- Code formatting and linting.
- Collaborate with other Modelling Engineers to align on correct approach.
- Update existing pipelines for CZ tables (SDLF and OF) where necessary with new columns if they are required for EZ tables.
- Raise DDP requests to register databases and tables, and to load data into the raw zone.
- Create comprehensive good documentation. Ensure each task is accompanied by detailed notes specific to its functional area for clear tracking and reference.
- Analyse and manage bugs, and change requests raised by business/SMEs.
- Collaborate with Data Analyst and Virtual Engineers (VE) to refine and enhance semantic modelling in Power BI.
- Plan out work using Microsoft Azure, ADO. Dependencies, status and effort is correctly reflected.
Required Skills:
- Proven experience in data modelling and data pipeline development.
- Proficiency with tools like ER Studio, STTM, AWS Glue, Redshift & Athena, and Power BI.
- Strong SQL and experience with generating DDL scripts.
- Experience working in SAP data environments.
- Experience in any of these domain areas is highly desirable: Logistics, Supply Planning, Exports and IFOT.
- Familiarity with cloud platforms, particularly AWS.
- Hands-on experience with DevOps and Agile methodologies (e.g., Azure ADO).
- Strong communication and documentation skills.
- Ability to work collaboratively with cross-functional teams.