Responsibilities:
- Conduct a comprehensive assessment of the current CMDB architecture, identify data gaps, and define a roadmap to a modern, AI-ready CMDB ecosystem.
- Design and implement a target-state architecture that integrates ServiceNow with Databricks and Microsoft Fabric, enabling gold-tier datasets to flow into the CMDB from diverse source systems.
- Modernize CI lifecycle management, reconciliation, and data quality processes through automated, AI-assisted, and data-driven workflows.
- Develop and maintain ServiceNow modules supporting ITSM and CMDB governance, including workflow automation, business rules, and module configuration.
- Reduce dependencies on IT teams by creating data-driven solutions to automate relationship mapping, dependency integrity, and reconciliation.
- Introduce AI/ML capabilities such as knowledge graphs and predictive intelligence to surface relationships between assets and proactively identify dependencies.
- Design and oversee CMDB audits, KPI dashboards, and continuous improvement processes to maintain data quality and integrity.
- Leverage Azure cloud services and Databricks for scalable data ingestion, transformation, and processing.
- Implement automated job scheduling and observability with self-healing AIOps capabilities.
- Serve as the technical lead and mentor for the CMDB team, establishing best practices, coding standards, and architectural principles.
- Ultimately accountable for the Quality and Integrity of the CMDB.
- Manage a team of passionate CMDB data engineers.
What We're Looking For:
Key qualifications:
- Experience: 5-10 years in data engineering and or CMDB management and architecture.
- Education: Bachelor's degree in Computer Science, Engineering, Information Systems, or a related field.
- CMDB Expertise: Deep understanding of CMDB concepts including CI lifecycle management, data model governance, relationship mapping, reconciliation, audits, and KPI dashboards.
- ServiceNow Development: Hands-on experience with workflow automation, business rules, module configuration, discovery, and integration with enterprise systems.
Data Engineering: Strong skills in ETL/ELT pipeline design, Python for data transformation, Databricks for scalable processing, and Microsoft Fabric for data lakehouse architectures and data distribution.