Subscribe
Receive the latest strategic talent insights straight from the CEO’s desk
Advancing the Skills-Based Architecture Journey from Ontology to Execution
The arrival of LLMs has given the Skills-Based Architecture journey new hope and life. Before this, many companies considered this a significant investment project. Many platforms and solutions that promised skills-based architecture often resulted in significant upfront costs with little return on investment. This is why at Draup, we have been focusing on small wins in the Skills journey. I thought I would begin this week’s email by highlighting the differences between Skills Ontology and Skills Architecture.
We use Skills Architecture and Skills Ontology interchangeably, but there are differences between the two. The skills ontology is a structured, machine-readable map of how skills relate to each other and their roles, and it is used mainly for AI, search, and matching.
- Skills architecture is a broader organizational framework that defines how skills are applied across roles, job families, and talent systems.
- Ontologies focus on semantic relationships (e.g., “Python” is a programming skill), while architectures focus on business applications (e.g., which skills are required for a Data Engineer).
- Ontologies are often industry- or platform-wide (like ESCO or O*NET), whereas architectures are usually customized for a specific enterprise.
- Skills architecture typically incorporates ontologies as a foundational layer but extends to include proficiency levels, learning paths, and workforce planning use cases.
A strong skills architecture is not built overnight. It’s a layered journey—often starting with ontologies but expanding to include proficiency models, upskilling pathways, and succession planning. That’s why early pilots, targeted use cases, and iterative modeling are essential. These small, measurable successes compound over time, creating a resilient, enterprise-ready skills architecture that’s both data-rich and business-aligned. This has been the way Draup has approached this problem statement over the years.
Examples of Small Wins in a Skills Architecture Journey:
- Combining two near-identical roles into one streamlined role
(e.g., merging “Software Engineer II” and “Software Engineer – Cloud” where appropriate) - Identifying the correct tech stack for a role
(e.g., ensuring a Data Engineer role is mapped to tools like Snowflake, dbt, and Python, rather than legacy tech) - Identifying the root skill of a job
(e.g., recognizing that the core of a “Marketing Automation Specialist” role is CRM system expertise) - Correcting outdated or ambiguous job titles
(e.g., refreshing “Webmaster” to “Digital Experience Manager” to reflect modern responsibilities) - Mapping 10–20 roles to an initial draft of skill families and task libraries
(e.g., starting with Sales, HR, or Engineering rather than the whole organization) - Clarifying proficiency levels needed per skill
(e.g., Intermediate vs. Expert Python skills for Data Analyst vs. Data Scientist) - Aligning one business unit’s hiring profiles to a future-state skills framework
(e.g., updating hiring JD templates to reflect AI-augmented skill needs, Skills-based Interview Guides) - Building a small reskilling map for one emerging job family
(e.g., showing how traditional network engineers could transition into cloud networking roles) - Creating a basic crosswalk between job titles and industry-standard taxonomies
(e.g., mapping internal role titles to O*NET, ESCO, or Nasscom references to ground skills analysis) - Clarifying different AI roles instead of having a single generic AI title
(e.g., distinguishing an AI Research Scientist from an Applied Machine Learning Engineer, or an AI Product Manager—each with distinct skill sets)
I am attaching a short write-up on Draup Roles and Skills Evolution, which highlights how Draup can help with these small wins. This document may be useful for you.