The second most frequently asked question I receive when presenting my work is whether the BIM performance models I use contradict with UK’s BIM maturity levels. The short answer is ‘no’, but the longer answer is much more interesting and possibly a bit controversial. This Episode compares the Bew-Richards BIM Maturity Model (UK) with the BIM performance models developed as part of the BIM Framework. The comparison highlights the benefits of separating country-specific strategy models from country-agnostic performance models, and how both are needed in every market.
This episode is available in other languages. For a list of all translated episodes, pleaser refer to http://www.bimthinkspace.com/translations.html. The original English version continues below:
This post will first introduce the maturity/performance models and then compare their main attributes in a simple table followed by a brief summary. To keep this episode as short as possible, I've delayed discussing some parts in detail to future posts (e.g. compliance testing vs performance assessment) and moved much detail down to footnotes.
The Wedge
The UK maturity model - also known as the iBIM model (the name of its highest level) or the BIM Wedge (due to its famous shape) - was developed by Mark Bew and Mervyn Richards in 2008 [1]. There are many versions of the base model with subtle but meaningful differences; the one shown below (Fig.1) [2] appeared in the UK Government Construction Client Group (GCCG) report in 2011:
Fig. 1. The UK BIM Maturity Model (GCCG, 2011) - online version
This now ubiquitous image is a clear embodiment of UK’s BIM strategy and its wide-ranging industry initiatives. In its basic form, the model includes four levels with stable (levels 0 and 1), stabilising (level 2), and yet to stabilise (level 3) definitions [3].
Since it was first developed, the BIM Maturity Model has established itself as a key component of a national (UK) BIM diffusion policy. It is now very difficult to isolate BIM levels from other UK-centric construction industry strategies (e.g. Soft Landings), workflows (e.g. RIBA Plan of Work ), roles (e.g. Information Manager), and protocols (e.g. UK’s version of COBie [4]). Applied more widely without its UK-specific components, the BIM Wedge loses much of its strength yet can still be used in defining long-term objectives of varied stakeholders. It’s an open-ended, optimistic model that – even graphically - invites others to imagine subsequent levels [5], and to add layers of meaning [6] on top of those already defined.
The main shortcoming of the BIM Maturity Model is that it’s not a maturity model in the true sense of the word [7]. A more accurate description of it would be a strategy model, a policy model or an industry roadmap. This is due to the structure and evolving definitions of BIM levels which render them unsuitable for assessing BIM diffusion across markets or BIM performance within organizations:
- The BIM Levels (especially levels 2 and 3) act as containers for multiple guides, standards and other policy components. We can assess against each component within the container (e.g. assess awareness of, or compliance with PAS1192-2:2013 or Soft Landings) but we can’t assess against the container as a whole, especially as existing components are continually updated and new ones are still being added.
- The BIM Levels cannot be used – and probably were never intended – to assess the abilities of individuals, organizations or teams. Although it is referred-to as a ‘maturity index’ in the GCCG report (2011, p.16), the iBIM model does not include the necessary assessment metrics. That is, you cannot use BIM Levels to establish an organization’s ability to collaborate with others, conduct model-based thermal analysis, or deliver high-quality 4D construction sequencing.
- What BIM Levels can do very efficiently is identify policy/learning targets and test compliance [8] against pre-defined standards. That is, each container/level represents a well-defined ‘set of targets’ for stakeholders to learn about and comply with. The more detailed, and more standardised the components and sub-components within these containers, the easier it is to test stakeholders’ awareness of and experience in each and all elements.
So why are these distinctions significant: containers vs levels, strategy model vs maturity model, and performance assessment vs compliance testing? These distinctions wouldn’t be as critical if the BIM Maturity Model was referred to as a ‘UK BIM Roadmap’ or a ‘UK BIM Strategy Model’. However, the unfortunate naming of the Wedge/iBIM figure as a ‘Maturity Index’ has several potential consequences:
- It confuses strategic targets (with set future dates [9]) with compliance milestones (with set standards and protocols) and compiles both under a maturity label;
- It hampers, or at least delays, the development of a separate maturity assessment model; and
- It will necessarily favour ‘evidence-based’ compliance testing [10] over ‘outcome-based’ performance metrics.
These consequences - which we're starting to witness today - are avoidable provided we separate the necessarily rigid strategy targets from the necessarily flexible performance levels. This will hopefully make more sense after reading the next section.
The Framework
The BIM Framework [11] includes a large set of conceptual models which complement each other to explain the BIM landscape and deliver tools for capability assessment, learning and performance improvement. For example, the BIM Capability concept - one of the three main framework dimensions - is used in establishing/assessing BIM performance milestones within organizations and teams (but cannot measure countries). Also, the BIM Maturity concept is used more widely to assess the performance of both organizations (and their sub-units – e.g. departments) and whole markets (and their subdivisions – e.g. countries). The two concepts, their history, and interdependence are clarified below:
Stages and Levels
The ‘BIM Stages’ model was first introduced through BIM ThinkSpace (Episode 8 – Feb 2008) and then published in a high-impact journal [12] as BIM Maturity Stages [13]. The model identified three BIM stages: [1] Object-based Modelling, [2] Model-based Collaboration, and [3] Network-based Integration. These stages are also preceded by a pre-BIM status (a null or zero stage) and a post-BIM future [14]. In 2009, the maturity concept was separated from capability/maturity and introduced as the BIM Maturity Index (BIMMI) [15] with five levels of maturity: [a] Ad-hoc or low maturity; [b] Defined or medium-low maturity; [c] Managed or medium maturity; [d] Integrated or medium-high maturity; and [e] Optimised or high maturity.
BIMMI has been covered in two previous BIM ThinkSpace posts: Episode 13 clarified how the five maturity levels (a, b, c, d and e) are used alongside BIM capability stages (1, 2 and 3) to measure organizational abilities; and Episode 21 clarified how the same five levels are used alongside eight macro maturity components to assess and compare market maturity. These different uses highlight two important attributes of BIMMI: (i) it is usable in assessing both organizational and country-wide maturity, and (ii) it cannot be used on its own and must be coupled with another model.
Point of Adoption or S-Curve model
While the macro BIM maturity assessment has been thoroughly covered in the previous episode, it is beneficial to highlight the unique strength of integrating capability stages with maturity levels to assess organizational performance. This is best represented by the Point of Adoption (PoA) model [16] (or S-Curve BIM Model) which is further explained below:
Fig. 2. The Point of Adoption model v1.0 (Succar, 2014) - (full size, current version)
As illustrated in Fig.2, BIM implementation starts at the Point of Adoption (PoA) when an organization, after a period of planning and preparation (readiness), successfully adopts object-based modelling tools and workflows. The PoA [17] thus marks the initial capability jump from no BIM abilities (pre-BIM status) to minimum BIM capability (Stage 1). As the adopter interacts with other adopters, a second capability jump (Stage 2) marks the organization’s ability to successfully engage in model-based collaboration. Also, as the organisation starts to engage with multiple stakeholders across the supply chain, a third capability jump (Stage 3) is necessary to benefit from integrated, network-based tools, processes and protocols. Each of these capability jumps is preceded with considerable investment in human and physical resources, and each stage signals new organizational abilities and deliverables not available before the jump.
However, the deliverables of different organizations at the same capability stage may vary in quality, repeatability and predictability. This variance in performance excellence occurs as organizations climb their own BIM maturity curve, experience their internal BIM diffusion (adoption within an organization), and gradually improve their performance over time. As shown in Fig. 2, there are multiple maturity S-curves reflecting the mixed nature of BIM adoption, even within the same organization. This is due to the phased nature of BIM with each revolutionary stage requiring its own readiness ramp, capability jump, maturity climb, and point of adoption. This is also due to varied abilities across organizational sub-units and project teams. For example, while the Melbourne branch of Organization A may have excellent model-based collaboration capabilities, the Athens branch may have basic modelling capabilities, and the Hyderabad branch may still be preparing to implement Revit, ArchiCAD or Tekla. This variance in ability makes it necessary to use both capability and maturity measurements to generate a compiled rating for organization A as it simultaneously prepares for BIM, implements BIM, and improves its BIM performance.
The Point of Adoption Model highlights how capability stages and maturity levels are used to assess/facilitate BIM implementation within organization and – in combination with other Framework models - BIM diffusion across markets. However, these performance models are not intended to communicate a national BIM policy. Since they do not identify time-based targets, the varied performance models can be mapped to country-specific objectives, varied industry initiative and any structured timescale.
Comparison Table
Based on the above, Table 1 highlights the main differences and similarities between the UK’s maturity model and the Framework’s performance models:
Main Model/Framework |
[1] BIM Maturity Model (UK) |
[2] BIM Performance Models |
A. Developed by |
Mark Bew and Mervyn Richards |
Bilal Succar |
B. First Published |
2008/2010 |
2008 |
C. Geographic Application |
UK-specific |
Applies across countries |
D. Main sub-Models (only those related to performance) |
No sub-models
|
BIM Capability Stages; BIM Maturity Index (BIMMI); Individual Competency Index (ICI) [18] |
E. Main sub-Components |
A large set of well-defined UK-centric requirements and deliverables |
Placeholders only for market-centric requirements and deliverables. Can be connected to sub-components of any country |
F. Number of Levels/Stages |
4 Maturity Levels (additional levels may be added) |
5 Capability Stages + 5 Maturity Levels + 5 Competency Levels (no additional stages or levels allowed within each model) |
H. Definitions |
Fixed for Levels 0 and 1; Level 2 defined in 2015; Level 3 not yet fully defined |
Definitions of capability stages and maturity levels fixed since 2010 (competency levels added in 2013 using the same 5-level formula) |
I. Authority |
Authoritative, adopted as part of a national strategy; increasingly being imitated by other policy makers |
Authority by research impact (e.g. citation count) and professional adoption (Modified Jan 18, 2017) |
J. Research base |
The model is not based on academic research and its conceptual bases are unknown |
The framework is based on academic research and its models published in peer-reviewed papers [19]; conceptual bases are exposed to scrutiny |
K. Ability to measure |
Enables compliance testing of organizations and teams against sub-components (e.g. PAS1192 parts 2-4); cannot measure organizational performance or market maturity; does not allow compiled ratings or continuous measurement |
Can be used in measuring organizational performance, team compatibility, market maturity, and individual competency; can test compliance when levels are mapped to sub-components of different countries; enables compiled ratings and continuous measurement [20] at various levels of granularity [21] |
L. Ability to communicate |
Simple to understand by most stakeholders; can collate a variety of targets and compliance areas within each level |
Simple to understand by specialised stakeholders; can collate a variety of performance improvement steps in-between stages |
M. Main weakness |
The model mixes between strategy targets and performance levels; the model loses its essence outside the UK |
The models are not formally endorsed by an authority[22] |
Table 1. Comparison Table
In summary, and based on the above comparisons, the UK BIM Maturity Model and the BIM Framework’s performance models are not contradictory but quite complementary. This is especially true if the iBIM/Wedge figure is understood as a strategic roadmap rather than a maturity index as currently referenced and widely understood.
As illustrated in Episode 20, a truly mature BIM market requires a combination of strategic objectives (Component I) and performance metrics (Component IV). That is, in each market and within each organization, we need both types of models: (1) we need a strategy model to define long-term objectives and a timescale to reach these objectives; and (2) we need a performance model to measure and improve the abilities of stakeholders against these defined objectives. While a BIM strategy model and a BIM performance model are not contradictory, relying solely on either is as efficient as using a compass to measure the height of a building, or using a ruler to navigate to the North Pole.
One more thing, this episode is intended to clarify concepts and generate discussions around national BIM policies, BIM maturity models and performance improvement. These topics are especially important as new roadmaps [23] and maturity models [24] are starting to make their presence known. I’m therefore keen to share my thoughts with you and read your comments; especially if you wholly disagree with my approach, comparisons or analysis.
And, in case you’re still wondering, the most frequently asked question I receive is: “what software do you use to generate your [add adjective] graphics?” And the short answer is…[25]
[1] I could not find a publically available copy preceding 2010. If one exists, please let me know and I’ll update this post.
[3] Refer to BIM Maturity Levels (Designing Buildings Wiki, 2014) and The 20 Key Terms you need to Know (NBS, 2014)
[4] Yes, discrepancies between US and UK use/abuse of COBie has recently emerged. Trick question: is COBie a ‘standard’, a ‘specification’ or a ‘protocol’? The answer to this will place you in either camp, or somewhere in between
[5] Additional levels beyond Level 3 BIM are imagined/expected by many (e.g. the BIM2050 Report)
[6] There are a number of varieties for each model developed by third parties for their own purposes. For example, please refer to buildingSMART’s technical roadmaps (webpage)
[7] The term maturity within organizational studies typically refers to acquired abilities. That is, an organization or market is said to mature as its performance level builds upon earlier performance levels. A maturity level is not a goal in itself but a “well-defined evolutionary plateau that institutionalizes new capabilities” (SEI, 2008)
[8] Performance is the ability to conduct an activity or to generate a desirable outcome. To measure performance, the question typically used is “do you [or your organization] have the ability to….?” Compliance is more concerned with testing if an individual/organization is following a set procedure, code or standard (compliance testing don’t measure outcomes). Note that performance assessment includes compliance testing, but not the opposite. Example, to generate great-tasting hospital food (if such a thing exists!), you need to be able to cook tasty food (performance) and comply with dietary requirements and health standards (compliance)
[9] There are set compliance dates attached to BIM Level 2 and BIM Level 3 with some variances in applicability by project type (new vs renovation). Beyond the well-known 2016 mandate for public projects, I couldn’t locate an online resource with consistently updated information. Please point one out to me and I’ll update this note
[10] Stakeholders will be asked to provide evidence that they've complied with PAS1192-X (and similar) rather than demonstrate an ability/excellence in delivering best-in-class products and services
[11] The BIM Framework is the main deliverable of my PhD research at the University of Newcastle (2005-2013). It si continuously evolving and now has its own blog: www.BIMframework.info
[12] The BIM Maturity Stages model appeared in “Succar, B. (2009). Building information modelling framework: A research and delivery foundation for industry stakeholders. Automation in Construction, 18(3), 357-375”. This journal article was first available online on December 6, 2008 (download paper: http://bit.ly/BIMPaperA2)
[13] A hybrid version of the BIM Stages model was published as the ‘Towards Integration’ graphic within the CRC-CI National Guidelines for Digital Modelling (pages 12-13, PDF 2.9Mb). Sub-stages were identified by Mr Andrew Gutteridge (AIA), the Chair of the Integrated Digital Modelling Taskforce partaking in the development of the Guidelines
[14] The last stage is referred to as virtual integrated Design, Construction and Operation (viDCO). According to this model, there are no real stages beyond Stage 3. After that, the BIM term is predicted to diffuse into larger systems, schemata and interconnected data repositories
[15] Refer to “Succar, B. (2010). Building Information Modelling Maturity Matrix. In J. Underwood & U. Isikdag (Eds.), Handbook of Research on Building Information Modelling and Construction Informatics: Concepts and Technologies (pp. 65-103): IGI Publishing.” This book chapter was first released in December, 2009 (download paper: http://bit.ly/BIMPaperA3)
[16] The Point of Adoption model will be published as part of a peer-reviewed journal article in the near future. The article is co-authored by Dr Mohamad Kassem of Teesside University (UK)
[17] The Point of Adoption (PoA) is not to be confused with the critical mass ‘inflection point’ on the S-curve (E. M. Rogers, 1995) (Everett M Rogers, Medina, Rivera, & Wiley, 2005); or with the ‘tipping pint’, the critical threshold introduced by Gladwell (2001) in his famous The tipping point: How little things can make a big difference
[18] The Individual Competency Index (ICI) is discussed in Episode 17 on BIMThinkSpace.com and Item 15 on BIMFramework.info
[19] The BIM Framework is published across a number of peer-reviewed papers. If interested in assessing the research impact of these papers, please refer to my Google Scholar page
[20] Continuous measurement is different to staged measurement (using stages and levels). Continuous measurement is used for capability/competency profiling and gap analysis. The concepts of staged/continuous measurements are an adaptation of staged/continuous representation within the Capability Maturity Model Integration (CMMI-SE/SW/IPPD/SS, V1.1) developed initially by the Software Engineering Institute (SEI), Carnegie Melon). Continuous measurement is a principles underlying the BIM Excellence methodology/tool; more about this in a future Episode.
[21] There are four Granularity Levels (GLevels): Discovery, Evaluation, Certification and Auditing – refer to the BIM Maturity Matrix paper
[22] The BIM Performance Metrics were lately acknowledged by a key pan-industry initiative in Australia. Please refer to Section 3.3 of the APCC/ACIF “A Framework for the Adoption of Project Team Integration and Building Information Modelling” – Dec 2014 (PDF 1.6Mb)
[23] Example: A Roadmap to Lifecycle Building Information Modeling in the Canadian AECOO Community (https://www.buildingsmartcanada.ca/roadmap-to-lifecycle-bim/)