CMMI Performance Continues to Impress, Improve

Author: Ron Lear, Vice President, Frameworks and Models, CMMI Content Development & Services
Date Published: 25 January 2023

2022 was an encouraging year for the CMMI. All the numbers for adoption performed strongly in 2022 – more appraisals, courses, new companies adopting, and we received and addressed over 2,000 comments from the previous releases of the CMMI Model and appraisal Method Definition Document (MDD). Additionally, Security and Safety appraisals were finally launched/available to align appraisals, the CMMI Appraisal System (CAS) and the Model. We also collectively accomplished a significant reduction of our appraisal quality review time and CMMI systems technical debt. 

On behalf of the entire ISACA team, I would like to express my sincere gratitude to our customers, the community, and our internal ISACA teams for focusing to make that happen. Don’t worry—we’re not done. We still have work and improvement to do, and we will, together. I am especially excited about the upcoming release for CMMI 3.0 this March.

What else? The other most notable accomplishment was that we were able to complete the analysis and reporting of all 2019-2021 CMMI Performance Report data and publish those results in the 2019-2021 CMMI Performance Summary Report, catching us up with the performance improvement results for the initial first three years of CMMI Performance Solutions (formerly CMMI V2.0) adoption, which are, frankly, amazing! 

AND! Stand by because of, and thanks to, our CMMI Performance and Data Analyst, Jennifer Nuessen, who joined us in January 2022, we will have the 2022 update to the CMMI Performance Summary Report out in Q1 of this year. Great work, Jen, and thank you!

Also, thanks to the community (and Jen’s analysis work), we identified a key improvement to the Performance Report template to add the most common categories and subcategories used for business and performance objectives. The data is much more consistent and makes it easier for appraisal teams to complete the template.

So, we will soon have four years of statistically validated performance data, and we are VERY interested to see the comparative analysis of re-appraised organizations who had their 2- or 3-year renewal appraisals coming up in 2023. This is independently verified data that clearly shows by implementing processes that meet the intent and value of CMMI practice areas and practices correctly, the investment quickly pays for itself and consistently achieves critical business capability and performance objectives—81 percent of the over 16,000 business and performance objectives identified from 2019-2021. And in that sentence, “correctly” means that the organization is consistently achieving its performance objective and improvement goals. 

But let’s look at this another way through the lens of logic. CMMI is a framework/ecosystem for reliable and consistent best practices and performance outcomes. Businesses want a competitive advantage wherever they can get it, and most businesses state, in one manner or another, that they value or prioritize continual improvement. And how many of you have had a Sponsor ask, “What’s the minimum I need to do to pass the appraisal?” I’ve always been baffled by this question. We have a model of proven best practices with a clear track record of meeting critical business objectives 81% of the time, along with clearly improved performance, and you want to just do the MINIMUM? Sure, there is an investment cost and ROI tradeoff, but why wouldn’t an organization want to know what best practices are for those areas where they see their capability and performance outcomes are critical to their business?

Full disclosure, a few (OK, since I’m talking about measurements and accuracy, it was actually a total of three) of our Lead Appraisers reached out to us in 2022 to either state or ask, “The Performance Report data isn’t really telling you what you think it is,” or “How can you state CMMI adoption was the reason why these measurement and improvement goals were met?” It’s a fair point, especially if you only look at the surface of data and ignore the intention of the Performance Report and related designed-in performance aspects that are now central to CMMI Performance Solutions. You have to look at the entire process, and not just the template or information. 

The MDD requires that Appraisal Teams complete and verify the Performance Report, appraise the practice statements AND value statements, and any additional required information. Each of these are there to add value to the organization being appraised. However, it’s clear that is not happening consistently quite yet, and we still have a learning curve with the reason for the Performance Report and how to properly use it. The Practical Guide to completing the Performance Report has helped a lot, but this is still sinking in with appraisal teams. Appraisal teams are also required to verify that the information in the Performance Report is consistent with and aligns to the overall appraisal results, such as the Final Findings, as well as any weaknesses, improvement opportunities or strengths.

In other words, we are relying on experience, trained and certified Lead Appraisers and their appraisal teams to:

  1. Complete the Performance Report as part of the appraisal activities. If the site personnel have drafted it, that’s fine, but the appraisal team still needs to verify every entry as part of the appraisal activities, and to verify that the measurements and analysis being used is clear and helping the organization build its capabilities and improve performance.
  2. Verify that the Final Findings and other related appraisal results, i.e., practice characterizations, are consistent with the Performance Report and vice versa.

So, that’s my answer to the statements or questions I’ve received—the results in the Performance Report are supposed to be accurate, and they are supposed to be verified by our trained and experienced appraisal teams. But we also recognize this is a learning curve and has been since we launched V2.0 in 2018/2019. We will continue to improve the Report and process as we continue to gather community feedback.