PuMP is a practical methodology, but it’s also quite a different way of thinking about how to measure performance. And because it’s so different, it’s natural to expect that sometimes we can overthink parts of PuMP, and sometimes we can underthink parts of it, too.

 

The Right Amount of Thinking Matters

The risk of overthinking something is we can spin our wheels, not making any forward progress despite a lot of motion. It means we need to back off a bit, regain traction, and then we can move forward. PuMP makes this a safe choice, because each step of PuMP affords the opportunity to check how the previous step worked out, and make any tweaks or corrections before we go much further. That’s the 80% rule: permission (and, in fact, insistence) that overthinking works against successful progress.

The risk of underthinking something is that we forge ahead, before we have what we need to make the next step successful. It means we need to shift back a gear, and stop taking shortcuts that have us arriving at the next step underprepared. It’s what they mean when many PuMP users say “trust the process”: a reminder that there is no superfluous step in PuMP, and we can trust that everything is there because more than 25 years of testing has proven it.

From the deep and extensive experience of our global team of licensed PuMP Partners and PuMP Contractors, we know there are a few parts of PuMP that people will most often overthink and underthink. And we share them with you, here, for you to ponder, identify with, and hopefully tweak your PuMP implementation in the future!

10 Things We Often Overthink in PuMP

  1. Spending too much effort discussing and worrying about getting an accurate score in the PuMP Diagnostic. It’s more about reaching enough of a shared understanding of what comprises proper performance measurement practice, so that when you move on to Step 2 in PuMP, your team is with you, and open to the learning.
  2. Labouring over perfect de-weaseling of weasel words, and words that “might” be weasel words, and individual weasel words when it’s the phrase that matters. Rather, reach the point where you are 80% confident you have found words that you all share the same meaning of, and move on.
  3. Trying to ‘perfect’ a Results Map, like the wording of results and whether it’s blue or orange. It’s faster to build the Results Map with good enough language, which will help you find the clear cause-effect links, which will help you decide which colour or level any result logically sits in.
  4. Going around in circles not knowing exactly where to start your Results Map. Start. Then allow yourself to edit. Iterative editing is faster and more productive that a perfect starting point.
  5. Including any and every goal (and business-as-usual) in the Results Map. The Results Map is a tool for prioritising; it’s a tool for strategy execution. Build a second Results Map for everything else, if you must, and call that second one a Business Model Map. But separate strategy from business-as-usual.
  6. Worrying too much about whether leaders will understand the Results Map. Give them some credit. Thinking is part of their job, so share it and and invite them to explore it with you. They’ll see the value.
  7. Trying to respond or action every piece of feedback in the Measure Gallery. What you owe your visitors is that you move forward with strongest themes you draw from their feedback, not that you accept every individual piece of feedback.
  8. Overdoing the “where it fits” section of the Measure Definition. Your Results Map tells you where it fits – just read it and copy and paste the measures of the related results. If you don’t have any, come back and edit your Measure Definition after your Results Map has progressed some more.
  9. Procrastinating that leaders will never understand XmR charts. They will. They are best served, though, by seeing a KPI they’re familiar with placed side by side with an XmR chart version of it. Let them ask questions, and let those questions guide how you explain it to them.
  10. Focusing too much on automating the XmR charts and signal recognition and recalculation. Data visualisation software hasn’t kept up with XmR charts, and what your organisation already uses might not either. Tableau and Power BI have XmR charts, but honestly, automating them is a lower priority than engaging people to use and value them, and getting a few runs on the board with how they help improve performance (close performance gaps).

10 Things We Often Underthink in PuMP

  1. How to bring people with you throughout the PuMP process. This means not relying solely on a Measures Gallery, but engaging people in the work throughout.  This is especially true with Senior Leaders early on, helping them understand the impact of the way they phrase things and the lack of clarity in the strategy, so it doesn’t come as a shock later and feel like its being pulled apart.
  2. Drawing out the most important intended results from Measurability Tests. People often see this as a rewording exercise rather than stopping to think what are we actually trying to cause or achieve. Particularly when the original goal is a typical action, it’s easy to focus on turning to a result statement, not on what they are trying to cause.
  3. The should/can/will test in the Measurability Tests. This is quite important, yet sometimes teams tend to say everything is important without putting much thought on the “will” part especially. “Will” means that you have committed time and resources that are ample enough to pursue or improve the result. It’s a non-negotiable addition to “should” and “can”.
  4. Strength tests in Measure Design. Many people are not putting enough conscious thought into whether a measure would really provide the evidence they need.  We see this when there are way too many 6’s & 7’s and a little challenge in what is treated as a quick measure scoring exercise with out really thinking deeply enough. It’s an opportunity to use it as a test of everyone’s beliefs and prioritisation.
  5. Writing potential measures as true quantitative measures in Measure Design. It can feel like you’re just jotting ideas down to quickly shortlist from the potential measures. But the discipline of writing each one using the proper recipe for writing a quantitative measure makes sure everyone understands clearly what the options mean, and that you shortlist the best measures.
  6. Choosing a statistic for the potential measures in Measure Design. Many people default to every measure is a count. More thinking is useful, to consider how the statistic adds context and meaning for using the measure to manage and accomplish a result.
  7. Fleshing out the Measure Definitions fully enough to be actionable. Sometimes teams don’t think thoroughly enough and they can realise later that not all was documented in the necessary detail. The definition has to be clear enough that someone outside the Measures Team could accurately pull the measure together into a spreadsheet or have the measure values ready for the dashboard.
  8. Setting the scope of each performance measure in the Measure Definition. We see that some people definitely don’t know how to think about scope properly, and have no idea how it can really tune the measure to the underlying behaviour we’re really seeking from using the measure.
  9. Identifying change initiatives for Reaching Performance Targets. It seems many people lack the necessary skills for process design and analysis, which are vital for truly choosing, planning and implementing change initiatives that will close performance gaps.
  10. Integrating PuMP into existing performance improvement systems or processes. Seeing PuMP as a stand-alone process is a major contributor to its slow uptake in any organisation. PuMP should be mapped into your strategic performance management process, along with your other frameworks for strategic planning, project and program management, performance improvement and change management.

Any others, from your experience?

Share in the comments your own experience with what overthinking and underthinking Measures Teams can easily fall into.