The challenge when building Performance Management solutions is not usually collecting or reporting of measures; rather it is choosing which measures and targets are the most appropriate to use, and building / implementing effective management processes to use these measures to good effect. The five tools discussed here each have the capacity to inform one part of the selection and usage processes. Two of the tools are ones that 2GC has developed specifically for Performance Management work, two are widely used frameworks that 2GC has over the years become expert at using and deploying, and one is an aid to performance reporting that many organisations find indispensable within a working Performance Management system.
The ACME change management framework is central to 2GC’s work on strategy implementation with senior management teams. It builds on solid academic research into how organisations implement strategy, adapted and tailored to be easy to understand and efficient to deploy. Helpfully it is also easy to integrate with modern strategic Balanced Scorecard design methods.
ACME - 2GC’s Strategy Execution Framework
Our ACME strategy execution framework was developed by 2GC building upon original research carried out by Lawrence Hrebiniak (a Professor from Wharton Business School in the USA, whose 1984 book “Strategy Implementation” was a major milestone in the development of thinking about these issues), and Rahman Muralidharan - an Academic who wrote a powerful paper that identified a set of common strategy implementation concepts from an analysis of many strategy implementation frameworks. The four elements are reassuringly straight forward - yet when undertaken as a set these four elements provide a powerful framework for managing strategy implementation.
If you plan to implement a strategy, it is helpful to know what it is. Yet research has repeatedly shown that in many organisations people don’t know what the strategy is - and most importantly - don’t know how to relate the strategy to what they do. So good strategy implementation has to begin with getting a clear statement of what you are trying to achieve - and importantly - a statement of how you know you have achieved it. There are many ways that this articulation can be done, but partly mindful of the second element in the ACME framework, 2GC favours the production of a concise, quantified statement of what a strategy is intended to achieve - which turns out to also be something that can be efficiently captured in a Balanced Scorecard “Destination Statement”.
It is not enough to state where you hope to get to, if an organisation is going to implement a strategy there is also a need for elements within the organisation to know what they are expected to contribute to the achievement of the strategy. Without this clarity about what contribution is required, it is highly unlikely that the contribution will be made. So the second element of the ACME framework is Communicate - the process of getting each unit within the organisation ‘on board’ with the strategy and clear about the contribution you expect them to make. In small organisations this contribution can be clarified through the creation of a strategy map / strategic linkage model - in larger organisations through the development of aligned / cascaded Balanced Scorecards across several units. In both cases, these are elements that naturally appear as part of a modern strategic Balanced Scorecard deployment process.
The third element of the framework is to do some monitoring - collecting information about what a team or organisation has been doing to contribute to the implementation of a strategy, and to evaluate whether that contribution is having the appropriate / required impact on the wider strategy implementation work. Without this information strategy implementation becomes a “Fire and Forget” exercise. Collecting and reporting this kind of information is what the Balanced Scorecard was invented to do, and so it is perhaps unsurprising that we think the most effective way to deliver this element of the ACME framework is through the deployment of modern Balanced Scorecard designs.
The work of Hrebiniak and Muralidharan that led to the development of the ACME framework emphasised above everything else the need for strategy implementation to be actively managed. They observed that without regular management intervention to adjust a strategy in light of learning and experience (or external events) and without the organisational awareness that ‘someone is watching’, it was all too easy for a strategy to become a hollow activity: something that is undertaken by delivers little that is relevant or valuable. The Engage element of the ACME framework is all about creating and using simple, robust, effective management processes that encourage the organisation to regularly review the progress of a strategy implementation programme, and use these reviews to drive the necessary changes to keep the programme on track towards the Destination.
Dig deeper on this topic
The Balanced Scorecard was first brought to public attention through an article by Robert Kaplan and David Norton that was published in the Harvard Business Review in 1992. In the article, the authors considered how to help organisations be more effective at implementing their strategic plans. They noted the need for managers to have regular and focused feedback on two kinds of data: how well the tasks required to implement the chosen strategy were being executed; and whether there was evidence to indicate whether the strategy itself was going to deliver the long term outcomes required.
A feature of the Balanced Scorecard has been how the tool has evolved and adapted over time - both with improvements that make it easier to use, and extensions to its design that help it operate across a broader range of situations. Back in the early 2000s 2GC pioneered research into understanding these changes, and has remained committed to being at the forefront of research into Balanced Scorecard as a tool.
At its core, the Balanced Scorecard is simply a table containing a few numbers that record specific features of the organisation. These ‘actual’ numbers are matched to ‘target’ values for each feature. By comparing actual performance against the target values, managers can spot, and intervene to correct issues before they become problems. The challenge with Balanced Scorecard is not the concept but the content.
Balanced Scorecard: It’s all about the design
Balanced Scorecard is useful only when the measures and targets chosen are appropriate. Working out what these are is not easy. The method of choosing which measures and targets to use is critically important for Balanced Scorecard to be successful. Since 1992, huge effort has gone into developing and refining methods to make this design process easier to undertake and more reliable.
1st Generation: 20 Measures and Targets in a Table
The earliest Balanced Scorecards were simply tables containing a concise mix of Financial and Non-Financial measures, each measure having one or more targets associated with it. In the original articles written by Robert Kaplan and David Norton designers were encouraged to choose a small number of measures and to link these directly to the organisation’s Mission, Vision and Strategy. Choosing what measures to use though was hard because at that point no one had come up with a reliable ‘design process’ to use - and so the resulting Balanced Scorecards often had strange and inappropriate measures within them; partly for this reason the majority of these early Balanced Scorecards failed. 1st Generation designs are still being created today, but there is no good reason for this: most new designs use one of the later Balanced Scorecard designs (which work better).
2nd Generation: Strategy Maps, Objectives and Linkage Models
Soon after the initial Kaplan and Norton papers were published it became clear that badly design Balanced Scorecards were both common and not very useful. The issue was quickly identified to be poor measure selection, and a design process that was hard for managers to participate in. Rapidly (within a few years) an improved design approach emerged that added a second element to the Balanced Scorecard design - a graphic illustration of the strategic objectives that made up the organisation’s strategy showing a small number of ‘strategic objectives’ and some simple causal linkages to show inter-dependencies between them. Building this diagram helped managers to validate that the strategy reflected in the Balanced Scorecard was the right one, and gave strong clues about which measures to include in the actual measure / target table. Hugely better than the 1st Generation, the introduction of this new design approach triggered a new wave of Balanced Scorecard development - and a cohort of Balanced Scorecards that (in the main) actually did some good!
3rd Generation: For Speed, Alignment, Quality
The huge success of 2nd Generation design methods, and the many Balanced Scorecard designs that were created using them, highlighted some deeper issues with the framework. Many 2nd Generation designs were not being used effectively because people found agreeing targets for the measures hard, and it was almost impossible to effectively deploy this kind of Balanced Scorecard in organisations with many units / divisions. The issue in both cases was found to be a lack of consensus within the management team about what the ‘end point’ of the strategy would look like. Without a good understanding of the end-point units had difficulty working out what they could or should do to contribute to achieving the strategy, and knowing if they had ‘done enough’ of it. The solution was to introduce a third element to the framework - the “Destination Statement” - a concise agreed, quantified description of the hoped for effect of implementing the strategy. This extra device made Balanced Scorecard design faster, massively helped deployment in multi-unit organisations, and improved the quality of measure and target selection. Today it remains the gold standard design approach to use.
Dig deeper on this topic
Results Based Management and Logical Framework Approach are two frameworks widely used in the UN (RBM) and NGO (LogFrame) sectors. Both tie the provision of funds to relatively formal descriptions of the outcomes being sought through the work being funded, along with descriptions of the actions to be carried out (and so funded) and the methods by which progress / results will be measured. This requirement for measurement of the progress / results of a funded project has encouraged the development of sophisticated performance measurement methods in many NGOs - but the role of these tools in obtaining funding has also pushed uses of the frameworks towards their use primarily as mechanisms for “Monitoring and Evaluation” rather than management.
Logical Framework Approach - History
Logical Framework Approach (LogFrame) was developed for USAID by a consultancy in the late 1960s, and its use became a requirement for the projects USAID funded from 1970. Since then the framework has been adopted as a funding pre-requisite by many other fund giving organisations, and unsurprisingly the framework has become a familiar internal project management approach in may (fund receiving) NGO type organisations.
Logical Framework Approach - Description
At its simplest, LogFrame is simply a four-by-four matrix. Filling in the intersections within the matrix encourages project teams to consider wider aspects of a project: in particular to consider how progress or achievement will be recorded. The first of the two dimensions of the matrix has four headings Goals, Purpose, Outputs, and Activities. The second dimension lists four descriptive aspects - Summary, Indicators, Verifications and Assumptions. Over time a rich infrastructure of common processes and activity lists has been developed to help project teams complete LogFrame grids, and in NGOs in particular, LogFrame documentation is a standard pre-requisite for any project.
Results Based Management - History
Results Based Management (RBM) is an evolution of a standardised budgeting system introduced within UN Agencies in the late 1990s. That system - called Results Based Budgeting - required Agencies to attach information about the outcomes being sought and methods being used to budget requests, but had no formal requirement for how this information was to be communicated. This lead to a multitude of incompatible formats for documenting these ancillary elements, and much confusion. So in the mid-2000s a revised version of RBB was introduced that specified a standard format for this ancillary information, and also a raft of standard management processes to be used alongside these statements. This revision to RBB was called Results Based Management, and it has since become the mandated approach to programme management within UN linked organisations.
Results Based Management - Description
RBM is quite complex due the many roles it fulfils within the UN, but at its core is a four-by-four matrix (called the Design and Monitoring Framework in RBM) similar to that found in LogFrame. The main difference is the labelling of the four dimensions is slightly changed - with Impact and Outcomesreplacing Goals, and Purpose. RBM also includes a useful eight-step management process that encapsulates a cycle encompassing definition use and revision of the Design and Monitoring Framework. This management process echoes the design of the various strategic management cycles popular in the commercial sector.
Dig deeper on this topic
Although not an essential requirement, about two-thirds of organisations that undertake formal performance management activities use some kind of software system to collate and report the measurement information to managers and other stakeholders. It is indicated in our annual survey of Balanced Scorecard Usage that about half of these software-using organisations use basic office software, with the remainder using some kind of specialist software package. The specialist packages are expensive, but offer potentially significant advantages, especially in larger organisations with many measures spread across many operating units.
2GC keeps close tabs on many of the organisations that offer specific software solutions that can be used to report performance management data. We publish a list of the major packages that we know about elsewhere on this website. We also maintain close working relationships with the vendors of the more common packages - partly as a result of working with many of them on client projects.
ERIC is a workshop toolkit used by 2GC to support Destination Statement Development. It is a tool that has particular application in organisations that have multiple units contributing toward a common strategic goal - where the challenge facing a management team is to work out how they can best contribute to the overall strategic goals set. The toolkit helps a group consider four factors that will bound any contribution, and so simplify the space within which the contribution itself needs to be set.
ERIC - Expectations
The first element to be considered is the expectations that the organisational unit is accountable for delivering. This is a straightforward exercise for most teams - and comprises identifying a list of stakeholders and for each stakeholder articulating what expectation(s) they have - typically described in terms of outcomes to be achieved. This list should include whatever the management group understanding of any previously articulated required contribution from the group to the wider strategic goal. Once this list has been captured, a second stage is to review each stakeholder expectation in turn and to discuss the necessity (and feasibility) of achieving each.
ERIC - Resources
In similar fashion to the Expectations discussion, this part of the toolkit focuses on identifying the resources available to the organisational unit that could be used to help the unit make its contribution to strategy implementation. Most groups find this list of resources shorter than it needs to be - and so part of this discussion is usually adding to the list the ‘missing’ resources - i.e. those that would be needed by the group for it to be able to make a sufficient contribution to the strategic goals of the wider organisation.
ERIC - Interventions
In similar fashion to the Expectations discussion, this part of the toolkit calls for the management group to discuss and agree upon the managerial interventions they have open to them: what managerial levers could they use to encourage their unit to achieve its strategic contribution. The management group’s ability to control what happens is closely linked to the available range of interventions identified.
ERIC - Constraints
A final dimension, this discussion considers the practical constraints that will limit the management group’s ability to direct the organisational unit they lead to make the required contribution to the wider strategy. These constraints may be material (e.g. lack of budget) or political (e.g. difficulty getting co-operation from another group in the organisation). The discussion should focus on identifying what the constraints are, and for each whether any mitigation options are available, and if so how the mitigation could be obtained.
Once these four dimensions have been clarified, it is relatively straightforward to take this information and help the group develop a combined view that describes the expectations that it aims to meet in pursuit of making a contribution to the wider strategy, along with a reflection upon the steps the group will need to take if it is to secure the appropriate resources and / or remove binding constraints. With this clarity about what contribution the group can make, and what will need to be done to be able to make this contribution, a more general discussion can be held to describe how making the contribution will affect the organisational unit over time - and so draft the core elements of a unit based Destination Statement.