[section separator="true"]
[section-item 9]
[row]
[column 12]
[toc-this]
Definition
Process benchmarking involves carrying out a systematic analysis and comparison of organisations’ processes and procedures, with a view to understanding the reasons for variations in performance, and identifying and incorporating [link title="best%20practice" link="%2Faware%2FPA%2FPages%2FConcepts%2FBest-practice.aspx" /]
. Process benchmarking can be either external (comparing one organisation with another) or internal (comparing a number of operational units or local offices within a single organisation).
It can take two forms:
[toggles]
[toggle title="Examples"]
- Comparing practices in different Commission delegations.
- Comparing how projects for a policy area are managed by member states, and against best practice recommended for project management.
- Comparing how the EU implements a given policy with the best practice recommended by professional and non-governmental associations in the EU, and with legal requirements in other developed countries.
- Comparing treasury management functions in EU institutions.
- Comparing processes between agencies and with the Commission central departments.
[/toggle]
[/toggles]
- generic benchmarking involves comparing the performance and processes of organisations in unrelated industries, with the aim of identifying good practice in order to get a particular outcome.
[toggles]
[toggle title="Examples"]
- How EU Institutions can procure a certain service at the lowest price when compared with others.
- How member state authorities carry out controls, for example, their farm inspections in the domain of agriculture.
- How EU Institutions’ practices for recruiting civil servants compare with those of member state and other international organisations.
- How different institutions ensure that payments are made within 14 days of receipt of invoice?
[/toggle]
[/toggles]
Instructions
This page includes only instructions specific to process benchmarking. When carrying out process benchmarking the audit team should follow both the instructions on this page and the instructions applicable to benchmarking in general.
Added value
Process benchmarking:
- Can be very effective when quantitative performance data has already revealed a problem or unexplained variance in performance in an organisation.
- Can add value to audits by identifying scope for improving procedures and reducing costs.
- Can be time-consuming and costly. Cost will depend on the complexity and nature of the process under scrutiny. It may therefore be appropriate only for simple processes or for comparisons in a limited number of geographical locations.
- May be useful in planning phase to identify good practices to serve as audit criteria.
- May be easier to use in some audit areas than others, for example in the EU institutions or agencies.
Identifying what can be process benchmarked
A process may be worthwhile benchmarking if it is:
- critical to the success of the organisation, critical to the achievement of its aims, to stakeholders and to achievement of value for money.
- a potential problem area, not performing satisfactorily or incurring unnecessary or higher costs.
- costly in terms of resources consumed.
- one in which others may be doing better.
Typical processes which can be benchmarked are:
- project management and implementation
- asset management
- risk management
- procurement
- processes for paying grants and subsidies
- processes for managing key information systems
- human resource management (for example, control over sickness absences, training)
- policy or programme management (for example, in a given geographical area against processes in a different geographical location).
Many of these processes may be mapped in the course of normal financial / compliance / performance audits. Some of this information may also exist in the auditee’s own procedure manuals.
Equivalent comparator
In the ECA audit context, most comparisons are likely to be within different parts of the same organisation, within the EU Institutions or agencies, or between member state organisations and other bodies with similar aims.
It may be easier to compare offices or units within one organisation (for example, the Commission), than to compare with outside equivalent organisations). Outside organisations - both public and private - that are working in similar sectors or carrying out similar activities do exist, but opportunities for process benchmarking may be limited.
Benchmarking clubs may provide members with comparator data on a confidential basis in return for information about their results, or for a fee.
Resources
[icons-list icon-size="2" separator="line" icon-vertical-alignment="middle" vertical-alignment="middle"]
[icon-list-item title="Checklist%20for%20main%20steps%20in%20benchmarking" description="to%20be%20followed%20by%20the%20audit%20team%20from%20an%20early%20stage%20in%20the%20planning%2C%20through%20the%20audit%20until%20the%20reporting%20phase." link="%2Faware%2FDocuments%2FChecklist-benchmarking.docx" icon="file-word-o" /]
[/icons-list]
Examples
The Commission’s processes for delivery of aid through non-State Actors were compared with two member state organisations considered to have good practices ([link new-window title="SR%204%2F2009" link="https%3A%2F%2Fwww.eca.europa.eu%2FLists%2FECADocuments%2FSR09_04%2FSR09_04_EN.PDF" icon="external-link" /]
on the Commission’s management of non-State Actors’ involvement in EC Development Cooperation. Resources).
EU legislative standards for waste water treatment and processes were compared with legal requirements and best practices in non-EU countries with equivalent economic development ([link new-window title="SR%203%2F2009" link="https%3A%2F%2Fwww.eca.europa.eu%2FLists%2FECADocuments%2FSR09_03%2FSR09_03_EN.PDF" icon="external-link" /]
on the effectiveness of Structural Measures spending on waste water treatment. Resources).
The establishment of EU programmes were compared with similar programmes in member states, Israel and the USA, as well as with other EU programmes ([link new-window title="SR%202%2F2012" link="https%3A%2F%2Fwww.eca.europa.eu%2FLists%2FECADocuments%2FSR12_02%2FSR12_02_EN.PDF" icon="external-link" /]
on ERDF co-financed financial instruments for small and medium-sized enterprises).
OECD best practices for impact assessments, academic literature and member states practices were used to develop standards for comparison. The Commission’s impact assessment system was compared with four member states and with the USA; the IAs of five Commission departments were also assessed against the standards; external experts and a focus group helped shape both the audit work and the report; and instances of good practices found were cited in the report ([link new-window title="SR%203%2F2010" link="https%3A%2F%2Fwww.eca.europa.eu%2FLists%2FECADocuments%2FSR10_03%2FSR10_03_EN.PDF" icon="external-link" /]
on impact assessments’ support for decision-making).
A comparison of how member states implemented Regulations and two similar programmes where they had a margin of discretion was carried out. The report cites numerous examples of differing practices in member states ([link new-window title="SR%2010%2F2011" link="https%3A%2F%2Fwww.eca.europa.eu%2FLists%2FECADocuments%2FSR11_10%2FSR11_10_EN.PDF" icon="external-link" /]
on the effectiveness of school milk and school fruit programmes).
A comparison was made of member states’ supervisory and control systems for agriculture. The systems reviewed were classified as effective, partially effective and not effective. Until 2009, traffic-light colours were used to summarise the ECA’s assessments in Chapter 1 of the ECA annaual report. (e.g. [link new-window title="2009" link="https%3A%2F%2Feur-lex.europa.eu%2Flegal-content%2FEN%2FTXT%2FPDF%2F%3Furi%3DCELEX%3A52010TA1109(01)%23page%3D22" icon="external-link" /]
, [link new-window title="2010" link="https%3A%2F%2Feur-lex.europa.eu%2Flegal-content%2FEN%2FTXT%2FPDF%2F%3Furi%3DCELEX%3A52011TA1110(01)%23page%3D18" icon="external-link" /]
).
For research grants, the Commission rules and processes for evaluations, IT and financial controls were compared with those in equivalent agencies in three member states, the USA and Switzerland. The time taken to grant funds under Framework Programme 7 was compared with the performance of agencies in four member states, the USA and Switzerland. ([link new-window title="SR%202%2F2013" link="https%3A%2F%2Fwww.eca.europa.eu%2FLists%2FECADocuments%2FSR13_02%2FSR13_02_EN.PDF" icon="external-link" /]
).
Member state practices for the design, implementation, monitoring and evaluation of programmes were compared against audit criteria developed from legislation, Commission documents and publications, and scientific studies as well as other EU programmes with similar objectives. Best practices were identified in consultation with agri-environment experts, and three member states were identified where such practices existed. ([link new-window title="SR%207%2F2011" link="https%3A%2F%2Fwww.eca.europa.eu%2FLists%2FECADocuments%2FSR11_07%2FSR11_07_EN.PDF" icon="external-link" /]
on the design and management of agri-environment support).
[/toc-this]
[/column]
[/row]
[/section-item]
[section-item 3]
[row]
[column 12]
[toc fixed="true" selectors="h2%2Ch3" class="basic-toc" /]
[/column]
[/row]
[/section-item]
[/section]