The Treasury

Global Navigation

Personal tools

You are here: Home > State Sector > Performance > Benchmarking Administrative and Support Services > Benchmarking Reports > FAQs Relating to the FY 2008/09 and FY 2009/10 Report

 

FAQs Relating to the Administrative and Support Services Benchmarking Report for FY 2008/09 and FY 2009/10

Page updated: 13 Apr 2011

1.  What does the report provide?

This report provides a new level of management information quality and transparency regarding State sector overhead spending. For the first time, we have published administrative and support service performance information from across 33 State sector agencies with over 83,000 employees including Human Resources, Finance, Information and Communications Technology, Procurement, Property Management, and Corporate and Executive Services - John Whitehead, Statement by the Secretary to the Treasury.

This report shares the results of two benchmarking exercises, each with a different set of participating agencies and different reporting periods. In May 2010, 14 agencies voluntarily participated in a benchmarking exercise for the FY 2008/09 reporting period. And, in December 2010, 33 agencies participated in a benchmarking exercise for FY 2009/10, either voluntarily or based on Cabinet decision [The Treasury, Better Administrative and Support Services Programme: Report on Phase One findings and proposal for Phase Two, Wellington, CAB Min (10) 38/4B refers] - Scope of this report, Introduction.

Results cover six administrative and support service functions. In addition to a chapter summarising overall findings, this report features a chapter specific to each of the following functions: Human Resources, Finance, Information and Communications Technology, Procurement, Property Management, and Corporate and Executive Services. The latter includes but is not limited to Legal Services and Communications - Scope of this report, Introduction.

2.  Why was this report written?

This report responds to Government demands for better, smarter public services for less. The current economic climate is a key driver of the Government's focus on delivering public services more efficiently and effectively and redirecting resources from A&S services to higher priorities where possible - Background, Introduction.

In November 2010, Cabinet directed some larger departments and expected some larger Crown agents to report A&S service cost and quality to the Treasury each year and to release the results publicly to promote transparency and scrutiny - Background, Executive Summary.

3.  What data is the report based on?

The report is based on the results of two benchmarking exercises. The initial exercise measured Administrative and Support service performance in 14 agencies for FY 2008/09, and the second exercise measured Administrative and Support service performance in 33 agencies for FY 2009/10 - Background, Executive Summary.

4  What is the purpose of the report?

The purpose of this report is to provide managers in agencies with management information that improves transparency and scrutiny and helps identify opportunities for improvement and savings - Background, Executive Summary.

This report provides information on the performance of Administrative and Support services across government. Based on the largest and most comprehensive benchmarking exercise of its kind in the New Zealand State sector, this is the first of a series of annual reports on the cost, efficiency, and effectiveness of Administrative and Support services across agencies. Administrative and Support services support the delivery of services to the public and include Human Resources, Finance, Information and Communications Technology, Procurement, Property Management, and Corporate and Executive Services - Purpose of this report, Introduction.

This report provides transparency across a significant area of expenditure. Agencies spend $1.849 billion, or just under 10 percent of organisational running costs, on Administrative and Support services each year. The publication of a consistent set of objective cross-agency performance results on this significant area of expenditure increases transparency, allowing Ministers, agencies, and the public to assess whether this spending is providing value for money and track performance over time - Purpose of this report, Introduction.

This report does not propose agency-specific solutions for optimising Administrative and Support service delivery. There is significant activity underway in agencies to lower the cost and strengthen the efficiency and effectiveness of Administrative and Support services. This report provides management information to support those initiatives and track performance changes each year. While this report identifies general opportunities, it does not suggest specific operational changes or comment on the progress of initiatives specific to individual agencies - Purpose of this report, Introduction.

5  Why is benchmarking important?

What has been needed in the public sector for many years is robust and transparent management information so that informed decisions can be made. This report is an important step towards that transparency and puts new scrutiny on areas of public expenditure that were previously hidden - Hon Bill English, Foreword by the Minister of Finance.

Management information, such as that provided in this report, is key to transparency and driving sustainable performance improvements across the State sector. The frequently referenced Peter Drucker quote “If you can't measure it, you can't manage it,” summarises the importance of benchmarking: it provides an evidence base for assessing current performance, setting targets, identifying and quantifying opportunities for improvement, and tracking changes in performance over time - John Whitehead, Statement by the Secretary to the Treasury.

6  How does management information support agencies Administrative and Support service performance?

Management information supports a robust, evidence-based discussion regarding the way we currently do business and opportunities for improvement, but it does not allow us to make quick judgements about the relative quality of management in different agencies. Sometimes variations in expenditure and efficiency among agencies are based in their operational differences, and it will be important to use the information in this report constructively and understand each agency's performance within its operational context - John Whitehead, Statement by the Secretary to the Treasury.

7  What work is already underway for Administrative and Support services in the State sector?

Over the past two years, chief executives and their agencies have contributed to a range of collective processes, such as joint procurement and greater shared services. Those processes are already bearing fruit. For example, the first four joint procurement projects are expected to save around $115 million over the next five years - Hon Bill English, Foreword by the Minister of Finance.

Other examples of work already underway for Administrative and Support services n the State sector include: the ICT Common Capability Roadmap, the Government Procurement Reform Programme, the Communications Function Review, the Government Legal Services Review, the Digital Continuity Action Plan, and individual agencies also have their own programmes of work.

8  How much did the measured agencies spend on Administrative and Support services?

This exercise shows that measured agencies spend over $1.8 billion dollars each year on Administrative and Support services. Significant variations among agencies in service efficiency and effectiveness indicate opportunities to deliver these services in new ways to save millions each year that could be redirected to other Government priorities - John Whitehead, Statement by the Secretary to the Treasury.

9  What is Treasury's ongoing involvement with benchmarking of Administrative and Support services?

The Treasury will continue to work in partnership with measured agencies to publish Administrative and Support service performance information each year. The quality of management information will improve in future reports, mainly because successive benchmarking exercises will show valuable trend information and the impact of our improvement efforts over time - John Whitehead, Statement by the Secretary to the Treasury.

10  Will there be similar reports in future years?

In addition to providing annual information on the cost and quality of Administrative and Support services, future reports will comment on the extent to which agency responses to fiscal constraint result in savings and service quality improvements - Future reports, Executive Summary.

11  What was the quality of the data like?

Overall, the quality of data submitted by agencies was high. Where there are concerns with data quality, the underlying problems are based in the maturity of measurement methods and are common in the private and public sectors around the world. Procurement costs are difficult to measure, and the report makes no conclusions about Procurement function costs or efficiency due to data quality issues. CES function results should also be used with caution as organisations undertake a wide range of functions without standard definitions. This impairs data consistency and the ability to compare across New Zealand agencies and internationally - Data quality, Executive Summary.

12  Will the quality of management information improve in future years?

The quality of management information will improve in future reports in part because metric sets and data collection methods will improve based on lessons learnt year to year, and in part because successive years of data will provide valuable trend information - Future reports, Executive Summary.

13  Are the results comparable between the measured agencies?

While results are broadly comparable, results need to be understood within the context of each agency. While agencies have common features, some have unique functions and cost drivers. For example, large service delivery agencies are expected to have higher ICT costs than policy agencies, especially if they have more expensive ICT requirements such as specialised line business applications or a distributed network. Therefore, agencies' benchmarking results are a guide to relative performance, and conclusions regarding efficiency and effectiveness should be made in light of each agency's operational context - Data quality, Executive Summary.

14  What has the response from agencies been on the benchmarking exercise?

Agency response to management information to date has been positive, with some agencies participating voluntarily and indicating interest in working together to build more detailed data sets to support efficiency initiatives. The Treasury is sharing data and methods with other governments as Administrative and Support service efficiency is of broad interest, and management information is widely recognised as fundamental to identifying and tracking improvement opportunities - Next steps, Executive Summary.

15  How was the benchmarking methodology developed?

The New Zealand measurement methodology was adapted from established international benchmarking methodologies. Rather than building a bespoke methodology, the New Zealand agency benchmarking exercise adopted metrics and methods from the UK Audit Agencies and two leading international benchmarking organisations: the American Productivity & Quality Center and The Hackett Group. The metrics and methods used in an initial benchmarking exercise for FY 2008/09 were refined for FY 2009/10 based on feedback and lessons learnt - Benchmarking methods, Introduction.

Benchmark values were selected that are comparable and practical. Median values are the primary values used for comparison as these reflect average performance and mitigate the effect of large outliers. Upper quartile values are used for more ambitious benchmarks. The upper quartile value is the lowest performer in the upper quartile, or the 75th percentile in the dataset - Benchmarking methods, Introduction.

16  How is total office accommodation (m2) per FTE calculated?

The net leasable area of office buildings divided by the average number of FTE's accommodated in those buildings. It is not workstation size.

17  How were the NZ cohorts identified?

Measured agencies are grouped into three NZ agency cohorts. Tosupport comparisons of agencies with the greatest operational similarities, agencies are grouped using the following criteria: Size of operating budget, number of organisational FTEs, agency type by primary function, and distribution of people/service. Using these criteria, measured agencies fell into three groups of equivalent size with a profile that shared at least three of the four criteria and is described in Appendix 3 of the report - Benchmarking methods, Introduction.

18  How were the metrics decided?

Metrics were selected with measured agencies. Three principles guided metric selection: Metrics reflect performance – they provide meaningful management information; Results can be compared – they are comparable across NZ agencies and comparator groups; Data is accessible within agencies – the measurement costs are reasonable.

The final selected metrics were those most relevant and measurable in the New Zealand State sector environment. Measured agencies used a consistent underlying taxonomy based on definitions from the UK Audit Agencies, the American Productivity & Quality Center, and The Hackett Group - Benchmarking methods, Introduction.

19  How were the agencies selected for measurement?

For FY 2008/09 measurement, agencies volunteered to participate in measurement. For FY 2009/10 measurement, Cabinet agreed that departments be required and Crown Agents be expected to make an annual submission of A&S service performance data to the Treasury. This group excluded the following departments and Crown Agents:

  1. Those with fewer than 250 FTE staff - because in small agencies, a small team can be overwhelmed by the measurement effort and have little resulting data;
  2. District Health Boards (DHBs) - because they are involved in a specific initiative for A&S service optimisation in DHBs;
  3. Those agencies planned for merger into larger Departments (e.g. Archives New Zealand, National Library of New Zealand, Legal Services Agency, MORST and FORST) - because the data will be of limited use post-merger; and
  4. The two intelligence and security departments - because they have limited external reporting requirements to protect national security.

The table below lists the agencies that provided data for the two A&S service benchmarking exercises (FY 2008/09 and FY 2009/10).

Note that Ministry for Culture & Heritage, Ministry of Transport and State Services Commission all have less than 250 FTEs and participated in the FY 2009/10 measurement exercise on a voluntary basis.

Agency Name FY 2008/09 FY 2009/10
Accident Compensation Corporation No Yes
Counties Manukau District Health Board Yes No
Department of Building and Housing No Yes
Department of Conservation No Yes
Department of Corrections No Yes
Department of Internal Affairs Yes Yes
Department of Labour No Yes
Housing New Zealand Corporation Yes Yes
Inland Revenue Department Yes Yes
Land Information New Zealand No Yes
Ministry for the Environment No Yes
Ministry of Agriculture and Forestry Yes Yes
Ministry for Culture & Heritage No Yes
Ministry of Economic Development Yes Yes
Ministry of Education Yes Yes
Ministry of Fisheries No Yes
Ministry of Foreign Affairs and Trade No Yes
Ministry of Health No Yes
Ministry of Justice No Yes
Ministry of Social Development Yes Yes
Ministry of Transport No Yes
Nelson-Marlborough District Health Board Yes No
New Zealand Customs Service No Yes
New Zealand Defence Force Yes Yes
New Zealand Fire Service Commission No Yes
New Zealand Police Yes Yes
New Zealand Qualifications Authority No Yes
New Zealand Tourism Board No Yes
New Zealand Trade and Enterprise No Yes
New Zealand Transport Agency Yes Yes
State Services Commission No Yes
Statistics New Zealand No Yes
Te Puni Kokiri (Ministry of Maori Development) No Yes
Tertiary Education Commission No Yes
The Treasury Yes Yes
Whanganui District Health Board Yes No
Page top