For the assignment outlined below, submit your response to the numbered questions below as a Word file in Brightspace.
Assignment 7: Performance Measures
This week’s Brightspace module includes the following report:
Rivenbank, William C., David N. Ammons and Dale J. Roenigk. 2005. North Carolina Local Government Performance Measurement Project: Benchmarking for Results. Chapel Hill, NC: University of North Carolina School of Government.
Below there are questions about 6 cases that are summarized in an addendum at the end of this report. You reviewed some of these cases in assessing cost analysis in Assignment 6. Our focus here is on using performance measures, including those linked to cost concepts and tools. The questions below are intended to help review four types of performance measures outlined in the reading by David N. Ammons on the basics of performance measurement. They are (1) Output (workload) Measures, (2) Efficiency Measures, (3) Outcome (effectiveness) Measures or (4) Productivity Measures.
For each case below read the case and then answer the case question(s). Just report what is used in the case – not every case will use all four types of measures. The table or each question provides space for 4 measures, you may have less or more than 4 measures, depending on the case. Remember – measures of productivity are relatively rare. In your response, identify the case name and include the question number with your response. Submit your assignment to Brightspace. Format note: The tables created below do not copy well to a Word document. You can create your own, but please report your responses in Table format for each case in the Word document.
Residential Refuse Collection: City of Winston-Salem, pages 1-2
In this case, the City decided to end an arrangement to contract out for a portion of their refuse collection. Several measures were used in making this decision. List each of the measures and identify what type of performance measure they are using from the four types we have been discussing: (1) Output (workload)Measure, (2) Efficiency Measure, (3)Outcome (effectiveness) Measure, or (4) Productivity Measure. MeasureType
Residential Refuse Collection: City of Concord, pages 3-4
In this case, both an efficiency measure and an outcome measure led to a re-examination of contracting out for refuse collection.
What was the efficiency measure:
What was the outcome measure:
Household Recycling: City of Wilmington, pages 7-8
In this case, the City decided to contract their recycling service to a private contractor. Identify what performance measures were used in this analysis and identify what type of the four general categories of performance measure they represent.
Measure:
Type:
Police Services: City of Greensboro, pages 9-10
In this case, the City wanted to assess the adequacy of staffing levels in the Greenville police department to address crime levels in city neighborhoods. Identify what performance measures were used in this analysis and identify what type of the four general categories of performance measure they represent.
Measure:
Type:
Emergency Communications: City of Ashville, pages 11-12
In this case, the City of Asheville wanted to assess the productivity of staff in its emergency communications function. Identify what performance measures were used in this analysis and identify what type of the four general categories of performance measure they represent. Based on these performance measures, what important factors affecting performance did the audit of operations identify?
Measure:
Type:
Factors affecting performance;
Asphalt Maintenance & Repair: City of Hickory, pages 13-14
What performance measure(s) did the City of Hickory use in assessing pothole repair? Also, identify what type of the four categories of performance measure they represent.
Measure:
Type:
BenchmarkingforResults2015.pdf
Benchmarking for Results
.
:
, , ,
, , , ,
, , , ,
, , , ,
-
north carolina local government performance measurement project
Benchmarking for Results
a. john vogt
december 2005
cosponsored by:
the cities of asheville, carrboro, cary,
charlotte, concord, durham, gastonia,
greensboro, hickory, high point, matthews,
raleigh, salisbury, wilmington, wilson, and
winston-salem
institute of government
the north carolina local government
budget association
north carolina local government performance measurement project
copyright 2005 school of government
School of Government CB# 3330 Knapp Building, The University of North Carolina at Chapel Hill, Chapel Hill, NC 27599-3330
Preparation and printing of this report were made possible by funding from the participating cities.
This report is published by the School of Government. Public agencies and officials may photocopy portions of the report, if is copied solely for distribution within a public agency or to officials or employees thereof and if copies are not sold or used for commercial purposes.
Printed in the United States of America
Preface
The North Carolina Benchmarking Project is a regional benchmarking initiative that compares performance statistics and cost data across the following ten service areas: residential refuse collection, household recycling, yard waste/leaf collection, police services, emergency communications, asphalt maintenance and repair, fire services, building inspections, fleet maintenance, and human resources. Participating municipalities endure the challenges of data collecting and data cleaning because they believe that performance measurement and benchmarking are catalysts to service improvement.
The steering committee of the benchmarking project decided to forgo the addition of a new service area for study in 2005, opting instead to have project staff members at the School of Government gather information from each participating municipality on how the benchmarking data are actually being used for improving service efficiency and effectiveness. This report, along with the supporting case studies on data use, contains the program and organizational findings that were obtained from the review of the benchmarking experiences from the fifteen municipalities that participated in the Final Report on City Services for FY 2003–2004. They included Asheville, Cary, Charlotte, Concord, Durham, Gastonia, Greensboro, Hickory, High Point, Matthews, Raleigh, Salisbury, Wilmington, Wilson, and Winston-Salem.
The benchmarking project is a collaborative effort between the School of Government and the participating municipalities. Special thanks are owed to the members of the steering committee who were instrumental in collecting the information needed for this report and to the following individuals who wrote the supporting case studies for their respective municipalities: Ben Rowe, deputy budget and evaluation director of Winston- Salem; Kathy Mann, budget analyst of Wilmington; Randy J. Harrington, budget and performance manager of Concord; Tony McDowell, interim budget manager of Asheville; Stephen Carter, budget and management analyst of Greensboro; and Karen Hurley, budget analyst of Hickory.
This document contains evidence that the participating municipalities in the benchmarking project have used the comparative statistics at the program level to support a variety of service delivery decisions. It has been designed so that additional case studies on data use can be added as municipalities continue to use the benchmarking information for service improvement. The steering committee members understand that the long-term success of the benchmarking project hinges on documenting the success stories of using comparative information.
William C. Rivenbark David N. Ammons Dale J. Roenigk
Introduction
At one time performance measurement was thought to be innovative, but today it is accepted as a professional norm in local government for demonstrating operational accountability of service delivery and for creating an environment for productivity improvement. Although adoption of performance measurement systems is common, full implementation of these systems is rare.1 Adoption refers to the creation of measures for tracking service performance. Implementation, on the other hand, represents the actual use of these measures for converting information into action.2 This distinction is critical. Given the expense of adoption, an adequate return on investment hinges on effective implementation.
When an organization engages in benchmarkingthe comparison of its performance against relevant performance standards or the performance of other organizationsthe investment is greater and so is the desire for an adequate return. 3 Benchmarking consumes more organizational resources than internal performance measurement given the difficulty of ensuring data accuracy, reliability, and comparability across multiple organizations. Operational improvement based on lessons learned from benchmarking is where an organization hopes to gain its return on investment.
The North Carolina Benchmarking Project is a regional benchmarking initiative that compares performance statistics and cost data for participating municipalities across the following ten service areas.4
• Residential refuse collection • Household recycling • Yard waste/leaf collection • Police services • Emergency communications • Asphalt maintenance and repair • Fire services • Building inspections • Fleet maintenance • Human resources
The benchmarking project is managed by the School of Government under the guidance of a steering committee comprised of representatives from each participating municipality.
This report contains the results from research conducted during spring and summer 2005 on how municipalities are using the comparative performance statistics for converting information into action. It begins with a brief overview of the benchmarking project and the methodology used to gather the information from the fifteen municipalities that participated in the benchmarking project during FY 2004–2005. The findings are then presented on how the comparative statistics are being used at the program and organizational levels.
Overview of the North Carolina Benchmarking Project The impetus for the benchmarking project came from two groups: city managers and budget officials. The North Carolina League of Municipalities held a meeting in 1994 with city managers from the larger municipalities in the state to discuss the topics of privatization, competition, performance measurement, and benchmarking.5 Subsequently, local budget officials who were affiliated with the North Carolina Local Government Budget Association held a meeting in 1995 to discuss the possibility of creating a benchmarking project. They had a desire to move beyond internal performance comparisons over time. They wanted to place service performance from their own organizations within the context of other organizations, with the belief that even outstanding performers can learn from the practices of others. The pilot phase of the
2 North Carolina Benchmarking Project
benchmarking project started in fall 1995 after the Institute of Government hired a project coordinator.
The following three goals guide the benchmarking project: (1) develop and expand the use of performance measurement in local government, (2) produce reliable performance and cost data for comparison, and (3) facilitate the use of performance and cost data for service improvement. Nine municipal performance and cost data reports had been produced by 2005 in response to the second goal. However, the participating municipalities do not endure the challenges of data collection and data cleaning simply to produce a report. They participate in the benchmarking project to enhance their own internal performance measurement systems and to use the comparative performance and cost data for service improvement.
Methodology The findings contained in this report were derived from a review of the benchmarking experiences of the fifteen municipalities that participated in the benchmarking project in 2005, which included Asheville, Cary, Charlotte, Concord, Durham, Gastonia, Greensboro, Hickory, High Point, Matthews, Raleigh, Salisbury, Wilmington, Wilson, and Winston- Salem. Municipal representatives were queried by an e-mail survey, followed by in-person interviews and subsequent telephone and e-mail contact in 2005.
Program Findings This section focuses on how the benchmarking data are being used to improve the efficiency and effectiveness of service delivery. Survey questions asked for specific examples of how the benchmarking data have supported operational change within the service areas under study. While some of the following examples contain specific outcomes, others are more recent initiatives with promising but unconfirmed results.
Residential Refuse Collection Benchmarking data have been used most frequently by participating municipalities in the service area of residential refuse collection. The comparative statistics were used in Hickory, for example, to justify automated collection with one-person crews. The city lowered its cost per ton collected from $98 in FY 1995–1996 to $69 in FY 2003–2004— a savings of $29 per ton collected.
Concord used the benchmarking data for negotiating more favorable terms with its private hauler. The city was paying $7.07 per collection point when its refuse collection contract expired. The private hauler’s proposal for a new contract called for payment of $7.76 per collection point. The city countered using data from the benchmarking project that showed Concord’s service costs were relatively high and the contractor’s service quality was relatively low when compared to other municipalities. The parties agreed to continue the service at a rate of $7.07 per collection point, subject to consumer price index and fuel price adjustments.
One of the major success stories of the benchmarking project is found in this service area. Winston-Salem used a private hauler to provide residential refuse service to approximately 6,500 households. After the benchmarking data revealed underutilized capacity within its own operations, the contract with the private hauler was discontinued and service by city crews was extended into the affected neighborhoods without adding staff or equipment. This move improved efficiency and produced annual savings of approximately $395,000.6
Household Recycling Comparative statistics for household recycling helped municipal officials monitor the effects of service expansion in Asheville. Program changes yielded an increase in the waste diversion rate from 14 percent in FY 1998–1999 to 24 percent in FY 2003–2004. The principal impact of program success is the extended life of the Buncombe County landfill.
North Carolina Benchmarking Project 3
Benchmarking data assisted Wilmington officials with a decision to privatize the household recycling program, producing an annual savings of approximately $75,000.7 This change in service delivery decreased the cost per ton collected from $308 in FY 1994–1995 to $234 in FY 2000–2001. Further expansion of the program since 2001 decreased the cost per ton collected to $128 by FY 2003–2004.
Benchmarking data also have been used to assess the possibility of altering truck and crew configurations in Concord and to evaluate the cost per collection point for contract negotiations in Hickory.
Yard Waste/Leaf Collection Comparative statistics for yard waste/leaf collection supported the use of seasonal labor in Hickory and justified a recommendation for a leaf machine in High Point. The program change in Hickory helped reduce the cost per collection point from $51 in FY 2001–2002 to $30 in FY 2003–2004. Analysis in High Point showed that the new equipment would reduce the cost per ton collected.
Police Services Greensboro used the benchmarking results in a management study of police patrol staffing.8 The study found that Greensboro was below average in the number of sworn officers per 1,000 residents and had a slower than average response time for high priority calls when compared to the cities of Durham, Raleigh, and Winston-Salem. A workload analysis indicated a patrol availability factor of only 6.6 percent, signaling little ability to engage in proactive patrol. In response to the management study, Greensboro approved an additional thirty-two sworn officers for its police department.
Other examples of data use in police services included the analysis of a proposal to add a patrol beat in Cary, gauging the efforts of community policing in Concord, and investing in a telephone response unit to reduce calls per officer in Wilmington.
Emergency Communications Asheville eliminated three dispatcher positions in emergency communications based on an analysis of the benchmarking results, which allowed the reallocation of approximately $105,000 to other programs. The benchmarking project’s comparative statistics also have been used to identify the need for an additional supervisory position in emergency communications in Cary and to make changes for an ISO rating improvement in Concord.
Asphalt Maintenance and Repair Among the high profile budgetary decisions confronting most municipal officials annually is the amount of resources that should be appropriated to asphalt maintenance and repair. Typically, administrators urge adherence to the adopted resurfacing cycle policy, which usually calls for the municipality to resurface a specified number of lane miles on an annual basis. Depending on revenue projections, however, this capital investment is sometimes deferred in favor of other programs. Several jurisdictions have solidified their ongoing commitment to a systematic street resurfacing program with the support of the benchmarking results.
Two municipalities have used the comparative statistics to analyze the cost-effectiveness of using in-house crews versus contract crews for resurfacing projects. Asheville made the decision to use contract crews for additional projects, while Concord increased in-house capacity.
Hickory used the comparative statistics to justify a new automated patch truck for pothole repair. The city reported 85 percent of potholes repaired within twenty-four hours in FY 1997–1998, which was well below the average of 96 percent. After the capital investment, the city reported 97 percent of potholes repaired within twenty-four hours in FY 2001–2002, which was slightly above the average of 95 percent.
4 North Carolina Benchmarking Project
Fire Services Cary, Charlotte, and Concord have used the comparative statistics to analyze the workload of fire inspections. Cary established a staffing plan for determining when to add new fire inspectors in response to its workload analysis. High Point, on the other hand, used the comparative statistics to analyze and approve the request for twelve new firefighters in response to a merger with two volunteer stations.
The most notable use of comparative fire service statistics occurred in Hickory. The city’s high cost per response suggested an underutilization of personnel and equipment and prompted the decision to begin responding to emergency medical calls. This increase in workload allowed the fire department to spread its fixed costs across more calls for service, which substantially lowered its cost per response from $3,246 in FY 1998–1999 to $1,832 in FY 2003–2004.The workload change has resulted in minimal impact on the effectiveness measure of average response time to high priority calls, which increased slightly from 4.0 minutes to 4.4 minutes during this same time period.
Fleet Maintenance The benchmarking data prompted a staffing analysis in Concord, which resulted in the reorganization of fleet maintenance, the reduction of a full-time equivalent position, and the establishment of productivity goals. Asheville and Hickory also have used the benchmarking results to establish productivity goals regarding billable hours, parts turnover, and percentage of rolling stock available per day.
Organizational Findings This section focuses on how the participating municipalities are using benchmarking as a management tool. These findings represent patterns of use that extend across programs. They include the importance of management integration, the impact of higher order measures, the use of efficiency measures, the selection of benchmarking partners, the refinement of measures, and the overall organizational impact from participating in the benchmarking project.
Management Integration Only a few of the participating municipalities in 2005 had developed a systematic process for integrating the comparative statistics into the management functions of planning, budgeting, and evaluation. Those municipalities that had advanced the furthest in integrating benchmark statistics into fundamental management systems tended also to be the ones that had actually used the benchmarking information to achieve the most significant operational improvements at the program level.
• Wilmington has integrated the benchmarking data into its strategic planning process. Several of its organizational objectives are tied to statistics from the benchmarking project, including Part I crimes and response time to high priority calls.
• Concord uses the benchmarking results in support of performance budgeting. Program managers use the comparative statistics to help establish their service objectives for the coming fiscal year. Work plans are then formulated that include strategies for closing performance gaps.
• Winton-Salem uses the benchmarking results in support of zero-based budgeting. The budget department selects several programs each year to analyze from a zero-based perspective where all resources are justified as part of the annual budget process. The performance and cost data from the benchmarking project are used for analyzing the program’s resources from a comparative perspective. When a program is not part of the benchmarking project, budget staff members must independently collect comparative data.
• Hickory uses the benchmarking results in conjunction with its total quality management program. Once a process is selected for study, an improvement team is
North Carolina Benchmarking Project 5
formed to analyze the components of the process, to identify strategies for process improvement, and to monitor the results. The comparative statistics are used with the analysis when the selected process is part of a service area in the benchmarking project.
Higher Order Measures The benchmarking project reports on three types of performance measures for each service area under study: workload measures, efficiency measures, and effectiveness measures. While workload measures are important for providing information on service demand, they simply report on how much. Efficiency and effectiveness measures are considered higher order measures as they report on the relationship between inputs and outputs and on the quality or impact of service, respectively. Municipalities that relied less on workload measures and more on the higher order measures of efficiency and effectiveness were more likely to use the comparative statistics for making management decisions at the program level.
There is a broader implication from this finding. Research has shown that workload measures are more commonly tracked in local government than efficiency and effectiveness measures.9 Research also has shown that public organizations have struggled with moving from adoption to implementation of performance measurement.10 This review suggests that over-reliance on workload measures may restrict the use of higher- order measures and limit the use of performance data for making management decisions.
Service Efficiency A majority of the participating municipalities reported heavy reliance on efficiency measures. One reason for this is the emphasis placed on cost accounting by the project. The total cost of each service area is calculated as a step toward determining resources consumed per service output.11 Several of the respondents also reported that the benchmarking project afforded them the ability to calculate accurate and reliable efficiency measures for the first time and that elected officials tend to focus on service efficiency because of their concern about increasing tax rates.
Benchmarking Partners Most of the participating municipalities reported that they primarily compare their performance statistics with the statistics of specific benchmarking partners. The primary reason for this practice was the perception that municipalities of similar size are more comparable. Several respondents also reported that elected officials were only interested in municipalities of similar size.
There are two questions that arise from the practice of selecting benchmarking partners based on population. First, is this a sound practice from the perspective of seeking strategies for service improvement? Concord, which reported systematic comparison across all jurisdictions, provided the most examples of data use at the program level. Second, do economies of scale drive performance in local government? One source suggested that economies of scale are more applicable to capital-intensive services like water and sewer as opposed to labor-intensive services like police and fire protection. 12
Therefore, service efficiency may not necessarily improve as the size of a labor-intensive program increases.
Refinement of Measures The benchmarking project has been instrumental in helping participating municipalities improve the quality of their performance measures. A service review, for example, revealed that municipalities were having trouble with calculating accurate and reliable household participation rates. As a result, a new methodology was adopted to establish the effectiveness measure of set-out rate. Several municipalities reported that the new measure has improved their ability to accurately track the success of their recycling programs.
6 North Carolina Benchmarking Project
Organizational Impact The municipalities were asked to identify the overall organizational impact from participating in the benchmarking project. Listed below are the most notable responses:
• Reporting on the performance of service delivery within the context of comparable performance statistics enhances program accountability.
• Benchmarking has helped change the organizational culture by increasing the importance of performance measurement. Program managers are more concerned with data accuracy and reliability and are more open to data analysis.
• Benchmarking has given program managers a broader perspective on how services are provided. They have become more open to the idea that reviewing processes in other organizations can help them improve their own service performance.
• Budget staff members have become more knowledgeable about programs under study, decreasing the communication barriers between staff members and program managers.
• Reporting on comparative statistics has resulted in other management initiatives. For example, citizen surveys have been conducted to supplement the performance and cost data, which has resulted in allocating more resources to priority service areas.
• Benchmarking has assisted organizations with making progress toward performance budgeting, where the performance and cost data have been used in contract negotiations with external vendors, in the reorganization of selected programs, and in the allocation of additional or fewer resources based on need assessments.
One of the best anecdotal observations regarding the value of project participation came from a seasoned budget director who noted that her fingers were crossed every time she received an information request from the manager regarding a program. Her constant hope was that the program would be one of the ten programs currently under study in the benchmarking project, making it possible to give a timely and informative response.
Conclusion A distinction has been made between the adoption and implementation of performance measurement and benchmarking in local government. This review of the benchmarking experiences of the fifteen municipalities that participated in the benchmarking project in 2005 revealed that the comparative statistics have been used at the program level to support a variety of service delivery decisions.
Prior research has suggested that time is a factor in moving from the collection of measures to actually using them in management decisions.13 Indeed, some of the municipalities that have the most experience in performance measurement and longest participation in the benchmarking project were among the leaders in the use of performance data; but time is no guarantee. Evidence from this review suggests that the systematic integration of comparative statistics into management processes, the use of the higher order measures of efficiency and effectiveness, and the management practice of performance comparison across all jurisdictions also were important factors to the implementation of measures for converting information into action.
North Carolina Benchmarking Project 7
References 1. Patria de Lancer Julnes and Marc Holzer. 2001. Promoting the Utilization of
Performance Measures in Public Organizations: An Empirical Study of Factors Affecting Adoption and Implementation. Public Administration Review, 61 (6): 693–708.
2. Stehr, Nico. 1992. Practical Knowledge: Applying the Social Sciences. Thousand Oaks, CA: Sage Publications.
3. The comparison of performance statistics is one of three approaches to benchmarking in the public sector. See David N. Ammons. 2000. Benchmarking as a Performance Management Tool: Experiences among Municipalities in North Carolina. Journal of Public Budgeting, Accounting & Financial Management, 12 (1): 106–124.
4. For information on the specific definition of each service area, see William C. Rivenbark. 2005. Final Report on City Services for Fiscal Year 2003–2004. Chapel Hill, NC: School of Government.
5. Paula K. Few and A. John Vogt. 1997. Measuring the Performance of Local Governments. Popular Government, 62 (2): 41–54.
6. Ann Jones. 1997. Winton-Salem’s Participation in the North Carolina Performance Measurement Project. Government Finance Review, 13 (4): 35–36.
7. David N. Ammons. 2000. Benchmarking as a Performance Management Tool: Experiences among Municipalities in North Carolina. Journal of Public Budgeting, Accounting & Financial Management, 12 (1): 106–124.
8. Greensboro Budget and Evaluation Department and Greensboro Police Department. 2004. Patrol Staffing Study. Greensboro, NC: City of Greensboro.
9. William C. Rivenbark and Janet M. Kelly. 2003. Management Innovation in Smaller Municipal Government. State and Local Government Review, 35 (3): 196–205.
10. Patria de Lancer Julnes and Marc Holzer. 2001. Promoting the Utilization of Performance Measures in Public Organizations: An Empirical Study of Factors Affecting Adoption and Implementation. Public Administration Review, 61 (6): 693–708.
11. Total cost includes direct costs (personal services and operating expenditures), indirect costs (overhead
Interested in getting help with this assignment?
Get a professional writing team to work on your assignment!
Order Now
Recent postsFor this final assignment, you will prepare a brief paper detailing the steps undertaken to complete a presentation that disseminates information you assemble
Please choose to answer only one of the 2 following questions. Option 1: In your opinion and based on scientific, peer-reviewed published evidence, does child
At the beginning of the previous academic year, the institution announced it would drop football at the conclusion of the season. The announcement created pub
you will review current research in Personality and provide a critical evaluation of that personality research through an annotated bibliography. An annotated
In Module 5, we considered the third in our three-part series on research design. Specifically, the focus was on the longitudinal studies, in which the resear
Need help with a similar or different Task?
We have the best writers to help you. Hire Writer Now