1997 April Report of the Auditor General of Canada
Chapter 5—Reporting Performance in the Expenditure Management System
Responsible Auditor: John Mayne
5.2 The federal government is in the process of revamping its Expenditure Management System, the process by which it plans, budgets and seeks parliamentary approval of future expenditures. The changes create an incentive for departments to develop better information that helps managers provide more effective programs to Canadians and inform Parliament in a more timely way about the results of their activities. Some Canadian provinces and governments abroad have already made important strides toward providing their legislatures and citizens with better information about the performance of government programs.
5.3 Progress has been made by several departments, and we found instances in which performance information reported to Parliament addressed many key aspects of performance reporting. In particular, we found improvement in the information's orientation to results. But to fully realize the potential, departments need to describe expected performance more clearly and concretely - that is, in more measurable terms. The account they give of performance needs to focus more on the benefits for Canadians. This will take time.
5.4 Further progress requires strong leadership by the Treasury Board Secretariat and senior departmental managers. The Treasury Board Secretariat has implemented positive innovations in a short period of time, but needs to improve the consistency of the assistance it provides to departments as well as its efforts to document and communicate good practices in measuring and reporting performance.
5.5 The role of parliamentary standing committees is vital to continued progress. If committees ask for information on the results of government activities and visibly use the information in their deliberations, departments will have a powerful incentive to collect and report the information.
A Need for Information on How Well Programs Are Doing
Information on performance is required for good management and effective governance5.6 Knowing how well programs and services are doing is increasingly essential to managing today's public sector, as governments here and abroad face resource reductions and a citizenry that continues to want good value from its government. Credible information on the results being achieved is needed for three reasons:
- To manage tax dollars better. Government expenditures are being reduced, but many Canadians still feel they are overtaxed. It is therefore more important than ever that tax dollars be spent efficiently and effectively. Ministers receive a range of information to consider in making decisions on future spending and priorities. Information on what current spending is accomplishing needs to be a key element of advice, particularly when resources for new initiatives must be reallocated from existing expenditures. Managers need timely information on what is working and what is not so they can direct resources to where they are most effective.
- To make better decisions. Officials need this kind of information on performance for the many day-to-day decisions they must make to manage effectively the programs and services provided to Canadians.
- To provide for better accountability. Canadians are demanding greater accountability from their governments. Trends in public opinion show a reduced confidence in government and, in a recent survey, the item most often cited as necessary for improved governance was accountability for measured results and effectiveness. Increasingly, Canadians are asking for better information on what they and their country are getting for their tax dollars; members of Parliament have expressed frustration at the lack of this kind of information. Moreover, with governments seeking alternative ways of delivering programs and services that involve other levels of government and voluntary and private organizations, credible information on performance is needed even more to ensure that tax dollars are being spent well.
Other jurisdictions report on accomplishments5.8 Many jurisdictions are making visible efforts to provide their legislatures and citizens with better information about the performance of government programs. In Canada, in addition to the federal government, a number of provinces have moved in this direction. Appendix A outlines the reporting regimes in Alberta, Nova Scotia and New Brunswick as examples. Abroad, quite a few countries have moved to improve reporting of the performance of their governments as a whole and of individual departments and agencies. Appendix A briefly describes the cases of the United Kingdom and Australia, as well as the United States federal government and the states of Oregon, Texas and Minnesota.
5.9 We did not examine the success of initiatives undertaken in other jurisdictions or assess the extent to which better performance information is being provided than is the case in Canada. However, the experience reported by these jurisdictions shows that although providing performance information is not easy, it can be done. It is also clear that several years will be needed to fully implement initiatives to improve performance reporting.
What is ``performance"?5.10 The concept of performance deals with how well things are done:
- Were the expected results accomplished?
- Were they accomplished within budget and in the most efficient manner?
- Were there undue, unintended consequences?
- Is the organization learning and adapting?
5.12 Outcomes can range from intermediate outcomes , such as changes in the actions of program clients and their satisfaction with a service, to more long-term or ultimate outcomes such as general improvements in the well-being of Canadians, the economy or the environment. Intermediate outcomes are more easily linked to the activities of a program than are ultimate outcomes.
5.13 Exhibit 5.2 illustrates how the various results of an anti-smoking program could be characterized. Another example can be found in Chapter 10 of this Report, which presents a diagram of results for an activity providing information on energy efficiency (Exhibit 10.5).
Knowing how well programs are working is a challenge5.14 Complex programs mean that expectations must be clear. Federal programs operate in a complex environment. They often have to deal with a diverse and uncertain number of external factors and other players with similar or competing objectives. Moreover, they do not operate in a stable environment; change is now the norm. Furthermore, programs are trying to accomplish broad public objectives reflecting the public interest, which sometimes can be a challenge to measure.
5.15 Different individuals with different expectations about what a program is supposed to accomplish can reach different conclusions about how well the program is doing. Assessment of actual performance requires the establishment of a standard for comparison, such as a specified level of expected performance. These standards ought to be clearly stated as meaningful performance expectations .
5.16 Information on a program's performance must be gathered. In some cases, individuals can assess the direct service that each has received - such as assistance in submitting income tax returns, employment guidance or an import permit for a commercial shipment. But more information is needed to assess the overall success of a program, such as whether it ensures that Canadians pay their fair share of taxes, or adequately reduces the risk to specific endangered species. The managers of the programs need the same information if they are to manage for results. Gathering and assessing the relevant data and information requires systematic effort.
5.17 Information on program performance is an essential part of the process by which the government plans, budgets and seeks parliamentary approval for future expenditures. This process is the expenditure management system.
The Expenditure Management System Has Been Revised
A focus on results of programs in a world of no new resources5.18 Expenditure management in the federal government during much of the 1980s and into the early 1990s was characterized by numerous, relatively small, across-the-board expenditure cuts and occasional freezes on expenditures. At the same time, managers were given access to policy and operational reserve funds, albeit in diminishing amounts. Although cumulatively the incremental reductions were significant, expenditure growth was not contained.
5.19 Moreover, the expenditure management system did not meet basic information needs. Our 1990 study of efficiency in government found that the planning and accountability documents required at that time often did not contain information pertinent to managing for efficiency. A year later, we noted that it was hard to manage by results when few meaningful ways to measure them had been put into use.
5.20 Revisions introduced in 1995. The revised Expenditure Management System (EMS) was announced in 1995 in the context of the government's aim to reduce the deficit to three percent of GDP. In the 1995 publication "The Expenditure Management System of the Government of Canada", the President of the Treasury Board noted that "the old ways [of managing expenditures] clearly [did] not work in a time of severe fiscal restraint." A key feature was to "improv[e] information on program performance to aid decision-making and facilitate accountability." Another key feature was the elimination of central policy reserves to fund new initiatives, since "central policy reserves did not encourage the ongoing review of existing programs and spending." New initiatives were to be funded through reallocation of existing resources. The Program Review the government undertook in 1994 identified specific programs and services to be reduced or eliminated.
5.21 As we have reported in the past, previous attempts at gathering and reporting performance information have met with limited success. None, however, was driven by the current need to make difficult spending choices and to gather support for reducing expenditures.
A framework for the revised Expenditure Management System5.22 The main features of the revised EMS were contained in a document tabled in the House of Commons in February 1995. Subsequently, additional refinements have been developed and are being implemented, sometimes on a pilot basis. Exhibit 5.3 identifies the various documents and processes that make up the Expenditure Management System.
5.23 From Parliament's perspective. Phase II of EMS reform (Phase I was the introduction of Business Plans described in paragraph 5.29), known as the Improved Reporting to Parliament Project, is intended to improve the expenditure management, planning and performance information that goes to Parliament. The major changes are in the Estimates documents and the way they are dealt with by Parliament.
5.24 The Estimates are currently in three parts: Part I, the overview of government spending; Part II, the proposed appropriations; and Part III, detailed expenditure plans for each department and agency. The intention now is to split Part III into two new documents:
- a Report on Plans and Priorities , tabled in the spring in conjunction with Parts I and II of the Estimates, and intended to establish performance expectations and outline the general direction the minister wants the department to take during the estimates year and the next two fiscal years; and
- a Performance Report tabled in the fall for the period at least up to the previous 31 March, and intended to provide information on results actually achieved in serving Canadians both as clients of specific services and as taxpayers.
5.26 In addition, the government is testing an In-Year Update to Parliament in the fall as a complementary document to Supplementary Estimates, noting any significant changes in either the financial or non-financial figures tabled the previous March in the Main Estimates for the current year.
5.27 The timing of the spring plans and fall Performance Reports is intended to enhance the role of standing committees. Fall Performance Reports enable committees, in a timely way, to report their conclusions to the House and to make their views known to the government as it carries out its fall budget consultations. Standing committees may thereby influence spending plans and priorities of the next or subsequent years. Similarly, spring plans may provide a focus for committee reporting to the House on future years' plans and priorities, facilitated by 1994 changes in House rules.
5.28 The key milestones in the Improved Reporting to Parliament Project are outlined in Exhibit 5.4 . The Outlook documents mentioned in the exhibit were introduced in response to the February 1994 changes in House rules and will be subsumed by the Reports on Plans and Priorities. Sixteen pilot Performance Reports were tabled in the fall of 1996 (see Exhibit 5.5 for a listing of these departments). Pilot Reports on Plans and Priorities were tabled in the spring of 1997 by these departments, along with the 1997-98 Main Estimates.
5.29 Within government, business plans are being prepared. An integral element of the EMS is the Treasury Board requirement for departmental Business Plans to be submitted to the Board for review, in place of multi-year operational plans. Business Plans are concise, strategic, three-year plans that set out goals, targets and performance measures. Several provinces, including Alberta, Ontario, and New Brunswick, also have adopted this approach.
5.30 The intention of the government is that Business Plans are to be the main planning document of departments and submitted to Treasury Board ministers. Although these are not public documents, the public Reports on Plans and Priorities will integrate the business planning information with other reporting requirements established by Parliament.
5.31 An important element of the revised EMS entails departments moving to a comprehensive, results-focussed accountability structure for reporting financial and non-financial information, known as the Planning, Reporting and Accountability Structure (PRAS). The PRAS relates the internal management and accountability regime of the department to its objectives, business lines, resource requirements and performance targets. It is intended to provide the basis for reporting in Business Plans and Estimates documents.
Focus of the Audit5.32 Because of the length of time needed to make major changes and the need to adapt to changing circumstances, implementation of the revisions to the Expenditure Management System is ongoing. We could have waited until the system was fully mature before auditing in this area. However, we decided that it would be more useful to communicate clearly our expectations and assessment of progress at this time, so that they can contribute to the ongoing implementation of the system.
5.33 We undertook the audit to assess the progress departments are making in their reporting of expected and actual performance in the Expenditure Management System. We also sought to identify examples of good reporting practices that exemplify our expectations (set out in detail in Appendix B) . These expectations were developed in consultation with the Treasury Board Secretariat. They are based on our assessment of experience in Canada and other jurisdictions. We identified five characteristics needed to give a credible account of performance:
- clear context and strategies;
- meaningful performance expectations;
- accomplishments reported against expectations;
- demonstrated capacity to learn and adapt; and
- fair and reliable performance information.
5.35 We also examined the role played by the Treasury Board Secretariat in implementing innovations to encourage better reporting of information on performance, and the use made of this information by parliamentary standing committees. Full details on the conduct of the audit can be found at the end of the chapter in About the Audit .
Observations and Recommendations
Revised Estimates Framework Shows Potential for a Greater Focus on Results
More opportunities for standing committees to comment on results5.36 An emphasis on results. Communications about the revisions to the Expenditure Management System and related Estimates documents emphasize an important message - the need for departments to make available information on what programs are intended to achieve and what they are accomplishing.
5.37 Separate reports fit the budget cycle better. Information contained in the Main Estimates submitted to Parliament each spring by the government is the basis for parliamentary approval of resources the government requires for the coming fiscal year. The Estimates contain a mix of financial and non-financial information that covers past expenditures and accomplishments, year-to-year financial reconciliation, and planned expenditures and activities. By separating information on future plans from information on past performance, especially in the fall Performance Reports, the new framework draws more attention to the performance achieved by programs. It also provides performance information on previous fiscal years five to six months earlier than before, serving as timely input to budget consultations.
5.38 In-Year Update potentially provides better information on changes to expectations. Supplementary Estimates are requests for changes to program funding in the Main Estimates. In November 1996, an In-Year Update report covering the 16 pilot departments was tabled, to test a way of complementing Supplementary Estimates by providing better information to Parliament on significant adjustments to program priorities, expectations, and changes in planned spending. The In-Year Update is intended to make Parliament more aware of program changes that have significant impacts on Canadians. We assessed the nature of the departmental input to the In-Year Update and found that none reported changes to expected performance, although some had proposed significant resource changes. The Report did provide some helpful information on resource reallocation that would not be evident in the Supplementary Estimates.
Promising improvements in accessibility of information5.39 Improved timing of reports is, in itself, not sufficient to make information useful. We also expected the information to be organized and presented so that it is easily understood and the reader can readily access other desired information.
5.40 Promising improvements in accessibility through information technology. Treasury Board Secretariat staff are working with departments and the House of Commons to improve the electronic distribution of pilot documents. The intention is to use the Internet as the main tool for electronic communication of these documents to all MPs, their staffs, and the public. To date, the pilot Performance Reports, the In-Year Update report, the 1997-98 Reports on Plans and Priorities and all Estimates Part III are available through the Internet at http://www.tbs-sct.gc.ca/tb/rpem/homee.html. The intention is to continue this practice. In addition, the Review Information Network of the Secretariat Internet web site (http://www.info.tbs-sct.gc.ca/rin_hm_e.html) contains information on hundreds of recent reviews of various aspects of performance of selected programs.
5.41 A Parliamentary Working Group on the Improved Reporting to Parliament Project was convened to give feedback on proposed improvements to the expenditure management information provided to Parliament. The Working Group reports that members of Parliament support the move to electronic reporting. The offices of all MPs and Senators are currently connected to the Internet. Although Treasury Board Secretariat officials have consulted with the House of Commons and with departments, parliamentary staff believe that better co-ordination of detailed implementation is needed in the future. For example, the format used by the Secretariat to make Performance Reports available on the Internet is not suitable for some systems. A continued partnership between the House of Commons and the Treasury Board Secretariat will help to ensure that the Members' needs are met and that information is distributed in a timely and effective manner.
5.42 Shorter, more readable documents. Parliamentarians have complained about the excessive amount of information in the Estimates Part III. The proposed framework for reporting to Parliament would address this by providing briefer documents that focus on key strategies for at least the current and next two years.
5.43 We assessed the structure and readability of the pilot Estimates Part III and the fall Performance Reports and compared our assessments with those contained in Treasury Board Secretariat evaluations of the documents. In most cases, our Office and the Secretariat found that the documents were shorter and more readable and used better communications techniques than previous Estimates Part III. Parliamentarians consulted during the Secretariat's evaluations of the pilot Estimates Part III and the fall Performance Reports found the separation of planning and performance information helpful and the documents more focussed and clearly organized than the previous Estimates Part III.
5.44 Need more links to detailed information. Users of information on departmental performance may sometimes need more detailed information than is provided in shorter documents. Although parliamentarians have complained about excessive information in the Estimates Part III, they and their staffs have also observed that there may not be enough detail on specific topics of interest. Balancing these concerns requires that documents provide links to more detailed information. The pilot Estimates Part III and the fall Performance Reports of some departments contained some references to sources of more detailed information. However, in many cases there was no consistent and systematic use of such references.
5.45 The Treasury Board Secretariat intends that departments will electronically link Expenditure Management System documents on the Internet to more detailed sources of information. Consistent with recommendations in our 1992 study of information to Parliament, this would allow for shorter printed documents and lower costs than with printed distribution, while providing easy access to additional information where needed. Electronic linkage is not expected to be completed for two years.
5.46 Departments should ensure that their reports contain appropriate references to more detailed sources of information.
Need for a formal means to ensure that improvements continue5.47 The current reporting framework has been brought about through administrative changes and motions in the House, and implemented only on a pilot basis. Although these can be successful approaches, neither has the force and permanence of legislation.
5.48 Other jurisdictions report some success with accountability legislation. A number of other jurisdictions with systems of government similar to the Canadian federal government, including Alberta, Western Australia and New Zealand, as well as the U.S. federal government and a number of state governments, have made use of legislation that requires departments and agencies to report on their accomplishments. ( Appendix A provides some information on several of these jurisdictions.) These planning and reporting regimes, while established through legislation, are similar to that being implemented in the Canadian government.
5.49 Some Canadian federal entities have legal reporting requirements. Since 1984 the accountability framework for federal Crown corporations has been contained in Part X of the Financial Administration Act . The Act requires the tabling in Parliament of summaries of corporate plans that set out corporate objectives and expected performance, and annual reports that indicate actual performance against those objectives. Since then, our audits have found improvements in information presented to Parliament on the performance of Crown corporations, although further improvements are still required. Some other organizations have specific legislated reporting requirements. For example, enabling legislation for the Atlantic Canada Opportunities Agency requires it to evaluate its activities and "report on the impact [its] activities have had on regional disparity." Since the 1987 legislation, the Agency has reported considerable information on the impact of its two main programs. Yet, as we reported in 1995, improvement still needs to be made in the reported information. We also reported that the Atlantic Canada Opportunities Agency is taking steps to build on what has been learned. We have not audited the Agency's performance information since that time.
5.50 Legislating government reporting requirements is not enough on its own to ensure the quality of performance reporting or the appropriate use of the information by legislatures. Several other essential factors are listed by the United States General Accounting Office, which identified five key challenges:
- developing and sustaining commitment by top management;
- building the capacity of agencies to implement the legislative framework and use the resulting performance information;
- creating incentives to implement the framework and to change the focus of management and accountability;
- integrating the framework into daily operations; and
- building a more effective legislative oversight approach.
5.52 The progress observed in the initiatives to date needs to be continued if their potential for better reporting and use of information on performance is to be realized.
5.53 The government, with Parliament's agreement, should ensure that the improvements in the reporting regime implemented on a pilot basis are made a permanent and formal feature of the Expenditure Management System covering all departments and agencies, and should seek parliamentary approval for incorporating these improvements in the business of supply.
Progress in Describing the Department's Business5.54 We expected the performance plans and reports to provide an overview of what departments are trying to accomplish under what circumstances, and the relationship of individual programs to broader organizational goals and commitments to the public. This information would allow the reader to interpret the program performance that is reported, and it is needed to help manage programs and make decisions about the allocation of resources.
In many cases, descriptions of departments have improved5.55 Treasury Board Secretariat evaluations of the pilot Estimates Part III and fall Performance Reports observed that, in most cases, members of Parliament and their staffs found the overviews of departmental functions, structures and main activities better than in the past.
5.56 Improved descriptions of mandate and mission. We found that most departments provided clear descriptions of mandate and mission. The best of the mission statements were related to departmental business lines and contained a clear vision of an ideal outcome toward which the department was striving.
5.57 Limited improvement in breadth of stewardship information. We noted in our 1992 study on departmental reporting that the existing Estimates documents focussed on annual spending and consequently did not often report on the other instruments departments use to meet the objectives of the programs they deliver - in other words, global stewardship. Regulations, loan guarantees and tax expenditures are some common examples of these other instruments. We recommended that the government report on the broader stewardship responsibilities and not just on annual spending. In 1994, we expanded on this concept to include the need for a lead department for overall reporting of activities that are delivered by more than one department. We, as well as parliamentary staff, expect that departments would at least refer to other government activities that bear on the outcomes they are trying to achieve. For example, Indian and Northern Affairs Canada provided in its fall Performance Report some financial information on other federal departments dealing with First Nations. We note that Treasury Board Secretariat guidelines point to the need to include material on global stewardship.
5.58 However, Performance Reports showed no clear overall improvement in the provision of such information on global stewardship. The references to non-spending instruments, when made, are generally included as supplementary financial information and are not incorporated into the general discussion of the department's context, business lines and strategies. We recognize that some of the information needed to report on global stewardship is beyond a department's direct responsibility - for example, many tax expenditures. However, where this is the case, we would expect, as a minimum, a short discussion of the extent of the use of such instruments and their impact on the outcomes the department is seeking.
5.59 Departments should include in their performance reporting reference to related activities elsewhere in government, as well as a discussion of all the key instruments that they use to achieve their objectives, including those
instruments that are not reflected in estimates of spending for the coming year.
Progress in Orientation of Results Expectations, but Need for Greater Clarity and Concreteness5.60 We expected performance plans to provide an understanding of what the department intends to achieve, by when, and at what cost. We looked for a focus on major intended results of key programs, especially on outcomes that represent the value received by program clients and other Canadians from government activities. This information would assist members of Parliament in their scrutiny of future departmental spending and is needed by managers to control their programs. It is essential for reporting performance.
Expectations focus more on outcomes5.61 Good progress in establishing outcome-oriented expectations. We compared performance expectations set out in the spring in the Estimates Part III and the Business Plans of 11 departments with those subsequently set out in the fall in departmental Performance Reports and in Annex B of the Annual Report to Parliament by the President of Treasury Board, "Getting Government Right: Improving Results Measurement and Accountability". In the majority of the cases, we found the focus on outcomes was greater in the fall than in the spring. Overall, of the more than 600 expectations set out in Annex B of the President's Report in the fall, more than half were for outcomes. We consider this to be good progress toward a focus on outcomes.
Expectations need to be clearer and more concrete5.62 Statements of expectations need to indicate clearly what would have to occur for a program to be judged successful, given the department's mission and the program's objectives and activities. They need to be expressed concretely in terms that permit measurement. The measurement can involve numeric estimations of magnitude or "softer" forms of measurement, such as the solicitation of users' views. However, if the expectations are not clear and concrete, they are not useful for assessing how well a program is doing.
5.63 Few statements of clear and concrete expectations. Our assessment of the performance expectations stated in the President's Report and fall Performance Reports found some that were specific enough to give a clear picture of desired results ( Exhibit 5.6 gives a few examples).
5.64 While they are clear, however, these expectations are not concrete: one does not know when things are to occur or exactly what level of performance is needed to fulfill the commitment. A better approach to establishing clear expectations is to set specific targets, as illustrated by the Atlantic Canada Opportunities Agency and Revenue Canada (see Exhibit 5.7 ). The Atlantic Canada Opportunities Agency, in particular, stood out as having set a large number of targeted outcomes, and while it did not do so in all cases, in many instances it indicated its contribution to changes in the outcomes.
5.65 Where targets have not yet been established, expectations need at least to set out the expected direction of improvement in performance, as demonstrated in Exhibit 5.8 . About one fifth of the expectations reported in Annex B of the President's Report did so. However, indicating the direction of improvement does not tell the degree of change that would have to occur, and by when, for the performance to be judged successful.
5.66 We compared the clarity and concreteness of expectations set out in the fall with those in the spring, for 11 departments. We found some improvement in clarity, but little improvement in concreteness.
5.67 Overall, we found that the nature of performance expectations varied considerably. Some were general statements of goals to be achieved, some were descriptions of the activities that programs were going to carry out, others were lists of planned performance indicators, and still others were actual targets against which performance could be assessed. This was especially true of the results commitments set out in Annex B to the President's Report. The wide range of the commitments stated in the Report makes it difficult to compare departments and suggests that there was a need for clearer guidance on what these statements should have been.
5.68 Clear and concrete statements of expected results would allow actual performance to be assessed. Establishing expectations that are reasonable and measurable and that clearly indicate what results can be expected of a program can be a challenge. Any given department may be successful at meeting this challenge for some areas of activity but not for other areas. Overall, with a few notable exceptions, the performance expectations set out in the fall did not include targets and many were not specific. Many were too ambiguous to allow us to determine the anticipated result.
5.69 Exhibit 5.9 provides a few examples of imprecise performance expectations. These examples are intended to be illustrative and should not be taken as an indication of the overall progress of these departments toward setting clear expectations. Nor should they be taken as an indication that these departments have made more or less progress, overall, than others toward setting clear expectations.
5.70 In these examples, only a very general expectation is stated. It is difficult for readers to understand in advance how much of exactly what is expected to occur, unless additional detail is provided. For example, although Statistics Canada has taken an important step in identifying a final outcome that may result from its statistics, the ``redesign of health care policies and programs" could be satisfied by quite minor changes in a single aspect of a health service being delivered, or could require substantive changes to health care in Canada, depending upon the perspective of the reader. In its Performance Report, Statistics Canada discusses processes in place to monitor the utility and relevance of its products, and it may be able in the future to provide more specific information about accomplishments in this area. However, the kind of general expectations shown in Exhibit 5.9 inherently create a potential for disagreement about whether or not they have been met.
5.71 Performance indicators often not clearly explained. A performance indicator is the kind of information actually used to assess a specific aspect of performance. In some instances, Performance Reports clearly identified and explained the kind of information that would be used. For example, Exhibit 5.10 shows how the RCMP explained the meaning and pertinence of the "clearance rate".
5.72 However, we found instances where it was not evident what the performance indicator was measuring or why it was being reported. Revenue Canada, for example, in discussing the performance of its Customs Border and Trade Administration Services, presents several indicators, including "Compliance Rate" , expressed as a percentage, under the heading "Travellers". Although the compliance rate may be a good outcome-oriented indicator, the Performance Report provides no explanation of what the rate includes. The other two indicators are not defined either. However, the 1996-1997 Estimates Part III for Revenue Canada did provide brief definitions of these indicators. As another example, many departments refer to "client satisfaction" without explaining satisfaction with what, and how it will be determined.
5.73 In some cases, where departments did not report clear and concrete indicators of performance, they identified indicators they plan to use or areas for which they are developing indicators. This is a good practice. For example, Environment Canada announced that it was developing measures of its impact on the consideration of sustainability in energy decisions and on environmental stress caused by transportation. Statistics Canada proposes to develop an indicator to measure the average burden that responding to its surveys places on individual Canadians. It currently reports the average "response burden" on all businesses.
5.74 Departments should establish clear and concrete statements of the performance expected from their lines of business. These should be included in their Business Plans, Reports on Plans and Priorities, and Performance Reports.
Some Progress in Reporting Accomplishments, but Still Cannot Judge Success of Programs
Some performance reporting meets many of our criteria5.75 We expected to see progress in the reporting of accomplishments in relation to expectations. At its best, performance information provides a credible account of how well a department has done. This account needs to focus on outcomes related to key expectations. It ought to describe what the department is trying to achieve and what it contributes to that end, given the influence of other actors and outside factors. We also would expect the account to show that the department has accepted responsibility for results by taking action to address weaknesses in performance. Finally, it ought to be made credible by indicating the extent to which the reader can rely on the information presented. Appendix B outlines our expectations in more detail.
5.76 We examined the reporting of accomplishments in the fall Performance Reports. The best examples provided many of the key elements for reporting performance, but not all of them.
5.77 For example, Transport Canada's reporting on its aviation safety activities did not provide concrete performance expectations, nor did it comment on the reliability and validity of the performance indicators used. However, it did provide the general objectives of the Department's aviation activities, a clearly explained outcome-oriented indicator (accident rates), a limited description of the activities the Department undertakes to influence the outcome and a discussion of the limitations on its ability to link its activities to the safety of the aviation system (Exhibit 5.11) .
Need more progress in reporting key accomplishments5.78 Reporting of performance accomplishments needs to be selective to be usable. We expected that the fall Performance Reports would focus on the key intended results, especially outcomes, and key or significant programs or program elements. We also expected some reporting against previously stated expectations.
5.79 In 11 departments, we compared the reporting of outcomes in the fall Performance Reports with reporting in the spring 1996-97 Estimates Part III. We found some improved reporting of outcomes in the majority of these 11 departments between the spring and the fall.
5.80 Nevertheless, our examination of the 16 fall Performance Reports found that much progress was still needed in the reporting of outcomes. Although departments reported some intermediate and few ultimate outcomes, overall there were many instances where outcomes were not identified. However, expectations established in the past tended to focus on outputs rather than intermediate or ultimate outcomes. We observed a greater emphasis on outcomes in the performance commitments reported in the President's Report of fall 1996. This may lead to greater reporting of intermediate and ultimate outcomes in the future.
5.81 Even when outcomes were reported, the general lack of clear and concrete expectations made it impossible to judge how well most programs were doing.
5.82 Emphasis on activities and outputs. In interviews conducted by the Treasury Board Secretariat, officials in most departments said that although they are committed to better measurement of results, actual reporting is still oriented toward activities and outputs. Of the departments that said they are moving toward reporting results, all but one acknowledged that their reporting focusses on outputs and that much work remains to be done to move toward reporting outcomes. When we examined the fall Performance Reports, we found that the emphasis was on describing activities undertaken and their outputs. Although some description of activities and outputs is essential to a credible account of performance, the picture is not complete if outcomes are not addressed. In the future, we will report on the obstacles departments encounter in moving toward increased measurement and reporting of outcomes.
5.83 Reporting is often on details. Some staff responsible for departmental planning and reporting have said that strategic reporting - that is, reporting that focusses on key activities and results - is difficult because all managers want their own programs and activities included. But if reporting is to be strategic and focussed, it must be selective and not cover all aspects of a department in detail. For some departments, performance reporting was not always strategic, focussing on detailed lists of activities undertaken within business lines or on details of administrative and management matters. This reduces the readability and increases the length of reports. Decreasing the discussion of these details would allow departments to address elements of performance reporting that need more attention.
5.84 Departmental performance reports should be more strategic, providing a better account of the key aspects of performance.
The department's contribution not frequently made clear5.85 The vast majority of departments did not show how their reported achievements related to their activities. Nor did they clearly describe or discuss their contribution to outcomes in comparison with the contributions of other parties. Although a few departments provided information that gave some understanding of their own contribution to their intended outcomes, none did this consistently well.
5.86 We realize that determining the contribution made to an outcome can be a challenging task. Nevertheless, reporting on the performance of key outcomes ought to recognize the other factors that may be influencing an outcome, describe how the program's activities are intended to influence the outcome sought and highlight any available evidence supporting the contention that the program is having an impact. This need not be a lengthy text; its purpose would be to explain to the reader why the program is thought to make a difference. Where little is known about the extent of the contribution, this ought to be acknowledged. Performance in controversial areas might, at some point, need more formal program evaluation studies to assess the program's influence.
5.87 Reluctance to report results that cannot be fully controlled. In our discussions with program managers, they sometimes indicated an unwillingness to report on results that they could not fully control. Managers are less likely to have full control over outcomes than over outputs. The Treasury Board Secretariat's evaluation of the fall Performance Reports observed that in 10 departments, managers stated that the inability to attribute outcomes directly to departmental efforts was a major factor affecting their reporting of results.
5.88 This problem needs to be addressed. Departments ought to be giving a credible account of their performance, discussing - with the evidence available - the contributions they are making to all key outcomes, even those over which they do not have complete control. Few departments have attempted to do so.
5.89 Departments should describe how the activities of their programs contribute to the reported outcomes.
More attention needed to the fairness and reliability of performance information5.90 To be considered and used, performance information needs to be understandable and credible. Credibility requires information that is understandable, reliable and valid - that is, information that is verifiable and measures what it purports to measure.
5.91 Balanced reporting enhances credibility. Some managers may want to avoid criticism by reporting only performance that has met expectations. Doing so, however, can lead to Performance Reports that lack credibility because they appear to be just "good news stories". Indeed, this was the view expressed by some of the parliamentarians interviewed as part of the Treasury Board Secretariat's evaluation of the Performance Reports. More balanced reporting is essential if the information is to be credible and used outside the department.
5.92 For the most part, the pilot Performance Reports focussed on positive accomplishments, with limited or no attention paid to areas where expected results were not being achieved. Where expectations are not clear (as was common), it is difficult to judge whether programs are accomplishing enough.
5.93 Nevertheless, we found several instances where departments did enhance the credibility of their Performance Reports by describing performance that had not met expectations. Exhibit 5.12 shows Environment Canada's description of shortfalls in bringing back threatened species.
5.94 Departments can adopt one of two constructive approaches to reporting performance that is lower than expected. They can report actions planned to improve performance or, where expectations are no longer feasible and appropriate, indicate changes in those expectations.
5.95 Reporting corrective actions that are being taken. Departments can acknowledge areas where performance goals were not met, and indicate what they have learned about the reasons for the performance gap and what corrective action they plan. This demonstrates a balanced approach to performance reporting and proactive management. It is perhaps where there is no evidence of having learned from experience that criticism is most warranted about performance that has not met expectations.
5.96 We found a few examples of reporting that both identified problems and discussed what was being done to address them. For example, Transport Canada's Performance Report recognized that the system of public harbours and ports suffers from overcapacity and too much bureaucracy, and it promised that a policy would be implemented to correct the situation. The Atlantic Canada Opportunities Agency reported that it had dropped a loan program after evidence had shown it to be not very effective. Another example was Statistics Canada, which reported that it was restructuring surveys in response to studies of the strengths and weaknesses of its energy, retail, wholesale and manufacturing statistics.
5.97 Revising expectations. In some cases, performance may not meet expectations because the expectations are no longer feasible or appropriate in light of current circumstances. In these cases, departments could update their stated expectations and provide the rationale for doing so. We found no instances where departments clearly reported changes in expectations.
5.98 Where performance does not meet expectations, departments should report the performance gap, the reasons for it and the changes they are making to close the gap.
5.99 Credibility requires accurate information. To be credible, information presented in Performance Reports must be reliable and valid. Furthermore, we expected that reports would indicate the information's reliability and validity - where the reader may question it - by describing, for example, the method of data collection and verification, the accuracy of the data, their relevance and significance, and related factors also influencing the performance reported. We would also expect any limitations in the reliability of the data to be reported.
5.100 Few departments presented information of this type to indicate the extent to which their reporting of performance can be relied upon. For example, the source of outside information needs to be presented, allowing for a determination of its reputation. Environment Canada took this approach by indicating whether the Department itself or a reputable professional journal was the source of information used to report progress in controlling emission of ozone-depleting substances.
5.101 Unintentional and relatively minor errors in reporting can nevertheless be damaging to the credibility of a department's Performance Report. We verified the accuracy of reported information in only 18 specific instances, where we expected to find good performance information that could be used as examples in this chapter. This was not a representative sample of either the reports or the information in them. In verifying the information, for the most part we did not audit the sources of the information in detail.
5.102 We assessed the methodology used to develop the reported information and assessed the consistency of the information with that contained in the original sources. In almost half of these instances we found problems, including errors in reporting or calculating numbers and inadequately qualified reporting. The errors are mostly minor and did not result in misleading statements. However, they do point to the need for better procedures to check the performance information being reported.
5.103 Departments should provide, where appropriate, an indication of the strengths and weaknesses of reported information and their implications for the reported performance. They should develop and implement procedures to ensure credibility and accuracy of the information reported.
Little information on interdepartmental and horizontal issues5.104 In her Fourth Annual Report to the Prime Minister on the Public Service of Canada, the Clerk of the Privy Council and Secretary to the Cabinet observes that horizontal policy issues - ones that cut across a number of departments - are growing in number. The Report also notes that tackling these issues requires an expanded knowledge base and increased interorganizational collaboration. We have commented in the past on weaknesses in the availability of information on the results of interdepartmental and horizontal activities. There is still a need for a mechanism to ensure that requirements for information on interdepartmental and horizontal performance are identified and addressed and that the performance is reported. The reports that make up the information base of the Expenditure Management System are mostly provided by individual departments and therefore do not contain information on interdepartmental and cross-government aspects of performance.
5.105 In the President's Report to Parliament for 1996, the Treasury Board Secretariat announced that it would be working with Statistics Canada and other departments and agencies to develop a set of core government-wide performance indicators that could be used for future reporting. An interdepartmental committee has been established, and work is under way. Progress is to be reported in this year's President's Report. Establishing a set of government-wide indicators would be one way to report on horizontal and interdepartmental performance. As illustrated in Appendix A , a number of other jurisdictions have moved significantly in this direction and now regularly report against a set of global performance expectations.
A Vital Role for Treasury Board and Its Secretariat5.106 The Treasury Board Secretariat plays a key role in the Expenditure Management System. We expected that it would ensure a reasonable rate of progress in improving performance reporting within the EMS. Ensuring reasonable progress requires that the Secretariat provide adequate guidance and direction to departments and agencies and follow through with supportive action. It also requires that it foster a climate for credible performance reporting by showing leadership, demonstrating commitment, providing incentives, facilitating the building of expertise and developing a capacity to learn and adapt. It is important that the Secretariat recognize the challenges faced in implementing system-wide changes of this size, and capture and act on the lessons learned in order to better manage change in the future.
Treasury Board Secretariat has implemented important innovations in a short period of time5.107 Change has been rapid, making the job difficult. The Treasury Board Secretariat has moved quickly to implement revisions to the Expenditure Management System. Rather than implementing all the revisions at once, it has taken a phased approach that includes trial implementation of innovations in departments at each phase of the budget cycle, building on the experience gained in previous phases. The Secretariat consulted extensively with parliamentarians, this Office, departments and other stakeholders.
5.108 Changes during implementation have been rapid and frequent, and sometimes confusing for departments, who told us they would have liked more information earlier on how the components of the EMS fit together. Secretariat analysts also found the changes confusing. The initiatives were being implemented at a time of considerable internal change in the Secretariat.
5.109 Steps have been taken to provide guidance and direction. In order to ensure that expenditure management documents meet the needs of Parliament and government, expectations and standards for these documents have to be communicated to the departments that prepare them. The Treasury Board Secretariat has issued guidelines for EMS documents, conducted extensive consultations with departments and provided feedback on draft documents. As part of this audit, we examined the EMS guidelines. We found that they encourage departments to establish and report against expectations for performance. They also encourage a strategic focus on results and outcomes. The guidelines for the fall Performance Reports included principles for performance reporting prepared by this Office in consultation with the Secretariat.
5.110 To develop the performance expectations in Annex B to the President's Report, Secretariat analysts reviewed departmental performance and planning documents and identified an initial set of expectations. These were then presented in the form of "mock-ups" and were discussed with departments. Most departments were able to further improve their performance expectations based on this guidance. We found this approach to be particularly effective in encouraging a greater focus on outcomes.
5.111 We noted that guidance to departments and agencies on the preparation and presentation of financial information in performance plans and reports was limited to the structure of the tables to be included. We believe that better guidelines would help ensure that financial information is more complete, reliable, clearly and fairly presented, and prepared on a basis consistent with the government's stated accounting policies.
5.112 Consistency in the provision of guidance and direction is important. When trying to implement considerable change in a short period, it is particularly critical to convey consistent messages. Although the written guidelines are reasonably consistent, departments told us and the Secretariat that they had received inconsistent advice and feedback as they prepared the new reporting documents. Secretariat analysts acknowledged some inconsistencies in the responses they provided to departments. The guidelines were revised several times in response to feedback and as experience was gained. Requirements for the content of EMS documents were added or changed up to a few weeks before deadlines. While this might have allowed for the most up-to-date guidance to be provided, departments sometimes perceived the changes to be confusing.
5.113 The Secretariat advises us that it has taken measures to resolve this problem by co-ordinating all advice on performance reporting through teams previously established to co-ordinate the development of Business Plans. However, experience in the development of the 1996 Business Plans has shown that the approaches taken by those teams vary considerably. It remains to be seen whether this approach will address the concerns of departmental officials.
5.114 The Treasury Board Secretariat should ensure the consistency of advice and feedback provided to departments on performance reporting.
5.115 Training and development initiatives have been implemented. Moving to a more results-based approach to management is a major change for most programs. There is a considerable need for education and training in how to develop useful measures of performance and how to use them in managing and reporting. This is an important departmental responsibility, but the Treasury Board Secretariat has a role to play as well. The Secretariat undertook a number of initiatives to provide additional guidance and to develop the required expertise in departments. These included sponsoring many workshops over the past two years at the Canadian Center for Management Development; holding workshops with groups of departments; briefing departments and their executives; and holding or co-sponsoring sessions to review experience with such innovations as departmental Outlook documents, pilot Estimates Part III and Business Plans. The sessions included departments and the users of performance information, including this Office. Some included members of Parliament. In general, we found that the training and review sessions encouraged a focus on results, including outcomes.
5.116 Measures have been taken to learn and adapt. On behalf of the Parliamentary Working Group, the Secretariat has conducted evaluations of pilot Estimates Part III and fall Performance Reports. We agreed with the evaluation conclusions except for the assertion, reported in the evaluation of the pilot Estimates Part III, that performance expectations were set out well in the documents. Treasury Board Secretariat management has agreed with the recommendations flowing from the evaluations of the pilot Estimates Part III and the fall Performance Reports. Appendix C summarizes the key recommendations of these evaluations; we agree with those recommendations.
5.117 We interviewed Secretariat officials and reviewed available documentation to assess the progress made in implementing the recommendations and related commitments to action. Some of the recommendations address matters that fall outside the time frame of this audit; in almost all remaining areas, the Secretariat has made progress. We found that Secretariat officials have used the results of the evaluations to modify their approaches, develop training and guidance, make recommendations to management and support the development of motions for consideration by the House of Commons.
5.118 The Secretariat has also evaluated both the 1995 and 1996 business planning processes. We did not examine TBS implementation of recommendations from these evaluations, many of which fall outside the scope of this audit.
5.119 Better identification of good practices is needed. In the past, departments have expressed concern that the rigidity of requirements for reporting in the Expenditure Management System has not allowed them to communicate clearly the department's business and context. The current approach adopted by the Treasury Board Secretariat responds to this concern by allowing considerable flexibility in the format and content of expenditure management documents. Instruction is in the form of general guidelines rather than directives, and requirements are few. Some departments, however, indicated that they need more help in preparing their documents than the guidelines provide.
5.120 We believe that identifying and communicating examples of good practice - not only for reporting performance but also for identifying expectations, measuring results and using the results to improve programs - is an excellent way to provide guidance. The evaluation recommendations reflect this principle. At the federal level in the United States, documenting cases of good practice is an important part of implementing the legislated performance reporting regime. Our audit found that there are instances of good practice that can be identified and communicated.
5.121 Treasury Board Secretariat has encouraged departments to share good practices as a means to develop the needed expertise. The Secretariat played a more active role in the development of Annex B to the President's Report by providing individual mock-ups, which identified expectations that analysts thought suitable. The most recent Canadian Centre for Management Development workshop on reporting of results also adopts a proactive approach to communicating good practices. Nonetheless, considerable improvement in performance reporting is still required. We believe that more documentation and communication of good practices is needed to help departments produce better information on what they have accomplished.
5.122 The Treasury Board Secretariat should strengthen its effort to document and communicate good practices by departments in articulating performance expectations, measuring results, and using the information to improve programs and report accomplishments.
Departments need to see that the Treasury Board and its Secretariat use performance information5.123 Measuring and reporting on results will take hold in government only if the information is found to be useful to program managers and seen to be useful to senior government decision makers. Use by others outside the departments can play a key role in encouraging a focus on results. When the Treasury Board and its Secretariat use or demand information on results, they demonstrate commitment and leadership and provide key incentives to continue improving the reporting of results. Departmental officials have told us that if they do not see performance information used in making decisions, it will be hard to justify efforts to improve reporting. At the time of this audit, it was too early to assess central agencies' use of the information in the fall Performance Reports.
5.124 The Treasury Board Secretariat's evaluation of the business planning process highlighted other problems. Secretariat analysts recognized that although Business Plans were used to provide a general context for resource and program design decisions, they had not used information in Business Plans as they had intended, especially to comment on interdepartmental and horizontal issues. Review of Business Plans by the Treasury Board came too late for some departments to make adjustments for the current year. Review was still ongoing for some departments in early 1997. Also, some departments did not get the feedback they desired from the Treasury Board on specific issues.
5.125 The Treasury Board Secretariat should ensure that individual departments are aware of Treasury Board and Secretariat use of performance information provided in the departments' Business Plans and Performance Reports.
A Leadership Role for Standing Committees5.126 Parliamentary standing committees could play a leadership role in encouraging departments to manage for results, by asking for information on results and by visibly using it in their deliberations.
Standing committee use of performance information is important5.127 Visible interest in, and use of, performance information by parliamentarians, especially standing committees, can have two benefits: first, if departments believe that committees want the information, they will likely devote more effort to providing better information; and second, parliamentarians will be better served by that information.
5.128 Increased, but still limited, use by some standing committees of the performance information provided by departments. Committee hearings were held in 1995 and 1996 on departmental Outlook documents, Estimates Part III, the pilot Estimates Part III and five of the fall Performance Reports. The Performance Reports were tabled on 31 October 1996, giving limited time for review before the House adjourned in December.
5.129 Departments have said that standing committees pay only limited attention to dealing with the accomplishments of programs. Our review of the documents from these hearings showed that there was a small increase in references to and discussion of the performance information provided by departments. However, discussion of performance information provided to standing committees was still minimal.
Treasury Board Secretariat has actively involved Parliament5.130 Early in the initiative, the Secretariat turned its attention to developing an understanding of the needs of parliamentarians. A Parliamentary Working Group met several times with Treasury Board Secretariat officials. Consultations also took place with other parliamentarians, including standing committee chairs and the chair of the Subcommittee on the Business of Supply of the Standing Committee on Procedure and House Affairs. The officials supported the process leading to the motions passed in the House in December 1995 and June 1996, endorsing the key stages of the initiative. As the project advanced, Secretariat staff explained the intent of the changes through briefings of standing committees and appearances as witnesses at committee hearings. In addition, information sessions and focus groups were conducted with parliamentary staff, including the research staff of the Library of Parliament and procedural clerks of the House of Commons.
Parliamentary use of performance information could be facilitated5.131 As part of our audit work, we obtained the views and thoughts of a number of standing committee chairs about government efforts to improve information for Parliament and about the role their committees could play in fostering better management in government. They encouraged us to suggest ways that standing committees could be more effective in considering departmental performance information.
5.132 Current role for standing committees and MPs. The need to improve the parliamentary process for reviewing government expenditures has been widely recognized for at least 30 years, including in successive parliamentary committee reports. Members of Parliament have indicated three sources of frustration: the available information does not provide a reasonable perspective on performance; the volume of government activity makes in-depth review impractical; and, as noted in the 1993 Report of the Liaison Committee on Committee Effectiveness, MPs in standing committees see ``little point in committing precious time to reviewing estimates that will not be modified as long as the government has a majority."
5.133 Recently, action was taken to re-examine the parliamentary process for expenditure scrutiny. In June 1995 the Standing Committee on Procedure and House Affairs established a Subcommittee on the Business of Supply. This subcommittee was given a mandate to undertake a comprehensive review of the business of supply, with particular attention to the reform of the Estimates and to the related processes and mechanisms of the House and its committees. The subcommittee has issued several reports, supporting and providing guidance to the pilot projects associated with the Improved Reporting to Parliament Project. The subcommittee expects to issue its final report in early 1997.
5.134 Mechanisms for reporting to the House are important. Reforms in the mid-1980s allowed standing committees to report on all matters relating to the mandate, management and operation of the departments assigned to them, including program and policy objectives, expenditure plans and the departments' effectiveness in meeting their objectives. Although this broad mandate applies to reports on priorities and performance, more specific measures can help, at times, to clarify the mandate. For example, in February 1994, amendments to the House Standing Orders facilitated committee review and reporting in the spring on expenditure priorities and plans for future fiscal years. That committees have not yet done so may be linked to parliamentarians' lack of familiarity with the new documents and to limits on the time available to committees for review.
5.135 Review and reporting on the fall Performance Reports would provide another opportunity for committees to question and challenge spending plans and priorities of departments, based on what was accomplished in the preceding fiscal years. The 1994 rule changes clarified the power of committees to report on plans and priorities in the supply process (now to be supported by the spring Reports on Plans and Priorities). New arrangements may be needed to facilitate committee reporting to the House on fall Performance Reports.
5.136 Performance plans and reports, unlike the now discontinued Outlook documents , are Estimates documents. Their formal tabling in Parliament and referral to committees will assist parliamentarians in two ways. First, members and committees will be made aware of these documents at an opportune time; and second, they will receive them in accordance with established parliamentary procedures. The fall 1996 Performance Reports were tabled with the President of the Treasury Board's Annual Report. This practice is expected to continue.
5.137 Timing of reports to Parliament is also important. Parliamentarians would have liked to receive Outlook documents (now part of the spring Reports on Plans and Priorities) and the Performance Reports earlier, to allow more time for review. In order to influence early budget discussions and departmental implementation of previous budget decisions, the spring Reports on Plans and Priorities need to be provided in time for review and reporting before the summer recess. For committee review to be part of the budget consultation process, Performance Reports would have to be provided as soon as possible after the fall session of Parliament begins. Exhibit 5.3 illustrates the timing of EMS documents and their potential review by committees.
5.138 Other jurisdictions are also trying to increase the involvement of legislators in reviewing performance information. We observed earlier that many jurisdictions have made progress in implementing initiatives to provide better performance information to their legislatures. They are also trying to find ways to make the review of this information more meaningful to legislators. New Zealand, the Australian states of Western Australia and New South Wales and the province of Alberta are four that have recognized the need to develop the capacity for legislators to use the information more effectively (see Exhibit 5.13) .
Standing committees could ask a number of questions about departmental Performance Reports5.139 Members of Parliament can expect that, over time, performance plans and reports will:
- explain the context and business of the department and its programs;
- discuss the strategies being used to meet objectives;
- explain what exactly is to be accomplished in the next year or two, and at what cost; and
- describe in a credible manner what has been accomplished in relation to previously stated expectations and at what cost, and what has been learned.
Where reports provide information only on departmental activities (numbers of inspections, reports produced, etc.):
- What are the results of those activities? Why aren't those results reported?
- Where performance expectations are vague or unclear:
- How will we know if and when those expected results actually occur?
- Where accomplishments have been reported with no reference to a standard or some expected level of performance:
- Is the reported level of performance good enough? How can we tell? How much is left to accomplish and how long will it take? What is the department doing to improve its performance?
- What has been the department's contribution? What other factors are acting that influence or limit the department's efforts? What has the department done to develop a clear understanding of its influence?
- How reliable is the information? Why should we believe the information presented?
5.142 Standing committees may wish to consider strengthening their review, challenge and use of performance plans and reports from departments. Government should work with Parliament to develop means for enhanced parliamentary use of departmental performance information as input to the budget consultations.
5.144 Nevertheless, progress has been sufficient to allow us to find examples of good practice that, collectively, demonstrate that the key elements of adequate reporting to Parliament can be provided.
5.145 Although considerable improvement has been made in the reporting regime and in the clarity of communication, the content of performance expectations and reporting of actual performance still need to be improved. Reports need to focus more on key areas of performance so that the information required can be provided in documents that are short and readable.
5.146 It is important to maintain the progress we observed and to make more progress. To accomplish this, leadership by senior departmental management and the Treasury Board Secretariat is required. The Secretariat needs to strengthen the guidance and feedback it provides departments and its documentation and communication of good practices. As the Expenditure Management System continues to evolve, it is important that departments and the Treasury Board Secretariat ensure that the required capacities are present, including the co-ordination needed within these organizations and among organizations.
5.147 Progress will be significantly enhanced if departments clearly see use of performance information by the Treasury Board, its Secretariat, and parliamentary standing committees. More visible use by standing committees could be accomplished through stronger review by committees and a link to the budget consultation process.
Treasury Board Secretariat's comments: In 1995, the President in Chapter 1 of his report to Parliament, Strengthening Government Review, stated the government's commitment to "a management culture that is fact-based, results-oriented, open and accountable." Reporting on performance is a key element in the strategy to make these fundamental changes. It is also a key component of the Improved Reporting to Parliament Project. Accordingly, the government welcomes the Auditor General's support for this initiative.
Our strategy is to focus attention on clear commitments to results, to experiment with pilot reporting, and to involve program delivery and policy managers in measuring performance. The Auditor General's recommendations to departments and to the Treasury Board Secretariat are consistent with current actions and plans. As we move forward to extend fall performance reporting broadly to departments and agencies, we will be considering the Auditor General's recommendations, as well as parliamentary and public response to the reports. We will be tracking how these reports are used, in particular by parliamentarians.
In addition, the government has established an Independent Panel to advise on modernizing comptrollership in the federal government. This Panel will be looking at all aspects of comptrollership, including performance measurement and reporting.
We welcome the Auditor General's continuing attention to these matters. Updates on our progress will be provided annually in the President's Report to Parliament.
About the Audit
ObjectivesThe objectives of our audit were:
- to assess the government's progress in reporting departmental performance expectations and accomplishments to Parliament and central agencies through the Expenditure Management System; and
- to identify examples of good reporting of expectations and accomplishments to Parliament.
ScopeThe audit focussed on performance information in the Expenditure Management System. We concentrated on performance expectations and actual performance in Performance Reports, Estimates Part III, Business Plans, the In-Year Update and Outlook documents of 16 pilot departments for the period 1995-96 and 1996-97. We also assessed performance expectations set out for 32 federal departments and agencies in the 1996 Annual Report to Parliament by the President of the Treasury Board. In particular, we assessed improvements made in the availability and nature of the information reported. We focussed on information about results, rather than financial performance per se.
We examined the role played by Treasury Board Secretariat in implementing related innovations to the Expenditure Management System, encouraging departments to report the expected and actual results of their activities and obtaining feedback and participation of members of Parliament. We also examined the use of performance information in the hearings of parliamentary standing committees.
Finally, we looked at the experience of other jurisdictions in reporting performance and in using accountability legislation to ensure the implementation and continuation of efforts to improve reporting for results.
CriteriaWe assessed progress against two criteria. First, departments should report performance information that allows a judgment to be made on how well its programs are doing. More detailed expectations for good performance reporting are presented in Appendix B . We did not expect, at this time, to see all of these expectations reflected in all of the performance information being provided as part of the revised Expenditure Management System.
Second, Treasury Board Secretariat should have procedures in place to ensure a reasonable rate of progress in improving performance reporting within the Expenditure Management System. We expected to see evidence of adequate guidance and direction, leadership, demonstrated commitment, supporting incentives, building of expertise in departments and support for learning and adapting.
ApproachWe examined the relevant documents reporting on performance that form part of the Expenditure Management System. For all hearings on Outlook documents and Main Estimates documents for 1995 and 1996, we reviewed standing committee testimony and reports for references to departmental performance and looked for references to performance information in information requests made by standing committee members to parliamentary research staff.
Where appropriate, we relied on evaluations of the 1996 business planning process, the experience with pilot Estimates Part III and fall Performance Reports conducted by Treasury Board Secretariat staff.
We reviewed the literature on accountability legislation in other jurisdictions and interviewed officials from selected jurisdictions.
We conducted interviews with departmental and Treasury Board Secretariat officials and discussed improved reporting with a number of standing committee chairs. We also interviewed departmental officials and House of Commons staff with respect to the use of performance information by parliamentary standing committees.
We attended selected meetings of Treasury Board staff and departmental officials, participated in training sessions and observed several standing committee hearings.
Audit TeamStan Divorski
For information, please contact John Mayne, the responsible auditor.