This Web page has been archived on the Web.

1998 December Report of the Auditor General of Canada

Main Points

21.1 In our 1993 Report, we raised a number of concerns about the performance of CIDA's Bilateral Programs. Our 1995 and 1996 Reports contained a description of CIDA's actions to implement an ambitious program of change and renewal. In this chapter we report on the extent to which CIDA has resolved the main concerns we raised in 1993.

21.2 CIDA has adopted the concept of results-based management to change the Agency into a more results-oriented and learning organization. Progress is evident. CIDA has introduced a Results-Based Management Policy Statement, and supporting policies and guidelines have been developed and communicated to staff. It has established a Framework of Results and Key Success Factors as a guide for its managers and partners to focus more on results as they manage projects. However, progress in applying the principles of results-based management to CIDA contracts has been slow.

21.3 CIDA is in the process of developing, reviewing and updating Regional/Country Development Policy Frameworks for all major countries and regions. Some of the more recently developed Frameworks state objectives and expected results. However, the expected results are not compared with actual results, one of the key means of assessing performance.

21.4 At the project level, where official development assistance (ODA) is actually delivered, we found a greater emphasis on results, but uneven progress toward actually managing for results. Critical assumptions for project success and expected results have been articulated and a system of annual progress reporting implemented, but the statements of expected results are not always realistic. Further, once critical risks have been identified and assessed, they are not consistently monitored and addressed. This calls into question whether the expected benefits of projects will be realized. Projects are not systematically monitored after funding ends to determine whether the results expected have been achieved.

21.5 CIDA prepares internal reports on performance, reflecting its increased ability to assess and report systematically on its performance. It also submits an annual Performance Report to Parliament; this would be improved if CIDA showed progress toward results for individual projects in the context of the results expected. In reviewing the Performance Report, we concluded that more rigour is needed to ensure that information is clear, meaningful, and accurate. The delivery of ODA in a great many countries and under uncertain political and economic conditions is a very complex and challenging undertaking, one that results in varying degrees of success. We believe it is important that CIDA bring out this dimension in its reporting; a more balanced picture would enhance the credibility of its reports.

21.6 We concluded that CIDA's actions have addressed the main concerns raised in 1993. The Agency now needs to keep up its momentum in implementing management for results. It has much of the supporting framework in place. To close the accountability loop, it needs to continue working on the measurement and reporting of development results.

Introduction

21.7 CIDA is responsible for administering about 78 percent of Canada's International Assistance Envelope (IAE). The remainder is delivered by, among others, the Department of Finance, the Department of Foreign Affairs and International Trade and the International Development Research Centre.

21.8 CIDA delivers its international assistance through a variety of channels. Geographic programs, also referred to as bilateral or country-to-country programs, account for nearly half of its program expenditures. The rest is delivered through funding to multilateral institutions such as United Nations programs, to international financial institutions such as the Regional Development Banks, and to a multitude of non-governmental organizations including universities, businesses and not-for-profit organizations.

21.9 As a result of Program Review, Canada's budget for international assistance was cut from $2.6 billion in 1994-95 to $2.1 billion in 1997-98; CIDA expenditures for Official Development Assistance (ODA) within this envelope were reduced from $2.2 billion in 1994-95 to $1.7 billion in 1995-96 and have remained relatively constant since then (see Exhibit 21.1) . In April 1995, the government transferred the management of the program for Central and Eastern Europe and the Newly Independent States from the Department of Foreign Affairs and International Trade to CIDA. Although the program does not fall within the ambit of Canada's ODA, it is intended to be consistent with the foreign policy's six priorities of official development assistance.

21.10 For 1998-99, the IAE budget is about $2 billion; $1.6 billion of this will be spent on CIDA programs. Bilateral programs (Geographic Programs and Bilateral Food Aid) account for approximately $790 million, or about 50 percent of the Agency's total expenditures (see Exhibit 21.2) .

Focus of the audit
21.11 This audit was the third and final phase of a planned follow-up. Phase I consisted of two parts: first, the Agency's self-assessment of the actions taken as part of its renewal to address the principal concerns raised in our 1993 Report; and second, its development of a Performance Measurement Model (now called the Bilateral Project Performance Review System). Phase II covered CIDA's progress in implementing the model to measure and report on the performance of its bilateral programs and projects. This audit was Phase III and followed through on our 1993 work. Details on how we approached this work are presented at the end of the chapter in About the Audit .

21.12 Because ODA is delivered through projects, our audit was focussed primarily at the project level. We reviewed the Agency's programming framework in the countries selected for audit and its linkage with projects in those countries. We examined 28 projects in five countries, paying particular attention to planning, managing for, measuring and reporting results. In the course of our work, we heard many favourable comments by CIDA's country partners about the dedication and co-operation shown by CIDA staff. Our country-specific findings are presented in case studies.

Observations and Recommendations

Resolving Conflicts Among Objectives

Aid is tied more closely to foreign policy objectives
21.13 In 1993, we were concerned that it was difficult for CIDA to concentrate on its objective of putting poverty first and encouraging self-reliance while, at the same time, it had commercial and political objectives that did not always lend themselves to dealing with poverty in a direct way and that encouraged external dependency.

21.14 We felt that competing or conflicting objectives should be pursued only consciously and in full realization of the impact on the primary objectives. At stake was CIDA's ability to pursue its development mandate effectively.

21.15 In 1995, the government released its foreign policy statement Canada in the World , which set out a new mandate for Canada's Official Development Assistance (ODA), and defined international assistance as an integral instrument of foreign policy:

The purpose of Canada's ODA is to support sustainable development in developing countries in order to reduce poverty and to contribute to a more secure, equitable and prosperous world.
21.16 Canada in the World mandates Canada's ODA to concentrate available resources on six program priorities: Basic Human Needs; Women in Development; Infrastructure Services; Human Rights, Democracy and Good Governance; Private Sector Development; and Environment.

21.17 Canada in the World helped CIDA to address problems of conflicting objectives. It led to a policy framework that provides a structure for linking CIDA's development efforts to Canada's foreign policy more rationally than had been the case under the previous strategy. Further, it has helped CIDA to communicate and explain its work to its partners and stakeholders.

21.18 CIDA budgets its bilateral ODA on a country basis. The need to concentrate assistance efforts on a limited number of countries has been a long-standing issue. In Canada in the World , the government committed itself to doing so while maintaining programs in other countries through low-cost, administratively simple delivery mechanisms. Over 120 countries received Official Development Assistance from Canada in 1996-97; the 20 largest beneficiaries accounted for just under 63 percent of the total amount. This represents an increase from 1993-94, when the 20 largest beneficiaries received 58 percent. As shown in Exhibit 21.3 , the amount of assistance provided to countries rated by the Organization for Economic Co-operation and Development (OECD) as ``least developed" and ``other low income" has remained constant at about $458 million, or 55 percent of CIDA's ODA expenditures.

21.19 Canada in the World calls for 25 percent of ODA to be committed to the program priority of Basic Human Needs, which includes emergency humanitarian assistance. In 1996-97, 38.4 percent of CIDA's ODA program expenditures went to Basic Human Needs, including food aid and humanitarian assistance (33.9 percent when emergency humanitarian aid is excluded).

Implementing Results-Based Management

Progress toward results-based management is evident
21.20 As a result of its 1992 Strategic Management Review and our 1993 report on its bilateral programs, CIDA launched an ambitious renewal initiative in 1994. It made a commitment to Parliament to transform the Agency into a more results-oriented, accountable organization. CIDA adopted a policy of results-based management as an Agency-wide initiative to enable it to systematically address these commitments (see Exhibit 21.4) .

21.21 To make the policy operational and to integrate the concept of results-based management into project delivery, the Agency undertook several key steps:

  • developed an Agency Accountability Framework;
  • assembled practices, policies and guidelines into a Roadmap for practitioners;
  • articulated its Framework of Results and Key Success Factors;
  • developed Regional/Country Development Policy Frameworks; and
  • provided extensive training.
These and other actions represent significant advances by the Agency in putting into place the supporting framework for implementing results-based management.

An accountability framework has been developed
21.22 In 1993 we highlighted a need to clarify the extent to which CIDA is accountable to Parliament for managing for results, and the extent to which CIDA's staff and its partners are respectively accountable for obtaining results. In July 1998, CIDA's executive committee approved an accountability framework. It identifies accountabilities of the Agency overall; it further articulates the accountabilities of the President, branch heads, and key managers in each branch.

21.23 The basic definition of accountability in the framework requires that CIDA identify its objectives and demonstrate that the resources allocated to it for official development purposes are managed to achieve the intended results. It also requires that the Agency report the results achieved in its development programs to Parliament and the Canadian public. The framework indicates that, through its partnerships with developing countries and Canadian and international partners, CIDA shares accountability for development results and its own accountability must be viewed in that light. CIDA accepts responsibility and accountability for monitoring actions by its recipient partners as well as other events that may affect development goals, and for acting to ensure that momentum toward these goals is maintained.

21.24 This shared accountability for ODA delivery makes it especially important for CIDA to follow through on its programs and projects so it can be in a position to report on whether Canada's ODA is producing the expected results. This type of information would be useful to include in CIDA's Performance Report to Parliament.

21.25 CIDA accepts that it is fully accountable for achieving operational results. This means that it is fully accountable for the setting of objectives, formulation of policies, selection of development initiatives, allocation of resources, monitoring and performance reporting. This also involves identifying expected outputs and outcomes, assessing related risks and monitoring them, and taking appropriate corrective action.

Work on results-based contracting has slowed considerably
21.26 The Agency has not been as successful in its attempt to develop results-based contracting. CIDA delivers nearly all bilateral ODA through contracts with Canadian executing agents (CEAs). In 1995, it recognized the need for a contracting approach that would define the respective accountabilities and risks of both the executing agents and CIDA. In 1996, CIDA consulted with its CEA community. A joint task force, representing CIDA and the community, defined results-based contracting as "a contracting method which employs an iterative and participatory approach to results definition and seeks to promote the achievement of development results through appropriate contractual terms such as the statement of work, the basis of payment (including incentives) and mutually accepted performance indicators."

21.27 The report by the task force in June 1996 expressed some concerns about this approach; to address them it articulated a number of guiding principles. It called on the Agency and the supplier community to use the ideas and suggestions in the report and further develop the necessary policies, programs and operational practices needed to implement results-based contracting. However, CIDA deemed them too impractical to implement. Since then, progress has slowed considerably. The Agency told us that the complexity of trying to develop a results-based contracting approach was much greater than expected. In July 1998, a committee chaired by the President was established to review and simplify CIDA's contracting regime. Part of its work will be to recommend changes needed to implement an approach to contracting for results.

21.28 CIDA should make every effort to identify practical steps that it can take to incorporate the principles of results-based management in its contracts, and experiment with these steps in selected projects.

CIDA's response: CIDA will continue to follow the course of action developed by the Committee for Improving Contracting. Based on the results of pilot projects, it will pursue a practical and iterative approach to contracting for results and apply the lessons learned, where feasible, to its contracting process, recognizing that this is a long-term endeavour.

Risk management and monitoring need to be more consistently integrated into projects
21.29 Our 1993 audit of CIDA's bilateral programs noted the need for improvements in the way CIDA manages its operating risks, and CIDA then took steps to improve its risk management. Project managers were expected to monitor anticipated risks and use the information thus gathered to adjust project activities as necessary. CIDA's Results-Based Management Policy Statement and its Framework of Results and Key Success Factors called risk management integral to results-based management at the level of both country and project. We found that CIDA has improved its guidance and training on the identification and assessment of risk. Managers are trained to include in approval documents critical assumptions about risks to the outputs, outcomes and ultimate impacts of projects. However, there is little practical guidance on how to collect and use information on project risks, except for financial risk, as these evolve over time. CIDA would benefit from developing more specific guidance and tools in this area.

21.30 Our 1993 audit underlined the need to be able to obtain early warnings of project difficulties. In most of the projects we examined, we found limited ongoing monitoring of risks that had been identified in the planning phase, and no deliberate strategy for managing them. In their project reports to CIDA headquarters, Canadian executing agents normally discussed activities, not risks. In a number of projects, problems arose that had been at least partially predicted through risk analysis in the planning phase.

21.31 CIDA guidance and training have so far dealt with external risks - those outside the direct control of project managers, and often related to the level of commitment by the recipient country. The guidance does not address the equally important job of managing risks that are more internal to the project - for instance, those related to critical tasks, key personnel, or project complexity.

21.32 CIDA should broaden its guidance on risk assessment and risk management to include internal risks as well as external risks to a project.

CIDA's response: CIDA will strengthen its identification, assessment and management of internal risks of projects by enhancing the guidelines in the bilateral Roadmap and by reinforcing risk management in its training on results-based management.

Determining whether benefits have endured is essential
21.33 In order for development to occur, the benefits from a project should last beyond the termination of aid funding. The sustainability of benefits needs to be considered throughout the duration of CIDA's involvement, and monitored afterward to determine whether expected benefits have endured. Exhibit 21.5 lists four key indicators of project self-sustainability. We noted that when events put at risk these conditions for sustainability, reactions by project managers were inconsistent. Some took action to deal directly with the risk. Others tended to continue the project with minor adjustments or consider applying additional resources, rather than stand back and analyze more fundamentally whether the project should be substantially changed or discontinued.

21.34 Generally, sustainable programming requires commitment to projects over a long time frame - possibly 10 to 15 years. The commitment must be supported by a strategy addressing the level of programming that is appropriate, the participation of other donors, and the recipient country's capacity to assume the project activities fully once CIDA's funding ends. We found that CIDA's five-year planning horizon does not cover the period of time that can be expected to elapse before the outcomes of these longer-term projects begin to take effect. Consequently, some of the projects cannot be sustainable unless their initial phases, usually five years, are extended by another one or two phases; yet planning documents do not always reflect this consideration.

21.35 CIDA and other donors are leaning increasingly toward "softer" projects, such as the institution-strengthening and capacity-building projects we examined. Because managing and measuring results achieved from these projects present a new set of challenges, and the projects are related indirectly to poverty reduction, CIDA needs to closely monitor its own experience and that of others to ensure that this type of program tool is producing the expected results.

21.36 Applying results-based management to this softer type of project is particularly challenging. For example, expected results of development are hard to define and perhaps even harder to measure in projects that involve institution strengthening and capacity building. In most cases, longer-term outcomes and impacts of development do not begin to show until years after CIDA's direct involvement in funding ends.

21.37 To focus on results, however, CIDA needs to monitor whether the expected shorter-term effects of a project have been achieved and are likely to lead to the desired longer-term development impacts. Even though a project may be finished and CIDA funding ended, it is still important to monitor in a cost-effective way whether the expected results have materialized. We found that CIDA does not systematically continue to monitor projects after their completion to determine whether the results expected in the short and long terms have been achieved. Lack of this information precludes both possible action to increase the prospects of long-term results and an opportunity to learn lessons for future projects. Currently, CIDA obtains this kind of information in a limited way only through periodic performance reviews of its development priorities.

Reporting on Project Performance

The quality of performance reporting is inconsistent
21.38 Although the elements of results-based management are becoming embedded in project delivery, the first key step - defining results that are realistic - has not been easy. In most of the projects we examined, the initial statements of expected results - particularly long-term impacts - were unrealistic, resulting in a subsequent need to modify the project or its goals, and presenting problems for reporting against expected results.

21.39 Results-based management guidelines and policies issued in 1996 refer frequently to expected results, and to comparing them with actual results. In practice, however, the focus has been on the immediate or short-term results of projects. Although statements of expected impacts or longer-term effects are developed along with quantitative or qualitative indicators of them, they are not integrated into the project management system for aid delivery. For example, there is no provision for longer-term monitoring to assess these indicators and report on whether development goals have been achieved.

21.40 The Policy for Performance Review, the Results-Based Management Policy Statement, and the accountability framework provide the structure for CIDA's performance measurement and reporting systems. The Annual Project Progress Report system (APPR), begun in 1996, provides the basic tool for tracking active projects with a value over $100,000.

21.41 Once project managers complete these progress reports, branch management reviews them, and uses them to track the results of the projects. They are also used at the country and branch levels to address the difficulties identified in them. The information in the APPR includes project expenditures, expected and actual results (outputs, outcomes and impacts), lessons learned and progress ratings. The individual reports are compiled and a Bilateral Branch Achievement Report is produced. These reports along with audit and evaluation reports are used as inputs to prepare the Agency's Performance Report to Parliament.

21.42 We reviewed the 1997-98 APPRs on the projects we had selected for our audit to see whether, based on our understanding of the projects, the information provided was accurate and meaningful. We found that most of the reports reflected the status of the projects fairly. However, in some APPRs we noted that the information was stated in terms that were more positive than the situation warranted. Also, actual results could not always be readily compared with the statements of intended immediate and short-term results. The Annual Project Progress Report system is still evolving. More management attention is needed, in our view, to ensure that the information provided in the reports is accurate and meaningful and that it is used in deciding on the future direction of projects.

CIDA's Performance Report to Parliament

More meaningful, accurate and balanced reporting is needed
21.43 Along with other government departments, CIDA submitted its 1996-97 Performance Report to Parliament in the fall of 1997. To determine how its performance reporting system works in practice, we reviewed the way CIDA produced the Geographic Programs section of its Performance Report for the fiscal year ended 31 March 1997. The Annual Project Progress Reports (APPRs) are the main source of the project information presented in the Performance Report; the latter cites many examples of the results produced by CIDA projects. We compared the results of 33 illustrations shown in the section on Geographic Programs with the results shown in the APPRs for the same projects.

21.44 The Performance Report describes a number of projects to illustrate performance. The Agency points out that international development activity does not yield meaningful results in neat financial year intervals. Consequently, most of the illustrations present the results of activities that span a number of years, but the distinction is rarely made between these results and others that were the result of only one year's activity. Where the results shown cover several years of activity, the number of years is not shown. No information is given on the resources used to achieve the results shown for specific projects. These resources may include not only CIDA's expenditures but also the human and financial resources provided by other donors and recipient governments. Also, while the illustrations describe achievements, they do not describe them in the context of the results that were expected. Nor do they describe the significance of these specific projects in the context of the overall development program for the country. This information would provide a more meaningful picture of the Geographic Program's performance.

21.45 We also noted instances where the stated results were not consistent with those in the Annual Project Performance Reports (APPRs). These included, for example, showing expected rather than actual results or results beyond those shown in the performance reports. In our view, more rigour is needed in reviewing and compiling information for the Performance Report to ensure that it is clear and accurate.

21.46 CIDA delivers ODA in many countries under uncertain political and economic conditions and enormous constraints of infrastructure and geography. This makes project success and assessment of development results very complex and challenging. We noted that the branch achievement reports include numbers and percentages of projects that are progressing satisfactorily, those that have manageable problems and those with serious problems. However, the 1996-97 Performance Report excludes this information, although the Agency has reported it in the past. We believe it is important that CIDA bring out this dimension in its reporting; the credibility of its reports would be enhanced by a more accurate balance between positive accomplishments and areas where expected results could not be achieved. In the latter cases, CIDA could describe the types of actions it has taken to deal with problems or the way it will apply lessons learned to future projects.

21.47 CIDA should more rigorously review the quality of the information in its Performance Report, and present a more meaningful and balanced picture of its performance.

CIDA's response: CIDA will continue to improve the consistency, accuracy and reliability of its performance information so that a more balanced and meaningful picture can be reflected in its Performance Reports.

Core indicators to measure development progress are being developed
21.48 In its Performance Report, CIDA acknowledged that it is possible to measure outputs (the immediate, visible, concrete and tangible results of a project) and outcomes (the achievement of the purpose identified for the particular project). However, it questioned the feasibility of using aggregated performance information from the project and institutional levels to comment definitively on the Agency's overall performance or on its performance at the country or program level.

21.49 The Agency points to the difficulty of measuring impacts - that is, broader, higher-level, long-term benefits to the community, country or group. There are several factors that make the task highly problematic. A key one is the difficulty of attributing a particular development impact to a specific CIDA contribution. A second is delay: generally, outcomes and impacts can be expected only several years after projects have been completed. It is difficult to track them after the beneficiaries have taken ownership of a project. CIDA points out that other donors also experience these difficulties. The Agency is sharing with them its approaches and its performance data to find ways of addressing these concerns.

21.50 We agree that it may be difficult to measure outcomes and impacts and to attribute them to a single project or a single donor such as CIDA. Nonetheless, the Agency allocates bilateral program funds on a country or regional basis. The strategic planning process culminates in a multi-year plan at the country or regional level, based on an analysis of the various factors and risks underlying CIDA's interventions. We noted that some of the more recent Regional/Country Development Program Frameworks state the development results that are expected at the country level. CIDA could attempt to periodically assess whether these expected results have been achieved at the regional/country level.

21.51 Reporting on Canada's ODA program is often done in terms of amount spent - usually as a percentage of Gross National Product (GNP), which is then compared with the standard international spending target for ODA, 0.7 percent of GNP. In 1996-97, Canada's spending on ODA was about 0.34 percent of GNP. This indicator is widely used internationally for intercountry comparisons of spending on official development assistance.

21.52 While this may be a useful indicator of spending on aid effort or commitment to it, it is an indicator of input rather than output: it does not measure the results or the effectiveness of aid. Developing indicators of results is a logical next step in CIDA's results-based management process. CIDA has been supporting and working with the Development Assistance Committee of the OECD to develop a working set of core indicators for measuring the progress of development efforts. An illustration of these possible indicators is shown in Exhibit 21.6 . Developing and reaching international consensus on core indicators of results is a complex undertaking, and the current effort is still very much a work in progress. We encourage CIDA to keep up its efforts with other donors, multilateral institutions and developing countries to refine and reach consensus on a set of indicators.

21.53 Once this is done, CIDA could use the indicators in its reports to describe a country's development results. Because donors are acting more in concert, the results will likely not be attributable directly to any one donor, but CIDA could show how its projects have contributed to the overall results. What is important is that lasting development results be achieved, not that they be attributed directly to the intervention of any particular donor. This type of reporting would reinforce CIDA's move to focus more on development results, and would improve accountability for the effective use of ODA funds.

Counterpart Funds

CIDA has responded to 1993 concerns but better control is needed over payments into counterpart funds
21.54 In June 1994, CIDA published a set of new directives on counterpart funds. The new directives defined the respective accountabilities and responsibilities of CIDA and the recipient country for managing counterpart funds. Counterpart funds are a development instrument whereby Canadian commodities or goods are converted to aid funding. CIDA approves a project amount, which it refers to internally as a line of credit to the recipient country, to finance the purchase of the commodities or goods in accordance with its Canadian content policy. A Canadian supplier sells the product in the recipient country and CIDA pays the supplier in hard currency. The policy requires that normally the same amount in local currency then be paid into the counterpart fund by the local buyer as described in the agreement with the recipient country. The counterpart fund legally belongs to the recipient country; it is usually controlled by a joint board and used to fund development activities. In 1993, we were concerned about CIDA's lack of assurance that all moneys in counterpart funds had been spent for the intended purposes and had been accounted for fully.

21.55 In 1997, CIDA undertook an audit of counterpart funds. An interim audit report in March 1998, based on the audit of three funds, noted weaknesses in the flow of money into all three that had resulted in the appropriate amounts not being credited to the funds. In one case, the internal audit could not provide assurance that the recipient government had deposited into the fund the full amount that was required, and that funds had been used for the purposes intended. The audit also concluded that CIDA should re-examine the issue of the ownership of the funds as well as reporting on results achieved from their use. As a result, the counterpart fund policy is being revised to address these concerns.

21.56 We examined three counterpart funds in Bolivia, Bangladesh, and Senegal. Given that the structure of each counterpart fund is unique to the country in which it operates, we cannot form an opinion on the management of counterpart funds in general. We found that for the funds we examined, CIDA had adequate assurance that all moneys had been spent for development purposes and had been accounted for fully. However, in two of the funds we found problems in control over the flow of moneys that resulted in less than the full amounts being placed in the funds. In one case, CIDA allowed payment into the fund of less than it had paid to the Canadian supplier (see Exhibit 21.7) . In the other fund, the memorandum of understanding with the recipient country allowed it to withhold from the amount to be deposited up to 7.5 percent in incidental charges. We noted that there had been no study, as called for by CIDA policy, to establish the amount of the net proceeds to be placed in the fund; the recipient country consistently withheld the full 7.5 percent. There was no accounting for how these funds had been spent, as we would have expected in the absence of such a study. Over the last three years, these charges averaged $930,000 per year. Given the potential for problems of this nature, CIDA will need to ensure that it applies its policy on payments to counterpart funds more consistently.

21.57 Where a transaction results in a payment into a counterpart fund of less than the full amount paid by CIDA, CIDA project managers should document the rationale for the transaction.

CIDA's response: CIDA will strengthen its procedures to include the need to document adequately the rationale for this kind of transaction in the management of counterpart fund projects.

Institutionalizing Lessons Learned

CIDA has made considerable progress
21.58 In 1993 we were concerned that CIDA did not have an organizational "learning culture". We urged the Agency to institutionalize lessons learned at the project and country levels. In response to our comments, CIDA agreed to initiate a simple and clearly understood mechanism that would allow lessons learned to be captured and accessed.

21.59 CIDA undertook formal and informal steps to strengthen its learning culture. Lessons learned are an integral part of the Policy for Performance Review and the Results-Based Management Policy Statement. They are one of the key elements of the Annual Project Progress Reports and the end-of-project reports, and are incorporated into branch-level achievement reports. Lessons learned are being synthesized as part of each corporate review carried out by the Performance Review Branch. The Agency has set up various vehicles to gather and disseminate lessons learned. Examples of these vehicles are the President's Forum on best practices, and informal networks such as the results-based management practitioners' network and others that discuss areas like Women in Development, and Environment.

21.60 CIDA has made considerable progress in incorporating the concept of lessons learned into the results-based management approach. The system for capturing lessons learned relies mainly on the APPR. The quality of information on lessons learned varies considerably among APPRs. CIDA needs to do more to consolidate, analyze and disseminate this information to make it useful to staff. Otherwise, the risk is that project staff will not see the exercise as useful and its value to the organization and to performance reporting will be lost.

Audit and Evaluation

Further effort is needed on performance review
21.61 In July 1994, CIDA introduced its Policy for Performance Review. This policy established a framework for the audit and evaluation functions and has had the effect of creating a closer connection between the Performance Review Division and operational branches at CIDA. The Policy also called for a Performance Review Committee with membership drawn from inside and outside CIDA, in order to demonstrate an openness to renewing performance review and to benefit from the experience of other departments. However, this committee met only twice, in 1995. We were informed that the Executive Committee performs this role now. In 1997, the Performance Review Division obtained Branch status and began to report directly to the President of CIDA, and its Director-General became a member of the Executive Committee.

21.62 In 1995, CIDA reported to Parliament that it planned to conduct performance reviews of the six ODA programming priorities over the next three years. However, significant slippage has occurred. To date, the Performance Review Branch has undertaken three reviews covering two of the priorities. The Branch indicated that these reviews had proved more complex and costly than planned. The reviews also incorporated a more comprehensive approach, designed to identify implications for policy development in the ODA priority areas as well as lessons learned from individual projects. In addition, the Branch was assigned the lead role in the development and implementation of results-based management across the Agency. The remaining reviews of the ODA priorities have now been rescheduled over the next few years.

21.63 We noted that the Policy for Performance Review did not address concerns we expressed in 1993 about project evaluations (now called management-led operational reviews). We felt then that there was a need to guard against a perceived lack of objectivity, mainly because project evaluations were commissioned by the Geographic Program staff. There was no ongoing quality assurance function in CIDA that examined whether minimum evaluation standards had been met.

21.64 The Branch has since undertaken several initiatives to help ensure more consistent quality of operational reviews. It has encouraged use of the Framework of Results and Key Success Factors as the basis for these reviews, and has established a standing offer list that Geographic Program staff can use to contract for the services of performance reviewers. In addition, on request the Branch provides guidance on the conduct of operational reviews.

21.65 These initiatives represent progress, but they still do not address our concerns. We believe that the Agency needs to monitor at the corporate level whether expected quality standards are being met.

21.66 The Performance Review Branch should selectively review the quality of operational reviews conducted by the geographic branches as part of its regular audits and evaluations.

CIDA's response: CIDA will, on a selective basis, review the quality of the operational reviews conducted by the geographic branches as one of the lines of inquiry in its audits and evaluations.

Conclusion

21.67 The Agency is now well into the renewal program it began in 1994. The framework for results-based management along with supporting policies and guidelines have been developed and communicated to CIDA's staff and development partners. We concluded that CIDA's actions have addressed the main concerns we raised in our 1993 Report. However, while a more determined focus on results is evident at the levels of country program and project delivery, progress in implementing results-based management has been uneven.

21.68 The Agency needs to maintain its momentum toward implementing management for results. In focussing on results, it needs to take a harder look at whether country programs or projects should be continued, scaled back or rethought when expected development results are at risk because of changes in critical underlying assumptions, absence of essential conditions for sustainability, or other factors that could have an adverse impact. The Agency also needs to monitor projects in a cost-effective way after their completion to determine whether expected results have been achieved.

21.69 The Agency has developed a performance reporting system that starts at the project level and ends with its Performance Report to Parliament. Both internal and external reporting can be improved by including information on expected as well as actual results at the project and country levels, and by presenting a more balanced picture of accomplishments, difficulties and challenges in delivering development assistance.

21.70 All donors recognize that it is difficult to measure development results at the country or program level. However, the Development Assistance Committee of the OECD is developing a working set of core indicators to measure development progress at a level higher than that of the individual projects. CIDA has contributed to the development of these indicators, which are still being refined and tested. As CIDA gains more experience in results-based management, indicators such as these or others that may emerge could provide it with a means of reporting program results to Parliament.


About the Audit

In 1994, CIDA and the Office of the Auditor General agreed that we would conduct a phased follow-up of our 1993 audit of CIDA's bilateral programs for official development assistance. The work would focus on actions taken by CIDA to address the main concerns raised in our 1993 Report. This audit represents the third and final phase of the "phased follow-up" approach.

Objectives

Our objectives were:

  • To assess the extent to which CIDA's actions in implementing a results-based management approach have satisfactorily resolved the main concerns raised in our 1993 Report.
  • To review the quality of performance reporting on development results.

Scope and Approach

Our audit continued to focus on CIDA's accountability to Parliament for managing for results. We reviewed the way CIDA plans and manages its bilateral programs and projects, and measures and reports their results.

We used a combined country- and project-based approach to conduct this audit. CIDA identified four country programs: Bolivia, Estonia, Senegal and Vietnam. CIDA believes that these best demonstrate the improvements gained by implementing results-based management in the bilateral development aid program. We added Bangladesh to provide a basis for comparison with our 1993 audit. In each of the identified countries, we selected a sample of projects for audit.

We conducted field work at Canadian missions and project sites in the selected countries. It included review of on-site documentation, as well as consultations with stakeholders such as Canadian and local executing agents, with host country officials and with representatives from other international donor organizations.

At CIDA's headquarters, we examined the process and the underlying reasons for selecting country programs and projects and the process for measuring and reporting their results. We also assessed the extent to which the key concerns raised in our 1993 Report had been satisfactorily addressed.

Our work did not extend to comparing CIDA's performance with that of other organizations providing bilateral aid.

Criteria

The criteria used at the country program level and at the project level were based on criteria proposed in our 1996 Report:

Country level

Results Achievement

  • Well-defined objectives that can be reasonably achieved are actively pursued.
Risk Management

  • Risks associated with the program and related activities are assessed and managed.
Performance Measurement

  • Relevant information on performance is obtained and used.
  • The extent to which programs and activities are meeting the Agency's performance expectations is evaluated and understood.
Project level

Results Achievement

  • Well-defined objectives that can be reasonably achieved are actively pursued.
  • Projects are followed up to determine whether project partners are achieving (or are likely to achieve) planned results.
Risk Management

  • Risks associated with the projects and related activities are assessed and managed.
Sustainability

  • There is follow-up to assess whether project results are likely to provide sustainable benefits after CIDA's direct funding comes to an end.
Performance Management

  • Relevant information on performance is obtained and used.
  • The extent to which projects and activities are meeting performance expectations is evaluated and understood.
  • Timely action is taken to improve project performance.
  • Periodic reports on the potential for results and corrective action taken where appropriate are presented to CIDA's senior management.
  • CIDA is providing assurance that money has not been expended for purposes other than those for which it was appropriated.
Institutional level

Learning Organization

  • CIDA has created an action-oriented organizational environment that promotes learning by doing, an innovative style of management and an openness to differing viewpoints.

Audit Team

Assistant Auditor General: David Rattray
Principal: John Hitchinson
Director: Manfred Kuhnapfel

Johane Garneau
Tina Guthrie
Jacques Marquis
Paul Morse
Barry Neilson
Jaak Vanker

For information, please contact John Hitchinson.