Systems Under Development Best Practices Symposium

Now for Something Completely Different

Gordon Brackstone

Gordon J. Brackstone is Assistant Chief Statistician, Informatics and Methodology at Statistics Canada. He is responsible for the development and application of statistical methodology in Statistics Canada, as well as for the Agency's computing services. Former positions include Director-General of the Methodology branch, Statistics Canada, Chief, Surveys and Data Acquisitions, in the Central Statistics bureau of British Columbia, and various positions at Statistics Canada.

A graduate in statistics from the London School of Economics, he has been closely involved with Census methodology and with the measurement of the quality of data from censuses and surveys.


I should probably start by explaining how I come to be here talking about the works of Monty Python. Those of you expecting dead parrot sketches or marriage counselling sessions, I'm afraid, are going to be disappointed.

What happened was that, after the representatives of the Auditor General's Office met with us a couple of times in the early phases of their study, they concluded that some of the practices we were using at Statistics Canada were different or uncommon enough that they deserved further study and exposure. Hence, their invitation to me to give a presentation in this symposium about the way we do business in Statistics Canada.

As for the title, I will leave you to judge whether the approaches we are using are significantly different from practices elsewhere, and if they are different, whether they are portable to other organizations in other lines of business.

We have already heard today that there are no magic pills for all situations and I'm certainly not pretending that we have found any. But I can assert that the practices we use work for us in our business. In my opinion, they have been among the factors that have helped Statistics Canada gain a reputation of being among the best statistical agencies in the world.

I must say that, having attended some of the Technology in Government sessions last week, as well as our sessions this morning, I have heard much that resonates with what I had intended to say. So, maybe this won't be so completely different after all. Perhaps it will be an example of one experience in government with some of these approaches.

I started by asking myself: what are the features that correlate with success in systems work, in our experience? Why do some projects work out well and some not so well? It is tempting to ascribe success to specific characteristics of individual projects—to a particularly effective project manager, or a particularly favourable match of technology and needs, or a particularly flexible client, and so on. These are important features, but they are not guarantees of success, especially if the environment in which projects take place is not conducive to success.

It's the environments of which I want to speak mainly today. I want to describe the approach that we have taken at Statistics Canada to create frameworks and a culture in which, we hope, sensible business decisions can be made and implemented.

While I will be emphasizing the informatics dimension of this environment, it must, of course, embrace all projects, whatever their information technology content. If there is one general message that I would like to leave from this talk, it is one that we have heard already today. It is that the information technology dimension of projects must not become a separate end in itself, that one key to success in systems under development is to keep them focused within their larger business context. It's when they become separated from that context and lose this focus that they can take on a life of their own and, in the worst case, become pure information technology projects.

I identify three frameworks that, we feel, are needed to create the right environment within which projects can flourish. First, framework decides which projects get started and with what expectations. Choosing the right projects to start is clearly crucial. To be facetious, the easiest way to achieve 100 per cent success is not to start those projects that are going to fail. This is an important part of the process.

Second, when we have decided to start a project, the project management framework should then provide an environment that helps to ensure that things happen as planned.

Third—and here I'm focusing on the information technology dimension—an information technology framework defines the corporate strategies and the constraints on IT choices within projects, to ensure that the resulting systems fit efficiently within the corporation's overall computing environment.

I'm going to talk mostly about the first of these frameworks, a little about the second, and hardly at all about the third. This allocation of time reflects the degree of portability of our procedures to other institutions, the information technology framework being the most specific to Statistics Canada.

I think you need to know a few features or characteristics of Statistics Canada and its business to appreciate what I'm going to describe. Here are some features that may distinguish Statistics Canada's business from many other organizations'. First, we are in the information business. Our product is information, by which I mean both data and interpretation. These days, it seems to me, the word "information" has been commandeered by the computing world, as in the phrase "information technology", but with or without technology, Statistics Canada is in the information business. For us, data are an end and a product in themselves, not just something to be processed in support of other goals.

Our programs are characterized by their number and variety. While we have a few large ones— or at least large by our standards, such as the population Census—most are small. We operate more than 400 surveys of various sizes and frequencies, and as well as survey programs, we have analytic programs, dissemination and marketing programs, and the usual administrative programs, each with its own requirements. The point is that we are not running one monolithic program, and yet, coherence between the products and the outputs of these varied programs is an important concern.

Confidentiality is a particular concern for us because, on one hand, we absolutely have to safeguard and prevent unauthorized access to the confidential micro records, which underlie statistical aggregates. On the other hand, we want increasingly to offer our clients electronic access to the aggregated data that we publish. Reconciling these two requirements puts a significant constraint on our computing systems.

It goes without saying that Statistics Canada has to maintain a very strong client orientation, to remain relevant and to ensure that its prime users, particularly within the federal government, continue to support the Agency's activities and funding, and also so that we can maximize the revenues we obtain from the sale of our products and services.

Over the past decade, we have put in place a planning system that has served us well in selecting new initiatives to launch, in implementing efficiencies and in handling budget reductions. I'm not going to try to describe it in full detail, but only to extract some of the features that, I think, are pertinent to today's agenda. They reflect three particular elements of Statistics Canada's management approach.

Our planning system requires documented proposals and produces decision records that are available to everyone. The process is visible. All can see what is being proposed and what is being decided.

It also supports the notion that program managers should have the authority to decide what is best for their programs within general corporate frameworks and within budgetary constraints.

On budgets, it pursues the notion of full costing of all resources used so as to give program managers freedom to choose the combinations of resources that are best for their programs within their overall budget envelopes. We use the phrase "a dollar is a dollar" to describe this. It's really consistent with the notion of operating budgets, which has recently been introduced.

It's this last feature that is crucial in the context of computing resources because, to implement this feature in the case of computing, we have introduced and evolved an EDP information technology cost recovery scheme within the Agency. Those parts of our informatics organization that are responsible for providing computing services—our mainframe services, mid-range, micros, communications—are entirely revenue-dependent. Their former budgets are in the hands of users. The prices for their services are developed to reflect the real cost to the Agency for providing those services. All the operational costs, salary and non-salary, are recovered.

For those elements of capital equipment that are shared—and those are primarily the mainframe and its ancillary equipment, and the communications infrastructure—capital replacement costs are built in to rates. Our informatics branch is then responsible for funding the necessary capital replacement of these items from accumulated revenues, with the corporation acting as a banker across fiscal year boundaries, if necessary.

Funding of the initial acquisition and replacement of capital equipment that is not shared is the responsibility of program managers. The rates I referred to are recommended each year by our informatics committee, which is one of our management committees, and it has on it representatives from user areas, as well as from the informatics area. (I should say that the competition amongst our directors to serve on this committee is fierce.) In this way, the full costs of computing services are in the hands of program managers. Decisions about what combination of computing is most efficient for their programs can be made by them, with the advice of the informatics branch usually being sought. If they want to put more or less of their total budget into computing, they can. And equally, of course, they can change their combination of non-EDP resources as they see fit.

For their part, the service providers in the informatics branch have to monitor their costs and revenues very closely. They have to adjust to changes in usage patterns among the various services and they have to be on the look-out for new service opportunities and new service needs, as well as for services that are no longer justified.

The funding of the major infrastructure investments, which I mentioned, has to be planned on a multi-year basis and built into the rates. As a result of this, we do not seek or expect external or Treasury Board funding for computing infrastructure. We expect our computing costs to be included in the budgets of all new programs and we use those moneys, in part, to augment our computing infrastructure.

Let me go back to our planning process, to try to explain how this relates to the development and consideration of proposals and to the initiation of projects. We have an annual cycle, which we usually consider to start in the fall, with the issuing of strategic objectives for the planning period. (We use that word, "strategic".) These may identify specific subject areas in which we seek improvements, or cross-cutting issues to which we wish to see attention given. For example, the issue of marketing will certainly be part of this year's strategic guidelines. These guidelines encourage proposals in certain areas, but they don't stop other good ideas from being put forward.

Guided by these directions, proposals are developed. These proposals are then reviewed at various management levels, leading to decisions by the corporate planning committee. This review process includes what we call a senior management review, which is a review by the whole senior management group in Statistics Canada. It's usually a two or three-day meeting where all the proposals are put on the table and discussed, and everybody has a chance to give an opinion on proposals from every area.

The decisions are made, in the end, by the corporate planning committee, which is essentially our executive or policy committee. They are usually made by the end of February, for the coming fiscal year; then they are reflected in reference levels and operational plans for the coming years.

After the end of the fiscal year, each program produces a report on, among other things, the progress of past approved proposals and issues identified for the coming planning cycle. This feeds into the following year's planning round, along with various results of external and client liaisons.

Coming back to the proposals, there are four main types. Two contribute to our available resource pool, while the other two diminish it. Two we welcome; the other two we have to live with and deal with. "Efficiency" is, first of all, a proposal that will save the Agency money, usually after some initial investment and without negative impacts on program output. Payback is normally expected in a maximum of three years and these proposals, of course, often involve information technology investments. If the proposals are accepted—and they usually are if they hold up to scrutiny—the requisite funds are put into the appropriate budgets and the anticipated savings are removed from future budgets. So, new reference levels, in a sense, are defined for a program, which then proceeds with implementation. Those new reference levels then become the baseline for any further consideration.

There is in our system a provision for what we call adjustment proposals, which are not one of our main types, and are primarily to deal with delays in achieving efficiencies. I think this relates to the issue that has been mentioned already: risk taking. Admitting that adjustments may be necessary, but without making them too easy, is designed to encourage the right level of risk taking in proposing efficiencies. We want people to propose efficiencies. If they do, we adjust the budgets so they have to achieve them, but we want to recognize that, on occasion, there will be circumstances that may lead to the need to adjust what was originally forecast.

"Contingencies" are possible program or service cuts. They are contingency cuts, in effect. We always ask for a fixed percentage to be put on the table from each of our major program areas. These days, unfortunately, we're having to accept more and more of these proposals for program reductions.

"On-going program" proposals are resource demands just to keep a program at its current level: workload increases, for example. But here, proposals are put forward only when programs cannot cope with such pressures from within their existing budget envelopes.

"New initiatives" include all proposals to start a new program or to enhance an existing one. Rarely can we afford to start a large new program from within our existing funds, so often, new initiatives are proposals for seed money for preliminary work to demonstrate to a possible outside funder the potential of a new program initiative, or to do the groundwork that will pave the way toward a subsequent efficiency proposal.

Most proposals of all types include computing components. That's one of the components of cost. The informatics branch assists and advises program managers on these components, and helps them prepare the proposals and the cost estimates for the informatics components.

Since the EDP or informatics infrastructure investments are to be paid out of our internal EDP revenues, proposals to make such investments are not treated in quite the same way. Instead, these investment plans are put forward as part of the price setting process, which takes place each October, and are considered and blessed, or otherwise, by the corporate planning committee at that stage.

When all these proposals have been finalized for corporate planning committee consideration, which means they have run the gauntlet of review at various levels of management to reach the corporate planning committee, the informatics branch assesses the potential collective impact of their approval on the Agency's informatics infrastructure: specifically, whether they would seriously change the financial outlook of the EDP cost recovery regime, whether they would require any further investments beyond what was predicted during the rate setting exercise, or whether they would significantly distort any directions laid out in our information technology framework. This assessment is one of the inputs to the final decision.

As far as our planning framework is concerned, I'll stop there. I really only sketched out the main aspects of it. The conclusion about the planning process, which I want to emphasize, is that projects get started because of their contribution to the priority goals of the Agency, or because of savings they would generate for the Agency. I think this ties in well with earlier comments about aligning the information technology with business plans.

Business plans determine the projects we start and, therefore, drive the information technology components of those projects. For those projects that are started, there is a common understanding of their objectives, of their relationship to corporate goals and of their expectations, both substantive and financial—at least there should be, if our planning processes work properly, because those are exactly the kinds of questions that are asked about proposals as they are being reviewed. Clear identification of these, as we have already heard of the objectives and the expectations, is an important condition for success.

The informatics components of projects may be crucial, and significant in size, but they are not the raison d'être for the projects. However, the collective impact of all these informatics components on the Agency's informatics infrastructure has to be considered.

Let me turn now to the second framework, the project management framework, which tries to ensure that projects unfold as they should, once they are approved. Here I'm getting closer to what people have been talking about under the subject of best practices for systems under development.

We operate under a matrix management approach for developmental projects. A project team is usually formed with representation from the relevant line areas, which are usually the substantive subject matter areas in which data have been collected, the client or the user, or a methodology, which in our terminology refers to statistical methodology, informatics and operations. Others may be drawn in, depending on the nature of the project.

The project manager often is drawn from the sponsoring or subject matter area but, should be someone chosen for project management skills, as much as for line area knowledge. Richard described some of the characteristics of project managers that we, too, would like to see in more of ours.

A steering committee with management representation from the involved line areas provides guidance, advice, and arbitration if necessary. The prime responsibilities for implementation and monitoring lie with the project team and the steering committee. The steering committee also represents the reporting line back to the corporate planning committee, on progress and problems.

For major developmental projects, reporting requirements may be specified by the corporate planning committee when approval is given. In general, the larger the project, the more frequently the corporate planning committee wants to hear reports about its progress.

Those are the first two frameworks. The third framework is the information technology framework. I said I wouldn't say too much about this and I won't.

I already indicated that it attempts to define the technological environment within which projects have to operate. In our information technology framework, which is available in a document, we cover three main areas. We cover computing architecture, communications architecture and information architecture. In each of these areas we describe how we see our configurations evolving, the standards we follow, and the products and services available. If anyone is interested in knowing more about our particular framework, we can certainly make copies of it available.

Let me end with some success factors that we have gleaned from our successes and failures at Statistics Canada. The first deals with infrastructure investments and items that, we feel, the organization must invest in because, if they are not there, individual projects will have to invest in them, and that would be inefficient, confusing and inconsistent.

I have already described the importance of the three frameworks, but there are two more points to mention. Generalized Systems is the name we have given to an initiative to create generic, reusable software modules for the common survey functions that we need in many of our surveys. The notion, of course, is one that we have heard already: to avoid creating, recreating and maintaining many software products with only minor variations in their functionality.

The notion clearly extends outside the survey context and is an important method of reducing development costs and time, even if the resulting system may not be quite optimal for each application. In a total dollar regime, the trade-off between generalized and customized systems can be quite explicit, but, of course, it must consider all components of costs: capital costs, the system developmental costs, the operational running costs and the maintenance costs. In this area, I think, as in other organizations, we have a challenge, which is to curb the enthusiasm of the systems developers who want to do all the client says is needed and more, rather than take a system module off the shelf, which would suffice but, perhaps, not quite be optimal.

This leads to the third point: training. We have already heard about the importance of people in the evolution of information technology. There are several parts to training. We need to ensure, of course, that both our developers and our users are aware of our information technology framework, aware of our strategies and approaches, aware of the generalized products that are available. By narrowing the range of software in use, we reduce training needs for both developers and operators.

Among our developers, we think, we need more system architects and system assemblers and integrators, and fewer code writers. We want to encourage assembly of systems from existing components, as opposed to the writing of new components. Of course, we have to find ways of keeping all staff technologically up to date. Those are some infrastructure things.

The next point is related to individual projects. I have called it "lessons learned", and some of these, I think, we have heard already today: keep things small—and small is both beautiful and manageable. Give developers pieces that they can comprehend, and see where they fit and how they interface, and so, increase the chance of success, isolate the impact of failure and give people something they can work on with a result in a finite period of time—within a year, not within five years.

Separate and don't confuse project management and system development methodology. Project management must control and monitor all aspects of the projects and ensure good communication between participants. The introduction of a particular systems development methodology should not compromise the project management role.

We say "employ an architect", not in the sense of developing something "airy-fairy", but in the sense of looking at the whole project at the outset. Don't let the carpenter design the house by building a perfect shed and then adding extensions. Make use of outside expertise, but manage the interface. Don't leave management decisions to technical consultants or vendors.

Internal marketing is necessary outside the project team. The eventual users and other stakeholders who are not part of the project team have to be brought along throughout the development phase. Some very good systems have suffered bad reputations, through lack of attention to internal marketing of the product.

Let me summarize. The first point we have heard several times: business decisions, business planning, business objectives are paramount. They must be understood and must drive project initiation. They must drive not only project initiation, but also informatics planning and systems development—not the other way around. We feel that one should take enough time to identify the right projects through a planning framework and then do them properly, through a project management framework.

I very much subscribe to what the previous speaker said with regard to finding an incentive to get things finished on time. We find that, in projects such as our Census, where the date is fixed, everybody knows that things have to be ready on time. Where there isn't that same incentive, things tend to lag if you are not careful. So, putting in place checkpoints, benchmarks, and schedules that have to be met is very important.

Finally, you need the right balance between local and global optimization in the organization. How stringent should an information technology framework be in defining standards? How forceful should be the rules on using generalized or existing system modules, as opposed to rewriting them? The answers to those questions may vary according to the organization, but the answers need to be there and to be widely known.

Ensure that the management decisions are made by managers and not by technicians. Management accountability is not a technical issue.

Keep both the planning and the development processes transparent, visible. Communicate within and outside the project, and market the product.

So, in addition to describing, or at least outlining some of our processes, I must say we feel that information technology planning is not a separate activity. It is an important dimension of business planning that has to be integrated. Real information technology costs have to be reflected in program budgets and projects—information technology projects and others—must be launched in the context of business plans. Their implementation must keep the business objectives always in the forefront. We must not let information technology become a separate planning or activity stream.

Well, that concludes what I wanted to say. I hope you found something different, or at least of interest, and if it wasn't something completely different, perhaps, in the sense of Monty Python, we can think of it as part of the search for the Holy Grail.

 


Q In the context of program managers having their EDP dollars and choosing to invest or not invest those dollars, I'm wondering if that does imply, in your case, that those program managers are free to take those dollars and invest them outside the organization.

 

Gordon Brackstone

In principle, yes. There also are constraints, and one of the important constraints in our case, of course, is the confidentiality constraint that I mentioned. If they are involved in a survey that is collecting confidential data, it has to be in an environment in which that confidentiality is protected. But in principle, they are free to buy computing services from outside.