Published Quarterly by a Group of Professionals

Total Page:16

File Type:pdf, Size:1020Kb

Published Quarterly by a Group of Professionals

ELECTRONIC INFORMATION PARTNERSHIPS

PUBLISHED QUARTERLY BY A GROUP OF PROFESSIONALS WORKING FOR INFORMED DECISIONS IN THE ELECTRONIC AGE

Volume 7, Number 2, October-December 1998

Editor and Publisher: Peter Brandon Sysnovators Ltd. 17 Taunton Place Gloucester, Ontario K1J 7J7 Tel 613-746-5150 Fax 613-746-9757 Internet: [email protected]

Subscription: $149/annum

Contributors to this issue:

Richard Manicom John Riddle Don Gray Rainer Mantz Chris Hughes Ian Wilson Thomas B. Riley Karen Mackey Martin Podehl Roy Davies Frank White Happy Holidays and a Prosperous New Year!

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- FROM THE EDITOR A Self-effacing introduction

I am constantly reminded that these are attention-deficient times. More kids seem to have some attention deficit affliction, dismal scientists are increasingly preoccupied with the “economics of attention,” and they no longer make Ginkoba the way the characters in the TV commercial would have us believe. We are hopelessly attention-challenged. I don’t know how you plan to deal with this reality, but I am doing something about it right in this issue of Partnerships: I will maintain silence, while letting others speak!

That being said I salute the many contributors listed in the box below and thank them for their thoughtful contributions. To them and to you, the readers, welcome to this issue, joyous holidays and a very happy and prosperous New Year!

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- Squaring the “Virtuous Circle”

By Richard N. Manicom

Speaking at the Technology in Government Week conference in 1997, Mr. Ian Glen, the then Deputy Minister of Environment Canada and Chair of TIMS[1], spoke about the relationship between the Information Technology (IT), Program Management and Policy functions within government. Mr. Glen was aware of the cross-departmental projects that had been completed under the sponsorship of the TIMS "Enterprises" initiatives in which technology was used to integrate program delivery across departments in a client-focused way. He noted that the IT executives that were involved in the work were imaginative and open-minded towards cross-departmental program integration. His remarks left the impression that he was challenging the program and policy communities to follow suit. His message was that all three groups needed to be agents of change, be open to new ideas, and think in a "client-focused" way.

Mr. Glen and I had spoken about this issue in the less public arena of his boardroom, on a previous occasion. To make his point, he drew a circular model, a "virtuous circle" (see Figure 1) of the interaction between the three areas of Policy IM/IT expertise. Also present were Grant Westcott, at the Figure 1: time CIO of Justice Canada The and Nancy Dorigo of the “virtuous Treasury Board circle” Secretariat, and we mused, with the drawing on the table between us, about this interesting relationship. We wondered whether all the Program arrows between the three elements were in fact connected, and we discussed whether there were information flows in both directions. We also discussed the relative esteem in which the practitioners of these three areas of endeavour are held by the government community as a whole.

Peter Brandon, indefatigable student of IT management issues in the public sector, draws our attention to this subject again. He asks me, from my perspective as an IT executive in a large government department, what is the relationship between the IT, program management, and policy areas. Is IT the "poor sister" in the piece, forever excluded from important deliberations but left to implement the decisions made? Does the rest of the executive community take seriously issues brought forward by the IT group, or do the "propellerheads", at the bottom of the food chain, simply do what they are told? Corporate plankton, if you will?

What Determines the Role of IT in an Enterprise?

Clearly, the role of IT will depend on the nature of the business. In some enterprises, where the IT infrastructure is not viewed as a corporate asset, there will be no business models to guide infrastructure investment. Whatever IT happens will be local within business units. Expanding on this notion, if the corporate view of IT infrastructure is one of a utility, it will be justified on a cost management basis. If considered to be an asset on which the corporation is dependent, the justification will seek to balance cost and flexibility. And finally, at the most advanced level, infrastructure will be considered as a strategic enabler and justified on a "cost of flexibility" basis.[2]

The role that IT plays is going to depend on the present and future criticality of IT to the business. The greatest risk arises in situations where the potential role that IT might play is incorrectly assessed, resulting in incorrect positioning, and missed opportunities for the enterprise. A group of public sector heads of IT once developed a pro forma Deputy Minister's Guide to Assessing the Role and Organizational Positioning of the IT Function in your Department. It remains unpublished, but I kept the bar napkin we wrote it on.

The level of spending on information technology is another key indicator. In primary industries like mining, the information content in the product is low, and the spending on IT is likely to be under 1% of revenues. Moving to manufacturing, the business is more logistically intense, and the level of IT spending will be higher. Distribution and retailing are rapidly becoming as expert in using information for supply chain management as they are at their traditional core competencies, and again, IT spending is higher. In the financial services industry, the products are all information and service-based, and much of the service is electronic. IT spending can be as high as 20% of revenues. In these IT- intensive industries, technology determines the corporate agenda as often as responds to it.

IT in Government: Utility, Dependency, or Enabler?

Government work is intense in both information and human resources. Both suggest a critical role for IT. For activities intense in human resources, IT has the potential to improve the productivity, and free people from routine tasks that might be automated to allow more time to be spent on higher value tasks. Alternatively, if technology does not automate the task itself, it can at least manage the workflow among the humans, resulting in better productivity, predictability of business processes, and built-in process measures. In any field of government activity involving compliance, IT is a key strategy in ensuring that the scarce human resource is allocated to the cases that, on a risk management basis, are likely to be the appropriate ones. The role of technology in government activities that have a compliance component is indeed a fascinating one.[3]

Government work, then, is relatively IT-intense, either presently, or potentially. Governments, generally, manufacture no products and ship no goods. The products are financial, informational, services, or public policy initiatives. Delivery infrastructure has traditionally been bricks and mortar, but increasingly is electronic. Savings potential exists in many areas, but particularly where activity is people-based in intense multi-step processes. If information technology is not being brought to bear, there is a good chance savings are left on the table. IT should play an enabling role in government activities that are human resources-intense, and that involve lots of client contact and multiple transaction types. Management will need to leverage IT strategically in these cases. The difference in overall effectiveness between those who are good at this and those who are not will be dramatic.

How Information Technology Drives Policy

Where IT is being used as an enabler for new opportunities, the resulting projects will drive policy work to deal with the issues inherent in the opportunity. For example, most forms of electronic service delivery that involve transaction processing raise tricky policy issues surrounding the rigour of identifying the client. An interesting slate of risk management, legal, privacy, and societal acceptance issues must be considered to make decisions in this area. As soon as the same service is available on electronic and traditional delivery channels, issues surrounding the cost infrastructure of each emerge, and the policy issues related to cost-recovery and modifying client behaviour with service levels come into consideration. The financial services industry has dealt with these issues in the past, but was aided by a clear focus on the bottom line and unencumbered with the public policy considerations that a government must consider.

In areas like compliance research, verification, enforcement, and intelligence, the potential of computers to build data warehouses for business intelligence are great. However, with these opportunities come policy issues related to the sources of the data, matching of data from multiple sources, and of course, privacy. Projects in these areas are likely to find the policy considerations as challenging as the technology ones.

Data capture is an area in which significant technical change has happened recently. Systems to optically scan incoming forms can use intelligent character recognition to capture the data and store it, or scan and store an image of the entire form, or both. Challenging information management issues follow directly: do we have to keep the form once it has been scanned? If we have a parallel electronic input channel for the same program in which the data arrives electronically with no accompanying form, how can we defend keeping the form in the paper channel? When something arrives on paper, why do we treat it like a national treasure when, if it comes on a network, we are happy with a few bits on a disk drive somewhere? These issues may sound simple, but the intersection of computer technology, business program design, information management, legal considerations, and the supporting policy framework for the entire undertaking are quite challenging.

Policy and Information Technology: Getting Synchronized

IT practitioners have thirty years of experience in working with program people ("line-of- business" people in private sector jargon). In fact, in some enterprises, the two groups co-habitated in the same organizational unit. A review of old organization charts will reveal entities with names like "Procedures and Systems". Deep in the past of Revenue Canada, Policy and Systems were in the same organizational unit as well.

We are now at a stage where it is imperative that the policy and technology people work closely together. In most of the visioning work done in the TIMS "Enterprises" initiative mentioned earlier, we found that there were relatively few technology barriers to implementing a cross-program integration of service delivery to improve client focus. On the other hand, many of the initiatives were laced with policy inhibitors. These included obvious limitations such as privacy inhibitors to sharing data, and more often, restrictions in department-specific legislation that were more restrictive than The Privacy Act. Other policy issues included differences in departmental policy concerning charging for information, which made it impossible to achieve the electronic assembly of a free information product that drew on the information reserves of multiple departments. Partnership between the policy and technology communities is improving rapidly as the program people demand resolution of the policy inhibitors to the technology solutions needed to improve program delivery.

Where is there Friction?

IT practitioners tend to be change agents by nature. When looking at a process, they quickly envisage how it can be improved. Program managers, normally closer to the business delivery reality, usually have their own ideas as to what can be improved, and, much to the annoyance of the IT people, thoughts on how to automate it. The program managers are more aware of the actual constraints of program delivery, but may also have more invested in the status quo. The IT people are normally familiar with the business processes of multiple client areas, and hence are well positioned to suggest best practices or automation approaches from one area that can be used by another.

Probably the most annoying habit of IT practitioners, from the program manager's viewpoint, is their tendency to suggest that some of the steps in program area "A", are functionally equivalent to those in program area "B", and perhaps a common system could be built to serve both needs. Program managers are likely to find this notion threatening, since they will not have exclusive ownership over the resulting common system. Further, with traditional systems, a data base or file normally was clearly the functional domain of a single client area. Common systems usually lead to common -which further implies shared, a disturbing notion - data bases. Hence the issue of who is the functional owner of a particular data element becomes a critical one. On Revenue Canada's Standardized Accounting Project, and the family of related project initiatives, this issue proved to be very challenging.

Maturity: Three Disciplines in Harmony

In the mature model, the business lines expect the IT function to be the provider of business solutions, and they are. However, the mature IT function also plays a leadership role, identifying opportunities for improvement in processes, delivery channels, workflow, business intelligence, and so on. Hence the job of deciding what will be done becomes a blend of hard current business needs with ideas, opportunities, and sometimes pushback from the IT side. Policy considerations are in stride with program and IT. On the one hand, policy resolutions to the inhibitors to technology initiatives are in step, and on the other, policy initiatives are thought through in synchronization with the program delivery and information technology issues involved. In the mature organization, this works. No group is perceived to have exclusive rights over good ideas. In our excitement to get on with them, we quickly forget whose idea it was, and focus on achieving the result.

Some Big Successes

At Revenue Canada we have had some significant successes with common systems to serve multiple programs. Systems of this type simply are not possible unless the policy and program areas are willing to co-operate to achieve the alignment necessary to permit a common system approach. For example, the systems that do the Child Tax Benefit and the Goods and Services Tax Credit were replaced in 1996 with a generic system for means-tested benefits called Individual Credit Determination. This common system allowed the timely implementation of the British Columbia Family Bonus and several other similar provincial mandates in 1997 and 1998 that could not have been quickly implemented otherwise. These provincial programs, structurally and legislatively similar to the federal ones in that they are means-tested on a family basis, were implemented as customized add-on's to the federal system at very low cost to the provinces. Careful policy work to determine that the federal business rules were applicable to the provincial programs was key to these successes.

In 1995, observing that almost every business line was asking for rich case management systems (a client business matter is a "case" that must be opened, logged, referred, post- dated, assigned, tracked, and so on), the IT people concluded that an opportunity existed to build a common one for all business lines. Considering the breadth of business activity of the Department, this significant initiative was met with some understandable disbelief, but finally the common core of case management plus the first specific implementation for the new Standardized Accounting system went into production this month. Further modules for Appeals, Corporate Income Tax, and CPP/EI Rulings will follow in the months ahead. The scope of this system will eventually cross most business lines of the Department, and will affect almost all of our many thousand caseworkers. Its implementation shows that policy, program, and IT people can work together as equal partners, and that each line of business is capable of having its own individual urgency for a specific system take a back seat to a superior corporate approach.

The Year 2000 issue has once again demonstrated that policy, program and informatics can work together well. In some organizations the Year 2000 workload is such that most if not all IT development work has been placed on hold. In some cases, this event is the first time in history that IT has not responded to business needs. At Revenue Canada, our early start on Year 2000 work means that most of our developers are available for development projects through this period, but the requirement for a moratorium on new implementations around the Year 2000 is requiring careful co-ordination among policy, program and IT people. Our general strategy is to minimize the impact of the Year 2000 on our business operations, yet prudent risk management dictates minimizing implementations at periods where we will want our resources available to deal with any Year 2000 glitch that managed to escape our certification testing process.

Human Resources

Our discussion thus far has focused on the three competence groups Ian Glen identified in his interaction model. At Revenue Canada, our processes for project approval and project management include a fourth and vital component, namely human resource management. Earlier this decade, Revenue Canada tracked the multi-year impact of technology-based projects on the HR profile of the Department, using data stretching back to the early 1980s from its two ancestor departments, namely the Customs & Excise, and Taxation components of the National Revenue portfolio. The investigation, which was presented to TIMS and sparked considerable interest, showed the sustained effect of continued investment in technology in a people-intense enterprise. The data revealed a gradual decrease in clerical, support, and administrative types of jobs, and the creation of a smaller number of higher levelled professional and program management jobs.

The project approach we now have in place couples an analysis of the HR impacts of a project with the decision-making process. Further, we are attempting the more difficult task of summing the HR impacts of multiple projects, across multiple lines of business, with varying job group and geographic profiles, to ensure multi-year, corporate management of the HR impacts of technical change. A formal process with our unions demands that a joint briefing on all projects occurs at a minimum six months before implementation. These sessions focus on the job impacts of the project, and issues such as training, changes in classification, job losses and how they will be dealt with, are discussed.

How "IT Dependent" can we be? At a recent review of the full range of policy and program initiatives being considered for the next several years in one of our business areas, over 80% were IT dependent. Of that 80%, the vast majority were totally, not somewhat, IT dependent. While not all of our business areas have such an IT dependency, most of the larger ones accounting for the bulk of the workforce, and correspondingly the budget, do. An effect of this reality is that the health of the IT function in the Department is a real concern of most business areas. An IT project difficulty of any kind normally has a direct program impact. A delay can impact attainment of a publicly committed rollout, cause a failure to achieve a legislative requirement, or cause a business line to not achieve a cost saving that has already been captured from their budget when the project was approved. In this environment, there is little time for "we / they" posturing between the groups involved. Policy IM/IT The "Virtuous Circle" Squared Figure 2: At Revenue Canada, all of the executive team is The involved in management processes for proposing, “virtuous funding, launching, monitoring and ultimately circle” auditing, all major projects. As a result, the squared perspectives of policy, human resources, program delivery, and information technology are all considered when these decisions are made. The HR working relationship between program and IT Program areas is particularly effective. The image of the program manager wildly conceiving new services and committing to new offerings without consulting the IT people simply is not valid. Modern program managers are expected to work effectively with their IT counterparts, and any that forget the interdependency for a moment experience such severe business impacts that they rarely do so a second time.

Accordingly, our circular model needs to be amended (See Figure 2) to include human resource management as an equal partner, and the entire model needs to be cast against a backdrop of financial and business planning and management that ensures that overall objectives are met.

Extending the Contribution of the IM/IT Professional

Considering the four disciplines that contribute to our square of competencies, it is reasonable that each has some core areas where it is incumbent on the practitioners to display leadership. People from each of the disciplines might well ask themselves what they should learn, and how they should behave, for their contribution to extend beyond the core area of their expertise to strengthen the bonds between the nodes of expertise in the newly squared circle. This would result in the strongest possible whole: synergy. I will address this issue for the IM/IT community, and leave the other areas to those in the respective fields. In the environment we have described, the IT leaders simply cannot stick exclusively to their IT knitting. They must understand the issues in all the other disciplines, and the effect of their interaction with the IT issues. The IT business still needs pure IT professionals, but it also needs those who have a strong overall business sense, are active students of service delivery models, and are interested in public policy issues. For my own part, since coming to the government sector six years ago, I have found the issues where policy and technology overlapped to be among the most interesting and challenging. Further, these issues properly managed can make a significant difference to overall effectiveness of projects.

In a project situation, with true teamwork, the professionals from all four of these discipline areas need to move on the issues together, combining their joint brainpower. Any one area that fails to address its fundamental parts of the initiative will compromise the project. Simply throwing up one's hands while waiting for the other disciplines to sort out their issues is hardly teamwork. Similarly, allowing the entire initiative to underachieve because one area has not put their best work forward is not a winning strategy either.

Each of these expertise areas needs to watch carefully to avoid circumstances where their own orderly approach to their own agenda unduly holds up cross-discipline projects in which the others are ready to proceed. Examples come to mind immediately: the IT group would like everyone to hold back on projects involving work flow until we select a corporate standard approach; or, the policy domain would like a hold on all electronic service initiatives until the ability to defend non-repudiation of digitally stored documents in court is resolved. If no projects were to proceed until each of the four discipline areas resolved all their issues to the satisfaction of their most pure theorists, we would never advance our cause.

A Look Forward

In just the short time since Ian Glen drew the circle of three in his boardroom, best practice has already moved to a four-cornered working model on the issues in moving the government's electronic service delivery agenda forward. In successful organizations, the partnership between these disciplines is tight and complementary. There is a shared agenda moving towards common goals and the respective professionals are moving forward together, in partnership, to address these issues.

In the future, the degree of literacy each of these areas has in the other's subject matter will increase, as will cross-dependencies. Fewer initiatives will be undertaken in isolation, and project teams will involve experts from all four areas.

For the IM/IT community, expect our colleagues in the other disciplines to understand our issues better. In return, they will demand an understanding on our part of their issues. Overall, this professional growth enriches our jobs, our minds, and most importantly, the business results.

Richard N. Manicom is Assistant Deputy Minister, Information Technology Branch at Revenue Canada. Prior to this he was an executive with IBM Canada. Mr. Manicom can be reached at [email protected]

Notes

[1] The Information Management Subcommittee of the Treasury Board Strategic Advisory Committee. [2] Weill, Peter and Broadbent, Marianne. Leveraging the New Infrastructure. Boston, Mass.: Harvard Business School Press, 1998. [3] Sparrow, Malcolm K. Imposing Duties: Government's Changing Approach to Compliance. Westport, Ct.: Praeger Publishers, 1994.

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- Announcement

Riley Information Services Inc. presents a one day seminar and training session: ELECTRONIC COMMERCE and PRIVACY LEGISLATON -- BUILDING TRUST AND CONFIDENCE February 23, 1999. Westin Hotel, 11 Colonel By Drive, Ottawa. Co-sponsored by the Federal Privacy Commissioner, Industry Canada, Treasury Board Secretariat, the National Archives, SCOAP and Sysnovators Ltd.

WHAT DOES THIS LEGISLATION MEAN TO YOUR ORGANIZATION?

The Electronic Commerce Bill currently making its way through Parliament is destined soon to become law. It is going to change the way private sector organizations and government departments do business on the Internet and in the office. The law mandates the protection of personal information kept by private sector organizations, no matter what form it is kept in. It will cover all aspects of the gathering and handling of personal information, not just of your clients and customers, but personnel within your organization.

Whatever the activity of your organization in dealing with personal information, you need to know what this law means to you. Federal organizations and Crown corporations not previously covered by the public sector Privacy Act will now be subject to the protection of personal information rules.

This seminar will bring together over twenty expert speakers, from the public and private sector to discuss the importance of this new legislation. They will shed understanding on its meaning and what organizations need to know to comply. Bruce Phillips, the Federal Privacy Commissioner, responsible for complaints made under the new Act, will address the importance of the legislation to all Canadians. Andrew Siman, Director-General, Office of Health and the Information Highway, Health Canada, will talk to the subject of Health Information Privacy. Francis Aldhouse, Deputy Commissioner, Office of the United Kingdom Data Protection Commissioner, will describe the European Union's Data Protection Directive and how this will impact on North American companies. This Directive requires that for personal information to be transferred outside a European country there must be an adequate level of protection of personal information in that foreign country to which the information is being sent. This has deep implications for trade. Find out what your organization needs to do to comply.

Experts, such as Bill Munson of the Information Technology Association of Canada and Helen McDonald, Electronic Commerce Task Force, Industry Canada, and other professionals, will discuss the meaning and mechanics of the legislation.

REGISTER NOW TO LEARN HOW THIS IMPACTS ON YOUR ORGANIZATION AND ELECTRONIC COMMERCE.

Cost: $450(plus GST) if registered by January 10,1999. $495(plus GST) if registered after January 10. Visa accepted. For full program and registration details: http://www.rileyis.com/seminars/Feb99/ Group rate: For every four registered the fifth one is free. For more information or to register by phone: (613) 236-7844 Note: The Westin Hotel is offering a special rate of $154 per night for delegates.

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

THE RILEY REPORT

Privacy Bill Introduced

By Tom Riley

The Electronic Commerce Bill, C-54, The Protection of Personal Information and the Electronic Authentication Act, as it is generically know was introduced into the House of Commons on October 1st. This Bill will extend the coverage of the Privacy Act to include the federally regulated private sector. It will also apply to Crown Corporations in the areas of telecommunications, broadcasting, banking and interprovincial transportation. This will include such agencies as Atomic Energy of Canada, the Canadian Broadcasting Corporation, various Port corporations as well as Canada Lands Co. and some parts of Canada Post. The law will also apply to trade in personal information that occurs inter-provincially or internationally. Three years after the law has been in effect, its provisions will apply to personal information collected, used, or disclosed in the course of commercial activities. If, and when, provinces pass similar laws, they will take precedence over federal law. Since Quebec already has a similar law, Quebec will be exempted from the provisions of the law. Attached as a Schedule to the Bill is the Canadian Standards Association Model Code on the Protection of Personal Information. This will act as the standard for agencies to enact privacy policies.

Enactment of the proposed Bill into law is expected sometime early in 1999. The Bill has sailed into some rough waters as it moves through the legislative agenda. There are disagreements about the thrust of the bill. Some of these include the Federal Government’s move to enact privacy rights if the provinces fail to enact their own legislation. This represents the continuing debate between the federal and provincial governments over jurisdiction. The clause giving precedence to the Federal government Bill hinges on federal trade and commerce powers based on the theory that if the provinces fail to act it will cause a disruption of national trade. Another problem is that the bill is perceived as being solely an “electronic commerce” initiative to further business rather than as an instrument of human rights that will endow universal privacy rights to Canadians. This, and other criticisms, including those from the Opposition Parties in Parliament, has put the bill in some temporary jeopardy.

There have even been fears that Parliament might dissolve before the bill is passed. This raises a specter that the enactment of privacy rights could be delayed again for years. This would be unfortunate. The fact is this is a good step forward for Canadians. While there might be flaws in the bill, there is a pressing need to get it enacted. It is true that privacy is a human rights issue and the bill is driven by electronic commerce initiatives. Yet, it is the fact that electronic commerce is high on the agenda of governments that has allowed the privacy issue to go forward. It has been fortuitous for those advocating privacy. The initial impulse might be to try to have the bill withdrawn in order to bring in a less flawed bill. This would be a dangerous strategy. The moment for privacy rights in the private sector is now and it must be seized. The Industry Committee in Parliament started hearing witnesses on the Bill on December 1st. The first witness to appear was Minister of Industry, John Manley, and his staff from the Electronic Commerce Task Force at Industry. The second major witness to testify before the Industry Committee was Federal Privacy Commissioner Bruce Phillips. As to the Industry Committee, members need to be aware of the technical flaws and philosophical differences within the parameters of the existing legislation.

The law is long overdue. The bill, if kept on the fast track on which it is now moving, is destined to become law within a few short months. It is going to change the way private sector organizations and government departments do business on the Internet and in the office. The law mandates the protection of personal information kept by private sector organizations, no matter what form it is kept in. It will cover all aspects of the gathering and handling of personal information, not just clients and customers but personnel within organizations. The bill’s title might say electronic commerce but the language makes it clear that it is for all forms of personal information well beyond the ambit of “commercial and business” transactions. The approach on this bill should be “let’s clean up some of its inadequacies”. For example, the language of the bill in the definitions contradicts the title, as it is clear that it would apply to all “records” no matter what form or media. Clarification of the fact that the bill goes well beyond the confines of “electronic commerce” would help to create a wider understanding among the citizenry of the true meaning of the legislation.

Also, it appears that the government agency responsible for this bill is going to be Industry and not Treasury Board. This poses some administrative problems, as this new Act will affect all agencies now within the government purview. Many are not covered under the current Privacy Act. However, this is a problem to be solved within the Government itself. Some privacy advocates argue there is a perception problem about this privacy bill is it is equated with business. This then furthers the idea that this is all about creating climates for business to operate and to encourage the individual to do business online. The language may promote the individual, as consumer rather than as citizen but the fact is that all citizens will benefit because of the wide range of records that will be covered. The real essence of the problem is that of the flawed language in the bill relating to central issues addressing the fair information principles. As there have been many groups, such as the Public Interest Advocacy Center, the Canadian Bar Association and the Canadian Medical Association, appearing before the Committee, many of these issues will be addressed. The Committee will then have the opportunity to make any required changes.

One major issue is over the role of the Federal Privacy Commissioner. The question has been raised as to whether the Commissioner should be able to issue orders that recommend resolutions of complaints or binding orders. But as the bill is written, any final appeal would be to the Federal Court. This is similar to the language in the current Federal Privacy Act and is a fundamental flaw in that law. The argument behind giving the Commissioner the right to make recommendations only is that this position is like that of an Ombudsperson. The latter role requires a mediation process where a Commissioner would use persuasive powers to convince the organization to abide by the findings of a complaint. This would be a non-confrontational approach and might work with the private sector faced with a law they may not necessarily like.

However, a counter argument is that the Commissioner should be able to make binding orders. This already happens in many of the provinces, such as Quebec, Ontario, British Columbia, and Alberta, where the Commissioner has binding powers to order release. There are already enough precedents in the provinces to show that the right to make binding orders works. Also, this has the advantage of making the findings public and allows for certain precedents to be set. As the Commissioner makes rulings, some fundamental privacy issues will arise from the very start. This creates an environment in which organizations get a sense of where a Commissioner is going and thinking on issues, causing organizations to give pause before denying requests.

The Privacy Commissioner could remedy many complaints in the first instance by having the power to order implementation of the findings. Final appeals to the Court can be daunting to an individual. Even though, in the current bill, the Privacy Commissioner will be able to bring a case to the Federal Court on behalf of the complainant, this may not always be the case. Many individuals would not have the wherewithal to go the Court route and could give up if the Commissioner did not find in his/her favour (or only partially in favour). If the Commissioner were given the power to issue binding orders, then any appeal to the Federal Court would simply be on a point of law. Far fewer cases would end up in the Courts. This would be good for everyone, the citizen and organization alike. This is essential, as experience in many jurisdictions (Canada and elsewhere) has shown that organizations will often take the route to court in order to delay recommendations made by a Commission or Commissioner.

These are just a few of the important matters that the Parliamentary Committee will have to deal with as the bill goes through its Hearings.

On a further note, the passage of Bill C-54 will help Canadian organizations exchanging personal information within the European Union (EU). The EU’s Directive on the Protection of Personal Information came into effect on October 31st. The Directive requires all member countries to harmonize their laws to include the principles set out in the Directive. One of the clauses in the Directive requires that in the case of personal information on European citizens being sent to non-European countries, those jurisdictions must have an “adequate” level of protection. Passage of the Canadian Bill will meet that requirement.

The central issue in the privacy debate is that there is a statute on the books that is going to create strong privacy rights for all Canadians. Any political issue can be expected to hit rough waters as it goes through the political process. There will be dissent and disagreement. That is the essence of democracy. It doesn’t mean the legislation should sink. It has taken decades for such a bill to get to Parliament to enact privacy rights for Canadians in the private sector. It is hoped this will soon be a reality.

This article originally appeared in Access Reports: Canada and Abroad, Volume 6, Number 10, October 15,1998. Thomas B. Riley is the President of Riley Information Services Inc. Ottawa, international specialists in the creation of public policy on information issues and information management. He can be reached by phone at: (613) 236-7844 or email: [email protected] web: http://www.rileyis.com

Keeping the Internet Free

By Thomas B. Riley

"Governments are ill-equipped to handle the Internet, because it changes too rapidly, it's too decentralized and too international. The digital age moves too quickly for government action." Ira Magaziner, U.S. Presidential Advisor on Internet Policy.

Any attempts to regulate the Internet that would impede freedom of expression and freedom of speech could only be interpreted as censorship. The CRTC have announced that they want to hold hearings on whether or not they should be regulating the Internet. First of all, the Internet is not a broadcast medium. The CRTC is responsible for the Broadcast Act so they are in the wrong domain when they start talking about possible regulations on the Internet. The CRTC also worries about the necessity to maintain our unique Canadian identity and culture on the Internet. As the Internet is a borderless medium that does not emanate from any one place, trying to mandate cultural norms becomes an impossibility.

The Internet is currently the bastion of free speech. It is the first medium in history that allows people in any walk of life to communicate their ideas to any or all parts of the globe. It is an interactive medium that does not follow the normal rules of time and space. People can engage in chat rooms 24 hours a day or play Chess online with someone halfway around the world. An individual can use this medium for whatever purpose suits the individual. And that is the inherent flaw that leads government to want to regulate. Lurid details in the press about pornographic sites, pedophiles engaging in their sick fantasies and exchanging the most reprehensible pictures of children over the Net, all lead to feelings of revulsion and mixed emotions. Concerned citizens cry out for controls.

However, statistically these sites are very much in the minority. Practically speaking, there are already existing laws that allow authorities to track, investigate, arrest and convict parties engaging in illegal activities on the Net. This includes credit fraud and a host of other crimes, including hacking that causes damage. There are already numerous articles in the newspapers about young hackers who have been convicted for illegally entering Pentagon databases or other government and private facilities to do harm. The laws exist to handle these problems. Witness the recent arrest of 40 pedophiles from around the world that were using the Net to ply their sordid trade.

The nature of the medium resists regulation. However, like all technological developments in history, once they enter the public consciousness and society's mainstream, governments cannot resist the urge to regulate. Many Net activists and libertarians want the Internet to remain regulation free, claiming that as a new medium it is much too early to regulate it. This argument holds that, as the medium itself is so new, to unnecessarily regulate it at this stage would be to impede its natural growth and progress. Censorship of the Internet is the largest fear of the net's strongest advocates. Any restrictions on free speech could retard its growth and, ultimately, the way it will change society. This is a medium that defies any quick definition. It is a technology that takes its personal meaning from the person using it. The instant an individual goes on line, it is used for one's individual purposes; the experience is defined by one's personal desires, needs and wants. The Internet becomes an extension of the individual. This is why it can have such an infinite degree of uses. It can simply be used for sending an email, as a research tool, and a playground for electronic games in cyberspace, a place of invention and discovery, or a means to read the latest news online. It can be what the user wants. It is more than just the world's largest library. It is the biggest squawk box that humanity has ever witnessed. It is all things to all people. No other technology invented in history can make this claim. The Internet is a multimedia experience. Governments can offer electronic services in a multitude of forms. It could be just putting up some information on a web site or it could be allowing a multitude of transactions through a kiosk or through the computer from the home, office or public place. But the point is the individual can come to use it for a multitude of purposes. Someone can go online and do some banking through a web site, click over to a chat room, move to post a message in a newsgroup and then do some surfing on the web. As the medium itself is non-linear so are the ways in which the individual participates. The phone cannot be described this way and neither can the TV. If the Internet becomes popular through the TV, as some firms are trying to do, then the nature of TV as we know it will change.

The Internet is all about change - social, political, cultural, and beyond. The change is rapid. As with momentous events, it is often difficult to predict what the consequences will be or the results that will occur. This is certainly true of our new technologies. We have a glimmer of the changes to come. One of these is that technologies mean you never really leave the office. If you are in a high-powered job you are expected to be on call almost 24 hours a day. After all, your email is there, you have a cell phone and all the technological gizmos needed to be in touch and get the job done. In the sixties it was said that computers would free us to have more leisure time. Countless pop philosophy books of the time pronounced that we would have to learn how to creatively use our free time as we were going to have so much of it. The hard realities of the Digital Age have put paid to those long-ago, naïve theories. The more information technologies thrust at us, the longer our days become. This is but a scintilla of the change to come. A medium that allows us to communicate so freely and so widely cannot but bring fundamental change to the way we operate as a society.

This itself is a persuasive argument for minimal action by government in regulating the Internet at this time. It is apparent there will be some regulation in relation to privacy, security, cryptography and digital signatures, to ensure secure communications over the Net and to have some vestiges of the protection of our privacy. We will also need to find ways to ensure there is universal access and that the Net is affordable for all. This is in the quest for a level playing field and even that will have to come through means other than government interventions. But it should stop there.

It is far too early to be talking about further regulation of the Internet. It has been shown that current laws allow law enforcement to pursue and charge individuals on the Internet who engage in illegal acts. The eighties and nineties have been about the free forces of the Market Place. The next decade should be about the free forces of cyberspace.

Thomas B. Riley is the President of Riley Information Services Inc. Ottawa, international specialists in the creation of public policy on information issues and information management. He can be reached by phone at: (613) 236-7844 or email: [email protected] web: http://www.rileyis.com

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- THE WHITE PAPERS Privacy and Information Access Issues in the Near Future: An Interview with Ontario's Information and Privacy Commissioner

By Frank White

As the millennium approaches, Canada is experiencing increased attention to access to information and privacy issues. In many cases, the issues are identified as part of a wider concern on the direction and impact of information technology on Canadian society. Consumers are increasingly likely to question the effect on privacy and access to information of government and private sector initiatives to incorporate advances in technology into customer service delivery proposals.

The public has a high level of acceptance for improvements to service delivery when the benefits are obvious to the consumer. An example of consumer acceptance is the quick market penetration of on-line banking services. On the other hand, e-commerce on the Internet is not meeting expectations and forecasts. As an example, Computer World in March, 1998 compared on-line sales vs. catalogue vs. retail sales for 1997. The sales generated in each category were:

 $2.0 billion in on-line sales;  $78.6 billion in catalogue sales;  $2.5 trillion in retail sales.

The Internet is a good example of the many competing interests that have a bearing on the public's acceptance and use of new services available with advances in technology and communications. The use of the Internet for service delivery also raises many access to information and privacy issues that need to be resolved by policymakers and businesses. As a tool for the dissemination of information, the Internet is accessible, fast, cheap and available at all hours. The private sector or government can quickly disseminate information to the public. But is it the information the public either wants or finds useful?

Electronic commerce over the Internet offers the consumer a potentially innovative two- way channel of service delivery that can be customized to the consumer's needs and expectations. But how secure is the personal information that the consumer submits via the Internet? How is the consumer's privacy protected when business is conducted over the Internet?

There are many drivers of change that may affect the existing expectations of Canadians related to privacy and access to information. These drivers of change - such as the Internet, new or proposed laws and regulatory proposals and increased surveillance in society - will affect the decisions taken by policymakers in both the public and private sectors over the next few years.

Laws in the European Union dealing with data protection and the proposed Canadian Federal law to regulate privacy in the federally regulated private sector will certainly have an impact on the public's expectations of privacy. Two questions arise: What exactly are Canadian expectations? And can the privacy expectations be met, or is there a gap in expectations and the proposed law or regulatory proposal?

A third question also arises. Does the public want an open and free-for-all environment of information exchanges or is there an expectation of some form of control?

Another driver of change over the next few years is the trend of increased surveillance in our society. The trend is partially driven by the need for increased physical protection. But what is or what will be the effect on the individual's privacy?

Many of the drivers of change that will affect access to information and privacy have both positive and negative consequences for the public. The Internet is an example of how privacy and access to information expectations, new technology directions, business and profit motivation and government policy-making all intersect but do not yet integrate into a coherent public policy. There is a need to make progress on the public policy front. In the March 16, 1998 edition of Business Week, it was stated that " unless privacy is protected soon, the revolutionary potential of the Internet may never be realized".

Ann Cavoukian was recently appointed Information and Privacy Commissioner for Ontario. Prior to her appointment, she was Assistant Commissioner - Privacy. Dr. Cavoukian is eminently qualified to discuss change, technology and information access and privacy issues. She is considered an expert, in her field with an extensive record of public speaking engagements and publications.

In this interview with Dr. Cavoukian, she discusses her views on how the drivers of change will affect access to information and privacy over the next few years.

Dr. Cavoukian, can you share your views on the impact of the Internet as a driver of change over the next few years.

Dr. Cavoukian: The Internet has immense implications for information access and dissemination although the potential for e-commerce is just being realized. Our traditional methods of providing information to the public are undergoing significant change. It is an enormous change in culture to adapt to the new technologies for information dissemination. The Internet presents an opportunity for government to be open and transparent in dealing with decisions and other matters that affect the public. The Internet provides an easy and inexpensive way of providing information that can be both useful to the public and support government accountability. On the other hand, not a lot of thought has gone into privacy and what happens to personal information passed over the Internet. Where do you put the laws or rules for privacy? And, how do you incorporate privacy protection as part of the Internet's operations?

In the United States, personal information on the net is viewed as a commodity, therefore secondary uses of personal information are seen as appropriate. Our culture in Canada leads us to focus on the needs of the data subject, the individual. Therefore, Canadians expect strict limits on the secondary uses of personal information.

In the United States, the Federal Trade Commission has asked Internet providers to find a way of ensuring privacy on-line in a governmentally unregulated environment. To date, the results have been disappointing. In my view, this result points to the need to develop laws and other mechanisms to deal with privacy. I believe that we need to enlist the support of technology to advance both access to information and privacy.

Can you describe an example of the use of technology to support privacy?

Dr. Cavoukian: There is an interesting tool recently developed by the World Wide Web Consortium (W3C) to offer more privacy than presently exists on the Internet. I am referring to a new protocol called P3P, or the Platform for Privacy Preferences. P3P was developed to promote privacy and trust on the Web -- to give consumers some concrete choices regarding what, if any, personal information they wished to give to a Web site. And it was designed to do this sooner rather than later -- to give the consumer some protection, right now, regarding the uses of his or her personal information.

How does P3P operate?

Dr. Cavoukian: P3P operates through an exchange of information between a Web site and an individual. The Web site discloses its privacy practices, enabling the user to exercise preferences over those practices. The user's preferences are communicated through his or her browser. Web sites with practices that fall within the range of a user's preference could be accessed "seamlessly," without any further need to take action. This level of exchange would all take place behind the scenes.

If the site's practices are not within the user's range of preferences, the user would be notified of a site's practices and have the opportunity to either accept the terms or attempt to negotiate other terms. This means the user can opt into the terms available on the site without any change or be given the ability to negotiate or just leave the site. As examples, site practices would include whether information is collected in personally identifiable form, the purpose of collection, who the recipients of the information will be, the ability to access one's personal information, to have it corrected, how long it will be retained, and so on.

How can the public's expectations or concerns about Internet usage be obtained? Dr. Cavoukian: One example of measurement that comes to mind is a recent survey on consumers' concerns about the Internet done by Georgia Tech. For the first time, the issue of privacy leads the list of consumer concerns. This is the first time that it has surpassed freedom of speech as the number one issue.

In a recent article in Wired magazine in May, 1998, the point was made that Web consumers seem to be more than willing to upset the marketing apple cart. They refuse to cooperate in marketing conducted at Web sites: 94% of respondents have declined to provide personal information when asked. And when consumers do reply or have to reply, inaccurate personal information is often submitted.

In large databases, there is usually a 20% to 30% error rate. If consumers falsify data, the effect on market planning and data mining will be substantial.

What about the notion of regulating the Internet?

Dr. Cavoukian: The CRTC recently asked for comments about that very question. My view is that the Internet is operating in an environment that knows no political boundaries and follows no enforceable laws. If you accept that as an accurate description of the environment, we should look at how we can communicate an appropriate message on Internet practices. The message to communicate is that a set of fair information practices should be adhered to by Internet users and providers. Underlying the practices are the principles that personal information should be treated with respect and that certain rules should be followed when dealing with personal information on the Internet.

From my view, it is unlikely that the Internet can be regulated. But you can regulate companies resident in Canada and their practices. If we accept that all Canadians are generally respectful of our laws, then a set of fair information practices can provide guidance on how to interact on the Web.

In your view, has the attitude of the e-commerce community toward privacy changed?

Dr. Cavoukian: Companies conducting business over the Internet are learning that it is in their best interests to operate within a set of rules. The public is hesitant about conducting business over the Internet, partially due to privacy concerns. While companies have to start with security for Internet transactions, it is only the first step. The privacy concerns and issues revolve around the use of personal information, once security is in place.

Another driver of change is the EU Directive on the protection of personal information. What effect will that have on Canada over the next few years?

Dr. Cavoukian: There has been a lot of speculation both in Canada and the United States on the effect of the EU Directive. What we do know is that it will apply to transfers of personal information in both the public and private sectors. Also some of the European States, when developing their national laws to comply with the EU Directive, are raising the bar with respect to the data protection requirement for transferring personal information to a non-EU State. For example, the EU Directive requires adequate data protection measures for personal information transferred to a non-member State. Some EU members have legislated "equivalent" data protection before personal information can be transferred.

I feel very strongly that legislation alone will not be enough to protect privacy in the next century in the digital age. A variety of tools will be needed to complement legislation, most notably, technological tools -- privacy-enhancing tools, or as they are now termed, PETs.

On the access to information side, increased surveillance seems to be a trend that will continue over the next few years.

Dr. Cavoukian: It is interesting because we see the two sides of the equation again. On the one hand, there is a public expectation that more personal information should be provided or captured to support increased security in a variety of situations. This has to be balanced with the impact on the privacy of individuals.

In some cases, an individual may have some choice about surveillance. There may be an ability for the individual not to participate when surveillance is taking place. In other cases, the individual is left with little or no choice. I believe the critical issues for the public to consider are:

 How does the individual know beforehand that surveillance is going to take place?  What is the personal information used for after the particular event, that is, are there secondary uses?  What choice, if any, does the individual have concerning participation in the surveillance?

My own view is that fair information practices must apply whenever surveillance takes place.

Your term as Information and Privacy Commissioner for Ontario started recently. What goals do you want to accomplish during your five-year term in office?

I have three primary goals for my term in office.

The first is to maintain the excellence of the Commission's Tribunal Services. My approach will be to continuously improve the process by simplification and streamlining. This improvement can be accomplished by ensuring accountability, focusing on the needs of the customer, building flexible systems that can accommodate change and creating a positive work environment for staff of the organization.

The second goal is to remain at the front of "relevance" to our provincial and municipal government clients by "finding a way". Our approach will be to provide solution-oriented advice to our clients. Governments have goals that need to be met. By encouraging early consultations to find common ground, I believe we can find solutions that both benefit access to information and privacy and allow the broader objectives of government to be met quickly.

And as a third objective, I want to sustain a culture of open government. I believe I have an important role to play in emphasizing the value of open government to the public.

The Information and Privacy Commissioner for Ontario is also charged with the responsibility of keeping the public informed of access and privacy issues. I take this charge very seriously and I intend to use my term to focus on my legislated mandate for public education.

Frank White is Principal, Frank White & Associates Inc., consultants in information management, information access and privacy. He can be reached at 416-535-8205 or by e-mail at [email protected].

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- EYEWITNESS OPINIONS Should Information Be Free? Why Money Shouldn't Matter ... Too Much

By Roy Davies

Summary: Commercial interests see the Internet as a hi-tech market-place but the agora or market-place of Ancient Greece is a better model as it was an arena not just for financial transactions but also for the free exchange of information. The gift-exchange or Potlatch model of scientific and academic communication is still more appropriate for much of the activity on the Internet than the free market model. This essay was written in January 1995. A shorter version was published in the inaugural issue of the British edition of Wired vol. 1 no.1 1995, p. 61, under the title “Money doesn't matter.”

In August 1990 my life was saved by a blood transfusion in the Foothills Hospital, Calgary, after my left leg was shattered by a fall while hiking alone in Paradise Valley in Banff National Park; two days before finishing the original version of this paper I ordered a copy of a journal article over the Internet and paid by credit card. Both procedures carried risks. I hoped that nobody intercepted my unencrypted message. More importantly I trusted that the blood donors were healthy and the testing procedures thorough. My trust would have been less if the operation had been performed in a country where blood donors were attracted mainly by money - a tempting prospect for penniless junkies - but in countries like Canada and Britain donors are unpaid and motivated by altruism, possibly tinged with self interest. They know that should they themselves ever need blood, supplies will be available thanks to the actions of other, like-minded individuals. [1]

Similarly much of the information interchange on the Internet takes place on a voluntary basis. Contributors get personal satisfaction and possibly prestige. They also know that they will be able to obtain information in the same way when they need it. The golden rule works and what is good for others turns out to be good for the individual.

The reverse can also be true. As Adam Smith, the father of economics observed, "it is not from the benevolence of the butcher, the brewer, or the baker that we expect our dinner, but from their regard to their own interest." [2]

For keeping our noses to the grindstone day after day, money is a powerful motivator and the collapse of communism has demonstrated the superiority of the (reasonably) free market in allocating resources. However markets do not always work to everyone's benefit. Booms followed by slumps have been caused by speculation in everything from tulips to shares. A market-oriented Internet would require accounting, banking and legal services all of which would have to be paid for. And whose laws would apply in cyberspace? Markets cannot operate efficiently unless prospective buyers have information about the goods on offer, but what if the product itself is information? You can inspect a printed book before purchase. You may buy a single issue of Wired before deciding to place a subscription. But how can you inspect one-off packages of information on the Internet when to inspect is to copy and thus to acquire?

The implications of problems in judging goods attracted the attention of Charles Babbage who, when not preoccupied with trying to complete his analytical engine or campaigning against street organ-grinders, devoted much of his time to studying economics. He pointed out that there is often a cost, in time or money, associated with verification of quality. [3]

That is why people are often prepared to pay extra for reputable brands. Flour was cheap but could easily be adulterated and therefore the government of Babbage's day went to the expense of building its own flour mills, e.g. for supplying the armed forces, to avoid having to inspect all supplies. Similarly if there is a charge for information which cannot be inspected before purchase, then instead of availing themselves of existing solutions to problems people will spend their time needlessly reinventing the wheel.

Is money really necessary to motivate information providers? Gift exchange systems, of which the voluntary, unpaid blood donation services of countries such as Britain and Canada are examples, have a long history going back to forms of barter used before the development of money. Often there was a competitive element present as in the encounter between Solomon and the Queen of Sheba when each monarch tried to outdo the other in generosity. In the equally competitive potlatch ceremonies of certain North American native peoples whole communities participated in exchanges spread over several days involving everything from blankets to twentieth century luxuries such as motor boats. [4] Cultural activities such as dancing, public speaking and initiation into secret societies also took place during the potlatch and the festivities were accompanied by a certain amount of drunkenness and by what appeared to the authorities to be wanton acts of destruction. Social standing depended on the munificence of the individual's gifts and chiefs would sometimes destroy some of their possessions to demonstrate that they already had more than they needed and could easily afford to be generous. When the disapproving Canadian government outlawed the potlatch in 1927 the loss of this traditional incentive to work led to severe and unanticipated social problems. The Act was repealed in 1951 but by then the influence of modern money and European culture had become firmly established and by the late 1960s, the potlatch had practically ceased to exist. Gift exchange systems have their roots in the culture from which they spring and their survival depends on the health of those roots.

Scientific research is a fiercely competitive and public process of information exchange but scientists do not normally get paid for academic papers nor do they pay directly for using information provided by others. The only direct rewards are citations and recognition by one's peers. Where commercial applications are foreseeable there is a separate system of publishing and licensing – the patent system. Recent attempts by drug companies involved in the human genome project to patent genes - yours and mine - have imperiled the free exchange of information and threaten to retard research. [5]

To extend the sphere of the market beyond its legitimate limits at the expense of the gift exchange system is to undermine science itself.

Traditionally markets had social as well as economic functions. The Greek word for market-place, agora, originally meant a meeting place. The Athenian agora was where Socrates felt most at home. It was an arena for gossip, political haranguing, philosophical inquiry and hard bargaining. Everything from apples to water-clocks was for sale, but talk, or information, was always free and when Socrates was the speaker, sometimes priceless. [6] No citizens were excluded from the agora except those awaiting trial on serious charges such as murder.

In 1886 W.T. Stead writing in the Contemporary Review claimed "the telegraph and the printing press have converted Britain into a vast agora or assembly of the whole community...". [7]

Even in that hey-day of laissez faire, action was taken to ensure that access to information was not limited to those with the ability to pay and Samuel Smiles, the apostle of self- help, criticised those who thought it right to spend tax payers money on prisons but not on libraries. [8]

The Internet surely has the potential to transform the whole world into an agora in which, as in ancient Athens, commerce and all sorts of free social interaction would flourish side by side. If that is to happen we must heed the warning given by the history of the potlatch which demonstrates that it a system of exchange that depends partly on altruism rather than naked self-interest is very difficult to revive once the culture that gave birth to it has been seriously weakened. Will we prove less enlightened than the ancient Greeks and the Victorians? Will the Net culture go the way of the potlatch? It is part of the environment of cyberspace which, like our physical environment, deserves protection. This culture insists that "free" is not synonymous with "worthless" nor "value" with "price". Let us learn this lesson before it is too late.

Notes and References

[1] Blood Donation. For detailed arguments about the superiority of unpaid voluntary blood donation systems see: Titmuss, R.M. The gift relationship: from human blood to social policy. London: Allen & Unwin, 1970. A more recent publication which comes to the same conclusion is: Beal, R.W. and van Aken, W.G. Gift or good? A contemporary examination of the voluntary and commercial aspects of blood donation. Vox Sang 63 (1), 1992 p. 1-5. (Vox Sang is the official journal of the International Society of Blood Transfusion).

[2] Adam Smith. The wealth of nations (5th ed. Published 1789.) The quotation comes from vol. 1, book 1, chapter 2, page 16 of the Cannan edition (Oxford 1896) or pages 26- 27 of the edition edited by R.H. Campbell, A.S. Skinner and W.B. Todd. Oxford: Clarendon Press, 1976.

[3] Charles Babbage. Babbage's most detailed work on economics was: Babbage, Charles. On the economy of machinery and manufactures 4th ed. New York: Kelley, 1963. (Facsimile reprint of the edition published in London by Knight, 1835). The whole of chapter 15 "On the influence of Verification of Price" pages 134-146, discusses the implications of problems in ascertaining the quality of goods. See also: Hyman, Anthony. Charles Babbage: pioneer of the computer. Oxford: Oxford University Press, 1982. Chapter8 discusses Babbage's work on economics and pages 114 and 115 deal with the cost of determining the correct price.

[4] Gift Exchange. For general information on this topic, including the potlatch, see: Mauss, Marcel. The gift: forms and functions of exchange in archaic societies. London: Routledge, 1990. For information about gift exchange in relation to barter, the potlatch in particular, and a detailed account of the history of money from primitive forms such as cowrie shells up to electronic funds transfer, stressing the wider, social aspects rather than the purely economic aspects, see: Davies, Glyn. A history of money from ancient times to the present day. Cardiff: University of Wales Press, 1994. The potlatch is discussed in the section on Money in North American History.

[5] The Human Genome. For information on the controversy over patenting human genes see: Marshal, Eliot. The company that genome researchers love to hate. Science vol. 266, no. 5192, 16 December 1994 page 1800-1802. Kleiner, Kurt. Squabbling all the way to the genebank. New Scientist. 26 November 1994 pages 14-15. [6] The Agora. For information on the Greek agora see: The Athenian agora: an ancient shopping center Princeton: American School of Classical Studies at Athens, 1971. Socrates in the agora. Princeton: American School of Classical Studies at Athens, 1978. Life, death and litigation in the Athenian agora. Princeton: American School of classical Studies at Athens, 1994.

[7] The quotation about the telegraph and printing press turning Britain into an agora comes from: Stead, W.T. Government by journalism. Contemporary Review, May 1886 pages 653-657. (The actual quotation is on page 654).

[8] Samuel Smiles was the author of "Self help", a best seller in Victorian times. Its message would strike a sympathetic chord with right-wingers who believe in individual initiative rather than collective action. However, for evidence of Smiles' views on the importance of public libraries see: Mackay, Thomas (editor). The autobiography of Samuel Smiles. London: John Murray, 1905 pages 155-157.

Roy Davies is a science librarian at the University of Exeter in the U.K. He is keenly interested in information retrieval. As he puts it, “it is not only the technical aspect, which is being transformed by the use of computers, that I find interesting but also the wider impact of information on society.” Mr. Davies is the book review editor for Interactive Learning Environments. His list of publications includes articles on expert systems in cataloguing and in reference work, and techniques for discovering previously unnoticed logical connections in scientific literature. Mr. Davies' Home Page is at: http://www.ex.ac.uk/~RDavies/homepage.html. He can be reached via e-mail at [email protected]

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

Information Technology Partnerships for Social and Economic Progress by Christopher Hughes

Can IT Partnerships Work?

Partnering is an often used word that frequently means different things to each of the participants. Ideally, a partnership is an operational or business arrangement that enables the individual participants to each achieve their own objectives while together achieving results that neither party could do on their own due to lack of skills or resources. Technology partnerships are often fraught with risk, but increasingly, alliances are necessary to achieve significant results.

When partnerships involve the application of new, untried, or complex information technology, risk increases, creating the potential of further tension among the parties, since new technology implementations and use very seldom turn out the way they were originally envisioned. However, if all parties go into these partnerships with a flexible attitude and are prepared to accept the inevitable risks, then they have a much better chance of perceived success, since it will be measured in the proper context.

This article describes a new partnership framework involving an advanced information technology to which the majority of readers of this newsletter probably have limited exposure. That fact, in itself, is indicative of one of the difficulties that Canada currently faces in making progress in this burgeoning area. However, increasing awareness of the issues, a changing political and economic climate and new synergistic opportunities in this area hold a promise of near term advancement.

The framework described here will bring high performance computing, high speed networks, and specialized knowledge together to solve complex problems facing industry and government. By providing a focal point for these resources and expertise, it is the goal of the various consortia developing under this framework, to bring a critical mass to bear on important scientific and technology problems facing Canadian industry and government, so that new and innovative solutions result. Through the application of these resources, it is hoped that new strides can be made to improve Canada’s international competitiveness and as a consequence, deliver social and economic progress to the country.

What is high performance computing?

In this day and age where computing resources are readily available to the general population, each user may consider that they have a “high performance computer” sitting in their office or home that adequately performs the tasks given to it. It is certainly true that the power and capacity available in the average desktop system today generally outstrips the mainframe computers of 15 or 20 years ago. Both personal computers and workstations have also been enhanced significantly over the past few years and this trend will undoubtedly continue allowing increasingly complex research and engineering activities to take place with this technology.

However, over this same period of time, the solution of scientific and engineering problems leading to the development of new industrial products have become increasingly complex, requiring the processing of enormous quantities of data using very complex algorithms. The average desktop system or workstation cannot currently handle such enormous problems. Consequently, what is ultimately needed is an integrated approach among various information technologies, culminating in access to high performance computing.

So, what is a high performance computer? The first computer to be termed a “supercomputer” is generally believed to be the CDC 6600, introduced in 1966. Later model CDC 6600s had a peak performance rate of 3 million floating point operations per second, or 3 Megaflops (Mflops). Now, however, computers of the 1990s are capable of peak performance rates of many Gigaflops (one thousand Megaflops). Teraflop (one million Megaflops) performance rates are now being experienced by recently introduced experimental systems and these will become more widely available by the turn of the century.

With this rapid increase in performance, it has long been recognized that the definition of the term supercomputer must be dynamic. More than just peak performance rates, must be considered when designating a computer as a “supercomputer”. Consequently, this term is increasingly now being displaced by “high performance computer” or “high performance computing environment” (HPC). This shift in terminology has resulted from the recognition that when real problems are being tackled (rather than just CPU benchmarks) it is the entire computing environment that must offer high performance, not just the CPU. Coupled with similar rapid advances in memory and input/output devices, the throughput rates of these systems have experienced enormous growth, since that first CDC 6600. In addition to a computer with a high computational rate and large, fast memory and input/output devices, a high performance computing environment must include high speed network access, reliable and robust software, as well as documentation, training, an appreciation of the potential of the HPC environment, and access to highly qualified personnel to ensure the most effective use possible.

Why do we need HPC? In comparison to our major trading partners, Canada is slipping in a number of critical capabilities relevant to effective deployment, use and management of our science and technology base. In particular the use of HPC is required to advance the development of new science and engineering that inevitably leads to the development of new products to ensure Canada’s international competitiveness. Today, no HPC capability of the scope required to handle complex scientific and engineering problems is readily available to researchers in Canada.

For example, in Eastern Ontario, Queen’s University currently has a small IBM SP computer which is fully utilized for small research projects, while the University of Ottawa relies primarily on computing resources in the U.S. or in other parts of Canada on an as available basis. In the case of Carleton University, the currently available machines were designed about 10 years ago and no experimental algorithm work can be carried out on these computers since the communication speed as well as the processor speed is extremely slow and the operating system is no longer supported.

The University of Mannheim in Germany makes a regular survey of the most powerful computers in the world and publishes them in its “Top 500” list. In the November, 1998 list, n one of Canada's universities are listed as having computing facilities that rank among the world's Top 500 (compared with 45 US universities). The 500th entry exhibits 17 Gigaflops of computational speed, up from 13 Gigaflops just 6 month’s prior. While Canada’s Atmospheric Environment Service is the highest ranked Canadian organization, it has slipped from 12th place to 18th place in just the past 6 months. The U.S. continues to occupy a dominant position on the list while other industrialized countries such as the U.K., Germany, Switzerland, Sweden, the Netherlands, and Japan consistently appear. While this situation is bad enough, the U.S. government has recently embarked on a major HPC activity known as the Accelerated Strategic Computing Initiative (ASCI) as part of its support for its nuclear stockpile stewardship program. The first system under this initiative already heads the Top 500 list achieving a computational speed of 1.3 Teraflops and ASCI intends to develop a 100 Teraflops computer system by 2003.

The companion Academic Strategic Alliances Program (ASAP) of ASCI enables U.S. academic institutions and researchers to gain access to ASCI computing facilities for focused research in specific targeted areas. Canadian institutions will not be eligible for access to ASCI facilities through ASAP. As a result, Canadian universities who have previously had access to some HPC resources in the U.S. now will run the risk of falling even further behind in the attainment and use of this essential scientific infrastructure.

As an example of this, Canadian university researchers, are not currently capable of treating the very large numerical problems that are at the leading edge of today's science and technology research programs. Instead, they have to study smaller and less relevant problems, and they have to restrict themselves to niches that their current computing resources allow them to exploit.

“So what,” you might ask. Increasingly in our rapidly growing information economy, the advancement of scientific and engineering research in the nation and the resulting competitiveness of Canadian industry will be critically impaired without this capability. Consequently, it is essential that our researchers get access to our own HPC infrastructure. Universities cannot do this on their own. A concerted effort involving a government-university-industry (another possible use of the term GUI) partnership is needed to make this happen.

What are the elements of an HPC partnership?

While all parts of society are struggling through a transition from a resource-based economy to an information economy with resulting difficult economic realities, this very situation can create opportunities that would not have been possible even a few short years ago. For too long, it was thought that simply making HPC available would stimulate usage, and miraculously, full and effective use of the facilities would occur. However, it is now recognized that for HPC to succeed in Canada, a GUI must be created to all work together to create a partnership for progress that involves long term commitment from all parties.

Fortunately, each sector now recognizes that science and technology efforts must be increased in order to maintain Canada’s competitiveness in the world. At the same time, each sector recognizes that it relies on the others’ to play their part in achieving this overall goal including the creation of the appropriate HPC environment. As a result, promising activities are now taking place in each sector. For example:

 Universities: Universities tend to be rich in knowledge resources but poor financially. They have a responsibility to both produce highly qualified personnel from their student resources as well as advance individual areas of expertise by carrying out basic and applied research which adds to the overall body of knowledge. Recent cutbacks in government funding are now forcing universities to look elsewhere and be innovative in seeking out alternate funding sources while being responsive to their client’s needs. One example is that university-led consortia are beginning to be formed to advance HPC in Canada through self-sustaining partnerships with industry and government.  Business: The pace of change in science and technology require that business maintain a frenetic pace in technology based product roll-outs to maintain or grow their market share. Their ability to develop these new products often relies heavily on contract research that requires cost-effective access to both HPC facilities and knowledge. This is especially true for small and medium enterprises (SME) who do not have the resources necessary to carry on this HPC based research. However, without the availability of the facilities, highly qualified HPC personnel cannot be trained to meet the demand of these organizations. With over 800 technology companies in Ottawa-Carleton alone, the majority of which are SMEs, the demand for such skills are enormous. Any HPC partnership must clearly ensure that this pent up demand is met.  Government: In the evolving economy of Canada, increased emphasis is being placed on science and technology based business rather than the traditional resource based industry. Government has recently recognized that it has a role to play in the support of innovative practices in these areas by providing seed funding for the necessary infrastructure and research to enable this innovation to take place. The introduction of the Canada Foundation for Innovation (CFI) program, the ongoing operation of NRC’s Industrial Research Assistance Program (IRAP), and the research grants through the Natural Sciences and Engineering Research Council (NSERC), Medical Research Council (MRC) and the Social Sciences and Humanities Research Council (SSHRC) are all examples of federal programs in this area. Similar activities within provincial governments are also taking place. HPC partnerships must seriously consider use of these programs to better ensure a successful operation.  Non-governmental organizations: Associations of like-minded individuals and organizations exist to provide a focus on particular technology subjects and help coordinate activities related to those subjects. Two examples of organizations with particular interest in the development of infrastructure to advance HPC in Canada are:  With the ongoing financial support of both the federal government and industry, CANARIE Inc., the Canadian Network for the Advancement of Research, Industry and Education, provides an opportunity for collaboration in stimulating the development of the Information Highway in Canada through an innovative research program and deployment of advanced networks. This, in turn, leads to an HPC sharing opportunity as an application on the Information Highway.  C3.ca, is an organization with participation from the academic, industry and government communities. Its mission is to significantly enhance the Canadian information technology and knowledge infrastructure, by creating a national network of shared computational research facilities located in different regions of the country that would be interconnected by the type of advanced communications being developed by CANARIE.

These NGOs play a vital part in focusing and coordinating activity related to HPC in Canada.

One emerging HPC partnering solution

Inspired by the C3.ca vision of interconnected HPC nodes and recognizing that a critical mass of academic expertise must be harnessed to meet the industrial and governmental needs for scientific research, 4 universities in Eastern Ontario (Queen’s, the Royal Military College, the University of Ottawa, and Carleton) have joined together in a consortium known as the HPC Virtual Laboratory (HPCVL). This consortium has reached out to both government and industry to work together to develop an effective technology based partnership to meet the research needs in various scientific and technology areas. Based on a delivery of an HPC service to both industry and government, the HPCVL intends to ensure a sustainable operation that continues to advance Canadian HPC knowledge and facilities.

In a very short space of time, HPCVL has achieved significant progress in making the vision of a world class Canadian HPC facility an operational reality. Tasks accomplished to date include:

1. The development of partnerships with information technology vendors to not only provide the requisite hardware and software, but also work with HPCVL to:  help build research collaborations with industry and government by providing assistance in establishing linkages to ensure sustainable, focused research  continually enhance the HPCVL information technology environment  provide technical support to enable effective use of the facilities  help train and retain highly qualified personnel in HPC  help create technological and economic benefits to Canada

2. The assembly of approximately 100 university researchers from the 4 partner organizations who will actively participate in HPCVL’s activities. Using the catalyst of the HPC infrastructure, the presence of this large body of knowledge and the strong interdisciplinary aspect of much of the research will enhance the value of HPCVL. Using the wide range of disciplines, research activities in areas such as the following will be targeted:  drug development in areas such as molecular design, x-ray data analysis and computational chemistry  telecommunications and tele-simulation  cryptography and security  information access, analysis and dissemination using very large data bases  health applications using tele-imaging and virtual reality technology  aerospace and astrophysics modeling and simulation  development of advanced materials  environmental and energy applications in areas such as pollution control and more efficient fuel development  financial modeling and simulation

The sharing of resources among 4 universities will lead to synergy and new collaborations in the research projects. Through this diverse membership, a variety of research and support resources will be available, enabling a cost effective pooling of talent. By contributing complementary resources, the consortium will minimize duplication of research effort and each of the partner organizations and individual members will gain access to computational resources, support expertise and research knowledge that would not normally be available to each individual organization.

3. An application for a grant under the CFI’s program for major regional or national facilities has been prepared and submitted. This application received wide support from industry, government agencies, and the individual universities. Recently, HPCVL has been accepted in the first round of review by the CFI and is now engaged in providing more detailed information to the Foundation for its second round of deliberations.

4. The establishment of a preliminary management framework for HPCVL. In addition to describing an operational structure, this framework places emphasis on full participation by all members of the consortium, the need to be operationally self sufficient, and the need to continually advance the technical facilities and knowledge through vendor commitments to forge a linkage to progress made by the ASCI program in the U.S.

5. To expand this regional initiative, HPCVL is also actively encouraging other Ontario educational institutions to join the consortium. Coupled with similar initiatives in other regions of Canada, such as the Multimedia Advanced Computational Infrastructure Project (MACI) in Western Canada, the goal is to create a network of HPC centres of excellence across the country, using the coordination of C3.ca. Through the leverage of facilities, and expertise, the goal of this effort is the creation of a much-improved HPC based research infrastructure for Canada.

Are we on the verge of a new era for HPC?

The political and economic climate in Canada now appears to be ready for progress in HPC. As the result of hard work by a committed group of people, decision makers are now recognizing that science and technological innovation is the key to continued economic and social progress for this country. Over the past 5 years, political leadership in this vital area has begun to be demonstrated.

As a consequence, business, government and universities now seem prepared to work together to bring effective use of HPC to bear on the technological, economic and social challenges of the future. By ending the previous era where significant funding for this important area was not available, Canadian HPC may now begin to rival its industrial partners. Through this investment, C3.ca and educational institutions can work together in the training and development of Canadians, with academic qualifications in science, engineering and the arts, to enable the application of HPC to society’s important issues and problems.

With continued determination, open-eyed recognition of the risks, and a spirit of co- operation, Canada has all the elements to move the HPC agenda forward.

Want to learn more about HPC from Internet resources?

For more information on the vision and mission of C3.ca visit http://www.c3.ca

For more information about Canarie Inc. visit http://www.canarie.ca

For more information about MACI visit http://www.wnet.ca/maci

Want to see the Top 500 HPC sites? Visit http://www.top500.org

For more info about the U.S. ASCI and ASAP programs, see: http://www.llnl.gov/asci

Want to learn more about CFI? Visit http://www.innovation.ca

Want to learn more about HPC? Consider attending the HPCS’99 conference. See http://www.queensu.ca/hpcs99

Post Script: Since this article was written, HPCVL has been notified that its submission to CFI has passed the first stage of the review process. It is now undergoing a second stage review and has been asked for additional information by CFI, along with a number of other groups seeking support for HPC. In its announcement, CFI has recognized the coordinating role of C3.ca in ensuring that the most effective set of HPC capabilities are deployed. In the CFI's words, "The CFI will consider eleven projects for the provision of HPC resources, either stand alone or through regional and national networks. As part of the next stage of review, the CFI will promote further rationalization and integration linked to the C3.ca consortium in order to place a number of regional nodes across the country that are interconnected, and can service a broad segment of the research community."

In addition HPCVL has received a $1 million grant from IBM to help establish its facility. In announcing the grant, Dr. Paul Horn, senior vice president and director, IBM Research said: "Programs like the HPCVL will help unlock the full potential of high performance computing. This new era of 'deep computing' will help extract valuable information and relationships from unprecedented volumes of data, affecting everything from the design of life-saving drugs to optimizing complex business operations." In accepting the grant on behalf of HPCVL, Queen's Principal, Dr. William Leggett said:

"We are extremely pleased. Clearly Queen's and our university partners are conducting research of the highest calibre, and this award further validates the HPCVL consortium's proposal to build a high performance computing facility"

Further information about these announcements can be obtained from:

 The CFI at: http://www.innovation.ca/english/publications/indexnov10.html  Queen's University at: http://www.queensu.ca/alumni/com/ibm.htm

About the author: Christopher Hughes is Managing Director of The Consociates Group Inc. He has had over 30 years of experience in the information technology field including the areas of high performance computing and the Internet. The Consociates Group is an IT consulting firm with a particular interest in developing partnerships between information technology and other fields for the mutual benefit of both. He can be reached at [email protected]

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- The meaning of “Professional”

By Don Gray

Quick! Think of three oxymorons! How about military intelligence (I was in the military), thundering silence (I’m not very silent), and software computer professional (I wish it wasn’t an oxymoron).

Why do you suppose someone would like to be a "professional"? My answer for people wanting to be professional involves reward and recognition. In the common view being a professional means being better than average. This often involves extra effort (beyond normal [whatever normal may be]). Once we achieve the "professional" status, we are rewarded (usually more money or some suitable substitute) and we get recognition from our peers, clients, and the industry in general.

The flip side of the coin is "Why do we want to do business with ‘professionals’"? Dealing with professionals gives us that added assurance that things will be done correctly. Our comfort level is "upped". If you don’t believe me, look at the alphabet soup of certifications desired in the computer want ads in this Sunday’s paper. As I think about being a software computer professional, I get hung-up on the professional part. I’m assuming that we share a common enough definition for a software computer (yes, I know this can be dangerous), but how do you define professional? Common definitions would include:

1. A person who is engaged in a profession. Some generally recognized professsions are medicine, law, law enforcement, teaching, civil and professional engineers, and the military. The characteristics of these types of professions are:

 Society has given the members of the profession the exclusive right to engage in the profession. Non-members are barred from practicing the profession.  Professions require specialized education and training (which is determined by the profession)  Professions are self-policing. They have a code of ethics and behavior and the power and will to enforce that code.

While I wish it were, it is obvious that creating software is not this type of profession.

2. A person who does their job with great skill.

The convenient portion of this definition is that it allows us to define professional within the context of a specific instance. The problem is how do we convince those we’ve not worked with that we’re "professional"? I think this definition holds some hope for us. An Alta Vista search on software AND professionalism returned a mere 56,970 hits. There definitely seems to be some smoke, but can we find a fire?

3. A person who engages in an art or sport etc., for money, especially as a means of livelihood.

Here I believe we have the lowest common denominator for creating software. This doesn’t require a profession (which we don’t have) or great skill (and some of us seem to be lacking this also). The good news is that we still get paid. The bad news is that we still get paid.

Cynically, I'd say that this is where most "software professionals" are functioning. It's easier to excuse our lapses in definitions, our lack of care to others, and our general attitude of wanting to show that we're "hot stuff". This is where you'd place the consultant who doesn't deliver on time, the employee who only writes the code which he's told, and the firms which fight over clients and turf like raw steak in a dog food bowl.

So what to do? Definition 1 is out and definition 3 is too inclusive. So what can we do with definition 2? In Tom DeMarco’s foreword to "The Responsible Software Engineer", three models of professionalism are described. The "Model Zero," the "3-P Model" and the "4-P Model." I would like to add a fourth; at a lower level than Mr. DeMarco's lowest level. My contribution to the list is probably the most bothersome. I call it the "Clueless Category." Can you identify yourself and other software creators, programmers and developers in the following models?

Clueless Category: The members of the Clueless Category are, well, clueless. They don’t have a clue that there are such things as standards, conventions and best practices. I’ve been around long enough to see that a number of software organizations fall into the Clueless Category.

Model Zero: Tom DeMarco says: "I intend a pejorative sense to this name, since the attitude represented by Model Zero is retrograde and offensive,… but nonetheless common. In this model, the word ‘professionalism’ is a simple surrogate for compliant uniformity." 3

This loosely translates to you doing what your boss told you to do since he’s doing what his boss told him to do since he’s doing … and so on. Should you make the poor choice to engage in non-conformity, you’ll certainly be deemed "unprofessional". I doubt that wearing a shirt and tie for 5 years made me a more professional programmer.

3-P Model: The 3-P Model uses three characteristics to describe professionalism. "The three characteristics are:

Proficient: Whatever it is that a professional does, he/she must do it with deftness and agility, and with the skill born of long practice. Permanent: The long practice comes from the permanence of the professional’s calling… Professing: Finally there must be some act of involvement by which the professional declares his/her intention to be, now and forever, a part of one chosen calling. The act may be a public ceremony of it may be a simple, private resolution of the form: EQUALS me." 3

Tom DeMarco goes on to say: "The 3-P Model still lacks something, an ethical dimension. People we think of professionals are governed by some kind of code. They know their profession gives them opportunities for wrongdoing, and they know what they will and will not do for ethical reasons. …a fourth P-term:

Promise-keeping: Professionals make certain promises to themselves (sometimes to the public at large) about what they will and won’t do. Professionals keep those promises". 3

Years ago I was told that to be successful all I had to do was find out what I liked to do, and then find someone to pay me to do it. Even after 20 years, I still can’t think of anything else I’d rather do than software development. I like the 3-P Model since it calls from a higher plane than "show me the money". But 3-P still won’t get us to definition 1. For that we need Tom's fourth P. If you happened to need a handy set of promises, you might want to try looking at the Draft Software Engineering Code of Ethics.4

You may also want to take a look the code of ethics of the Independent Computer Consultants Association. Where there isn't a total match, you'll find that they’re after the same thing: behavior, which is responsible to society, our clients, our employers and our associates.

This then would be a start. We can be professional, and work toward creating a profession. What would it take to create a profession? There are three fundamental issues we must deal with, on which people disagree, based on the three prongs of the definition:

1. Body of knowledge: Specialized training requires us to specify what it takes to be a computer professional. For every person, not on a case-by-case basis. Some feel this is a problem for those who feel we've learned some unique way of delivering especially good software. We don't agree on what the body of knowledge is. We don't even share languages, operating systems or basic methodologies.

The body of knowledge required to be a computer professional isn’t defined in languages, operating systems or basic methods. The body of knowledge is the foundation on which the languages, systems, and methods exist. Think of the "what we need to know to do the job", not the "which particular tool do we use to do it". Consider the medical profession. Obviously a brain surgeon and a podiatrist have very different specialized training. Yet, they both belong to the same professional organization. It seems to me then a network administrator and a database administrator (who granted have different specialized training) should be able to part of the same professional organization.

2. Testing: Now we delve into who we wish to endow with the power to decide who knows. Any good ideas for this one are obliterated by the fact that every current attempt at defining testing for the software profession is only supported by a small group, which mostly does not include employees of major companies. So what is the "required core body of knowledge"? Perhaps since I don't program very much, I can't even fulfill such a requirement. Maybe I can manage it, but not do it. Or maybe everyone needs to be able to explain Codd and Date's principles of databases, whether they work with them or not. This is a topic to irritate the complacent.

Yes, this is an irritating topic. Once again the critical point is the "required core body of knowledge". This is going to take some work. Once the body of knowledge is defined, then a way of testing can be determined. Consider the Project Management Institute (yes, I know it’s not a ‘profession’ … yet). They have managed to distill and publish (available for free download from their website) a body of knowledge necessary to properly manage any project. The "certification" involves a test, and documenting when and where you’ve had experience in the field. 3. Policing. Everyone seems to have his or her own definition of right and wrong. In a world of fairly stable values, the medical, legal and accounting professions could substantiate basic requirements. But even those are carefully matched to legal protections. The existing professions primarily seem to throw out members based on felony convictions. Whoa! Isn't this taking the whole process a little far down the road? You mean all "acts discreditable" occurs in the process of committing the felony? No previous warnings, behaviors or acts would have given us a clue that this person was "on the take"?

As computers (in every sense of the word) become even more prevalent and are used in more critical applications, society will demand that those involved in the computing field be policed. The arguments of "different definitions of right and wrong" and "nobody else does a real good job of policing" are diversionary tactics. Policing will happen. We have a choice: will we lead the discussion and efforts, or will a disaster and the resulting government legislation/mandates lead us?

When we have done these things, then "software computer professional" ill no longer be an oxymoron.

I would like to thank Sharon Marsh Roberts for her comments on the original thinking. In fact, some of her comments were so good; they’ve become part of the thinking.

1 definitions for professional (noun) from Webster’s New World Dictionary, Third College Edition. - © 1991

2 The discussion on professions is one that Rich Cohen routinely drills in the DCI CASE Forum on CompuServe.

3 Tom DeMarco’s professional models are discussed in "PROFESIONAL AWARENESS IN SOFTWARE ENGINEERING". I found the article at: http://www.atlsysguild.com/Site/Tom/Professionalism.html

4 Draft Software Engineering Code of Ethics by the IEEE Computer Society and ACM. On line at: http://www.computer.org/tab/seprof/code.htm

PRINCIPLES v 2.1 (I've only included the preambles for each Principle Section)

Principle 1: PRODUCT

Software engineers shall, insofar as possible, assure that the software on which they work is useful and of acceptable quality to the public, the employer, the client, and the user, completed on time and at reasonable cost, and free of error.

Principle 2: PUBLIC Software engineers shall, in their professional role, act only in ways consistent with the public safety, health and welfare.

Principle 3: JUDGMENT

Software engineers shall, insofar as possible and consistent with Principle 2, protect both the independence of their professional judgment and their reputation for such judgment.

Principle 4: CLIENT AND EMPLOYER

Software engineers shall, consistent with the public health, safety, and welfare, always act in professional matters as faithful agents and trustees of their client or employer.

Principle 5: MANAGEMENT

A software engineer in a management or leadership capacity shall act fairly and shall enable and encourage those who they lead to meet their own and collective obligations, including those under this code.

Principle 6: PROFESSION

Software engineers shall, in all professional matters, advance both the integrity and reputation of their profession as is consistent with public health, safety, and welfare.

Principle 7: COLLEAGUES

Software engineers shall treat all those with whom they work fairly and take positive steps to support these collegial activities.

Principle 8: SELF

Software engineers shall, throughout their career, strive to enhance their own ability to practice their profession as it should be practiced.

5. ICCA Complete Code of Ethics on line at http://www.icca.org/ethics.htm

Consultants will ensure that to the best of their knowledge they can complete the project in a professional manner both in terms of skills and time.

Consultants who are unable to professionally complete part or all of the contract, will be forthright and will offer to aid the client in finding resources to complete the contract satisfactorily.

Consultants will be honest and not knowingly misrepresent facts. Consultants will not engage in contracts that are in violation of the law or that might reasonably be used by the client to violate the law.

ICCA member firms, their principals and employees will uphold the principles of the ICCA and not commit acts discreditable to the ICCA.

Consultants will divulge any potential conflicts of interest prior to accepting the contract or as soon as possible after the conflict is discovered.

Consultants will only represent opinions as independent if they are free from subordinated judgment and there is no undisclosed interest in the outcome of the client's decision.

Consultants will not take advantage of proprietary information obtained from the client.

Consultants will safeguard any confidential information or documents entrusted to them and not divulge any confidential information without the consent of the client.

Consultants will keep the client informed of any matters relating to the contract even if the information is unfavorable, or may jeopardize the contract.

Consultants will install and use only properly licensed software on their systems as well as the client systems.

Consulting firms who use subcontractors can use a non-compete clause to restrict the subcontractors from working directly with their clients for a specified period of time. If the prime contractor uses a non-compete clause in their contracts, the term of the non- compete should be one year or less.

Consulting firms will not compel independent computer consultants to work as employees when they prefer to work as independent contractors.

Consultants will devote a significant portion of time in continuing education

This article is © 1998 Don Gray, Deltek Systems, Inc Pilot Mountain, NC. It appears on Don Gray’s web page at http://members.ols.net/~grayd and follow the "articles" links. The author receives e-mail at [email protected]

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- Autonomy, Visibility, Responsibility: What object-oriented technology is telling us about ourselves By Karen Mackey

We've frequently heard that art imitates life and life imitates art. More recently, the late mythologist and author Joseph Campbell suggested that, like art, our science, as a form of human expression, also imitates life and vice versa. Thus we might look to our science as a meaningful metaphor for our contemporary human existence. Naturally that brings us to the question, "what is object-oriented technology telling us about ourselves?"

There are a variety of things in object-oriented technology that do seem to reflect our human existence as individuals, as corporate entities, and as a society. For example, the transition from a master/slave program structure to a collection of collaborating individual, "autonomous" objects seems to mirror our human social transition from a feudal state to a federation of individuals. It's probably not surprising that OO originated and has been embraced by Western cultures where individualism is highly prized.

Even the notion of privacy which is so highly valued and instituted in our US laws is reflected in the concept of encapsulation where no object is allowed to diddle with another object's data. The range (public, private, protected, and friend) of visibility of an object's operations parallels the levels of human behaviors that span a variety of social spheres from intimate partnerships, to family, friends, acquaintances, and strangers. For a well socialized individual as for a well formed object, the "visibility" of human behavior is appropriately defined - e.g. you don't discuss your most intimate feelings with new acquaintances...usually.

That an OO designed and implemented system is made up of collaborating objects also mirrors our "new found" corporate understanding in the US that business is accomplished by effective collaboration of individuals. Of course, we're still trying to unravel the mysteries of what "effective" means. In addition, we recognize that business opportunities sometimes are most sensibly pursued through strategic alliances with other companies. Nonetheless within companies, it is still considered proper protocol for person A who wants something from person B to raise it up the chain of command to get official agreement - i.e. A talks to his/her manager to make the request of B's manager. This is similar to what is considered good design for members of an aggregate. Specifically, aggregate members should work through their controllers first rather than just talk to each other.

One of the tenets of good object design is to make objects coherent - focus on one purpose and do it well. Likewise, we see a continuing thrust for people to specialize in a technology or a skill. On the other hand, one of the problems we see frequently in our corporate organizations (especially in development projects) is the absence of generalists to help solve problems that have a larger scope than just the specialties of the individuals - for example, during integration of subsystems or during optimization of system performance [K. E. Mackey, "Why Bad Things Happen to Good Projects," IEEE Software, May 1996.] We might look to our successful OO systems and the role played by the system and/or subsystem controllers to get better ideas for our human organizations.

Taking this one step further, one would expect our human organizations to be in some sense isomorphic to the organization of a system under development [M. E. Conway, How do committees invent?, Datamation, 14(4), 1968.] However, how many times have you engaged the help of the "responsible" person only to discover that they don't have the power or authority to get their job done? We recognize that system architecting has a much larger element of art versus engineering. So, it's not surprising that our human organizations suffer from the same lack of architectural clarity as our systems.

Regarding human behavior, Eleanor Roosevelt said that no one can make you feel bad about yourself unless you let them. Similarly, an object cannot respond to an event if it has no corresponding operation. I'm sure that therapists around the world are counseling people to become better objects by taking ownership of their own operations and eradicating the undesirable ones. On the object side of the analogy, what if we built "learning" objects so that they could be vulnerable to importing undesirable operations (dysfunctional behavior)? Of course, our networked computers already are experiencing problems of downloading undesirable stuff. We must teach our objects not to talk to strangers.

Another interesting mirroring of human existence in our object technology is in ternary relationships. They're a real pain to draw, the object-oriented notations don't easily accommodate them, and we usually aim to reduce a ternary relationship to a set of a binary relationships. Interestingly from a human development perspective, as noted by Jean McLendon of the Satir Institute of the Southeast, Inc., the relationships that give us the biggest problems are the triangles, both within the family and outside the family. Perhaps when we figure out how to deal with triangles more effectively, we'll be able to develop improved modeling techniques for ternary and even n-ary relationships.

Another analogous relationship to consider is that between an unreferenced object and the isolated human. Some languages/runtime environments (Java, Smalltalk) handle them and some don't. Likewise, some cultures attempt to handle the disenfranchised and some don't. And how about objects that poll (obsessive-compulsive) versus those that are interrupt-driven (not assertive enough)? How about an object instantiated from a class with multiple inheritance (children from a blended family)?

At any rate, pondering question "what does OO technology tell us about ourselves?" may give us some additional insights into ourselves. They may be profound and possibly allow us to use an OO metaphor to characterize and analyze our contemporary social problems. Or they may just be fun to explore and give us good grist for object-oriented limericks:

There once was an object named Lee, Who did not have a good O-I-D, He got to a state That had no escape. Reboot was the only known key!

Note: This article was originally published in Distributed Object Computing in March 1997. Karen Mackey is a Senior Process Engineer at Lockheed Martin Missiles & Space and the editor of the “Culture at Work” column in the IEEE Software Magazine. Previously, she was a software developer and manager at Lotus, TRW, and AT&T Bell Labs. Mackey received a PhD in computer science from Pennsylvania State University.

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- You Must Remember This

Ian E. Wilson, Archivist of Ontario

Note: This article is based on an address to the Professional Development Forum at GTEC’s Technology in Government Week, Ottawa, October 27, 1998. Ian Wilson, Archivist of Ontario, spoke as part of a strong Ontario Government presence this year. While GTEC has traditionally focused on the development and uses of IT in the public sector, Wilson emphasized the many information management and preservation issues that are part of our not-quite-digital age. With the help of Humphrey Bogart and Casablanca, he was both thoughtful and humourous as he explored the rapidly changing information environment. For related information you might also want to look at the Ontario archives’ WEB site on information management (http://www.gov.on.ca/mczcr/archives/english/rimdocs).

This day and age we’re living in Is cause for apprehension, With speed and new invention And things like third dimension.... No matter what the progress Or what may yet be proved, The simple facts of life are such They cannot be removed. You must remember this... A kiss is still a kiss, a sigh is just a sigh... No matter what the future brings, As time goes by....

“As Time Goes By”, 1931, Words and Music by H. Hupfeld

You Must Remember This

At least some of you will recall that in the movie, Casablanca, Humphrey Bogart told Sam, the piano player, not to play “As Time Goes By”. The refrain begins “You must remember this...”. But the proprietor of Rick’s Café was trying very hard to forget the past. He was unhappy that “of all the gin joints in all the towns in all the world,” Ingrid Bergman had to walk into his. If he had pictures of her and old love letters, he would have long since burned them. Rick didn’t need a paper trail to document his painful memories of their affair. He wasn’t subject to audit nor did he have to contend with freedom of information laws. He wasn’t accountable to the public. He could thumb his nose at politicians. Clearly, Rick was not a civil servant.

As time goes by for most of us, memory is not enough. In the conduct of our business within government and other organizations, we create, collect, maintain and use a vast array of documents and data in paper, electronic and other forms. These records support all operational activities and are the evidence of our individual and joint actions, decisions and transactions. In its quest for peace, order and good government, the Ontario Government and its more than 60,000 employees maintain about one and three-quarter million feet of paper records in offices, government records centres and the Archives of Ontario. Large volumes of records in other media exist as well such as film, photographs, maps and video. More than 150 terabytes of electronic data are stored in mainframes, workstation computers and network servers.

Most of these records have a relatively short life-span. The value of the majority of government records diminishes rapidly over time and as part of its corporate information management role, the Archives authorizes their destruction. The paper records are shredded and find a new life in recycled paper products. This may be the purest example of the privatization of a basic government activity -- that some of the records of government find their greatest use as ceiling tiles.

About two or three percent of government records have continuing administrative, legal, fiscal or historical value and are transferred to the Archives. As evidence of actions and decisions, the permanent record enables governments to function effectively, to evaluate their activities and to be accountable to the public. They have many uses. They record birth, confirm death, verify divorce, document ownership, detail Cabinet decisions, document court proceedings and serve a myriad other operational, legal and research purposes. Archives provide the basis for all historical research and heritage preservation activity. They record the development of our society in all its diversity and complexity.

The public record supports the citizen’s “right to know” what their government has done and to hold it accountable. This is one of the defining characteristics of a democracy. Unless information is preserved for as long as it has value, public access rights are meaningless. As time goes by, the archival record constitutes the government’s corporate memory -- a memory we inherited, we add to and pass along to future generations.

The documents we create, collect and keep record the changing circumstances and events in our lives and the world around us. Because of the particularly rapid pace of technological development today, we sometimes like to think of change as a recent phenomenon. But throughout history, change in the economic, social, technological and other spheres has been continuous, often tumultuous in the context of the times and disorienting to those who experienced it. The evolution of the digital computer is a case in point. In the early 1800’s, highly skilled weavers in France were terrified of losing their jobs when new automated Jacquard looms were introduced. These looms produced intricate patterns controlled by punch cards rather than by the weaver. In one of the earliest worker revolts related to the introduction of new technology, they threw their wooden shoes called “sabot” into the machinery to jam the gears. This act of defiance did little good, although it did give rise to the modern term “sabotage”. Within a few years, there were 11,000 of these looms in use in France. Soon after, the Englishman Charles Babbage, considered the father of the automatic digital computer, was inspired by the Jacquard loom to invent one of his first “analytical engines”. (And if this were not enough, Babbage also invented the locomotive cowcatcher.)[1]

Other technologies took longer to develop. In Babbage’s day, documents were created using “low tech” tools. Natural, carbon-based substances -- ink, paper and an early digitally controlled input device (a quill pen) -- were all that was needed to create the records of government, of commerce, and of personal use. Even so, these simple communications technologies linked to transportation systems based on horse power and sailing ships enabled London, Paris, Madrid and Lisbon to create and manage empires circling the globe and spanning centuries.

Whatever the attraction of the past, we live in the present -- a period we like to call the Information Age. Computer technologies have become the predominant tools for conducting our business, governmental and personal affairs. The combination of increasingly low cost information storage, more complex information processing systems and cheaper telecommunications are transforming society, government and its institutions in ways we can but dimly perceive as yet. Several years ago a think tank of senior federal officials observed the changes in which they were participating:

“As society becomes more interconnected, complex and turbulent, more traditional ways of organizing and governing are being overwhelmed. In a more educated, interconnected, information-rich environment, governing systems predicated on a limited flow of information, including both bureaucracy and representative democracy itself, lose their credibility and authority.”[2]

The point they were making is that unless government adjusts to the realities and expectations of the information society, it will become increasingly irrelevant.

Although information technologies provide huge potential benefits, they do not, by themselves, increase our knowledge, improve the way we manage our business affairs or guarantee good government. They certainly do not ensure that important information is preserved for as long as it is needed. All of these outcomes depend on whether and how we as public sector administrators deal with important information management issues.

I would like to talk about the current information environment and the barriers and challenges we face in accessing, managing and preserving information created in a variety of traditional and new media. The concept of the Information Age suggests an abundance, indeed, even an over-abundance of information and media. A recent study confirms that we are not only dependent on information, but frequently overwhelmed by it and sometimes addicted to it.

A Reuters survey found that workers feel driven to gather as much information as possible, but that one half feel unable to cope with the information they have accumulated. Fifty-five per cent worry about making poor decisions in spite of all the information at their disposal. The study indicates that we are having a difficult time knowing whether we have the right information and whether we have enough, too little or too much. Now here’s the important part: 84% of respondents felt that information overload could be reduced if companies offered training courses specifically designed to help staff gather, manage and use information. Information management training would enable more informed decision making, better productivity, higher levels of job satisfaction and reduced stress levels among staff.[3]

Information alone is not enough. We need to know what is relevant and what we should do with it. What we really need is knowledge, that is, the ability to transform information into something useful. The information marketplace is responding to this need. Even though much of this is old wine in new and trendy bottles, consultants are beginning to promote “Knowledge Management” workshops and tools. (Knowledge management, I should tell you, is something that archivists have been engaged in for centuries.) More than 40% of Fortune 1,000 companies have named “Knowledge Managers” or put knowledge management programs in place.[4] My view is that knowledge management is fine, but that we should go for the “full monty”: Wisdom Management.

The road to knowledge, if not wisdom, starts with the ability to access the information we need. But before records can be accessed, they must be created. As a matter of course, governments, businesses and other bodies document their activities, functions, policies and decisions as a fundamental basis for conducting business, measuring outcomes, ensuring accountability and protecting their own and the public’s legal rights.

My experience in several governments, however, suggests that the many valid public policy reasons for keeping records are sometimes ignored or submerged in the pressures of day-to-day business. There is evidence that government internal communications are becoming increasingly casual, aided by the growing ease and convenience of electronic mail, voice mail, fax and similar tools. Some key decisions and directions are conveyed orally with no record of the transaction. Minutes of meetings sometimes become cryptic notes designed to obscure as much as reveal. In a legislated information access environment, the risk of not creating or keeping a record is weighed against having to produce the record and being held accountable for what it contains. Some officials fear retaining too many records rather than keeping too few.

Traditionally, there has been no penalty in law for destroying, losing or removing a record (although from time-to-time both civil servants and ministers have been called to task for doing these things). Just last week, however, a private member’s bill was considered in the Commons, which would impose fines for directing, counselling or otherwise causing the destruction of, or tampering with, official records. Bill C-208 would amend the Access to Information Act and would make these actions criminal offences subject to fines of up to $10,000 and two years in jail. The bill was approved unanimously by the Commons Justice Committee and appears to have Government support.[5]

As oral and other undocumented transactions increase, it is more and more difficult to verify actions and ensure that responsibility for them is identified. The key issue is not what records do exist but what should exist to support open and accountable government. Recently, the Australia Law Reform Commission proposed that their National Archives develop documentation standards -- guidelines for the government on what documents should be created to ensure that the public record is complete.

To be useful as evidence, information in any medium must have three essential attributes -- content, structure and context. Beyond these essential requirements, the qualities that make a record useful as evidence are relevance, authenticity, accuracy and reliability.

The presence of these traits depends on whether the information is maintained in effective and reliable record keeping systems. There is much evidence, however, that government’s capacity to manage its information resources is being overwhelmed. The primary reasons for this are the growing volume and complexity of documents (originals and copies) in paper, electronic and other formats; inadequate information management and information technology skills among government staff, and outdated document management standards and tools. Other factors include frequent program restructuring and increased outsourcing, both of which have led to the loss of important documents or confusion as to who should create and keep records.

The use of computers has not reduced the need to organize and manage records and data. Within ministries, the search for needed files among cluttered local hard drives and network servers, using cryptic and uncontrolled file names, tests the skills and patience of civil servants. Other issues include the proliferation of complex and often incompatible computer systems and software; threats to the security of electronic files; the easy and often unneeded duplication of files; unnecessary printing, photocopying and paper filing; and the ease of accidental and arbitrary erasure. Huge volumes of legacy data are stored unnecessarily in active systems, slowing the search and retrieval of current information.

In many jurisdictions, including Ontario, the approach to the management of information and information technology has traditionally been decentralized, fragmented and uncoordinated. Most information and data in government are maintained in ministry and program silos. There is still much hesitation in sharing information among (and even within) ministries, a hesitation reinforced by Ontario’s privacy law. As well, there are numerous gaps and frequent duplication of information collection. There has been a preoccupation with the technology bells and whistles rather than with managing the real asset: the information itself in whatever form it may take. As a result, many information technology development projects have failed or produced disappointing results or only incremental value. Too little attention has been given to government business goals and priorities and how information may be managed and used to support program objectives. There are also problems of program organization and professional perspective. In many ministries, related information management functions such as systems development, records management, access and privacy administration and library and research services are unconnected and uncoordinated. Too often we have separate “information technology” and “information management” islands. Information technology specialists, records managers and archivists speak in their own mysterious languages and collaboration among these groups has been sporadic. A “record” for one may simply be more “data” for another. For some, long- term data storage means months or a few years; for others, it is measured in decades and centuries. For some professionals, paper is passé; for others it is the “real thing”. Clearly, traditional records management approaches have been slow to address I.T. systems and their imperatives. On the other hand, the more technology obsessed don’t seem to understand that we are likely to have a paperless bathroom before we have a paperless office.

In the day-to-day life of the public service, the results of not properly managing an organization’s records can range from inconvenience and inefficiency to legal and political liability, and sometimes scandal.

Several years ago, allegations of physical and sexual abuse were made by former students of Ontario training schools. Based in part on access to information in files preserved by the Archives, more than two hundred charges were laid against training school employees. In the case of Grandview Training School for girls, however, the paper trail was harder to follow. Police knew that an investigation of allegations of sexual abuse had been undertaken in the 1970’s . The Archives had investigation case files from this period, but a search failed to turn up a report. Archivists searched through more than 200 feet of records. Nothing. On a hunch, they then looked in other, more unlikely locations. A copy of the report was finally found within a file about a fire at the Stratford Jail. The police copied the report and related documents and an investigation is still underway. A key question is whether the report had been inadvertently misfiled....or if it had been deliberately buried.

The federal government has experienced better known instances of the loss or destruction of critical records. These include allegations of the cover-up and unauthorized alteration and destruction of records related to the actions of Canadian soldiers in Somalia, and the destruction of records related to management of the blood supply. More recently, the public inquiry into the actions of the RCMP and the Prime Minister’s Office around last year’s APEC conference is dealing with evidence contained in numerous electronic mails, minutes and other documents. Issues regarding the integrity of official records are in the news daily now. Other controversies revolving around records relate to the Dionne Quintuplets, the Westray Mine disaster, Native residential schools, and First Nation land claims (which are based on records dating as far back as 1763).

The unique nature of electronic records poses special challenges for information managers concerned about the long-term integrity of recorded information. In the digital environment, the definition of a record is itself elusive and evolving. Although the majority of electronic records in government and business use today are straightforward word-processed documents, spreadsheets and databases, new and more complex document forms are increasing, such as compound, multimedia and virtual documents. These new document types are raising difficult questions for information specialists, lawyers and others. How can we identify and capture discrete documents which have numerous and changing informational components? How can we know who authored them in a shared work environment? Will there be an electronic “paper trail” to allow us to audit transactions and verify transmission, receipt and subsequent action? How can we distinguish between an original record and its copy and will we need to? How will we know the authentic from the altered? In the movie, a fictional Forrest Gump was digitally inserted into historical film footage, blurring fact and fiction.

Electronic mail provides a good example of a record type which continues to confound users, information specialists and the legal profession. Email has arguably become the most pervasive medium of information exchange in government and other large organizations. Although many people treat email very casually, it is an official record when used for business purposes and it is subject to freedom of information and protection of privacy laws. Unfortunately, important messages are often arbitrarily or purposely deleted. But do we need to keep all emails and how do we decide which are important and which can be erased? How long should we keep them? How and where should they be filed? Should we print out important messages or is this defeating the purpose of the digital medium? There are answers to these questions and many organizations have developed policies to guide staff. Similar questions, however, need to be posed for other media -- including voice mail.

Electronic media and the computers and software that store and process information have shown themselves to be fragile things. The most challenging question that faces archivists, historians, the legal profession and others is this: can we hope to preserve indefinitely and be able to access the information which is created and stored in systems which have an intrinsically short life span?

Electronic media emphasize rapid communication across space. We need to ask if they will also be able to communicate across time. Unless we take steps to deal with this issue, the Information Highway that beckons so promisingly before us may disappear when we try to look in the rear-view mirror. Although this is a crucial matter for the archival profession, it should be of concern to anyone who hopes to use or provide access to information and data stored in systems more than a few years old. It raises an issue that lies at the very heart of democratic government: accountability.

When Sam played “As Time Goes By” in Casablanca, he was confident that some things never change. “A kiss is still a kiss.....a sigh is just a sigh.....fundamental things” that can be relied on “no matter what the future brings”. The same cannot be said about digital documents.

“Machine readable” means machine dependent. Floppy disks, hard disks and magnetic tape are made by gluing thin layers of magnetic oxides to plastic or metal surfaces. These layers deteriorate when the plastic shrinks or expands, the adhesive degenerates or when the magnetic particles are disrupted or lose their precise orientation. These media also deteriorate with use. The industry estimate of the useful life span of a floppy disk is about 700 “spin hours” or 3-5 years of normal use. At the U.S. National Archives, a 15 year-old magnetic tape containing White House electronic messages began to melt as it ran on a machine that spins 10 times faster than earlier models. When damage is done, the result may be one corrupted letter among thousands of words or it may be a totally unreadable program or file.

Optical disks are a little more stable. With improper manufacturing, mishandling, moisture or heat, however, the disk can be scratched, the layers can separate, and the surface can corrode.

Aside from media durability, hardware obsolescence is a major problem. As a rule of thumb, if a digital medium is well positioned in the market, it will likely be available for about 10 years. Ken Thibodeau, Director of the U.S. National Archives Center for Electronic Records, described what he called the “moving threshold” [6]:

“You can get an optical disk that may last for a hundred years, but in ten years you won’t be able to read the thing. . . .We figure we’re safe for a decade. That means that in 10 years we expect to have to copy everything onto something else, but we don't know what that will be.”[7]

Jeff Rothenberg, a senior computer scientist with the Rand Corporation was less optimistic: “Digital documents last forever -- or five years, whichever comes first”.[8]

Software also exhibits the same “moving threshold. Application software and operating systems come and go. The more rapidly this happens, the happier Bill Gates is. While most software companies claim to support earlier versions of their products, it is wise to assume that the latest version will only offer one or possibly two generations of backward compatibility. This contributes to the growing volume of records stored in obsolete and potentially unretrievable formats. Public key infrastructure and other encryption systems add further complications for the long term.

This rather quick tour of media longevity and systems obsolescence prompts the question: how will we ensure , as time goes by, that we can “read” those ancient CD’s and other electronic formats?

The traditional approach to preserving electronic records seems, at first glance, straightforward. We maintain as long as possible the necessary software to read the record and related documentation. We periodically migrate the information from one hardware and software generation to another. We copy records to new media before the old media deteriorate.

The migration of computer files from one system or platform to another is complicated, costly, time consuming and labour intensive. This process requires a firm commitment and continuing funding to be effective. Breaking the cycle may mean the documents become irretrievable. It is typically reserved for only the most important records and data.

Sometimes it is technologically unfeasible to migrate or something may be lost in the translation, whether information content or some data processing capability. Formatting changes alone can alter the document’s content, context and structure. As well, new operating systems and application software will likely evolve very differently than those we know, posing challenges we cannot yet foresee. Over successive electronic generations, more and more is likely to be lost in conversion, calling into question the record’s reliability as evidence.

Maintaining the hardware and software used in creating the original record offers no solution. This is often impractical and doesn’t solve the media deterioration and obsolescence problem. There are few Univac repairmen still around and the same will be true for today’s PCs. On occasion, however, pulling old equipment off the shelf has saved the day. The U.S. Government used old mainframe computers from the Smithsonian museum to read census punch-card records from the 1960’s.

The option of converting electronic text documents to a universal format like ASCII should be mentioned. This is useful for straight-forward text documents, although possibly important formatting features are lost. As well, there is no certainty that ASCII, Rich Text Format, HTML or other generic formats will be around for long.

Another option is converting text and graphic image files into computer output microfilm and microfiche. The result is a very stable and relatively compact medium, accessible on widely available analogue equipment. This format has the advantage of being able to be computer scanned in future when and if a more stable electronic format appears. This option, however, is labour and processing intensive.

A more mundane option needs to be considered. The most common form of preservation is printing out the electronic document and storing the paper copy. This is feasible for most text and graphic files. It is estimated that 60% of all emails are printed out and filed.[9] Printing is difficult or impossible for complex document types, however. Printing a digital document often destroys many of the advantages of using the technology in the first place. It can no longer be easily accessed and distributed, processed and the data manipulated, or easily integrated with other types of information. Printing means additional storage space requirements, negating the savings made possible by the electronic format. Still, I am confident that in the days and weeks preceding January 1, 2000, printers in offices across the computerised world will be red hot.

In the wake of the Iran-Contra scandal, the U.S. courts ordered the National Archives in Washington to develop methods of preserving government email electronically. As an interim measure, the Archives issued a policy which said that offices could discard electronic versions of any government record if it had been printed out and filed. A coalition of researchers, journalists and historians concerned about the loss of information and the benefits of electronic access filed suit demanding that the Archives’ policy be withdrawn. National Archivist, John W. Carlin, defended the policy citing the technical challenges of preserving electronic records and saying that the Government has no system capable of storing the huge volumes of electronic data generated by a computerised bureaucracy. The U.S. District Court ruled the Archives’ print-and-delete policy “null and void”.[10]

So how will the essential problems of preserving electronic information be solved? As yet there is no clear answer. For the short term, partial answers lie in open systems standards and decreasing dependency on proprietary hardware and software. For the long term, Jeff Rothenberg proposes a combination of approaches including the development of systems which can emulate obsolete hardware and software environments and which use detailed descriptions (metadata) of the documents. The theory is that one day, software could be created “that would let the hardware of the future mimic the hardware of today”.[11] The idea of using one computer to emulate another is not new. This is how Windows programs run on current Macintosh computers and how old video arcade games can be resurrected on modern machines. If this approach works, it will not be easy. Critics say that it is unlikely that complex, highly interactive systems will ever be documented to the level necessary to make this work. They point to the problem of long-term data preservation as “the [Mount] Everest of computer science”.[12]

While we wait for these or other solutions to appear, many of us with limited budgets and technical expertise will be choosing the most practical and affordable techniques from among those I mentioned. In many instances, files are being printed out or stored as ASCII code. In others, critical systems are being migrated to new hardware and software environments. In some cases, the Archives of Ontario is asking the offices which created the records to maintain and migrate them to agreed upon environments. Some of these are temporary and transitional measures, in the hope that problems that daunt us now will be solvable in the future.

What about the archival holdings we already maintain in their original paper and other analogue forms? Is digital technology the answer to our preservation needs? The answer is no, not now and maybe never. Are digital imaging, the Internet and other new technologies essential for enhancing access to and wider use of our existing collections? Emphatically, yes.

Within the Ontario Government, we are focusing our efforts on raising awareness of records issues and contributing to the development of corporate strategies for information technology where we bring a life-cycle information management perspective.

In addition to its traditional archival functions, the Archives of Ontario provides leadership for Ontario’s recorded information management (or RIM) program. This role is based on our authority in the Archives Act and powers delegated by Management Board of Cabinet. We develop RIM standards, guidelines and best practices to guide the management of the government’s records and we provide advice and policy-related training. A variety of plain language publications dealing with strategic and operational information management issues is available on the Archives WEB site. Because the destruction of records requires my authorization, we coordinate and have streamlined the process for determining records retention and disposal requirements. The new process emphasizes the need for departments to set priorities and not waste time on records and data which have limited value.

We are also involved in a number of corporate restructuring projects, all with substantial information management dimensions. One of the most important of these initiatives has been the development of an “Information and Information Technology Strategy” for the Ontario Government. As part of this Strategy, a high level enterprise information architecture is being developed. It identifies information as a corporate resource which must be managed throughout its life-cycle according to approved standards for recorded information management.

The Strategy requires government ministries to approach information and IT planning in a systematic way. Ministries must clearly link the way they create, collect and manage information to their annual business plan. They are also asked to report on how they manage their information holdings over their life-cycle and in all media.

The Archives contributed to the development of these “IM and IT Planning Guidelines” and has produced a bulletin on “Information Management and Business Planning” to help ministries in addressing information management issues. The bulletin also identifies the need to take proper steps to review and seek authorization for the disposal of those systems and data holdings which will not be made Year 2000 compliant.

We are also working with many ministries on the development of major I.T. systems. The most ambitious of these is the Integrated Justice Information Project. This linked group of document and workflow management systems will integrate and automate a number of record keeping and information sharing functions across the policing, courts and correctional services sectors in Ontario. Justice records are among the most important created in the public sector and among the most heavily researched collections at the Archives of Ontario. To be successful, the Integrated Justice Information System will have to ensure that these electronic records remain authoritative, reliable and secure for very long periods of time. Cases that have been reopened after many years because of new DNA or other evidence depend on the availability of earlier records. The new system will need to ensure that electronic records which are created are linked to related paper records so that the complete file is available. This will be critical, as I suspect it will be a very long time before police, lawyers and judges accept a completely paperless system.

Other important steps are being taken to deal with electronic records in a legal environment. The Uniform Law Conference of Canada has drafted a Uniform Electronic Evidence Act. The Act accepts that electronic documents are and will increasingly be submitted as evidence in litigation. It focuses instead on the need to ensure the reliability of the record keeping system in which the documents were created, transmitted and stored.

In Ontario, the Archives is trying to help provincial ministries and agencies to improve record keeping practices and design more effective electronic information systems. What are some of the most important things public sector IM and IT managers should do? Here is a list:

First, change the mindset of those who think of their records as either personal possessions or the worthless by-product of government. Information is a critical resource we manage as a public trust. Its availability and integrity must be protected for as long as the information has value, just as we would do with other assets -- human resources, finances and facilities. Second, make sure staff understand their responsibility to document important activities, transactions and decisions. When information is missing or can’t be verified, effective and accountable program management, public access and legal rights are jeopardized.

Third, apply a life-cycle approach to the management of information in all media and systems. This includes protecting the accessibility of important data over changes in the technology environment.

Fourth, not all information is created equal. Toss out or delete information and data that have very temporary uses or are not official business records. These include “Let’s do lunch” emails and the first thirteen drafts of a report. One of our most popular publications is called “The Fine Art of Destruction”.

Fifth, provide training opportunities for staff in how to gather, evaluate, manage and use information. This will help them become real knowledge workers.

And, finally: use technology to redesign more efficient business processes and workflow. Automating old ways of managing programs and their information is a waste of time, energy and money. Look particularly at opportunities to streamline the flow of information and eliminate unnecessary paper creation and handling.

The time to make sure IT systems will function as effective record keeping systems is when they are being developed. After systems are in operation and important data is being arbitrarily or improperly deleted, it is too late. Life-cycle information management issues such as data retention, migration, disposal and archiving requirements must be dealt with when system functions and capabilities are being defined. Why can’t governments use their huge purchasing power to require that IT vendors incorporate key document management functions in new systems?

The uncritical acceptance of solutions developed largely with the private sector in mind is not sufficient. Certainly the private and public sectors have many similar requirements, at least in the short term, but the private sector has as yet no legally defined obligation regarding public access and they are not bound by the public sector’s prime directive -- public accountability.

The failure to plan for the proper and effective management of information in I.T. systems impedes good public administration and the accountability of government. The blame doesn’t lie entirely with the I.T. community. It is no longer acceptable that both I.T. specialists and program managers ignore record keeping responsibilities and look for a technological quick fix. I would suggest that when the history of public administration in the 20th Century is written, the Year 2000 problem and its huge and unproductive costs will be seen as a major failure of public sector management. The problem was entirely foreseeable, but managers were too polite or uninformed to ask the technologists about what they were doing. For their part, technologists were preoccupied with immediate and narrowly defined objectives. As a result, technology drove management and neither the systems nor the information assets of government were being managed in the public interest. There is no longer an excuse for management’s failure to maintain the integrity and accessibility of critical records and data.

Clearly we are still adjusting to changes in how we conduct business: from technology management to information resource management; from physical documents to logical documents; from analogue to digital; from location specific to location neutral; from software dependent to software independent; from media preservation to information preservation.

The evolving information environment calls for new strategies, new responsibilities, new tools and new skills. In the short term, we will need to manage a mix of physical and electronic records. In the longer term, we will need to ensure the management of distributed records in on-line systems using “hidden” classification systems and other evolving capabilities of document management software. The permanent or archival record will need to be identified when systems are designed and tagged for automatic transfer to virtual archives in multiple locations using new standards and procedures.

We will need to redefine the skills which information managers and systems specialists require and create a team approach to issues which cross traditional professional boundaries. We need to link the objectives of the program manager, the technology skills of the systems specialist, the organizational focus of the records manager, the legal savvy of the access and privacy administrator, and the long-term perspective of the archivist.

This will not be easy and along the way, some electronic records of government, business and other sectors will be lost or become inaccessible. For most records which have very short term value, it will not matter. For others, important gaps will occur. How significant will the loss of this knowledge be? Is information from the past still of value? Can it help us to understand the present and plan for the future?

We must individually and collectively decide if these questions remain important as we address the challenges of rapid and disorienting technological change. Like Rick in Casablanca, we have to decide if we want to remember...or forget. The song, “As Time Goes By”, is a sentimental but potent reminder that some important things do not change. Each generation endeavours to learn from those who went before, seeking knowledge and even some wisdom in the recorded experience of the generations. If we remember anything, we must remember this.

Notes:

I wish to thank Andy Lipchak, Coordinator of Policy and Planning at the Archives of Ontario for his considerable assistance in the preparation of this paper.

Opinions expressed in this paper are those of the author and not necessarily those of the Government of Ontario.

[1] “Babbage, Charles” in Encyclopaedia Britannica 15th Edition. 1988. Volume 1, p.765; and “Jacquard Loom” in Encyclopaedia Britannica 15th Edition. 1988. Volume 6, p. 467. [2] Steven A. Rosell et al., “Governing in an Information Society”, Institute for Research on Public Policy, 1992, p. 91. [3] “Glued to the Screen: An investigation into information addiction worldwide”, Reuters, 1997. [4] ”Social Studies”, The Globe and Mail, June 30, 1997. [5] Hugh Winsor, “Liberal member tackles secrecy”, The Globe and Mail, October 21, 1998. [6] William H. Honan , “At the National Archives, Technology’s Flip Side”, New York Times , October 1, 1995. [7] Ibid. [8] Jeff Rothenberg, Presentation to the Managing Electronic Records Conference, (Cohasset Associates), Chicago, November 7, 1995. [9] Mary Gooderham, “Electronic messages burying workers”, The Globe and Mail, May 14, 1997. [10] Michael Cooper, “Federal Government Clings to Paper Records”, New York Times, April 9, 1998. [11] Stephen Manes, “Time and Technology Threaten Digital Archives”, The New York Times, April 7, 1998. [12] Ibid.

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- Spotlight ICA

Established in 1968, ICA stands for the International Council for Information Technology in Governmnet Administration. A non-profit international association, ICA promotes and faclitates the informal exchange of ideas, knowledge and experiences in the management, organizational impact and use of IT in government.

ICA’s current membership has representation form 23 nations: Australia, Austria, Canada, Cyprus, Denmark, Finland, France, Germany, Hungary, Ireland, Israel, Japan, Korea, Malta, Netherlands, Norway, Portugal, Slovakia, Spain, Sweden, Switzerland, UK and the U.S.A.

Through its annual conferences, study groups, regular publications and frequent contacts between members and associates, ICA provides officials in national governments with a vehicle for addressing key issues and emerging policies related to the life-cycle management of IT investments in and by the public sector. As such, ICA supports senior management in public administrations in the formulation of policies and approaches to improve the effectiveness and efficiency of government administration.

For more information about ICA and its activities visit their web site at http://www.ica.ogit.gov.au/. ICA has recently held its 32nd conference in Helsinki, Finland. Following are two articles presenting conference highlights from two perspectives. The first perspective belongs to Dr. Rainer Mantz, the current ICA Chair. The second perspective belongs to Mr. John Riddle, Director General, Applications Management Services, Government Telecommunications and Informatics Agency. The 32nd ICA Conference in Helsinki - a Flashback with Polar Lights

Dr. Rainer Mantz, ICA Chair

“Taking stock” was the common task that brought together the ICA community for their 32nd annual conference at Helsinki’s Kalastajatorppa Hotel, named after a former village of fishermen overlooking the shore of the Baltic Sea. But it was not the current year that ICA’s programme committee had in mind when they decided on stock-taking as the theme for this conference; rather, it was the end of the century, which happens to be a turn of the millennium as well.

Whenever this date is mentioned where experts in information technology meet these days, it will evoke at least some thoughts and remarks on the year 2000 – or Y2K – problem. Hence it did not come as a surprise to anybody that the first session of the conference was dedicated to this problem. Three countries - the Netherlands, Sweden, and the US - plus the OECD as an supranational organisation shared with the audience their experience regarding Y2K preparations and contingency planning. According to OECD there is an average risk of possible losses up to 1% of the GDP for member countries caused by the problem, which can be looked at as a not too alarming figure. But on the other hand OECD warns that the challenge of enabling small and medium enterprises to cope with the Y2K problem has not yet been fully met, and that the health sector as awhole must be considered critical.

This viewpoint was reinforced by Dr Verstege from the Netherlands. Especially for the health care sector a detailed Y2K project and contingency planning has been carried out in that country. Findings show that in the health sector all telecommunication, building management systems and the software for intensive care units and operating theatres are – literally – vital. To prepare for the year 2000, in the Netherlands a combination of monitoring and supervision is used. Institutions that do not respond satisfactorily to questionnaires sent to them within the monitoring process will be inspected by supervisors.

By introducing the Swedish six step model for handling the Y2K problem Brigitta Nelson offered something like a rule of thumb for what must be done everywhere:

 create awareness,  make an inventory,  plan necessary action,  prioritise what must be done,  analyse affected components in detail,  and adjust the systems.

Accordingly, Neil Stillman from the US had a very similar five phase scheme consisting of awareness, assessment, renovation, validation, and implementation. Mr Stillman focused in his lecture on contingency planning, because he emphasized that repair measurements will not be feasible any more for an organisation that is to-date still in one of the early phases. Quoting a congressional evaluation, he said 9 important agencies in the US have reached adequate progress with their Y2K preparations, 8 are making progress, though there is some reason for concern, and 7 agencies can only claim progress which is on the whole inadequate. In this context Mr Stillman reminded his audience that the efficiency of contingency planning, important as it may be for the Y2K or any other problem, very much depends on a decision about the ”trigger point”. When you have a contingency plan and just wait till the 1st of January 2000 it will certainly be too late.

After these words of severe warning it may have been something like a relief to hear at the beginning of the next session from an old friend of the ICA, Ilmari Pietarinen from Finland, that the Central Co-ordinating Agencies for IT in government, as the backbone of this international organisation, are still going strong. They may have changed their outfit, they certainly vary greatly in their organisation, they undergo constant change - though on the whole at a slow rate, but their advisory role seems to remain central.

Next, a panel, consisting of Ian Barndt, Australia, John Riddle, Canada, and Neil Stillman, USA, looked more closely at one phenomenon used to co-ordinate IT in government: the Chief Information Officers (CIOs). In Australia this function has succeeded in bringing leadership, identifying opportunities for innovative solutions, and redesigning corporate services. On top of that, there seems to be a pronounced tendency to extend the role of CIOs to the management of information in general.

For Canada as well John Riddle identified some indicators for a realignment of the CIO function towards the management of information. He mentioned that even the name quite recently had been changed from Chief Informatics Officer to Chief Information Officer. John Riddle underlined the need for an evolving role of inter-jurisdictional governance and for strategic directions in government IT use. This thought gains a special momentum in view of the fact that among the 160 major players in the world economy (ranked according to their annual budgets), only 40 are states, all others are industrial corporations. Industrial corporations invest energy in cultivating a corporate identity where governments too often lose themselves in endless attempts at defining what their common infrastructure is supposed to be.

The audience learned from Neil Stillman that the U.S. has established a federal council of CIOs in 1996. This council has successfully taken on a leadership role in the Y2K conversion efforts already mentioned, but finds it difficult to reach binding agreements on a federal IT architecture. Similarly, government agencies can ask for the CIO council’s advice on capital planning and investment practices, but involvement of the council is by no means mandatory.

The views of the panellists combined a variety of interesting facets to a vivid picture of the CIO phenomenon.

The second day of the conference began by shedding light on the state of cross- government solutions to on-line services. Martha Dorris of the U.S.A., who acted as session chairperson, recalled to the plenary that there was at first sight little reason why cross-government solutions should succeed at all – keeping in mind that the funding typically had to be raised by a champion that developed a solution later to be adopted by others, very often without any re-compensation.

Australia’s Ian Barndt then explained how the government’s own service provider Centrelink would enable their customers, i.e. the Australian citizen, to electronically

 by 1999, access all personal records,  by 2001, ask for services 7 days 24 hours, and  by 2002, have 70 % of all government services delivered.

A new device, called web telephone, plays a key role in the strategy of Centrelink. These telephones offer a telephone receiver, a keyboard, a smart card reader, a screen, and a printer rolled into one. They can be installed like public call boxes and can be used already to-day: During a pilot project which started in September customers can access electronic yellow and white pages, consult the Job Network’s job vacancy list, and pay electricity and telephone bills electronically.

The Norwegian Public Administration Network was introduced as an example for cross- government system that addresses customers inside government. The Network in Norway supports procurement and paves the way for contract standardization for the benefit of civil servants in the central government, in municipalities and in local authorities. While ministries and other similarly large government organisations typically have little difficulty in observing the market themselves or hiring a private sector consultant for that purpose, this system enables easy access to low risk and economically sound solutions at all levels of public administration.

Finland’s Kari Välmäki emphasised that in refurbishing any part of government or society, IT is a mighty tool, but nothing more than that. Knowing what kind of society we want still must precede any reform of, say, the social and health services. A reform of these services is now underway in Finland: Using the client cards now introduced which enable electronic signature and include the client’s personal and contact data, in the long run Finnish citizens will be able to

 make service appointments from home,  communicate with responsible persons in the service chain without visiting their offices,  consult yellow pages,  access pharmacy and transport services, and  claim and receive refunds of medicine expenses.

Speaking of factors enabling or preventing co-operation across government departments and responsibilities, Bill Baker from the U.S. alarmed his audience by simply stating that in many cases the mere communication of emergency reaction forces poses a severe problem. The ALERT system is meant to overcome these difficulties that to-date even may cause the loss of lives. ALERT stands for Advanced Law Enforcement and Response Technology, but the system can in fact improve all of public safety. Basically, ALERT allows the driver of a police patrol vehicle or a fire engine to access data and to control various input systems via a single colour touch screen display. ALERT provides single-window access to

 radar,  video,  radio, and  GPS.

ALERT is currently being tested in the wider Washington, D.C. area in a pilot, which necessarily involves several levels of the public sector, i.e. the State of Virginia, the municipality of Alexandria and some Federal Authorities.

Encouraging the audience to play an active role in the conference is a custom ICA has honoured by observance for many years - this time by a new device: the Affinity Groups. Depending on their preferred topic attendants of the conference met in the afternoon of the second day to discuss the role of government IT use in one of four areas: public safety, education, health, and social security.

Nic Hopkins from the CCTA (UK) offered his audience food for thought on Thursday morning’s session on the Internet/intranet. Firstly, he welcomed the Government Secure Intranet (GSI) for the British civil service as a strategic tool for change which in the end might turn out to be the means for bringing about the globalization of the public sector. As well, his explicit reference to a well known non-electronic public network seems worth while quoting because it applies to any net: plumbing alone will not sell; you must offer hot water!

The Thursday morning session saw many more examples of world or nation wide networks. Some ten years ago comparable networks were only available to big companies and very few government organisations, but now they are offered to everybody at virtually nocost because the necessary infrastructure is just there - the Internet.

Frank McDonough from USA’s General Services Administration cited as Internet based applications:  a private company’s intranet where travel claims for 7000 staff members can be filled in and sent back to the company from all over the world and then are managed by one person,  Portugal’s Infocid, an integrated system offering health services, social services and job information for citizens via kiosks,  the US Army procurement support system that accepts electronic proposals from vendors.

On top of that Frank dared to predict that the Internet might as well become the medium to breathe new life into legacy systems, because in web-enabled systems a mainframe computer could be integrated like any other server.

Shira Har put us in the picture about Israel’s project to make statistical data available for the citizen aiming at a system which can visualize such data at user’s choice graphically, e.g. by producing pie charts on demand.

Nanne Solem Dahl and Michael Wright illustrated how web-enabled system could become in the not too distant future the means to help people find their way round in public administration. Danish citizens who visit the government web site can choose among links to web pages referring to virtually every life event. On the respective page there is comprehensive advice on who in public administration can help, and covered events include birth, education, marriage, retirement and the death of a relative.

To keep pace with innovation this year’s conference ended up - apart from the traditional closing procedures - with a new way of glancing at recent developments in IT: The Vignettes - each one briefly illustrating a new product or an especially forward-looking solution.

We learned about:

 mobile phones belonging to the third generation of wireless communication - the 1st being designed for analogous voice communication, the 2nd for digital voice communication - which enable now multiple simultaneous data connections and soon will make wireless LANs a reality (from a Finnish manufacturer of these phones),

 significant progress in automated speech recognition and interesting approaches for automated translation in the field of weather forecasting - rounded off by the insight that fully automated translations still imply far-reaching restrictions on vocabulary and grammatical structures (from Canada),

 smart cards designed for the use of all US citizens,

 geographical information systems which can supply relevant data for all areas of government work, because up to 30 % of the information used by governments contains direct or implicit references to geographical data (from Hungary), and  smart card based driving licenses which might even help save lives by providing biomedical information like the driver’s blood group - if (as the speaker from Israel very delicately put it) God forbid an accident happens.

To me it seemed that the vignette session succeeded in modelling en miniature what the whole ICA conference usually is about: offer the participants, as it were, motif, palette, and canvas to paint their own up-to-date picture of government IT in the world.

A Canadian Perspective on the 32nd ICA Conference

By John L. Riddle

The 1998 ICA Conference was held in Helsinki, Finland and was represented by 26 countries (all G-7) and 3 international agencies (COMNET, OECD, and the European Commission). This four-day conference, which is by invitation only, covered a wide range of IT/IM issues in a candid and most helpful manner. I wish to capture those ideas which can be broadly considered as “new thinking” or which have a particular relevance for Canadian public sector IT executives.

Year 2000

This is the dominant project of all jurisdictions. It is a world- wide endeavor, yet the approaches (and seriousness) are not uniform. With this backdrop, there are common themes: vendor compliance, contingency planning, funding, testing, measurement of performance, etc.

For European countries the introduction of the Euro starting January 1, 1999 and conflicting regulations which affect a myriad of cross-border subjects (trade, transportation, banking, etc.) stand out as significant worries. On the latter matter of cross border issues, it is not clear who is managing the dossier; OECD has identified the lack of “mapping” of interfaces to be a significant unresolved problem.

Most jurisdictions have zeroed in on Health Services as the most vulnerable sector (“there is no public tolerance for mistakes in this area”) with the Netherlands being the most aggressive and comprehensive in pursuing solutions. The Netherlands have included emergency services (ambulances), pharmaceutical companies (bio medical devices), care centers, handicapped/elderly residences, etc. in the scope of Health Services and are using an innovative adoption program where major problems are assigned to large hospitals or specific suppliers for resolution on behalf of the country. Moreover, they are initiating a controlled rollover in January 1999, in order to simulate difficulties in problematic sectors, such as health, energy, areas of food supply (water), data centers running critical government applications, etc. Sweden, as part of contingency planning, has identified Public Media (TV/radio) as absolutely critical for informing/maintaining order in the event of a Year 2000 crisis. Since their climate and distributed population mirror Canada, there may be a lesson here for us? Various processes are being used to measure Year 2000 progress; none are the same. The consequence of this random “homegrown” approach is that cross border comparisons are meaningless. With the benefit of hindsight, a standard performance methodology (i.e. the Gartner methodology) was a relatively easy and non-controversial area where jurisdictions could have visibly cooperated. Canada is equally culpable. There are not standard measures across Provinces or for that matter, with our largest trading partner, the United States.

Two insights from the USA which could impact the management of the Year 2000 problem in Canada, first, they are insisting on 3rd party verification of all federal/state mission critical applications effective immediately (“words are not enough – prove it”). Secondly, in their contingency planning and the identification of the trigger point to mobilize plans, they are advancing the date to activate. The rationale is that managers/ executives are always “too hopeful” and the complexity of multiple, integrated contingency plans starting in the same timeframe necessitate an early start. A further sobering thought from “south of the border” based on an August, 1998 review by the Y2K Council, which has Presidential visibility, only 11 of 25 federal departments are making satisfactory progress!

On a personal note, if the Year 2000 problem is not fixed, the indictment of the IT Community will be profound and long lasting. It will cause a demand for professional credentials (e.g.. project managers), for cross-industry bench-marking, for financial accountability (where’s the meat? where’s the efficiency? where’s the productivity return?) and for more transparency/performance measures. Further, this client insistence will make no distinction between public sector or private sector; we are all in this together whether we like it or not.

IT/IM Governance

There are two aspects of governance - the process by which power is shared and decisions taken - that will be of interest to Canadian public servants. The first involves a follow-up study by ICA on the changing role of central coordinating agencies (CCA) with respect to IT; our equivalent is the TBS. Essentially there is a movement not to intervene in departmental responsibilities. For example, very few countries surveyed exercise control over IT budgets, rather the majority of CCAs provide policy, advisory guidelines, and coordination in selected fields (i.e. security, privacy.) If one links these CCA study results with the second aspect of governance, the phenomena of Chief Information Officers, several interesting images start to emerge.

The CIO’s seems to be a role largely embraced by western governments, although in quite different forms; the UK has split the role between CCTA and CITU - one an operating agency and the latter a part of the Cabinet Office, Canada and Australia have a unitary approach with the position residing in the TBS and the department of Finance respectively, while the USA has a CIO Council (currently 48 members) with a rotating Chairperson. Broadly speaking, CIOs share a common mandate to deliver “on line services” and to achieve “measurable efficiencies” whether the initiatives are called Connectedness, Centrelink, Channels to Market, Joined-up Government, etc. One cannot help asking if these objectives (all with aggressive timelines) are realistic without some control of IT budgets and intervention into departmental responsibilities. Consider the remarkably candid survey results from the USA, CIO Council on IT expenditures; departmental capital planning was considered by 54% to be ineffective or only satisfactory and 48% viewed the funding of IT in these agency/department as driven more by crisis and ad hoc planning than strategic considerations. These results were supplemented by a further admission that without provisions for mandatory implementation and/or intervention progress beyond framework architecture (dare I say Blueprint) did not occur. For Canada these observations re-open considerations as to the role of the CIO in infrastructure, the control of IT expenditures, the definition of shared infrastructure and the timeless debate on optional and mandatory services.

Inter-Governmental Service Solutions

A variety of countries provided examples of horizontal services to citizens that necessitated significant co-operation across levels of government; municipal, regional, state and federal. These are several common messages that surfaced in presentations from Finland, Norway, the USA, Australia. Generally, persons who commit to intergovernmental solutions do so with a feeling of “making it work against all odds”. They feel like volunteers for which rewards and incentives for horizontalism either do not exist or are fleeting. This said, public servants continue to “volunteer”. They represent a more resilient breed of executive who embrace the notion of “citizen engagement,” build relationships, seek innovative funding arrangements, partner widely and who work with a large service vision while incrementally grinding out results/outcomes. Since cross jurisdictional linkages is the future, one might query if we are producing and developing such persons or letting “natural selection” determine leaders in this arena; can we afford to be this passive?

Another lesson, derived from Centrelink (Australia), the Public Administration Network Project (Norway) and ALERT (USA) is the recognition that a separate organizational structure with different rules and processes which is “nonaligned” with any particular level of government is more nimble and productive. This distinctiveness is different from the Canadian concept of SOAs or Agencies, rather it is more akin to the relationship of General Motors to the Saturn plant; you’ve got to start afresh while handpicking the talent you need to mobilize the vision. The case of ALERT - an on-site emergency response system (photos, biomedical tests, forms, access to data bases, etc.) for police/fire vehicles - also serves to underscore how a disaster, such as Oklahoma City, can focus the mind and reduce red tape. Additionally, by using the University of Texas (U of T) as the broker and dominant manager for ALERT, it provides an organizational model using academia which the Canadian public sector underemploys. Incidentally, the U of T gets all intellectual property and marketing rights for the ALERT software from which they expect to make millions! A concluding pearl of wisdom on inter-governmental solutions that appears so obvious but which warrants repetition; “don't manufacture cooperation, build on it”. If a community exists, if affinities are already established, then go with the flow. This is the case in Finland with the delivery of health/social services which include real time imaging and diagnostics as well as a smart card phone that enables “at home” patient monitoring both in urban and rural centers. This community of health professionals, including industry, was already working together and with “light” government involvement is dramatically increasing the scope and nature of services through information technology.

Internet/Intranet

Effectively, Internet has taken the heat and noise out of discussions on a common infrastructure. It - de facto - is the common infrastructure. All developed countries are using Internet to disseminate information. Some have added the ability to manipulate data/perform transactions and many public/private organizations are making applications web-enabled (29 case studies were provided to ICA participants including the example of MERX -Contracts Canada). Clearly, web-enabled applications represent an international trend and, if managed shrewdly, can generate significant benefits, i.e. all Cisco (part of the telecommunications sector) applications are web-enabled for the use of its 7000 employees worldwide producing a return on investment of 1000% or $350 million saved.

Since the essential Internet issue - beyond the plumbing or infrastructure - is client/citizen self-help or empowerment, then an interesting conundrum exists: how to provide information and services in a manner the user can intuitively find and access. Internet is a different medium, so structuring services is vitally important. Replicating a phone book or library in providing services is unsatisfactory. Denmark is delving into how people go through the web, what patterns emerge, the effectiveness of diagrams/flowcharts/maps, differences in socio-economic profiles or age, etc. Canada is assessing similar issues, most notably, the developers of Strategis. It is frustrating to realize that we have this immensely powerful communication medium and yet limited on-line knowledge about its users and their satisfaction with governments’ attempts to serve them. Unless governments are sophisticated in how they provide Internet services (e.g. targeting by groups, cross-jurisdictional services, life-cycle) then in a world of multiple distribution channels its impact will be diminished; its hard to “align” without immediate measurements and service quality indicators.

To pursue the need for client information in a different direction, reflect on fraud detection in a world wide electronic marketplace; does this inevitably increase pressure for a unique identifier (is a credit card sufficient?), what information management provisions will be required, will we countenance Internet matching of clients, or pattern recognition software, etc.? The adage that technology is not the problem but rather culture, laws, societal norms, citizen acceptance is a recurring theme.

Applied Specialty Technology ICA in an attempt to stay abreast of leading edge applied technology introduced 15 minute “vignettes” on a variety of subjects which included geographic information systems, automated translation, voice recognition, smart cards (Israel has a national smart card for driver licenses that include biomedical information in use for seven years) and wireless cellular phones. It is this last example that should interest Canadian public servants because this proven technology is coming to market so quickly. Nokia, a dominant cellular phone manufacturer, no longer thinks telephony. They are engineering and constructing exclusively with data transmission as their goal. The telephone will be a feature on their wireless products as will a geographic locator, an integrated video camera, Internet access etc., all weighing 40 grams. A prototype exists now with these and other features. This kind of portable power in one’s pocket, makes telework/ hotelling somewhat passé. It also leads to the conclusion that we should put “everything” on the Internet since one will access it anytime/anywhere via cellular technology.

Conclusions

From a personal perspective several trends or issues surface:

1. The Year 2000 problem has not galvinized and mobilized the energy of the global community. We are too ready to be comforted by data that is inconsistent. Contingency planning (including the management of communications) will be the growth area in 1999-2000.

2. Cross-jurisdictional integrated service delivery and associated governance structures (i.e. CIO’s) will bring about genuine innovation in serving citizens. The barriers around who leads, funding, visibility, etc. will fade but incentives, participants skill sets and selective intervention (compliance) will remain on the agenda.

3. Finally, there are two underlying threads that, because they were not explicitly addressed, are to me important harbingers, important opportunity areas. They are electronic document management - how can we have joined-up government without joined-up information? and fundamental changes to the nature of work. Both these subjects require our attention in a post Year 2000 era.

In concluding this article I want to point out that for those interested in more detailed information Country Reports - generally 4-8 pages - are available from 17 countries. They should by now have been posted on ICA’s web site at http://www.ica.ogit.gov.au/.

John Riddle is the Director General, Applications Management Services in the Government Telecommunications and Informatics Agency (GTIS). He is a very experienced executive with a long and distinguished career that involved executive stints with Statistics Canada, Treasury Board, and the Canadian Centre for Management Development (CCMD).

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=- Data Base Publishing on the Internet

By Martin Podehl, Statistics Canada [email protected]

Internet as a dissemination channel

Since 1995, the Internet has emerged as an important dissemination vehicle for Statistics Canada. The first web site was conceived on the principle of “telling” visitors about Statistics Canada. The orientation quickly changed to that of being a statistical information site, providing official statistics in a variety of formats to a variety of clientele. This supported the mandate of Statistics Canada to “to collect, compile, analyze, abstract and publish statistical information relating to the commercial, industrial, financial, social, economic and general activities and condition of the people”. Currently, the number of hits per day exceeds 200,000.

The advantages of Internet as a dissemination channel have become quite obvious:

 one location where the variety of information published and released by Statistics Canada can be accessed regardless of time and distance;  timely release of the latest information with instant access by clients;  the opportunity to publish more in depth information than would be feasible on paper;  the opportunity to publish information in context by providing hyperlinks to related information such as details, explanatory notes, previously published information, quality indicators, underlying methodology, etc;  cost avoidance in physical distribution compared to paper publications where each additional copy incurs costs for printing, order processing, shipping, billing, etc. On the Internet, the marginal costs for having an additional client access an existing piece of information is close to zero for both the client and Statistics Canada.

By now it is quite clear that electronic information services via Internet, or its future variations, will become ubiquitous in society. The question is not whether, but when. The speed of transition depends on many constantly changing factors: the adoption of microcomputers in homes; access to Internet at work, school or home; the increase of communication bandwidth; the costs of Internet connections; the user friendliness of access, navigation and display.

In contrast to paper publications, the marginal costs of informing an additional client through the Internet is very low if not close to zero. However, there are significant costs in operating an Internet site and in developing and updating content. In particular, as the content grows (e.g. Statistics Canada has now over 60,000 HTML (HyperText Mark- up Language) and PDF (Portable Document Format) pages on its web site) the costs of maintaining and updating individual HTML pages become significant. Industry figures state that currently on average 4 to 5 hours are required to manually create and then maintain one HTML page. Methods have to be employed through which such pages are created and/or updated in some dynamic and automated form from an organized set of information. This is referred to as data base publishing.

The principal concept of data base publishing is to separate the maintenance of the underlying information from the representation of its contents as HTML pages. This has two advantages:

 as new information is added to the database, new or updated HTML pages can be generated automatically without any manual intervention and coding.  by separating the two functions, improvements can be made to either of the two functions without impacting necessarily the other.

From the beginning, Statistics Canada has embraced the concept of database publishing as a fundamental design concept of its Internet service. Information on its site is grouped into categories called “information bins” with each bin representing a particular set of pages or documents of the same nature. Examples of our more popular bins are:  The Daily: the daily news release;  Canadian Statistics: a set of statistical tables about Canada’s population, economy etc.;  CANSIM (CANadian Socio-economic Information Management): Statistics Canada’s time series data base;  Trade: a detailed data base of monthly commodity exports and imports;  Downloadable publications: electronic versions of Statistics Canada’s official publications;  IPS (Information on Products and Services): a comprehensive catalogue of all products and services; and most recently,  The Statistical Profile of Canadian Communities: a set of tables from the recent 1996 Census on 6,000 cities, communities and municipalities in Canada.

Some of these bins are actual databases in the sense of a DBMS (Data Base Management Systems). Others are organized sets of documents/pages. The following describes these bins in more detail and shows how data base publishing methods are used to make them accessible, and to inter-link them, on our Internet site.

The Daily

The most popular feature of the Statistics Canada Web site is The Daily. The Daily is the vehicle for the first (official) release of statistical data and publications produced by Statistics Canada. It provides highlights of newly released data with source information for more detailed inquiries. It contains weekly and monthly schedules of upcoming major news releases and announces new non-print products and new services. The Daily is released every working day at 8:30 am. It is written for the media but is also of great interest to analysts in government and industry. The Daily is a highly structured and thoroughly edited document. Major statistics are summarized under the rubric “Major Releases” with highlights, statistical graphs and summary tables. Other statistics are announced in short paragraphs. As well, The Daily references (as hyper links) the publication titles with their catalogue numbers and the matrix numbers of the time series in CANSIM (see below) which contain more details on the data just published by each statistical program.

Each issue of The Daily is added to a repository of all past issues. This growing set of individual issues functions as a database in the sense that keyword searches can be executed against all past issues. As well, links from other pages on our Internet site can reference specific issues of The Daily.

On the technical side: The Daily is produced every day as a fully encoded SGML (Standard Generalized Mark-up Language) document that is then rendered into a variety of dissemination and presentation formats. These formats include: HTML documents on the Web; print versions; e-mail messages to 1,500 subscribers of a The Daily listserver; a voice synthesized dial-up service for the visually impaired; an ASCII text file which some secondary distributors download to their own information distribution networks.

Data base publishing in the case of The Daily means creating a structured document each day (text, tables, graphs, hyper links) from which all disseminated versions are derived, and adding the most recent issue as a new “record” to a repository for future access.

Time series database CANSIM

CANSIM is Statistics Canada’s online time series database. All major socio-economic statistics are stored in great detail in CANSIM as time series with varying frequencies and length of series, with some beginning in 1914 (e.g. the monthly Consumer Price Index). The database is updated daily and the latest data points are released at the same time as summary information is released in The Daily. Currently, CANSIM contains about 700,000 time series. Since 1973 and until 1996, CANSIM data were made available to the public only through commercial online data base services (e.g. Conference Board of Canada, Reuters, Wefa, Datastream, etc) under license with Statistics Canada.

In 1996, Statistics Canada added its own commercial online dissemination service by interfacing a copy of CANSIM to its Internet site. This daily updated database has become the source for two types of services:

 Direct online access to time series: Using an interface programmed with CGI (Common Graphical Interface) scripts for input specifications and HTML pages for output presentation, clients search the CANSIM directory meta data, select the time series of interest, specify the retrieval parameters, pay the specific retrieval fee (unit pricing based on number of time series requested). Payment is made with credit cards via an electronic commerce service (operated by an Internet service provider and a bank). The time series can be downloaded to the user’s microcomputer in a variety of formats. Recently a dynamic graphic display option was added. This interface, in a sense, offers the traditional online service for analytical experts. The innovation being the ease of use with instant response via the Internet and the paperless payment method through e-commerce.  Updating statistical tables on the Internet: Like many other national statistical offices, Statistics Canada publishes a statistical overview of Canada on its web site. This set of summary tables, referred to as Canadian Statistics, describes Canadians and their institutions. These free tables are grouped under four major themes: The Economy, the Land, the People, and the State.

In 1995, Canadian Statistics was launched with about 100 tables. Currently, there are 300 tables and this number is growing. Each table presents a certain subject and its display has been optimized for the screen, i.e. scrolling is avoided where possible. The initial set of tables was created and kept up-to-date manually. It became quickly obvious, that manual maintenance could not be sustained given the limited resources allocated. As most of the statistics are maintained in CANSIM, we hit upon the idea to update the Canadian Statistics tables automatically from the Internet interfaced copy of the CANSIM database. Software templates were developed for all tables where the data was derived from CANSIM. Each morning at 8:30 am precisely, an automated clock initiated process retrieves the latest data points from the CANSIM database, updates the tables, and posts them on the Internet site. The same process is also being used for the Economic and Financial Data table which is updated daily and corresponds to the data described on the International Monetary Fund's Dissemination Standards Bulletin Board (DSBB).

This update process of the Canadian Statistics tables is an excellent example of data base publishing. It has the following benefits:

 No human intervention is required to keep the tables up-to-date.  The layout of all tables is consistent.  The integrity of the figures is ensured as they are retrieved from the verified and authorized database.  The data are released in a timely manner and are always current.

The Canadian Statistics tables have become the most popular feature of Statistics Canada’s web site (ahead of the The Daily). In creating these tables we took advantage of the intrinsic feature of Internet to offer links from each table to more detailed information. For example, the specific CANSIM time series in the CANSIM database can be accessed when a client wishes to view the complete historical time series from which the table was derived.

CANSIM as the dissemination data warehouse

Encouraged by the success of using the existing CANSIM database as a source for electronic publishing, Statistics Canada is pursuing several developments to strengthen the role of CANSIM in this regard:

 CANSIM II: The underlying data base software for CANSIM is being redeveloped (using RDBMS software) to accommodate multi-dimensional tables, not just individual time series. CANSIM II will become the data warehouse for all macro data available on the Internet site as the source for direct data access as well as data base publishing with increased scope. (Exceptions to this are the data from the Housing and Population Census and the existing Export/Import commodity database which continue to have their own data base systems for the time being).

 Multi-dimensional table browsers: These software tools have recently become available (e.g. Beyond 20/20 from the company Ivation Inc., Ottawa). They allow flexible and convenient browsing of multi-dimensional tables (cubes) as two- dimensional presentations on the screen. They allow powerful access to the flexibly structured database while still preserving the easily absorbed presentation of statistics in flat tables or as a set of time series.

 Table creation software: Most paper publications in Statistics Canada consist of tables. As all statistics will be stored in CANSIM II and as most of the publication titles will be re-engineered to become electronic publications on the Internet at some point in the future, the opportunity exists to generate publication tables automatically from CANSIM II on the day of release (this is already done for some publications from CANSIM I). The required software for this function is being developed in Statistics Canada based on SGML as the structured language for marking up tables in HTML for Internet display as well other formats.

 Custom publishing services: While demand for standard publications continues there is a growing demand for customized, client specific services. Database publishing from CANSIM II could be used to create custom publications for individual clients. Since there is already an electronic commerce interface on the Internet site, the associated costs can be conveniently charged to, and paid by, the client.

Catalogue and other meta data

Statistics Canada maintains and publishes two meta databases. The first, the IPS (Information on Products and Services) is already on the Internet. The second meta database, the IMDB (Integrated Meta Data Base) is derived from the SDDS (Statistical Data Documentation System) and will soon be available on the Internet.

The IPS is a comprehensive catalogue of all products and services offered by Statistics Canada. Each record pertains to a specific product or service and uses up to 60 fields to describe it in detail (e.g. catalogue number, author, abstract, subject key words, price, contact, etc). This database containing about 6,000 records in both English and French is maintained in an ORACLE DBMS on an internal file server. It is continuously updated. Once a day, the latest changes to this database are uploaded to the external Internet site and stored as HTML web pages (one page for each record). Currently, this set of HTML pages represents the IPS database on the Internet site that can be searched directly by clients looking for information . It is also accessible through links from other information bins on the Internet site, e.g. The Daily, Canadian Statistics, CANSIM. IPS records also link to the ordering process for electronic (i.e. downloading from the Internet site) or physical product delivery. In the future, it is planned to store the actual IPS records in an SGML enabled data base on the Internet site and to automatically generate individual HTML pages when an IPS record has been requested either through a search or through hyper-links.

The second meta database is a comprehensive description of concepts, definitions, subjects, variables, methodologies and quality indicators about the statistical surveys and other source programs in Statistics Canada. This database was initiated in 1981 as the SDDS (Statistical Data Documentation System). It is now being enlarged and improved to become the IMDB (Integrated Meta Data Base). Each record pertains to a statistical source program such as a survey, an administrative data acquisition program, or the census. It also covers derived statistical programs such as the various National Accounts programs that produce statistics from primary or secondary data sources. Each record has a unique identification number (referred to as the “SDDS number”) and up to 120 fields in which the various meta information about the source program are stored. In 1999, SDDS/IMDB will be available on the Internet site for direct access through browsing or word searches as well as through hyperlinks to and from the various other information bins using the SDDS number. For example, time series in CANSIM reference the source statistical program for a set of time series. Hyperlinks from the CANSIM data directory to the SDDS/IMDB records will allow clients to check the source of particular time series that they have selected for access and downloading.

Downloadable Publications

Statistics Canada has started to convert publications from paper-only distribution to electronic distribution in the form of Internet downloadable documents in HTML and PDF (specifically Adobe/Acrobat) formats. This in itself cannot be classified as data base publishing. But if one views the total Internet site as a sort of structured “data base” then each publication issue can be regarded as a “record” within the publication bin. This, in turn, is part of the overall Internet data base. Similar to the The Daily bin of all past issues, this “publications bin” can be searched by keywords and hyperlinks can be used to link publication bin records to records in other bins on our Internet site.

The Statistical Profile of Canadian Communities

This is the latest of Statistics Canada’s information services on its Internet site. It has become instantly very popular. Statistical profiles based on the recent 1996 Census are available for about 6,000 Canadian communities (cities, towns, villages, municipalities, Indian Reserves and Settlements, etc.) highlighting information on education, income and work, families and dwellings, as well as general population statistics. A mapping feature is available for viewing the location and boundaries of a community within Canada. Zooming can display streets within a community. One can access a profile by entering one of the 40,000 place names in Canada in the search field. The system will return the community that contains the specified place name. Then, various statistical tables can be selected for online viewing. All this is driven from two data bases: tables are automatically generated as HTML pages with statistics extracted from an ACCESS data base using COLD-FUSION and are tailored to the specific nature of certain communities, for example Indian Reserves; maps are generated using a tool kit from ESRI with geographic files stored in another data base.

Issues

Data base publishing requires expert resources for the one time development of the necessary databases, systems and procedures. For an occasional or less frequent publishing program it may be simpler and cheaper to use software tools to manually create HTML pages from word processing texts or data in spreadsheets. HTML conversion tools have become easy to use. The trade-off between such a manual process and the automated data base publishing process needs to be evaluated for each case. On the other hand, once a database exists, new opportunities can be exploited.

Stringent data quality procedures have to be instituted to verify the accuracy of the information before it is entered into the database. This applies both to data and meta data. There has to be absolute confidence that the data in the data base are “correct” and that automatic data base publishing can proceed without further manual verification of data quality. On the positive side, once such errors have been found and corrected in the database, all future presentations extracted from the database will be correct. In widely distributed paper publications such errors could not be corrected.

As paper publishing is supplanted by Internet information services, the uptime of the Internet server becomes critical. If it is down, no one has access to the information. This becomes even more critical with data base publishing: if the database is down, nothing can be published.

The interface between extracting data from a data base and their final presentation on clients’ screens has to be based on robust, standard interfaces so that any change in the Internet presentation technology does not require a change in the data base access interface. Statistics Canada has had good experience with SGML in this regard. As much as possible, we build such interfaces using SGML as the interim format for information transfer from the data base layer to the presentation layer.

The current speed of technological changes is phenomenal. Constantly, new Internet access and presentation features are offered, particularly in the form of plug-ins. Of course, one should take advantage of such generally accessible features. On the other hand, many clients may not have the necessary client platform (e.g. 16 bit vs. 32 bit microcomputers) or the technical skills to deal with complicated downloads etc. Thus a balance needs to be struck between forward-looking design and conservative assumptions of the skills and infrastructure on clients premises.

Conclusion The Internet has started to fundamentally change the way Statistics Canada can disseminate official statistics. The Internet offers opportunities to reach more clients with more information in a more timely way with the reduction of costs for the total dissemination process in the long run. The lower costs can only be achieved by automating as many steps as possible within the chain of producing statistics from collected survey data and putting them into the hands of the clients.

In this chain, a data warehouse of published or publishable statistics (macro data) will play a pivotal role as a central staging area: survey and other statistical programs deposit their estimates into this data warehouse; the various dissemination processes retrieve data from the warehouse to be disseminated in a variety of formats and distribution channels, foremost through the Internet in the future.

Such a data warehouse must accommodate both the actual estimates as numeric values as well as all labeling, explanations, quality indicators, methodological notes, etc. associated with the statistics. It can then be the primary source for automatically publishing in an electronic format on the Internet in a variety of packages and formats.

Martin Podehl is Director of the Dissemination Division in the Marketing Branch at Statistics Canada. He can be reached at [email protected]

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

The contents of this file are copyright  1998, 1999 by Sysnovators Ltd. Copying for other than personal reference without express permission of the publisher is prohibited. Further distribution of this material without express permission is forbidden. Permission should be obtained from Sysnovators Ltd., 613-746-5150, Internet: [email protected]

=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-

Recommended publications