Check the Quality of Your Information Support Copyright 1991 CAUSE From _CAUSE/EFFECT_ Volume 14, Number 1, Spring 1991. Permission to copy or disseminate all or part of this material is granted provided that the copies are not made or distributed for commercial advantage, the CAUSE copyright and its dateappear, and notice is given that copying is by permission of CAUSE, the association for managing and using information resources in higher education. To disseminate otherwise, or to republish, requires written permission. For further information, contact CAUSE, 4840 Pearl East Circle, Suite 302E, Boulder, CO 80301, 303-449-4430, e-mail info@CAUSE.colorado.edu CHECK THE QUALITY OF YOUR INFORMATION SUPPORT by Gerald W. McLaughlin and Richard D. Howard ************************************************************************ Gerald W. McLaughlin is Director of Institutional Research and Planning Analysis at Virginia Polytechnic Institute and State University. Currently a co-chair of the ad hoc committee to develop a "best of CAUSE/EFFECT" publication, he is a past chair of the CAUSE Editorial Committee and has been president of the Association for Institutional Research (AIR), chair of the AIR Publications Board, and AIR Professional File Editor. In 1989, he was a recipient of the CAUSE/EFFECT Contributor of the Year Award. Richard D. Howard is Director of Institutional Research at North Carolina State University, and has held the same position at West Virginia University. He is currently a member of the ad hoc committee to develop a "best of CAUSE/EFFECT" publication and of the Executive Committee of AIR, and he chairs AIR's Professional Development Committee. He has also served on the CAUSE Editorial Committee, co- authored many articles published in CAUSE/EFFECT, and received the 1989 CAUSE/EFFECT Contributor of the Year Award. His work in recent years has focused on data quality issues in distributed environments. ************************************************************************ ABSTRACT: Planning and decision support processes are tools which, ideally, are designed to contribute information to individuals responsible for the administration of our institutions of higher education. These processes can produce quality results only when they are supported by quality information. This article demonstrates how to develop a checklist for evaluating the usability and effectiveness of institutional data and information. Three primary personnel responsibilities and five primary activities are identified. These define a process spanning key operational aspects of the development and maintenance of information support. Standards and examples of specific items to "check" are suggested for each function. The article's focus is on the human and management aspects of the enterprise, rather than the technical elements of hardware and software. Providing quality information for management in higher education presents continuing challenges and opportunities. Technology provides both the opportunity to distribute information activities to separate operating offices and also the ability to integrate and network these different activities. The current call is to use technology to create a "single system image" which provides a seamless environment for the user.[1] Technologists are called upon to link together information islands with new hardware and software tools. They are encouraged to build a network "navigator" that can: * integrate mainframe and personal computers, * control microcomputer software distribution, * help manage microcomputer data, files, and diskettes, and * provide the user with helpful tools[2] While some are presenting the "information archipelago" as an appropriate goal, still others are proposing that we seek the "information continent" which uses systems architecture and system logic integration to integrate information capability.[3] As technical capability and aspiration increase, however, so does the awareness that these super-systems must be managed. Along with integrated technology, there must be "a system of standards and procedures that coordinates the flow of information through the company and through its various forms for the purposes of keeping the information secure, recoverable, private, available, and accurate."[4] In other words, quality information management is required for quality information. With the complex input-process-output nature of information resource management, how can we improve the quality of information used on our campuses? There is no simple answer, particularly as we move toward distributed systems. Quality information management and quality information require a better understanding of the steps involved in providing information support. We need to establish standards of quality for each of the steps or functions. This requires that we find out who is involved with the various functions, and identify problems with our information support so that we can decide where best to start to improve quality. The model that follows is an initial effort to create a process and a product that can monitor information support and help improve the quality of information. THE UNDERLYING MODEL To monitor the quality of information support processes, both personnel responsibilities and information support functions must be considered. A model which interfaces five functional steps in the information support process with three areas of personnel responsibilities is shown below. Once functional steps of the model and the personnel responsibilities have been defined, a checklist of activities and responsibilities is identified which can act as a guideline in the evaluation of useful decision support information. It should be stressed that the utility of the model and checklist are totally dependent upon the willingness of the institution to create reliable and valid information. Without motivation to improve the process, the old "garbage in, garbage out" rule for computing will apply to the information support system. Information support functions In our model, the process by which information support is developed from those data typically collected to support various operational functions of the institution is based on five functions: selection, capture and storage, manipulation, delivery, and influence (usefulness).[5] These functions represent a closed loop, with the usefulness of information contributing to the criteria for selection and hence capture of appropriate data elements and manipulation and delivery of relevant information. It is a dynamic process which, in a decentralized or distributed environment, occurs at many points across the campus. Selection. What processes and events are sufficiently important to measure? Selection involves positioning information development activities by identifying key areas or events and selecting data elements which measure or define the structures of those areas or events. Some measures should be taken from census databases (that is, databases created on a fixed date); others are valid when taken from the dynamic operating files. Capture and storage. How and when does one capture and store data? Data elements should be captured at their source and coded consistently in categories which can answer questions. Data must be stored securely and still be accessible to those needing it. It is critical that the capture of data be coordinated through a central unit to ensure that characteristics of each data element are known by all users. This function is often referred to as the data administration role. Manipulation. What do the data mean? The interpretation of data requires the full documentation of their capture--the sample, conditions, and timing. Standard analytical procedures are used to translate data into information. Often manipulation requires the integration of various databases. The specific analysis is heavily dependent on the analysts' perception of users' needs. Delivery. How is information presented? Delivery provides the user with qualitative and quantitative information. Timing involves having the information available when it is needed. The needs, analytical capability, and decision-making ability of the user must be consistent with the reports. Standard reports and graphs support ease of interpreting results. Influence. How useful is the information? There are certain key points in organizational activities where the use of information can influence the direction or outcome of the activity. Presenting information at these critical points reduces uncertainty, influences or creates power, and focuses future events. Often the information first needs to be converted to intelligence and integrated into the user's knowledge. Evaluation of the usefulness of the information for various purposes during thisfunction provides insights into the selection function. Information support personnel In general, there are three roles performed by people associated with the creation of information. While each has a specific role, all must be interdependent if the information development process is to be successful in creating useful information. Technicians. These individuals are typically responsible for the collection, maintenance, and storage of the data. In general they are responsible for the hardware and software issues and generally have not been involved in data quality issues. Recently, however, the appearance of data administration functions and information centers reflects increased pressures on the traditional computer center to address data quality issues with the users. This is a direct response to increased demands for decentralized processing capabilities. Analysts. Typically, it is through these people that the integration and manipulation of the data occurs and information is created and disseminated. Before the emergence of technical capabilities that have made distributed processing feasible, these people were usually found in institutional research offices. Their responsibility is to provide the link between the computer center technical people and the users of the information the analysts created. Users. Once information is created, these people apply it in decision-making and planning activities. Because they are the primary beneficiaries of the information development process, it is critical that they be involved in the identification of the initial data that feed the process. Responsibility for the quality of information falls on all of the personnel involved in the information support process. Technicians have major responsibility for the reliability of the data that feed the process. Both internal and external validity are the primary responsibility of the analysts. The users of the information must take direct responsibility for construct and content validity. While the above discussion identifies the primary responsibilities of the individual personnel types, it must be emphasized that the overall quality of the information is dependent on the integrated efforts of all three types of personnel. It is within the context of the distributed computing environment [6] that the personnel responsibilities and five functions we have described are especially critical in the development of information support. The five functions provide the basis of our checklist for monitoring the information support process. The checklist that follows, based on the model described, is offered as a starting point. In using the checklist, others are encouraged to modify the values and items according to the specific situation found at their institution. The use of a checklist, focused on the characteristics of a specific institution, will help ensure reliable, valid, and useful information from all components in a distributed environment. THE CHECKLIST Based on the interaction of personnel responsibilities and information support functions, standards and a series of example checklist items are presented below for use in monitoring information support activities.[7] For each functional area, these statements identify a generic set of activities and responsibilities that should be addressed within the specific organizational and management structure of each institution. Selection The standard for a quality approach to selecting measurement criteria is content validity--the items and measures stored in the databases will start from a set of beliefs about the issues important to the institution. The elements and items should be able to test beliefs, select best beliefs, represent the realistic complexity of the beliefs, allow for multiple methods of looking at beliefs, allow for generalization to other databases, and support casual interpretation. The set of beliefs consists of assumptions about the institution, preferred strategies, and statistics used to indicate the institution's position. Checklist items for this support function include: * echnicians and analysts are involved in goal setting at all levels of the institution. * There are multiple measures in most key areas. * Everyone has a good idea of the management processes of the institution. * Most user questions can be answered from census databases or the fact book. * Standard definitions exist for key concepts such as "faculty" and "student." * There is a set of written guidelines for information resources management (IRM) available to users[8] Capture and storage The standard for quality capture and storage of data is reliability--comparable measures should yield the same or similar results. In general, this requires that steps are taken in this function to meet three standards of quality. The data must provide a stable view of the institution, they must be captured with the same value regardless of the coder, and they must be internally consistent. Checklist items for this support function include: * A data element dictionary is readily available to analysts. * Responsibility for data is assigned to key administrators. * Input is audited as it is entered. * There is an administrative systems group which coordinates databases. * Inconsistencies in databases are identified and resolved. * RFPs require compatibility with local standards. Manipulation and analysis The standard for quality manipulation and analysis of data is internal validity--measures must be interpretable. This requires that documentation be available to understand what the data mean. If the data are "cleaned," what criteria are used? Procedures used to summarize and synthesize data must be understood so that results can be interpreted. Systematic errors and misunderstandings must be avoided. Checklist items for this support function include: * Written procedures for coding data are available to analysts. * Those who analyze the data use standard packages. * User groups contain users, analysts, and technicians for all major databases. * Administrators have analytical perspectives and computer confidence. * Distributed databases are easy to integrate. * Rules exist for deriving official groups of faculty and students. Delivery The standard for delivery of quality information is external validity--information reported to the user must be generalizable to decisions that need to be made or to questions that need to be answered. It must contain facts which are based on events and individuals, consistent with those about which decisions are being made. To be usable it must be accessible to those doing the reporting. Checklist items for this support function include: * Census databases are widely available to users. * Standard graphic and analysis packages are used. * A calendar of key decision dates is available to technicians. * Periodic reports are in a standard format. * Reports tell users the extent to which results can be generalized. * There are resources on campus for those who want to learn to use the information system. Influencing The standard for producing useful information that can influence outcomes is construct validity--for the information to increase the knowledge or intelligence of the user, it must be useful for anticipating, explaining, or predicting future events. As such it must be related to the constructs which span the issues related to the success of the institution. The user must accept the information and integrate it into knowledge about the key concepts. It must be sufficiently comprehensive to meet needs, and focused without excessive measures. Multiple indicators of a similar construct should give convergent results. Checklist items for this support function include: * Members of the faculty use the information system. * Users see the information as unbiased and reputable. * Analysts are considered ethical. * Key administrators often meet with those who provide the information. * Executives make frequent use of the information. * Information providers include those who share the values of higher education and who understand the management of the college or university. Summary Meeting the technological and managerial requirements of future information systems will not be an easy task and may require modifications in organizational structures and/or processes.[9] The decision to improve the quality of current systems is an obvious starting point to prepare for the future. Without a plan, however, the complexity of information support coupled with the plethora of traditional wisdom will doom such a decision to failure. Having an effective system requires the cooperation of those who provide the information (the technologists), those who work to focus the information on user needs (the analysts), and those who need the information (users). The awareness of likely modifications in organizational structures and processes is as likely to produce turf battles as it is to produce harmonious cooperation. The purpose of creating a model such as the one presented here is to establish in everyone's mind the fact that we all depend on each other in the quest to produce quality information. If the quality of the circle is broken at any point, then the information support will be of lower value. The identification of the three types of individuals who have primary roles in information support shows that cooperation is required if our processes are to produce reliable and valid information in a distributed computing environment. The purpose of the standards is to show that prior work in the area of research has produced standards of quality that apply to our functions. The standards also give baselines that stimulate thinking. If we can clarify operational definitions of these measures of reliability and validity, then we can better understand and explain to others what is involved in quality information support. The checklist represents the conversion of concepts and standards into specific characteristics for each of which we should ask: "Do we have this here?" If the answer is no, then we know that we have identified a problem that will limit the quality of our information support. If you want to improve the quality of information on your campus, you might start by checking the degree to which the process achieves the standards discussed above. You can build your own checklist using the one presented in this article as a starting point. As you develop your own institution-specific support monitoring program, you may find that you need to develop multiple, specific checklists for different situations within a campus-wide distributed environment. ======================================================================== Footnotes 1 Robert C. Heterick, Jr., A Single System Image: An Information Systems Strategy, CAUSE Professional Papers Series #1 (Boulder, Colo.: CAUSE, 1988). 2 Antony Halaris and Lynda Sloan, "An Implemented Strategy for Campus Connectivity and Cooperative Computing," CAUSE/EFFECT, Winter 1989, pp. 36-41. 3 A. S. Targowski, and T. F. Rienzo, "Managing Information Through Systems Architecture," Information Executive, Summer 1990, pp. 43-49. 4 A. C. Cloud, "Controlling the Integrated Information Web," Information Executive, Summer 1990, pp. 37-39. 5 Gerald W. McLaughlin, and Josetta McLaughlin, "Barriers to Information Use: The Organizational Context," in Peter Ewell (ed.), Enhancing Information Used in Decision Making, New Directions for Institutional Research, 64 (San Francisco: Jossey-Bass, 1989), pp. 21- 33. 6 Richard D. Howard, Gerald W. McLaughlin, and Josetta S. McLaughlin, "Bridging the Gap Between the Data Base and User in a Distributed Environment," CAUSE/EFFECT, Summer 1989, pp. 19-25. 7 The ideas presented are meant to be suggestive; in reality standards can overlap several functions. 8 See Lore Balkan and Philip Sheldon, "Developing Guidelines for IRM: A Grassroots Process in a Decentralized Environment," CAUSE/EFFECT, Summer 1990, pp. 25-29, 33; also An Information Infrastructure for the Future, a document available to CAUSE members from the Exchange Library (#CSD- 0261) that deals with the establishment of an information infrastructure and organization to support information management activities. 9 Karen Miselis, "Information Systems--A Key Unifying Force on Campus," CAUSE/EFFECT, Summer 1989, pp. 50-53; James I. Penrod; Michael G. Dolence; and Judith V. Douglas, The Chief Information Officer in Higher Education, CAUSE Professional Paper Series #4 (Boulder, Colo.: CAUSE, 1990); Karen L. Miselis, "Organizing for Information Resource Management," in Jennifer B. Presley (ed.), Organizing Effective Institutional Research Offices, New Directions for Institutional Research, 66 (San Francisco: Jossey Bass, 1990), pp. 59-70. ========================================================================