Xem mẫu

112 THE THREE STAGES OF INTELLECTUAL CAPITAL MANAGEMENT 1. Business value replaces financial perspective—“How do we look at management?” 2. User orientation replaces customer perspective—“Are we satisfying our user needs?” 3. Internal perspective is focused on “Are we working efficiently?” 4. Future readiness replaces renewal and growth perspective—“What technologies and opportunities/challenges are emerging?”18 Interestingly, the Navy’s model also encourages the use of what have come to be known as “soft measures.” These are success stories or “lessons learned” that communicate financial or other returns (e.g., success of an operation) that have been realized from a KM program.19 The differentiating characteristic of the Navy’s performance measurement system is its focus on what should be measured rather than how. This is clarified through a number of guiding prin-ciples that determine the variables that should measured, the types of KM initiatives, and the var-ious types of measures that can be used. The First Guiding Principle:What to Measure. The Navy’s CIO clarified from the outset that it is not how you measure, but what you measure that is important. The Toolkit explains that when it comes to knowledge assets or IC, there is a lot of confusion as to whether performance meas-ures should calculate the value of the asset/capital or the effectiveness of the initiatives designed to leverage them.20 Without delving into theoretical discourse, the Navy’s model stresses the lat-ter, that is, effectiveness of the KM initiatives is what should be measured. But the question remains, what type of KM initiatives should be measured? The Navy clas-sifies initiatives into three groups. The first is program and process initiatives that relate to organization-wide activities. These are usually designed to streamline business practices and transfer the best practices across the organization.21 The goal of these initiatives is to prevent “reinvention of the wheel” and duplication of error. An example of such initiatives is the man-agement of customer relationships. The second type of initiatives are those related to program execution and operation, including transferring expertise and getting the right knowledge to support the effective execution of oper-ations. These initiatives should aim at facilitating collaboration and knowledge sharing to increase productivity, effectiveness, and quality. They apply to operations like R&D, manufac-turing, and computer and software systems.22 The third type of initiatives deals with personnel and training, or the development of the orga-nization’s human capital. The goal of this initiative is to foster employee satisfaction through improving the quality of life and enhancing employees’learning experience (e.g., fringe benefits management and distance education). The Navy’s model proceeds to identify the measures that can be used. The Second Guiding Principle: Not All Measures Are the Same. Like the Intangible Asset Monitor model, the Navy’s model provides standards that point to the results that should be tar-geted in a KM initiative. These standards identify the final outcomes, outputs, and the effective-ness of the initiative as a whole. Again, to measure the effectiveness of any of the three types of KM initiatives (program/process, execution/operation, and personnel/training), a mix of the three types of measures should be used. Outcome indicators measure the impact of a KM initiative on the effectiveness of the organi-zation as a whole. They attempt to measure things like increased productivity and the ability to meet strategic goals more effectively. Typical indicators include time, money, or personnel saved by implementing a practice, rates of change in operating costs, and improvement in quality. THE U.S. NAVY KNOWLEDGE MANAGEMENT SYSTEM:A CASE IN POINT 113 Output indicators measure the direct process outputs for users, the lessons learned in captur-ing new business, and doing old business better. These measures attempt to monitor, in quantita-tive terms, how the initiative contributed to meeting the organization’s objectives. Typical indicators include time to solve problems, usefulness survey, time to find experts, and user rat-ings of value added from the initiative. System indicators measure whether the individual systems are fully operational and deliver the highest level of service to the users. They monitor the usefulness and responsiveness of identified practices and tools. Typical indicators for an IT system, for example, include number of hits, fre-quency of use, viability of the posted information, usability survey, and contribution rate over time. The indicators mentioned are specific to the initiative introduced but mainly aim at monitor-ing the effectiveness of initiatives in achieving identified goals. What makes the Navy model’s measures and indicators outstanding is the assertion that measurement is only a step in a contin-uous process of a number of steps. These steps include designing, building, and implementing a program; designing performance measures; assessing these measures; then returning to the design phase, as illustrated in Exhibit 6.4.23 Like the Navigator model, measures under the Navy’s model are not seen as indicators that have to be monitored consistently and remain the same over time, but rather as “a valuable means to focus attention on desired behaviors and results.”24 Dis- tinctive to the Navy’s model is the use of the life cycle principle in designing the measures. The Third Guiding Principle: The Life Cycle of an Initiative. One of the main challenges that the authors of IC measurement systems face is measuring the flow rather than the stock of IC. The Navy’s model addressed the problem of flows by allowing for change in the measures depending on the life cycle of the initiative being measured. This practice, according to the Navy’s Guide, is taken from the American Productivity and Quality Center’s (APQC) benchmark study of the best practices in measuring KM initiatives. The APQC report25 states that a program goes through a number of stages in its life cycle, from preplanning, start-up, and pilot project to growth and expansion. Each stage determines the type of measures required. For example, the pilot project stage measures the success of the initiative to deliver real value to business objec-tives, such as efficiency rates through the transfer of best practices. By adopting this system as a guiding principle, the Navy model tries to overcome the static nature of measurement by accom-modating the dynamic nature of knowledge/value creation. Measure Assess Design, build, implement EXHIBIT 6.4 Performance Measures 114 THE THREE STAGES OF INTELLECTUAL CAPITAL MANAGEMENT The Navy model does not attempt to take the measures out for external reporting purposes because one of the goals of measurement is to secure funding across the organization for the KM initiative and programs. As such, measures are used as transient communication tools that change according to the audience and the message intended to be delivered. CONCLUSION The Navy implemented KM by effecting a number of changes in the organizational structure, culture, and IT architecture. This chapter outlined the changes that the Navy implemented using the framework outlined in Chapter 5 as a guide. To effectively implement KM, the Navy intro-duced the CoP structure to loosen what is an otherwise rigid structure. One of the Navy’s main means of doing this was to recognize knowledge sharing as one of its strategic objectives, high-lighting how liberal—rather than on a need-to-know basis—knowledge sharing will enable the Navy to achieve its mission of mastering the art of war. The emergence of KM champions at the commander level, coupled with the Navy’s awards for successful operationalization of KM strategies, gradually transformed the Navy’s secretive culture to one amenable to knowledge sharing. The Navy’s IT infrastructure also underwent major changes to respond to the knowledge needs of decision makers at all levels, and led to the development of the knowledge base. One of the drivers of the Navy’s success in KM is that it considers KM a developing effort, and hence involves academia, industry, and other government agencies to remain on the cutting edge. Equating its final goal as becoming a learning organization, there is no limit to the Navy’s suc-cess with KM. NOTES 1 A. Bennet and Dan Porter, “The Force of Knowledge,” in A. Bennet (ed.), Handbook of Knowl-edge Management (New York: Springer-Verlag, to be published). 2 A. Bennet, “Knowledge Superiority as a Navy Way of Life,” Journal of the Institute for Knowl-edge Management, Spring/Summer 2001 (vol. 3 No. 1), pp. 46, 48. 3 Department of the Navy, Information Management and Information Technology Strategic Plan, 2000/2001. 4 “Forward presence” is a U.S. maritime strategy assuring that forces are deployed in strategic locations worldwide to enable quick response. Supra note 3, p. 1. 5 The Knowledge Management Working Group is a U.S. government–wide group, sponsored by the Federal Chief Information Officers Council, formed to address issues relating to knowledge management, and includes experts from industry and academia. 6 A. Bennet and R. Neilson, “The Leaders of Knowledge Initiatives: Qualifications, Roles and Responsibilities,” in A. Bennet (ed.), Handbook of Knowledge Management (New York: Springer-Verlag, to be published). 7 The Toolkit. 8 Id. 9 A. Bennet, “Information Literacy: A New Basic Competency.” Available online at www.chips. navy.mil/archives/01_ fall/information_literacy.htm. THE U.S. NAVY KNOWLEDGE MANAGEMENT SYSTEM:A CASE IN POINT 115 10 Pilot study reported in note 1. The study identified success factors for KM as follows: culture 29 percent, processes 21 percent, metrics 19 percent, content 17 percent, leadership 10 percent, and technology 4 percent. 11 The Toolkit. 12 Supra note 1. 13 Id. 14 The Toolkit. 15 P. Senge, The Fifth Discipline: The Art and Practice of the Learning Organization (New York: Currency/Doubleday, 1990). Systems thinking is adapted from systems engineering, first pio-neered by Jay Forrester as “Industrial Dynamics” in the 1960s. 16 Interview on January 10, 2002. 17 Department of Navy CIO, Metrics Guide for Knowledge Management Initiatives (the Guide), August 2001, p. 9. 18 Id., p. 18. 19 The approach of managing and measuring knowledge through lessons learned or storytelling is also known as “anecdote management.” See Chapter 5 for more details. 20 Supra note 17, p. 5. 21 Id., pp. 29–30. 22 Id., pp. 43–44. 23 Id., p. 11. 24 Id., p. 5. 25 APQC, Measurement for Knowledge Management, February 2001. Available at www.apqc.org/ free/articles/dispArticle.cfm?ProductID=1307&CFID=154242. ... - tailieumienphi.vn
nguon tai.lieu . vn