The Avoidance of
Real Knowledge Management
(download zipped Word 6.0 version)

Rick Dove, Paradigm Shift International, www.parshift.com,

Ever look at your desk on a day when you've got some really hard work ahead of you and decide that it's important to organize the desk first? Weed through all those documents and memos that have stacked up and decide which should be saved, where they should be filed, and which should be tossed? Of course you have to read through all of the stuff to make those determinations. All the while I'm doing it I know its procrastination - but I can't help myself - probably because I don't know how to attack the problem I'm avoiding. Eventually there's nothing left to read and file and nothing left to do but get on with my real work - though on occasion I've managed to stretch this information sorting and filing job across days - telling myself that it has to be done, there's valuable information here that I won't be able to access if I don't catalog it - and I'll probably need some of that information when I finally tackle my real work, so this is really valuable preparation and organization. Riiight.
   Almost every company going after knowledge management now is guilty of the same thing - though its dressed up differently and takes a lot longer. The first thing they generally do is try to identify and capture what knowledge they already have and determine where their experts are and what their expertise is. Then they catalog and index all this information (it's not knowledge) and store it away in an on-line file system. Eventually they try to get people to use these files to save time by not reinventing solutions that are already known, and to save time by asking an expert for the answer instead of figuring it out for themselves. How much stuff do you suppose gets captured and cataloged and filed that is never of any use thereafter - either because it's really old news and inapplicable in new situations, or simply because nobody ever consults it?

Companies have managed knowledge,
and people have collaborated,
since time began.


If you build it they will come. Not! The experts are back again saying that they've learned from experience in their own company, and with others - that there isn't a natural pent up demand for this new infrastructure - after the ribbon cutting ceremony a few people are eager to exploit the system but not many. So the experts are ready to help with the next phase: cultural reprogramming. They understand this phase well from first hand experience - they've lived through their own indoctrination period successfully. Of course their people are all MBAs who naturally make a living by communicating and collaborating.
   But they'll show you that it works in industry, too. British Petroleum is living proof of that. BP began its $12 million dollar pilot-program in 1995, and "about a third of the money was spent on behavioral scientists who helped the people in the pilot programs learn how to work effectively in a virtual environment [Harvard Business Review, Sep-Oct '97, p. 153]." BP has good bottom line results to show for it: oil drillers who ran into really expensive problems got on the network and found others who had solved these very same or similar problems before. It's been quite effective for BP, and made good times of idle time for the drillers isolated on these drilling platforms - kind of like ham radio with a video link.
   Companies have been managing knowledge and people have been collaborating in communities of practice as long as there have been companies and people. This is not new stuff, this is not unnatural stuff. What is new, and useful, is the technology that can broaden the communities to include people who may never shake hands as well as people who are simply somewhere else. What is unnatural is the new abstraction of knowledge management into something artificial, something that can be dreamed up anew from logic and ideal visions. Something that can be purchased like a box of Cheerios - just add milk to nourish and reinvigorate the corporation.
   But they point with pride to what they've come to think is a real knowledge management system. After all, most of the experts say that this is what has to be done. And the experts know because they've already done it for themselves and for others, and they'll help get it started for you, too. The results are refreshingly tangible: you can count the bytes of captured "knowledge", you can count the numbers of accesses to this so-called knowledge base, and you can count the man-hours and dollars involved in creating and maintaining this resource. Numbers that all measure some form of performance and prove that the company is taking knowledge management seriously.
   And there's more to do - more man-hours to measure and more information technology to buy. Those same experts and a few new ones can now show that it is at least equally important to develop and support communities of practice - one of the most effective forms of collaborative learning. This makes a lot of sense and has a nice clear path to follow: you buy stuff from computer vendors and from software vendors. Things like network servers and Lotus notes, maybe even collaborative project notebooks and whiteboard services on the Internet. Once the infrastructure is up and running it's time to get some users, and create those communities of practice - you know - people who have common work-interests even though they reside in different places throughout the company, people who want to work together on the corporate intranet to swap knowledge and help each other do better faster.
  

   
   For a few years in the mid-nineties the Agility Forum provided a model of natural knowledge management facilitated and focused to a specific end. The Forum had two principle objectives: 1) to facilitate the discovery and creation of knowledge about enterprise agility, and 2) to facilitate the adoption and application of that knowledge by industry. It could have taken its meager funds and employed the more-than-competent resources available at Lehigh University to research the field, capture appropriate information, draw reasonable conclusions, and produce books and documents full of valuable information - that would likely go unread for the most part, especially by people with line responsibility who needed the understanding the most.
   Or it could have pursued one of two paths suggested by various interested factions: 1) the directed approach that convenes a blue ribbon group to determine precisely what knowledge is needed and then engages research to specifically address the agenda, or 2) the grass roots approach that brings together people in the trenches facing real problems and facilitates collaboration among those with similar interests. Approach one runs the risk of generating a pile of knowledge that goes unused by real people. Approach two runs the risk of solving real problems that have little if any long term strategic use.
   These same choices are faced by corporations every day: the dichotomy between those who would direct a solution from the top down and those who would encourage a solution from the bottom up. Both approaches can demonstrate advantages - and either can work quite well in a corporation with a strong unified culture aligned to the approach. More and more, however, we are finding mixed cultures at corporations as they merge across markets, across industries, and across seas. We're also finding strong control cultures opening the debate on emergent and empowered initiative, and we're finding consensus cultures beginning to see value in directed leadership.
   The Agility Forum blended approaches successfully. Not from infinite foreknowledge and wisdom - it simply had no choice, at least in the blending part. As to success, there are plenty of ways to blend and fail - compromise is often one. The Forum needed industry involvement at a committed and intimate level or it would never meet its second objective: the adoption and application of knowledge. The committed level meant that people with strategic responsibilities, and the control of resources, had to get behind the interest in agility. The intimate level meant that people fighting real problems had to work together to find solutions...to the problems that kept them up at night.
   The Forum convened a group it called the Strategic Analysis Working Group (SAWG), who's purpose was to identify an agenda of critical knowledge development necessary for understanding agility in organizations. Group composition rotated over time and was designed to represent various industries, labor unions, academic groups, government, and related organizations such as the National Center for Manufacturing Sciences. People were sought for their understandings of issues, their real interest/use/need for an agility knowledge base, and their ability to influence the communities they represented.
   The Forum provided a facilitation service group which did the logistical and administrative work for both the SAWG and the many individual collaborative groups that were independently formed among people with real needs and similar problems. This facilitation group administered the formation and logistics of collaborative learning groups, and accepted initiatives for new ones from virtually anywhere. However, they also "packaged" and sold the SAWG agenda, actively seeking people who had problems aligned with the strategic agenda, and then supported them in common pursuit. They also provided the information technology that stored and cataloged the results, and supported the communities of practice that emerged from the collaborative learning groups.
   The Forum wasn't active long enough in this industry-involvement mode to really mature the infrastructure it had started to build. Community of practice support, for instance, was only in its infancy, as were effective management and search methods for the knowledge repository. Notably, these IT areas were not the lead areas. The Forum led with real people's interest in real problems - both at the strategic level and at the operational level.
   This model can provide a strawman solution to satisfy those who find Knowledge Management too abstract and too distant from the real world of the organization, and at the same time satisfy both those who need a strategic approach and those who favor a grass roots activity.

1999 RKDove - Attributed Copies Permitted
Essay #053 - Published 5/99 in Automotive Manufacturing & Production, Gardner Publications


Would you like to offer some thoughts or add to the dialog? Your sending of a comment automatically grants us permission to edit and post at our discretion. Send your comment to
========= Reply =========================
From: "Darrell L. Ross" Darrell.Ross@ey.com, Date: Sun, 16 May
Loved your Cheerios quote!!

========= Reply =========================
From: m.crawford@deuk.co.uk (Mark Crawford), Date: Tue, 1 Jun 1999
What about measuring your performance. Models are good at showing you what to do, but how to implement it.
How to know how well you are managing. With direct measures it is much easier to keep motivated and implement it. Indirect measures such as cost savings, lead time reduction does not provide adequate evaluation.

========= Reply =========================
From:(Rick Dove), Date: Tue, 1 Jun 1999
Good point, Mark. Indirect measures are often like statistics - you can use them to make any point you want, especially when it is difficult to show direct and unequivocal causal effect. But direct measures also present difficulties. They often measure the wrong things just because they are the only (or the easiest) things to measure directly.

Like measuring the hit count on a knowledge-repository web site, for instance. I'll pick on this issue to make my point, but any issue would do.

A hit on a web page could simply mean someone referenced it and found it inapplicable - often the case when running through a list of search engine results. This is actually worse than it sounds. It is a performance count in the positive direction that actually should be negative as it wasted someone's time.

Far worse, however, are the hits that a web page counts that are done by robots and not people. Search engine robots account for a large number of page hits as the engines attempt to update their indices frequently. These days the robotic "readers" are inflating page-hit numbers even higher by downloading an entire site for presumed off-line reading later by a human.

I am suspicious that I have some pages on my web site that have a higher percentage of robotic readers than human ones - but if I implemented a page-hit-count meter it would indiscriminately count them all. Friends have suggested that I place a hit count meter on the "home" page to impress visitors with the magnitude of the company they keep. Unfortunately more visitors enter the site through other portals than ever refer to the home page - as they follow a link to a specific piece of information. This is a growing phenomenon on our web site as an increasing number of university courses employ selected direct-linked pages as mandatory course reading.

I am pleased that some of our material is deemed useful in university courses. But counting these genuine human readers is also a misleading performance metric - for us - as our objective for this knowledge repository is to affect they way people approach the solution of real problems in business.

There are more examples of misleading metrics here, but the point should be clear. Trying to measure the value of knowledge management activities is bound to distort the process as people evaluated on these metrics begin to manage the metrics. It is a simple and seductive matter for a webmaster to run up a page meter count, for instance, and it can even be done quasi-legitimately.

Better to design a KM practice that "pays as it goes" - meaning the acquisition/creation/cataloging activities have a direct and (relatively) immediate pay-off for the people and organizations involved. Creating a priesthood of knowledge managers will undoubtedly get lots of knowledge managed, but what we really want is to see useful knowledge applied effectively. If some KM activities require dedicated staff-work, the useful approach is to rotate real people through these positions. And the real measure of usefulness will be in their subsequent productivity and innovation when they return to line work.

This web-site knowledge repository pays as it goes. In the extreme, generating its content is reward enough. It is our way of attending to our own personal continual learning needs - all of which is eventually applied by us. Thank you for the opportunity to work through this line of thought.

========= Reply =========================


Home | Library
Send mail to with questions or comments about this web site.
1994-2005 Paradigm Shift International - Attributed Copies Permitted
Last modified: April 25, 2005