Re: Fruits of Learning LO1510

JOE_PODOLSKY@HP-PaloAlto-om4.om.hp.com
Mon, 5 Jun 95 08:59:33 -0700

Replying to LO1470 --

As is Barry Malis, I too am still looking for "beef," if, for no other
reason, to sell these great ideas to very focused and busy business
managers. The closest I've come is in an article described below by
Roger Bohn, who has been referenced elsewhere in LO discussions. I
haven't found the beef here, but there's at least a barbecue smell.

The attached, by the way, is from one of a series of "jottings" I send
to a few people in HP in a small attempt to provoke discussion and
dialogue in our widely diverse and distributed company.


Joe Podolsky
(podolsky@hpcc01.corp.hp.com)
===================================================================
...

The concepts of individual and organizational learning are
intuitively valid, but I've been trying to figure out a way of
demonstrating a direct tie between learning and business
results, i.e., financial and competitive success. I haven't yet
found that quantitative linkage.

But an article in the Fall, 1994 issue of Sloan Management
Review may provide a starting point. Roger E. Bohn, a professor
at the University of California at San Diego, titled his paper,
"Measuring and Managing Technical Knowledge" (pp.61-73).

Bohn differentiates, as most of us do, between data,
information, and knowledge. He uses these definitions: "Data
are what come directly from sensors, reporting on the measured
level of some variable. Information is data that have been
organized or given structure - that is, placed in context - and
thus endowed with meaning...Knowledge goes further; it allows
the making of predictions, casual associations, or predictive
decisions about what to do" (pp.61-62).

His key concept is centered around eight "Stages of Knowledge."
These are the stages

1 Complete ignorance; no knowledge exists.

2 Awareness; work is pure art; knowledge is tacit.

3 Measure; work is "pretechnical"; knowledge is written.

4 Control of the mean; but there are uncontrolled variations;
scientific methods are feasible; knowledge may be written and
even mechanized.

5 Process capability; variations can be controlled as well as
the mean; process "recipes" are employed.

6 Process characterization (know how); we can fine-tune the
process to alter output characteristics and reduce costs.

7 Know why; the process can be mathematically modeled and
simulated to achieve optimization.

8 Complete knowledge; the process and environment are so well
understood that all problems can be anticipated and prevented.

Bohn points out that, "There is a natural relationship between
degree of procedure and stage of knowledge. For example, in
order to automate a process, all key variables should be
understood at least to stage six and preferably to stage
seven...those portions of processes that are at low stages of
knowledge should be done using a high degree of expertise and
little automation...(but) the automation of a large, complex,
poorly understood...process leads to a large, complex, poorly
understood, unreliable, (and) expensive ...(albeit)
automated...process." (p. 67)

So try this logic on for size: Learning increases knowledge. More
knowledge allows us to move our processes up the stages of
knowledge. Operating at higher stages of knowledge allows us to
tune our processes to be more effective and efficient, thus
improving business results.

If this logic holds, then we need some sort of diagnostic tool
that relatively objectively identifies the stage of knowledge of
a process. Then, we could track the type and cost of the
learning process applied to the target business process and see
if the stage of knowledge in the business process has been
raised. We could compare the increase in revenues and decreases
in cost generated by the business process at the new stage and
compare those results to the cost of the learning that enabled
the improvements.

Bohn's article doesn't talk about a diagnostic tool like this,
but I can't believe it would be too hard to develop. It sounds
a lot like the Capability Maturity Matrix used by the Software
Engineering Institute at Carnegie-Mellon University for
evaluating software development organizations.

What do you all think about this? Would a process like this
make a difference in justifying and encouraging learning in your
organizations? Do you know of any diagnostic tools to quantify
stages of knowledge? Is this an approach worth pursuing?

Joe