Good Business LO12556 -Joe's Jottings #69

JOE_PODOLSKY@HP-PaloAlto-om4.om.hp.com
Thu, 13 Feb 97 09:27:49 -0800

One nice thing about my job is that I don't have to worry too much about
ethics. Ethical behavior is pervasive at Hewlett-Packard; we think about
it only in the very rare instances when something bad happens. And, in
those few cases, the corporate reaction is generally more social than
bureaucratic; the error is quickly corrected, the offender is treated
appropriately, and the process usually is completed with respect and
privacy. This is Washington, D.C. - NOT!

Moreover, I'm a quality manager. I've got a charter to yell about any
problems I see, ethical or otherwise.

But, thanks to the Internet and the explosion of electronic commerce, all
of us in information technology need to give more thought to security,
which leads us directly to ethical issues. W. Edwards Deming said, "In
God we trust, but everyone else has to bring data." We can paraphrase
that and say, "In God we trust, but everyone else has to have a password."

IT professional organizations have long been advocates of ethics. The
International Federation of Information Professionals (IFIP), a global
association of organizations such as the Association of Computing
Machinery, recently published a revised set of ethical guidelines. These
are discussed in the January 1997 issue of the ACM SIGCHI Bulletin in an
article by John Karat and Clare-Marie Karat. IFIP lists thirty codes,
grouped into these five categories:

- Respect (for individuals, for the public, for institutions, for quality
of life)

- Personal (or institutional) qualities (e.g., honesty, acceptance of
responsibility, courage)

- Information privacy and data integrity

- Production and flow of information (e.g., information about
specifications and tests should be available to involved people, even at
the risk of violating the privacy guideline)

- Attitude toward regulations (i.e., respect laws, regulations, and
professional standards)

The SIGCHI article points out there are broader (and grayer) ethical
issues not covered in the IFIP list, things like unequal distribution of
information and lack of respect for cultural diversity. My problem with
the IFIP list, however, is that it is like a set of laws in the Old West,
with nary a sheriff for miles around.

Deborah G. Johnson is a professor of philosophy at Rennselaer Polytechnic
Institute. She has written several books on computer ethics, and she
wrote an article for the January 1997 issue of _Communications of the ACM_
entitled, "Ethics Online." Johnson makes the connection between security
and ethics. She says that neither ethical pronouncements nor technical
fences such as firewalls and encryption will be enough to ensure data
integrity and privacy. She sees security as a moral issue. She says that,
"Our only hope is for individuals (online) to internalize norms of
behavior. That is how most behavior is controlled offline."

Johnson says that there are three special characteristics of online
communications that may affect moral behavior. The first of these is
"scope." "It seems, " Johnson writes, "to be the combination of vastness
of reach, immediacy, and availability to individuals for interactivity
that makes for something unusual here... We might think of scope as
power."

The second characteristic is "reproducibility." The basic moral problem
here is that we can easily kid ourselves into thinking that copying
something is OK. It's easy, cheap, and often undetectable. And the
person we copied from still has the original.

The third, and perhaps most basic ethical characteristic is "anonymity."
Johnson points out that, "... trust is difficult to develop in an
environment in which one cannot be sure of the identities of the people
with whom one is communicating." She observes that anonymity makes it
harder to detect and catch criminals, that it allows people to act without
the normal checks of social control, and that it creates doubts about the
source and, therefore, the integrity of shared information.

Johnson's proposed solution, is, to me, rather weak. She basically says
that the online buyer must beware, that we must try to apply the same
standards of trust to cyberspace that we would offline. But, as she
points out earlier, online anonymity makes lying easy. We have thousands
of years of body language that warns us about offline liars unless they
are also good actors. But as the cartoon says, no one on the Internet
knows you're a dog.

In spite of Johnson's concerns about technological fences, that's what we
are basically using as the first lines of defense. In addition, systems
for authentication, authorization, and encryption are being developed to
combat the specter of anonymity. We'll just use our skills with
information technology to protect our information technology applications.
Hmmm. That's like taking a shot of scotch as a hangover cure.

My basic question is, "Are the ethical rules in cyberspace different from
those in the real world?" In the real world, people are expected
generally to obey laws and follow social norms. Media and advertising
have special "rules" to keep from misleading us. Some of those rules are
fuzzy, as we learned in the recent trial that found ABC liable when they
had an undercover reporter video tape bad food handling practices.

What are the appropriate ethical rules for IT professionals? The IFIP
list shown above is pretty general. Take, for example, the issue of data
integrity. Should we have to analyze every file for its statistical
accuracy and place an appropriate warning label on each report from that
file?

How concerned should we be about the use of information technology? For
example, Cypress Semiconductor in San Jose uses "killer systems" to
automatically cut off various services from people who don't meet certain
task commitments. How would you feel if you were asked to build a system
like this?

Where do we draw the line? IT systems not only do what we want them to,
but they also invariably have unintended consequences. Vaughn Merlyn and
Sheila Smith of the Omega Point Consulting organization, wrote an article
on this subject for the January 20, 1997 issue of _Computerworld's_
Leadership Series. They give two reasons for some of the unintended
consequences. They say, first, that we assume "that information systems
projects are more logical, straightforward, and free of political
considerations than they really are... The second level of failure is ...
a lack of thought about the larger context." They say that we are often
shortsighted in our ways of using technology, not realizing or not
communicating some of the implications of the choices we make.

These "unintended consequences" are problems we see all the time. Merlyn
and Smith talk mainly about business issues, such as projects that are
late and over budget and that cause various types of lost opportunities.
When our customers ask for things that may cause problems, how strongly
should we, as experienced IT people, voice our concerns.

And what about systems problems that are the types of "normal accidents"
that we discussed in the last jottings, the kind caused by complexity that
can disrupt vital services or cause injury.

I don't know the answers to these questions, and I'd really like your
opinions on them.

I do believe, however, that security and ethics are two sides of the same
coin. The privileges of trust and openness are withdrawn when those
precepts are violated, and we must then put up electronic fences. In
application systems, the equivalent of firewalls are strict standards and
structured audits. But audits are generally most effective in business
settings. Information technology is moving beyond the office into living
rooms. If we have V-chips in our television sets, will we also have to
put their software equivalents in our Internet browsers?

Hewlett-Packard has continued to build its culture based on moral beliefs
of those of us following the examples set by Bill and Dave. But ethical
strength is also good business. Information technology has no one set of
role models on whose beliefs we can build. Instead, the on-going task is
in our hands.

Joe

-- 

JOE_PODOLSKY@HP-PaloAlto-om4.om.hp.com

Learning-org -- An Internet Dialog on Learning Organizations For info: <rkarash@karash.com> -or- <http://world.std.com/~lo/>