vasant raval, CISA, DbA, is
a professor of accountancy
at Creighton University
(Omaha, Nebraska, USA).
The coauthor of two books
on information systems and
security, his areas of teaching
and research interests
include information security
and corporate governance.
Opinions expressed in this
column are his own, and not
those of Creighton University.
He can be reached at
vraval@creighton.edu.
Do you have something
to say about
this article?
Visit the Journal
pages of the ISACA
web site ( www.isaca.
org/journal), find the
article, and choose
the Comments tab to
share your thoughts.
Go directly to the article:
Policy Vacuums
IT can be molded or shaped. The term “shape”
has its origin in an Old English word “scieppan,”
which means create. We, as IT professionals,
create numerous objects by shaping IT. A
broad definition of objects, in this sense, could
include software, infrastructure, an application
or a network of computers. As creators of these
objects, we are moral agents of the objects.
The thought of moral agency is important here
because as moral agents who create these objects,
the values of our choice—what is right and
what is wrong—will be inherited by the object
itself. Creating a robot means creating the object
(robot) with certain characteristics, moral or
otherwise, in it. And, when we accept the notion
that our creations inherit our value judgments, it
is important for us to also accept the role we play
as moral agents—in making permanent the values
we choose, through the created objects.
As moral agents, our duty is to be well
informed—that is, to seek as much valuable
information as feasible to do the right thing. As
producers and communicators of information,
we need to, within our constraints and
opportunities, be morally right. For example, we
are accountable for the faithful representation
of our output, we should be free from
plagiarism, and we should be transparent in
our communication. Independent of our roles
as users or producers of information, our
moral evaluations affect the informational
environment (or space). For example, privacy
and confidentiality of personal information and
democracy in access to information are part
of our broader role in the information space.
Luciano Floridi suggests that when considering
information ethics, we should take into account
all three roles. Any consideration of only one of
these is microethics; according to Floridi, what
we need is an all-encompassing macroethics. 1
J. H. Moor suggests that with rapid
developments in technology, a new approach to
ethics is required. He argues that the “logical
malleability” of computers led to the so-called
policy vacuums that require careful analysis
to fill. 2 He extends the argument to all kinds
of technologies to suggest policy vacuums, for
example, to the malleability of life (genetic
technology), material (nanotechnology) and
mind (neurotechnology). By nature, policy
vacuums suggest “fighting fires” as the molded
technology is embedded into a business model
(e.g., Facebook) or a strategic informational
object (e.g., a search engine) prematurely in
relation to ethics. Thus, the approach to solving
ethical dilemmas would be reactive rather than
proactive, and this may produce surprises leading
to frustrations and costly cleanups.
vISIOn OF e THICS AS An OngOIng DynAmIC ACTIvITy
In light of the rapid change in technology, any
assessment of ethical dimensions of information
is a frozen picture. Any change over time can
bring new ethical implications for us to identify,
sort out and address. Relying on a one-time
exercise, no matter how robust, would not
prevent ethical missteps. Therefore, changes
should be identified, documented and analyzed
not just from a micro perspective (as in its impact
on privacy of customer data), but rather more
in the spirit of bridging the policy gap that the
change creates. The purpose would be to not
just capture an instance of a possible ethical
compromise, but to identify and correct the
systemic attributes that could potentially trigger
ethical missteps.