Copy Link
Add to Bookmark
Report
Computer Undergroud Digest Vol. 05 Issue 82
Computer underground Digest Wed Oct 21 1993 Volume 5 : Issue 82
ISSN 1004-042X
Editors: Jim Thomas and Gordon Meyer (TK0JUT2@NIU.BITNET)
Archivist: Brendan Kehoe
Shadow-Archivists: Dan Carosone / Paul Southworth
Ralph Sims / Jyrki Kuoppala
Ian Dickinson
Copy Ediort: Etaoin Shrdlu, III
CONTENTS, #5.82 (Oct 21 1993)
File 1--Fair Info Practices with Comp. Supported Coop Work
File 2--LA Times does cyphertech; odds & ends
File 3--IGC Wins Social Responsibility Award
File 4--Full Description of Proposed "Hacker" Documentary"
Cu-Digest is a weekly electronic journal/newsletter. Subscriptions are
available at no cost electronically from tk0jut2@mvs.cso.niu.edu. The
editors may be contacted by voice (815-753-0303), fax (815-753-6302)
or U.S. mail at: Jim Thomas, Department of Sociology, NIU, DeKalb, IL
60115.
Issues of CuD can also be found in the Usenet comp.society.cu-digest
news group; on CompuServe in DL0 and DL4 of the IBMBBS SIG, DL1 of
LAWSIG, and DL1 of TELECOM; on GEnie in the PF*NPC RT
libraries and in the VIRUS/SECURITY library; from America Online in
the PC Telecom forum under "computing newsletters;"
On Delphi in the General Discussion database of the Internet SIG;
on the PC-EXEC BBS at (414) 789-4210; and on: Rune Stone BBS (IIRG
WHQ) (203) 832-8441 NUP:Conspiracy; RIPCO BBS (312) 528-5020
CuD is also available via Fidonet File Request from 1:11/70; unlisted
nodes and points welcome.
EUROPE: from the ComNet in LUXEMBOURG BBS (++352) 466893;
In ITALY: Bits against the Empire BBS: +39-461-980493
ANONYMOUS FTP SITES:
AUSTRALIA: ftp.ee.mu.oz.au (128.250.77.2) in /pub/text/CuD.
EUROPE: nic.funet.fi in pub/doc/cud. (Finland)
UNITED STATES:
aql.gatech.edu (128.61.10.53) in /pub/eff/cud
etext.archive.umich.edu (141.211.164.18) in /pub/CuD/cud
ftp.eff.org (192.88.144.4) in /pub/cud
halcyon.com( 202.135.191.2) in /pub/mirror/cud
ftp.warwick.ac.uk in pub/cud (United Kingdom)
COMPUTER UNDERGROUND DIGEST is an open forum dedicated to sharing
information among computerists and to the presentation and debate of
diverse views. CuD material may be reprinted for non-profit as long
as the source is cited. Authors hold a presumptive copyright, and
they should be contacted for reprint permission. It is assumed that
non-personal mail to the moderators may be reprinted unless otherwise
specified. Readers are encouraged to submit reasoned articles
relating to computer culture and communication. Articles are
preferred to short responses. Please avoid quoting previous posts
unless absolutely necessary.
DISCLAIMER: The views represented herein do not necessarily represent
the views of the moderators. Digest contributors assume all
responsibility for ensuring that articles submitted do not
violate copyright protections.
----------------------------------------------------------------------
Subject: File 1--Fair Info Practices with Comp. Supported Coop Work
Date: Wed, 20 Oct 1993 09:54:21 -0700
From: Rob Kling <kling@ICS.UCI.EDU>
Fair Information Practices with Computer Supported Cooperative Work
Rob Kling
Department of Information & Computer Science
and
Center for Research on Information Technology and Organizations
University of California at Irvine,
Irvine, CA 92717, USA
kling@ics.uci.edu
May 12, 1993 (v. 3.2)
Based on a paper which appears in SIGOIS Bulletin, July 1993
+++++++++++++
The term "CSCW" was publicly launched in the early 1980s. Like other
important computing terms, such as artificial intelligence, it was coined
as a galvanizing catch-phrase, and given substance through a lively stream
of research. Interest quickly formed around the research programs, and
conferences identified with the term advanced prototype systems, studies of
their use, key theories, and debates about them. CSCW offers special
excitement: new concepts and possibilities in computer support for work.
CSCW refers to both special products (groupware), and to a social movement
by computer scientists who want to provide better computer support for
people, primarily professionals, to enhance the ease of collaborating.
Researchers disagree about the definition of CSCW, but the current
definitions focus on technology. I see CSCW as a conjunction of certain
kinds of technologies, certain kinds of users (usually small self-directed
professional teams), and a worldview which emphasizes convivial work
relations. These three elements, taken together, differentiate CSCW from
other related forms of computerization, such as information systems and
office automation which differ as much in their typical users and the
worldview describing the role of technology in work, as on the technology
itself (Kling, 1991). CSCW is the product of a particular computer-based
social movement rather than simply a family of technologies (Kling and
Iacono, 1990).
The common technologies that are central to CSCW often record fine grained
aspects of people activities in workplaces, such as typed messages, notes,
personal calendar entries, and videotapes of personal activity. Electronic
mail is the most popular of the CSCW technologies (Bullen and Bennett,
1991) and is a useful vehicle for examining some of the privacy issues in
CSCW. Many electronic mail messages contain personal communications which
include opinions and information which many senders would prefer not to be
public information. However, most electronic mail system users I have
spoken to are ignorant of the conditions under which their transmissions
will be maintained as private communications by their own organizations.
(They often assume that their electronic communications will be treated as
private by their organizations. Others are extremely sensitive to the
possible lack of privacy/security of email transmissions.)
Discussions of computerization and privacy are highly developed with
respect to personal record systems which contain information about banking,
credit, health, police, schooling, employment, insurance, etc. (Kling and
Dunlop, 1991:Section V). Definitions of personal privacy have been examined
in extensive literature about personal privacy and record-keeping systems.
Analysts have been careful to distinguish security issues (e.g., lock and
keys for authorized access) from privacy issues -- those which involve
people's control over personal information. There has also been significant
discussion of the interplay between privacy and other competing social
values. The privacy issues in CSCW both have important similarities and
differences when compared with the issues of personal record systems. We
can gain helpful insights by building on this body of sustain thinking
about privacy and record systems to advance our understanding of privacy
issues in CSCW.
Another related and helpful set of inquiries examines the surveillance of
workers in measuring activities related to quality of service and
individual productivity (Attewell, 1991; Kling and Dunlop, 1993). Some of
the most intensive fine grained electronic monitoring involves listening to
the phone calls of service workers such as reservationists, and
fine-grained productivity counts, such as the number of transactions that a
worker completes in a small time period. While all managers have ways of
assessing their subordinates' performance, clerks are most subject to these
fine grained forms of electronic surveillance. The CSCW community has
focussed on professionals as the key groups to use groupware and meeting
support systems. Consequently, electronic monitoring has seemed to be
implausible.
The computing community is beginning to be collectively aware of the
possible privacy issues in CSCW applications. Professionals who use CSCW
can lose privacy under quite different conditions than clerks who have
little control over the use of electronic performance monitoring systems.
And personal communications, like electronic mail or systems like gIBIS
which supports debates, record personally sensitive information under very
different conditions than do information systems for regulatory control
such as systems of motor vehicle, health and tax records.
The use of email raises interesting privacy issues. In the case of email,
privacy issues arise when people lose control over the dissemination of
their mail messages. When should managers be allowed to read the email of
their subordinates? One can readily conjure instances where managers would
seek access to email files. These can range from curiosity (such as when a
manager wonders about subordinates' gossip, and requests messages which
include his name in the message body), through situations in which a legal
agency subpoenas mail files as part of a formal investigation. A
different, but related set of issues can occur when a manager seeks mail
profiles: lists of people who send more than N messages a day, lists of
people who read a specific bulletin board or the membership of a specific
mailing list.
CSCW systems differ in many ways that pertain to informational control. For
example, systems such as email and conferencing systems retain electronic
information which can be reused indefinitely with little control by the
people who were writing with the system. One can imagine cases in which
managers may wish to review transcripts of key meetings held by computer
conferencing to learn the bases of specific decisions, who took various
positions on controversial issues, or to gain insight into their
subordinate's interactional styles. Other systems, such as voice and video
links, are often designed not to store information. But they can raise
questions about who is tuning in, and the extent to which participants are
aware that their communication systems is "on." In the literature about
computerization and privacy, similar questions have been closely examined
-- regulating the duration of records storage, the conditions under which
people should be informed that a third party is seeking their records, and
conditions under which individuals may have administrative or legal
standing in blocking access to their records (See Dunlop and Kling, 1991,
Section V).
One of the peculiarities of CSCW in contrast with traditional record
keeping systems is the nature of the social settings in which systems are
being developed and explored. Most personal record systems are developed in
relatively traditional control-oriented organizations. In contrast, most
CSCW applications have been developed in academic and industrial research
labs. These settings are protective of freedom of speech and thought and
less authoritarian than many organizations which ultimately use CSCW
applications. In fact, relatively few CSCW applications, other than email
and Lotus Notes, are used by the thousands of people in traditional
organizations (Bullen and Bennett, 1991). Further, CSCW systems are
primarily designed to be used by professionals rather than technicians and
clerks. Professionals generally have more autonomy than clerks, who are
most subject to computerized monitoring (Attewell, 1991). As a consequence,
many CSCW developers don't face problems of personal privacy that may be
more commonplace when prototype systems are commercialized and widely used.
These contrasts between R&D with CSCW and the likely contexts of
application should not impede us from working hard to understand the
privacy issues of these new technologies. CSCW applications are able to
record more fine grained information about peoples' thoughts, feelings, and
social relationships than traditional record keeping systems. They can be
relatively unobtrusive. The subject may be unaware of any scrutiny. In R&D
labs, we often have norms of reciprocity in social behavior: monitoring can
be reciprocal. However, in certain organizations, monitoring may follow a
formal hierarchy of social relations. For example, supervisors can monitor
the phone conversations of travel reservationists and telephone operators,
but the operators cannot monitor their supervisors. The primary
(publicized) appropriations of "private email" have been in military
organizations, NASA, and commercial firms like Epson, rather than in
university and industrial laboratories.
CSCW creates a new electronic frontier in which people's rights and
obligations about access and control over personally sensitive information
have not been systematically articulated. I believe that we need to better
understand the nature of information practices with regard to different
CSCW applications that balance fairness to individuals and to their
organizations.
It is remarkable how vague the information practices regulating the use of
the few commonplace CSCW applications are. Yet we are designing and
building the information infrastructures for recording significant amounts
of information about people thoughts and feelings which are essentially
private and not for arbitrary circulation, without the guidelines to
safeguard them. People who use computer and telecommunications applications
need to have a basic understanding about which information is being
recorded, how long it is retained (even if they "delete" information from
their local files, who can access information about them, and when they can
have some control over restricting access to their information.
In the late 1970s the U.S. Privacy Protection Study Commission developed a
set of recommendations for Fair Information Practices pertinent to personal
record keeping systems (PPSC, 1977:17-19). A concern of Commission members
was to maximize the extent to which record systems would be managed so that
people would not be unfairly affected by decisions which relied upon
records which were inaccurate, incomplete, irrelevant or not timely.
Commission members believed that record keeping systems in different
institutional settings should be regulated by different laws. For example,
people should have more control over the disclosure of their current
financial records than over the disclosure of their current police records.
On the other hand, the Commission proposed that each institutional arena
should be governed with an explicit set of Fair Information Practices. In a
similar way, different families of CSCW applications or different
institutional settings may be most appropriately organized with different
Fair Information Practices. In the case of CSCW applications, fairness may
have different meanings than in the case of decisions based upon personal
records systems.
We need fearless and vigorous exploratory research to shed clear light on
these issues. This rather modest position contrasts strongly with that
taken by Andy Hopper of Olivetti, one of the panelists at this plenary
session on CSCW'92. He was enthusiastic about the use of "active badges"
(Want, Hopper, Falcao, and Gibbons, 1992) and insisted on discussing only
their virtues. He argued that one can imagine many scenarios in which
people are harmed by some uses of a particular technology, but that
discussing such scenarios is usually pointless. Hopper's 1992 co-authored
article about active badges examines some of the privacy threats their use
can foster. But on the plenary panel he was critical of people who asked
serious questions about the risks, as well as the benefits of new CSCW
technologies. In this way, he took a position similar to that taken by
spokespeople of many industries, including such as automobiles, who have
delayed serious inquiries and regulatory protections for environmental and
safety risks by insisting on unambiguous evidence of harm before
investigating plausible problems.
The active badge systems which Hopper described seem to be regulated by
Fair Information Practices in his own research laboratory (e.g., no long
term storage of data about people's locations, reciprocity of use,
discretion in use). These sorts of Fair Information Practices may be
required to help insure that active badges are a convenient technology
which do not degrade people's working lives. Other kinds of information
practices, such as those in which location monitoring is non-reciprocal,
and non-discretionary may help transform some workplaces into electronic
cages. Hopper and his colleagues briefly mention such possibilities in
their 1992 ACM TOIS article about active badges. And their article deserves
some applause for at least identifying some of the pertinent privacy
problems which active badges facilitate. However they are very careful to
characterize fine grained aspects of the technological architecture of
active badges, while they are far from being comparably careful in
identifying the workplace information practices which can make active
badges either primarily a convenience or primarily invasive. I believe that
CSCW researchers should be paying careful attention to social practices as
well as to technologies. Richard Harper's (1992) ethnographic study of the
use of active badges in two research labs illustrates the kind of nuanced
analyses which we need, although Harper also glosses the particular
information practices which accompanied the use of active badges in the two
labs.
Unfortunately, delays in understanding some risks of emerging technologies
have led the public to underestimate the initial magnitude of problems, and
to make collective choices which proved difficult alter. Our design of
metropolitan areas making individually operated cars a virtual necessity is
an example. In the early stages of use, the risks of a new family of
technologies are often hard to discern (See Dunlop and Kling, 1991, Part
VI). When major problems develop to the point that they are undeniable,
amelioration may also be difficult.
I characterized CSCW, in part, as a social movement (Kling and Iacono,
1990). Most of us who study, develop, or write about CSCW enthusiastically,
(and sometimes evangelistically) encourage the widespread use of these new
technologies. However, as responsible computer scientists, we should temper
our enthusiasms with appropriate professional responsibility. CSCW
applications open important organizational opportunities, but also opens
privacy issues which we don't understand very well.
The new ACM Ethical Code (ACM, 1993) also has several provisions which bear
on privacy issues in CSCW. These include provisions which require ACM
members to respect the privacy of others (Section 1.7), to improve public
understanding of computing and its consequences (Section 2.7), and to
design and build information systems which enhance the quality of working
life (Section 3.2). The ACM's code is rather general and does not give much
specific guidance to practitioners. The CSCW research community is well
positioned to conduct the kinds of research into the social practices for
using these technologies which could shape meaningful professional
guidelines for their use in diverse organizations. Will we take a
leadership role in helping to keep CSCW safe for users and their
organizations?
=================================
Note: I appreciate discussions with Jonathan Allen, Paul Forester, Beki
Grinter, and Jonathan Grudin which helped clarify some of my key points.
REFERENCES
1. Association of Computing Machinery. 1993. "ACM Code of Ethics and
Professional Conduct." Communications of the ACM. 36(2)(Feb.):99-103.
2. Attewell, Paul. "Big Brother and the Sweatshop: Computer
Surveillance in the Automated Office" in Dunlop and Kling 1991.
3. Bullen, Christine and John Bennett. 1991. Groupware in Practice: An
Interpretation of Work Experience" in Dunlop and Kling 1991.
4. Dunlop, Charles and Rob Kling (Ed). 1991. Computerization and
Controversy: Value Conflicts and Social Choices. Boston: Academic
Press.
5. Harper, Richard H.R. "Looking at Ourselves: An Examination of the
Social Organization of Two Research Laboratories" Proc. CSCW '92:
330-337.
6. Kling, Rob. 1991. "Cooperation, Coordination and Control in
Computer-Supported Work." Communications of the ACM
34(12)(December):83-88.
7. Kling, Rob and Charles Dunlop. 1993. "Controversies About
Computerization and the Character of White Collar Worklife." The
Information Society. 9(1) (Jan-Feb:1-29.
8. Kling, Rob and Suzanne Iacono. 1990. "Computerization Movements"
Chapter 19, pp 213-236 Computers, Ethics and Society, David Ermann,
Mary Williams & Claudio Guitierrez (ed.) New York, Oxford University
Press.
9. Privacy Protection Study Commission. 1977. Personal Privacy in an
Information Society, U.S. Government Printing Office, Washington D.C.
(briefly excerpted in Dunlop and Kling, 1991.)
10.Want, Roy, Andy Hopper, Veronica Falcao and Jonathan Gibbons. 1992.
"The Active Badge Location System" ACM Transactions on Information
Systems. 10(1)(January): 91-102.
------------------------------
Date: 05 Oct 93 03:09:50 EDT
From: Urnst Kouch <70743.1711@COMPUSERVE.COM>
Subject: File 2--LA Times does Cyphertech; odds & ends
(MODERATORS' NOTE: Urnst Kouch is editor of Cyrpt Newsletter, a 'Zine
specializing in techno-political commentary, satire, and virus
information)).
CuD readers might want to look for the October 3 and 4 issues of The
L.A. Times. In a two-part series, the paper's "Column One" was devoted
to privacy/cryptography issues.
"Demanding the Ability to Snoop: Afraid new technology may foil
eavesdropping efforts, U.S. officials want phone and computer users to
adopt the same privacy code. The government would hold the only key"
was the title and subhead of Robert Lee Hotz's 60+ inch piece.
Hotz focused on the Clipper/Skipjack end of the story, in part,
because Mykotronx, Inc., the manufacturer of the chip for the National
Security Agency, is based in Torrance, Los Angeles County. The
newspiece did not delve into any of the recent events surrounding
Pretty Good Privacy and Phil Zimmerman. Pretty Good Privacy was
referred to as "one of the best codes . . . free and [it] can be
downloaded from computer network libraries around the world"; the
people who make up the citizen-supported cryptography movement as
"ragtag computerzoids."
The L.A. Times series also included statistics documenting the steady
rise in court-ordered wiretapping from 1985 to 1992 and the almost
100% increase in phones monitored by pen registers - which record
outgoing numbers - from 1,682 (1987) to 3,145 in 1992. These numbers
do not include monitoring by such as the NSA and said so.
Whitford Diffie earned a boxed-out quote, too. "Recent years have seen
technological developments that diminish the privacy available to the
individual. Cameras watch us in the stores, X-ray machines search us
at the airport, magnetometers look to see that we are not stealing
from the merchants, and databases record our actions and
transactions."
The October 3 installment wrapped up with this succint bit from
Diffie: "Cryptography is perhaps alone in its promise to give us more
privacy rather than less."
Moving on from The L.A. Times, readers could find interesting the
following hodgepodge of facts, which taken together, lend some
historical perspective to the continuing conflict between privately
developed cryptography and the government.
For example, in reference to the Clipper chip, take the old story of
Carl Nicolai and the Phasorphone.
In 1977 Nicolai had applied for a patent for the Phasorphone telephone
scrambler, which he figured he could sell for $100 - easily within the
reach of John Q. Public. For that, the NSA slapped a secrecy order on
him in 1978. Nicolai subsequently popped a nut, took his plight to
the media, and charged in Science magazine that "it appears part of a
general plan by the NSA to limit the freedom of the American people .
. . They've been bugging people's telephones for years and now
someone comes along with a device that makes this a little harder to
do and they oppose this under the guise of national security."
The media went berserk on the issue and the NSA's Bobby Ray Inman
revoked the Phasorphone secrecy order. If the cypherpunks have a
spiritual Godfather, or need a likeness to put on a T-shirt, Carl
Nicolai and his Phasorphone could certainly be candidates.
About the same time, Dr. George Davida of the University of Wisconsin
was also served with a NSA secrecy order, in response to a patent
application on a ciphering device which incorporated some advanced
mathematical techniques.
Werner Raum, chancellor of the University of Wisconsin's Milwaukee
campus, promptly denounced the NSA for messing with faculty academic
freedom. The Agency backed off.
Both setbacks only made the NSA more determined to exert ultimate
control over cryptography. In an interview in Science magazine the
same year, Bobby Inman stated that he would like to see the NSA
receive the same authority over cryptology that the Department of
Energy reserved for research which could be applied to atomic weapons,
according to James Bamford's "The Puzzle Palace." "Such authority
would grant to NSA absolute 'born classified' control over all
research in any way related to cryptology," reads his book.
Readers have also seen the acronym ITAR - for International Traffic in
Arms Regulation - used a lot in reference to the government's interest
in controlling private cryptography. ITAR springs from the Arms
Export Control Act of 1976, in which "The President is authorized to
designate those items which shall be considered as defense articles
and defense services." ITAR contains the U.S. Munitions List, the
Commodity Control List and the Nuclear Referral List which cover,
respectively, munitions, industrial and nuclear-related items.
Cryptographic technology falls into the Munitions List which is
administered by the Department of State, in consultation with the
Department of Defense. In this case, the NSA controls most of the
decision making.
The Arms Export Control Act (AECA) exists _primarily_ to restrict the
acquisition of biological organisms, missile technology, chemical
weapons and any items of use in production of nuclear bombs to
embargoed nations or countries thought inimical to the interests of
the United States. (Examples: South Africa, North Korea, Libya, Iran,
Iraq, etc.)
That the AECA is used as a tool to control the development of private
cryptography in the US is secondary to its original aim, but is a
logical consequence of four considerations which the ITAR lists as
determinators of whether a technological development is a defense
item. These are:
1. Whether the item is "inherently military in nature."
2. Whether the item "has a predominantly military application."
3. Whether an item has military and civil uses "does not in and of
itself determine" whether it is a defense item.
4. "Intended use . . . is also not relevant," for the item's
classification.
If you're brain hasn't seized yet - often, this is what the government
counts on - you may have the gut feeling that the determinators are
sufficiently strong and vague to allow for the inclusion of just about
anything in the U.S. Munitions List or related lists of lists. That
would be about right.
Which is basically what Grady Ward has been yelling about, only he
doesn't kill you with jargon, bureaucrat-ese or Orwell-speak, God
bless him.
[Yes, you too can be an armchair expert on the topic using acronyms,
insider terms, secret handshakes and obscure facts and references to
go toe-to-toe with the best in this controversy. Just take advantage
of this little reading list:
1. Bamford, James. 1982. "The Puzzle Palace: Inside The National
Security Agency, America's Most Secret Intelligence Organization"
Penguin Books.
Nota Bene: The NSA really hated James Bamford, so much so that it
attempted to classify _him_, all 150,000 published copies of "The
Puzzle Palace," his notes and all materials he had gained under the
Freedom of Information Act. Of this, NSA director Lincoln D. Faurer
said, "Just because information has been published doesn't mean it
shouldn't be classified."
2. Foerstal, Herbert N. 1993. "Secret Science: Federal Control of
American Science and Technology" Praeger Publishers.
3. "Encyclopedia of the US Military", edited by William M. Arkin,
Joshua M. Handler, Julia A. Morrissey and Jacquelyn M. Walsh. 1990.
Harper & Row/Ballinger.
4. "The US and Multilateral Export Control Regimes," in "Finding
Common Ground" 1991. National Academy of Sciences, National Academy
Press.
------------------------------
Date: Tue, 5 Oct 1993 21:02:30 EDT
From: Nikki Draper <draper@EUPHRATES.STANFORD.EDU>
Subject: File 3--IGC Wins Social Responsibility Award
BAY AREA COMPUTER NETWORK ORGANIZATION
WINS PRIZE FOR SOCIAL RESPONSIBILITY
Palo Alto, Calif., September 15, 1993 - Computer Professionals for
Social Responsibility (CPSR), the national public interest
organization based in Palo Alto, announced today that the Institute
for Global Communications (IGC) has been named the winner of the 1993
Norbert Wiener Award for Social and Professional Responsibility.
Beginning in 1986, CPSR has presented this award each year to a
distinguished individual who, through personal example, demonstrated a
deep commitment to the socially responsible use of computing
technology. In 1992, the CPSR Board expanded the nominations to
include organizations. IGC is the first organizational recipient of
this prestigious award.
"The award is particularly appropriate this year because of the
enormous interest in computer networks generated by the debate over
the proposed National Information Infrastructure (NII)," said Stanford
professor and CPSR Board president Eric Roberts. "IGC has worked
diligently to use network technology to empower previously
disenfranchised individuals and groups working for progressive change.
CPSR has a strong commitment to making sure that everyone has access
to the resources and empowerment that networks provide. IGC has been
providing such access ever since it was founded in 1986."
"We're honored to be recognized by CPSR and to be the Norbert Wiener
Award recipient," says Geoff Sears, IGC's Executive Director. "Of
course, this award honors not just IGC, but the efforts and
accomplishments of all our network members, our entire network
community."
Sears will accept the Wiener award at CPSR's annual meeting banquet in
Seattle, Washington, on Saturday, October 16th.
This year's annual meeting is a two-day conference entitled
"Envisioning the Future: A National Forum on the National Information
Infrastructure (NII)" that will bring together local, regional, and
national decision makers to take a critical look at the social
implications of the NII. The keynote speaker will be Bruce McConnell,
Chief of Information Policy at the Office of Information and
Regulatory Affairs in the Office of Management and Budget (OMB), who
will present his views on the major NII issues now facing the
administration. Other highlights of the meeting include Kit Galloway
of Electronic Cafe International in Santa Monica, California, as the
featured speaker at the banquet. Using videotapes and a live
demonstration with CPSR chapters, Kit will present an innovative
approach to electronic communication and discuss how the Electronic
Cafe concept has been used.
The Institute for Global Communications is a nonprofit computer
networking organization dedicated to providing low-cost worldwide
communication and information exchange pertaining to environmental
preservation, human rights, sustainable development, peace, and social
justice issues. IGC operates the PeaceNet, EcoNet, ConflictNet, and
LaborNet computer networks. With a combined membership of 10,000
individuals and organizations ranging in size and scope from United
Nations Commissions to local elementary schools, IGC members
contribute to more than 1200 conferences covering virtually every
environmental and human rights topic.
The Wiener Award was established in 1987 in memory of Norbert Wiener,
the originator of the field of cybernetics and a pioneer in looking at
the social and political consequences of computing. Author of the
book, The Human Use of Human Beings, Wiener began pointing out the
dangers of nuclear war and the role of scientists in developing more
powerful weapons shortly after Hiroshima.
Past recipients of the Wiener Award have been: Dave Parnas, 1987, in
recognition of his courageous actions opposing the Strategic Defense
Initiative; Joe Weizenbaum, 1988, for his pioneering work emphasizing
the social context of computer science; Daniel McCracken, 1989, for
his work organizing computer scientists against the Anti Ballistic
Missiles deployment during the 1960s; Kristen Nygaard of Norway, 1990,
for his work in participatory design; Severo Ornstein and Laura Gould,
1991, in recognition of their tireless energy guiding CPSR through
its early years; and Barbara Simons, 1992, for her work on human
rights, military funding, and the U.C. Berkeley reentry program for
women and minorities.
Founded in 1981, CPSR is a national, nonprofit, public-interest
organization of computer scientists and other professionals concerned
with the impact of computer technology on society. With offices in
Palo Alto, California, and Washington, D.C., CPSR challenges the
assumption that technology alone can solve political and social
problems.
For more information about CPSR, the annual meeting, or the awards
banquet, call 415-322-3778 or send email to <cpsr@cpsr.org>.
For more information about IGC, contact Sarah Hutchison, 415-442-0220
x117, or send email to <sarah@igc.apc.org>.
------------------------------
Date: Sat, 16 Oct 93 17:44:16 PDT
From: annaliza@NETCOM.COM(Annaliza T. Orquamada)
Subject: File 4--Full Description of Proposed "Hacker" Documentary"
((MODERATORS' NOTE: In CuD 5.82, we ran a short description of a
proposed documentary film on "Hackers," which intends to be an
antidote to conventional media depictions of the topic. We asked for
a more lengthy description of the project and received the following
summary. We combined two files after a long day of teaching, and hope
we have not omitted or re-edited inappropriately. Any errors or
omissions are the result of our editing, and not necessarily gaps in
the original posts.
We have long-argued that conventional media depictions of "hacking"
are flawed. The more we learn about the proposed documentary, the more
encouraged we are that there exist film makers with both the talent
and the knowledge to produce antidotes to Forbes Magazines "Hackers in
the Hood," Geraldo's "Mad Hacker's Tea-party," and Datelines' modem
hysteria, to name just a few of the more egregious examples of media
madness. Annaliza's group may or may not tell the "hacker story" in a
way that will please everybody, but we remain impressed with her
meticulous research and her open-mindedness. She is about to begin a
cross-country jaunt to interview/film those willing to talk with her,
so if you have a story to tell, think about letting her know)).
=====================================
TREATMENT FOR DOCUMENTARY: UNAUTHORIZED ACCESS ONLY
16, October, 1993
annaliza@netcom.com
Lately the media have widely publicized the on-going dilemmas of
computer security experts whose job it is to stop systems crackers
(what the media have labelled as hackers) from breaking into secure
systems. There have been accounts of teenagers being sentenced for
stealing information, running up phone bills of thousands of dollars
and even espionage.
What is the real threat? Who are these people who break into computer
systems? Why do they do it?
Since the computer was first put on line and hooked up to a phone,
there has always been a risk to security. Breaking into computers is
viewed by many hackers as a mental game of chess. Often computer
professionals tolerate such break-ins as nothing more than inquisitive
minds trying to see if they can outwit the security experts. Most
hackers, when caught show no remorse. In fact, they rarely view
themselves as criminals. They even hold conventions in various global
locations, often inviting their prosecutors to join them. so why is
hacking such a threat? How does it affect the computer community?
Who are these hackers and what are their objectives? Is there any
positive side to hacking?
The focus of this documentary will be to follow the hackers and see
what motivates them. It will be to show how they feel about the
underground computer community, and their own place within it. What
are their stories and their explanations? Do they have a political
agenda, or are they just joyriding through computer systems? How do
they feel about the media and its sensationalized attitude towards
computer cracking and the "outlaw cyberpunk"? What do they think is
the future of the computer underground?
The hacker scene is fractionalized. There are many types of hackers.
Some work in solitude, others in groups. Some use cellular, others
are interested in programming. Some hackers obtain passwords and
codes through the underground or by "social engineering" company
employees or by using electronic scanners to listen in on phone
conversations. Some hackers know computer systems so well that they
don't need passwords but can log on to the computer directly by using
various security holes.
In most countries hacking is now illegal, so everyone who does hack
risks major penalties, even prison. Some groups have a political
agenda, or at least some unwritten moral code concerning the right to
information. There are various interests in the hacker scene
depending on the individual.
Some use hacking for personal gain. Kevin Poulsen, a hacker from Los
Angeles, used his knowledge of the phone system to block phone lines
to a radio station to win a new porsche (Littman, 1993).
Some hackers are into military systems. One case in particular was
comprised of a group of hackers in Germany who sold computer software
programs to the KGB. Though the software given to the Russians was
freely available in the West, the group faced espionage charges. The
hackers who sold the software displeased many in the W. German Hacker
Underground who believed it to be morally wrong to hack for monetary
gain. The project itself was allegedly started to bring the Soviet's
military computer software standard to a grade matching the Americans.
It was called "Project Equalizer" (Hafner and Markoff, 1991; Stoll,
1989).
The documentary will aim to find out more about what the political
premise of the hackers is presently and what its role will be in the
future. Are hackers using their skills for political reasons? Will
individual hackers play a major role in influencing the radical left
or the radical right in the future? Are hackers being used as
government or corporate spies? How do the hackers feel about computer
politics? How do hacker politics vary according to the nationalities
of the hackers themselves?
To date, the media have concentrated on systems crackers as the
entirety of the hacker community. Even though the community is
fractionalized, each sections interacts with the other. The
documentary will explore other parts of the underground.
Mark Ludwig, author of "The Little Black Book of Computer Viruses",
recently unleashed one of his latest virus programs at Def Con 1, a
hacker convention that was held in Las Vegas in July of 1993. The
virus infects the computer hard drive encrypting everything
automatically. The only way to recover the data is to know the secret
password. This sent a buzz through the conference. The ramifications
being that any information stored on the hackers hard drive would be
impossible to retrieve should the Secret Service come bursting through
the door simply by rebooting the computer.
Some hackers see themselves as artists. These hackers are always
offended when one confuses them with systems crackers. They see
themselves as more of an intellectual elite and are very condescending
towards systems crackers. One such hacker was able to penetrate a
NASA satellite probe. When the satellite was launched into space a
peace sign appeared on it's monitor.
The hacking community is growing. Every year conventions are held in
the United States, Germany, France and Holland, as well as through out
the world. SummerCon, HoHoCon, Def Con, and The Hacking at the End of
the Universe Conference are some of the best known. In August of
1993, The Hacking at the End of the Universe Conference was reported
as having over 600 attendees. This particular global conference, put
on by Hactic, was held outside of Amerstam in Holland. The speakers
ranged from hackers to security experts to Police Agents. The press
was everywhere. A spread even appeared in Newsweek Magazine (July 26,
1993: 58). Though most Cons are places for exchanging information,
meeting electronic friends, and generally having a good time,
sometimes there are problems. Last year at PumpCon arrests were made.
At Def Con, Gail Thackeray, a woman who spends much of her time
prosecuting hackers, started her speech by saying she wasn't there to
bust anyone. Another speaker, Dark Druid, was unable to talk about
his planned topic because his persecutor happened to be sitting in the
audience.
More and more hackers are breaking headlines in the news. The AT&T
crash of 1990, (though caused by a wrongly written line of code in a
the switching software program), led to speculation among some media
stories and law enforcement officials that hackers might have been
responsible.
So why are hackers such a threat??? What does a hacker do that could
affect the average person?? One of the objectives of the documentary
will be to explore the technology available to the hacker.
Hackers are experts on the phone systems, they have to be in order to
hack systems without being traced. The really good hackers are able
to dial into the phone systems and trick the phone computers into
believing that they are part of the system, or even that they are the
controller of the system. So how do the hackers do it? Where do they
obtain their information? How do they get onto systems? How do they
get out without being traced? What can they do with their hacking
abilities?
Kevin Poulsen, in the instance of the KIIS FM radio contest was able
to use his knowledge of the phone system to take control of the phone
lines and wait until 119 calls had been placed. On the 120st he
simply blocked all of the incoming lines to make sure that only his
call got through.
A prank by another hacker involved taking control of the phone system
and then using it to reroute the calls of a certain probation officer.
When someone called up the probation officers's office, the caller
would be connected to a phone sex service (Sterling, 1992: 98-99).
Some European hackers broke into South African computer systems during
the boycott against the Apartheid system. The hackers deleted files
in South Africa to disrupt the political system and also were able to
monitor which companies were breaking the boycott by monitoring
computer systems.
A serious case that was to initiate Operation Sundevil and lead to
many arrests was to involve a document called E-911. This document
(though later found to be obtainable through legal channels for about
$13.95) was obtained by a hacker on one of his jaunts through the
phone system computers. The document was kept by the hacker as a
souvenir. He sent the document to a friend who published it in an
electronic magazine called Phrack (an electronic hacker magazine
available on the internet). The phone company was furious that their
supposedly secure system had been breached and that proprietary
information was being spread throughout the hacker community. Not
only was this stolen/private property, the document contained
information pertaining to the 911 emergency services. Although the
document had been edited so that no harmful information was published,
the phone company was furious. Once a hacker has gained root or
super-user privileges at a phone company switching station there is
always the potential threat that they could do some very real damage
(intentionally or unintentionally). If a hacker could re-route a
judge's phone calls or have an enemies phone disconnected or make free
calls globally, what is to stop them from cutting off the 911
emergency systems??? This is why the U.S. Secret U.S. Service (the
branch of the government that is responsible for the prosecution of
most electronic crime) went so far as to break down doors of 15 year
olds with guns and haul them and all of their equipment away. One
hacker was reportedly banned from even going within 100 yards of a
computer terminal.
Our documentary will also explore the ramifications of the hacker's
actions. Many hackers have been arrested, imprisoned, had their
computers as well as their software confiscated. Are these arrests
always justified? Many innocent people have been questioned by the
Secret Service and FBI purely through suspicion in connection with
computer related crime. In fact, is was because of the FBI's
investigation of the alleged "theft" of Apple proprietary source code
and it's curious questioning of Mitch Kapor, founder of Lotus 1-2-3,
and John Perry Barlow, former Grateful Dead lyricist, that led to the
forming of the Electronic Frontier Foundation (EFF) (Sterling, 1992:
232-238). Phil Zimmerman, the creator of an electronic privacy
encryption program called PGP has been subpoenaed by the U.S.
government for creating a program that ensured legitimate privacy.
Many people have had their equipment confiscated without ever being
charged of a crime. Are fundamental human rights being broken because
of the fear of the unknown?
Is this fear really justified? If hackers can take control of local
switching stations (and they can) why don't they wreak havoc. If
there is such a threat to the general public then why don't hackers
cause more serious damage?
"Bellcore clearly believes that hackers are nothing short of
terrorists. A security alert from November 1990 warns that "the
potential for security incidents this holiday weekend is significantly
higher than normal because of the recent sentencing of the three
former Legion of Doom members. These incidents may include Social
Engineering (gaining information by posing as a bellcore employee over
the telephone), computer intrusion, as well as possible physical
intrusion."'*
But how do the hackers see themselves?? How do they justify breaking
into Bellcore electronically or physically. If hackers are such a
major threat then why do so many corporations using computers hooked
up to outside connections leave their electronic doors wide open?
As computers become more available and widespread throughout the
community, so does hacking. This documentary hopes to address the real
threats, as well as the hype. Is hacking "intellectual joyriding"?
Or serious criminal behavior.
By humanizing the hacker scene this documentary hopes to demystify the
sinister mythos surrounding what has been deemed by the media as 'the
outlaw hacker'. It is not the documentar's objective to make
judgements, only to try to understand.
The documentary will run approximately 30 minutes. Our objective will
be to film at various hacker conventions and meeting places in the
United States and Europe. We will be shooting on broadcast quality
video. The documentary crew will be leaving Los Angeles at the
beginning of December and going to wherever there are people who want
to get involved in the project. Ultimately, we hope to show the film
at conferences, festivals and perhaps on high quality t.v. (such as
Channel 4 in England or PBS in the U.S.). It will also be suitable for
classroom viewing and related educational purposes.
This documentary is about the hacker community itself. We are looking
for monetary donations from the underground or from people sympathetic
to the underground. In this way, we will be able to make the
documentary without corporate or film company control. Our group is
comprised of film makers who are involved in the scene itself. We are
looking also for any donation of services, i.e. Beta transfer time, an
off-on line editing suite, sound equipment, videotape, etc...
If anyone would like to get involved in the project in any capacity,
whether it be to go in front of the camera, or relate a story or a
hack anonymously to my e-mail address, or donate funds, or equipment
or editing time, please get in touch.
This documentary hopes to be an open forum for hackers to relate their
stories and ideas about the past/present/future. We hope to be able
to disseminate the hype from other sensationalized media who are only
looking for a good story and don't really care about the ramifications
of their actions.
Anyone who is interested in any aspect of this project, please contact
me Annaliza at annaliza@netcom.com
* Taken from 2600 Magazine - The Hacker Quarterly - Volume Nine,
Number Four - Winter 1992-93.
BIBLIOGRAPHY
Hafner, Katie, and John Markoff. 1991. _Cyberpunk: Outlaws and
Hackers on the Computer Frontier._ New York: Simon and Schuster.
Littman, Jonathan. 1993. "The Last Hacker." _The Los Angeles Times
Sunday Magazine_. September 12: 18 ff.
Sterling, Bruce. 1992. _The Hacker Crackdown_. New York: Bantam.
Stoll, Cliff. 1989. _The Cuckoo's Egg. New York: Doubleday.
------------------------------
End of Computer Underground Digest #5.82
************************************