Copy Link
Add to Bookmark
Report
Computer Undergroud Digest Vol. 10 Issue 42
Computer underground Digest Tue July 28, 1998 Volume 10 : Issue 42
ISSN 1004-042X
Editor: Jim Thomas (cudigest@sun.soci.niu.edu)
News Editor: Gordon Meyer (gmeyer@sun.soci.niu.edu)
Archivist: Brendan Kehoe
Shadow Master: Stanton McCandlish
Shadow-Archivists: Dan Carosone / Paul Southworth
Ralph Sims / Jyrki Kuoppala
Ian Dickinson
Field Agent Extraordinaire: David Smith
Cu Digest Homepage: http://www.soci.niu.edu/~cudigest
CONTENTS, #10.42 (Tue, July 28, 1998)
File 1--Encryption Policy Update (From Epic 510)
File 2--Are Gays' PCs PC?
File 3--How Technology Dumbs Down Language
File 4--FBI Asks Congress to Enhance Wiretap Powers (Epic 510)
File 5--NYT: Report Reveals Cost of Computer Incidents at Universities
File 6--RE: [NTSEC] Re: [Secure-NT] Followup to Rutstein review
File 7--Re: [NTSEC] Re: [Secure-NT] Followup to Rutstein review
File 8--REVIEW: "Personal Medical Information", Ross Anderson
File 9--Cu Digest Header Info (unchanged since 25 Apr, 1998)
CuD ADMINISTRATIVE, EDITORIAL, AND SUBSCRIPTION INFORMATION ApPEARS IN
THE CONCLUDING FILE AT THE END OF EACH ISSUE.
---------------------------------------------------------------------
Date: Mon, 20 Jul 1998 18:18:18 -0400
From: EPIC-News List <epic-news@epic.org>
Subject: File 1--Encryption Policy Update (From Epic 510)
Published by the
Electronic Privacy Information Center (EPIC)
Washington, D.C.
http://www.epic.org
High-powered DES Cracker Developed
The Electronic Frontier Foundation announced on July 17 that it
has produced a DES cracking supercomputer, capable of brute
forcing a 56-bit DES key in four days or less. John Gilmore,
leader of the project, has published the source code, hardware
diagrams, and schematics in a book to encourage others to
duplicate his work. The Data Encryption Standard, developed in
1974 by IBM and the NSA, is possibly the most widely implemented
encryption algorithm in the world. The U.S. government has long
maintained that 56-bit DES offers adequate protection for
sensitive data.
Junger Decision
On July 7, a federal judge ruled in a closely followed encryption
case that source code does not enjoy First Amendment free speech
protection. Judge James Gwin of the U.S. District Court for the
Northern District of Ohio ruled that law professor Peter Junger
can not challenge encryption export restrictions on the ground
that they abridge his right to free speech on the Internet. In
his decision, Judge Gwin stated that "... exporting source code is
conduct that can occasionally have communicative elements.
Nevertheless, merely because conduct is occasionally expressive
does not necessarily extend First Amendment protection to it."
Professor Junger is expected to appeal the decision.
"ClearZone" Proposal
A group of 13 companies lead by Cisco Systems announced on July 13
that they would develop a product called ClearZone, which would
enable routers to capture e-mail, URLs, and other data before they
are encrypted and sent over the network that could then be given
to law enforcement agencies. The proposal has serious
implications for personal privacy on the Internet, and many are
skeptical of Cisco's assertion that it will meet law enforcement's
demands and gain export approval.
New Crypto Export Guidelines
Secretary of Commerce William Daley announced on July 7 a new set
of guidelines for crypto exports for financial institutions such
as banks and credit card companies. U.S.-manufactured encryption
systems of any key length may be exported to a specified set of 45
countries by the financial firms once the products have been
subjected to a one-time examination by the Bureau of Export
Administration (BXA).
More information on encryption policy is available at:
http://www.crypto.org/
------------------------------
Date: Mon, 27 Jul 1998 20:17:42 -0700 (PDT)
From: David Batterson <davidbat@yahoo.com>
Subject: File 2--Are Gays' PCs PC?
Are Gays' PCs PC (Politically Correct)?
by David Batterson
Are gays/lesbians politically correct when we shell out money for
toys, tools, togs and travel? Should we be? While many of us do shop
at gay-owned and gay-friendly businesses, often we don't have those
choices. Or we simply go where the service and prices are best. Is
this wrong or right?
As a test, I polled (by e-mail) community leaders and openly-gay
business executives to find out what PCs they use, and if political
correctness played a part in their decision to buy those computers.
These were the questions:
(1) What computer (desktop and/or notebook) do you personally use (at
the office and/or home)........and why?
(2) Do you buy only from computer manufacturers that have a strong
commitment to employee diversity (including domestic partnership
benefits)? Should this be a consideration for GLBT consumers?
Following are the unedited responses. Replies appear in the order
they were received via e-mail.
Tim Gill, the openly-gay and very rich Chairman of the software
("QuarkXPress") firm Quark Inc. (www.quark.com), speaks out forcefully
on gay issues--both with his voice and his wallet. He helped fight
the infamous anti-gay Amendment 2 in Colorado while busy running this
top software firm, and gives speeches at gay organization meetings
and forums. His Gill Foundation has given $millions to GLBT and other
nonprofits. His response:
1. "I use Macs, and Windows machines of all varieties. Quark has
products for both Operating Systems, so I need to know both."
2. "There is no simple answer to this. However, if you're going to
boycott a company because they aren't gay-friendly, it is pointless
unless you tell the company what you've done. Send them a letter and
enclose a copy of the sales receipt for the system you did buy. Let
them know what they've lost. Or, if the system from the
non-gay-friendly company is lots cheaper, buy from them
and send the money you save to your favorite gay and lesbian
organization!
P.S. Here's a techno weenie observation about #2. You also save on
income taxes when you do this, so you should actually send in 25% more
than what you saved off the purchase (at least)."
Tom Rielly is CEO of PlanetOut (www.planetout.com), and co-founder of
Digital Queers www.dq.com). Rielly replied: "I use a Mac and have
since August of 1984 because it's the best User Experience I've ever
tried. We do buy from computer makers with DP benefits. I think it
should be one consideration for queer consumers in addition to many
others."
The founder of AEGIS (AIDS Education Global Information System),
located on the Web at www.aegis.com is Sister Mary Elizabeth. Sister
Mary said that "our laptop is a Toshiba 2105, a gift. My favorite
laptop is the IBM, as the keyboard has a better feel. Desktop, as
well as our servers, are internal designs based on the Intel Pentium
II 300 MHz CPU."
As for diversity, Sister Mary's position is: "As a nonprofit, with a
limited budget, relying on vendors with a strong commitment to
employee diversity is not always possible, as we have to consider
product reliability, support, and cost--factors to ensure that we can
stay online. I do feel this is an important issue, though, and I make
every effort to seek out and do business with GLBT supportive
organizations."
Craig Roberts is President of the mostly-gay-member San Diego
Democratic Club (members.aol.com/sddemoclub). Roberts said "I use my
PC at work for both work and personal needs. Consequently, I have no
say in what was purchased or from whom. I believe GLBT consumers
should always be cognizant of company policies on a variety of issues
(e.g., environmental concerns, diversity issues, labor policies, etc.)
when using their purchasing power."
Carrie Wong is Chairperson of Digital Queers and also COO of Niehaus
Ryan Wong, Inc. (www.nrwpr.com), a high-tech PR firm. Her replies
follow:
1. "At the office, I use a Gateway 2000 Pentium 200 processor w/32MB
RAM. It's our standard office computer that gives me enough power
and speed to keep connected, surf the Web and collaborate in real-time
with my co-workers." At home, I personally use a Macintosh PowerBook.
Even though our office has migrated almost entirely to PCs, I've stuck
with the Mac because I believe Apple is a good company (a heavy
supporter of domestic partnership rights and GLBT organizations such
as Digital Queers), and the Mac is an easy-to-use computer for remote
users like me."
2. "When it comes to purchasing computers, whether for work or
personal use at home, I'm afraid most people make buying decisions
primarily on price and functionality. I personally would support
buying a computer from any organization that has a strong commitment
to employee diversity, as long as the other requirements (price,
functionality) are met as well. In the PC business, not enough
'company culture' and 'ethics' topics are covered by the media. Most
computer trade publications focus on the hardware/software, so most
consumers don't get an insight into how a company performs on an
employee diversity commitment scale. That's unfortunate," Wong added.
Oregon State Representative George Eighmey is an openly-gay legislator
and attorney in Portland, who helped battle the infamous Oregon
Citizens Alliance and its gay-bashing Ballot Measure 9. His replies
follow.
1. "Office: Tangent-Pentium, Compaq Presario and Leading Edge WinPro
Home: MAC"
2. "I try, but I am not always successful either because of price,
availability or restrictions by the State of Oregon on sources of
purchase. Should this be a consideration for GLBT consumers?--Yes!
I belong to PABA (Portland Area Business Association), which is the
Portand area's own gay, lesbian and bi business group. We try to
purchase from each other as much as possible."
Jim Carroll at PFLAG wrote: "Kirsten Kingdon, PFLAG Executive
Director, forwarded you e-mail to me since I am the Director of
Administration." His responses were:
1. "Power Mac at work - due to assistance from Gill Foundation grant
and Digital Queers consultation."
2. "PFLAG's purchasing policies require an evaluation of a supplier's
inclusion of diversity employment policies (although domestic
partnership benefits is not specifically listed) for capital
purchases. PFLAG's purchasing policies were designed because it was
thought that this should be a consideration of GLBT consumers."
At deadline time, these were the only responses. Organizations
including NGLTF, HRC, GLAAD, Victory Fund, APLA and Shanti were
e-mailed to participate, but did not respond. The answers presented
here, however, should provide GLBT computer-users with enough
information to make up their own minds on this issue.
###
This article was written on a PC made by AST Research (on the HRC
list of gay-friendly companies). E-mail comments to
davidbat@yahoo.com. Copyright 1998, All Rights Reserved.
------------------------------
Date: Thu, 18 Jun 1998 12:45:57 -0400
From: Stephen Talbott <stevet@MERLIN.ALBANY.NET>
Subject: File 3--How Technology Dumbs Down Language
NETFUTURE
Technology and Human Responsibility
Issue #73 Copyright 1998 Bridge Communications June 18, 1998
Editor: Stephen L. Talbott (stevet@oreilly.com)
On the Web: http://www.oreilly.com/~stevet/netfuture/
You may redistribute this newsletter for noncommercial purposes.
*** How Technology Dumbs Down Language (162 lines)
From Steve Talbott <stevet@oreilly.com>
You've doubtless noticed that web search engines now offer on-the-spot
machine translation of foreign-language web pages. I'll spare you the
usual examples of comical translation. What worries me is not how bad
they are, but how we will go about making them better.
It's actually quite easy: all we need to do is to continue using ever
less evocative, less richly textured, less meaningful language. The more
we can resort to a flat, abstract, technical, and contentless vocabulary,
the more satisfactory the machine translation will be. If we could
finally learn to speak and write in something like a programming language,
we'd be blessed with near-perfect translations. Don't look for a *Moby
Dick* or *Leaves of Grass* to be written in this language, however.
But there's a second, complementary way for the translations to become
more acceptable: the reader can lower his standards of acceptance.
Commentator David Jolly tells us that, while computer translations were
once the butt of jokes, they are now taken quite seriously. He goes on:
But the real story is the Internet, because web-surfers aren't worried
about a publication-quality document; they just want to be able to
browse foreign websites. (CBS MarketWatch, May 13, 1998)
Of course, when we're "just browsing" we're not particularly concerned
about such things as depth of understanding, subtle distinctions, fidelity
to the source, and the intimate and sympathetic penetration of another
mind. These objectives, along with many others, fade into the background.
They may *need* to fade into the background on occasion. The concern on
the Net today is whether they are fading beyond retrieval.
In any case, all this underscores the question that a few people began
asking some years ago. In the convergence of human being and machine,
which is more fateful -- the machine's becoming more intelligent and
human-like, or the human being's becoming more machine-like? All the
commentary, all the prognostication, all the excitement seems focused on
the machine's generation-by-generation ascent -- which already suggests
that the human descent is well advanced.
Searching, Filtering, Blocking
------------------------------
The risks of machine translation are presenting themselves on several
fronts. To begin with, the widespread use of search engines encourages
authors to write for "searchability." The idea is to avoid the unexpected
(and therefore potentially more revelatory) word, and instead to appease
the audience's expectations. They will, after all, search according to
their expectations, and if they don't find you, what good will your words
do?
The same issues arise with filtering and blocking software. There is no
way -- and in principle never can be a way -- to implement a dependable
filter or blocker so long as our language remains alive and meaningful.
The blocking software must rely to one degree or another on past word
associations, automatically correlating certain words with particular
subjects and meanings. The result is that those whose intentions are not,
for example, pornographic, must avoid the "pornographic lexicon" or else
suffer blocking.
But -- as the study of meaning and metaphor has made abundantly clear --
the renewal of language and the extension of human understanding depend on
continual cross-fertilization between lexicons. Only in that way can we
counter the tendency for our language to harden into unrelated, narrow,
specialized usages that give us precision while eliminating expressive
power. Such specialized lexicons are ideal for capturing, in the most
prosaic terms, what we already know -- but disastrous for helping us to
take wing and transcend the previous limits of our understanding.
The concern for internationalization of web pages raises the same issues
yet again. Colorful, inventive, richly textured language is not only
difficult for foreigners to understand, but may also lead (we're told) to
unintended messages and even insults. The standard advice is to avoid
colloquialisms, unusual metaphors, and, in general, any unexpected use of
language.
While a genuine thoughtfulness may be at work in this advice, you will
find that I make none of the recommended accommodations in NETFUTURE. My
refusal is rooted in respect for the reader. To hear or read someone from
a different culture calls for a heroic effort of imagination and
sensitivity, and we do no favor to anyone by discounting this effort.
Personally, I would not want to encounter a foreign author in a watered-
down and patronizing form. Nor would I want to learn a foreign culture
through a compromised version of its language. Only the fullest and most
powerful use of language lends itself to the most profound grasp of the
speaker and his culture.
While I am not much of a stylist, I always try to do my best. I realize,
though, that this stance, taken in the wrong spirit, quickly becomes
arrogant. Certainly, for example, one-to-one communication calls for
profound mutual accommodation. The accommodation -- the willingness to
address the concrete individual in front of you -- is, in fact, nearly the
whole point.
But it happens that the mental effort and resourcefulness of imagination
required for this kind of accommodation is exactly what the machine-
reduction of language is now discouraging. You cannot accommodate to the
world of the other person without first doing the hard work of *entering*
it. The inability to achieve this work of imagination is surely
implicated in the various ethnic conflicts currently roiling the globe.
It is one of the characteristic paradoxes of the Net (a paradox lying, I'm
convinced, at the core of the entire technological enterprise) that the
tools designed to bridge the distance between peoples can operate in a
deeper way as tools for destroying even the bridges we already had.
Unspeaking the Creative Word
----------------------------
Voice recognition systems offer still another venue for the attack upon
language. But here it is no longer just the written word -- the word
already substantially detached from us -- that is at issue. It is more
directly we ourselves, in the fullest act of expression, who must adapt
ourselves to the machine's limitations. We must train ourselves toward
flatness, both in sound and meaning. But it is almost impossible to
achieve a given quality of voice without first achieving more or less the
same quality within oneself. Just how far is it healthy to practice inner
qualities of machine-likeness?
From ancient times the spoken voice -- the Word -- has been experienced as
the primary agent of creation. Still today we may occasionally hear dim
echoes of the Word's power, whether in song, or in dramatic presentation,
or at times when we are spooked, or in those intense, interpersonal
moments when everything hangs on the overtones of meaning and the soul-
gripping tonal qualities in the voice of the other.
I happen to believe that a lot hinges on our ability to rediscover, for
good or ill, the powers that stream into the world upon the current of the
human voice. It would, however, be a hard case to make to a computer-bred
generation. And, with our adaptation to machine translation, it promises
to become harder still.
Speaking of efforts to reform and simplify language, philologist Owen
Barfield has written,
Those who mistake efficiency for meaning inevitably end by loving
compulsion, even if it takes them, like Bernard Shaw, the best part of
a lifetime to get there .... Of all devices for dragooning the human
spirit, the least clumsy is to procure its abortion in the womb of
language; and we should recognize, I think, that those -- and their
number is increasing -- who are driven by an impulse to reduce the
specifically human to a mechanical or animal regularity, will continue
to be increasingly irritated by the nature of the mother tongue and
make it their point of attack. (Preface to second edition of *Poetic
Diction*)
Barfield wrote that in 1951. If he were writing today, I think he would
refer less to specific enemies of the mother tongue and more to the
emergence of a global logic of distributed intelligence and connectivity.
As we articulate more and more of our activities into the logical
operations of the computerized global system, we will also -- unless we
consciously resist the tendency -- sacrifice more and more of our creative
world of meaning, from which alone the future can arise.
(This is another illustration of my contention -- see NF #59 and 61 --
that the new threats of tyranny look less and less like issuing from
central, identifiable authorities, and more and more like properties of
"the system.")
==================
NETFUTURE is a newsletter and forwarding service dealing with technology
and human responsibility. It is hosted by the UDT Core Programme of the
International Federation of Library Associations. Postings occur roughly
once every week or two. The editor is Steve Talbott, author of "The
Future Does Not Compute: Transcending the Machines in Our Midst".
You may redistribute this newsletter for noncommercial purposes. You may
also redistribute individual articles in their entirety, provided the
NETFUTURE url and this paragraph are attached.
Current and past issues of NETFUTURE are available on the Web:
http://www.oreilly.com/~stevet/netfuture/
http://www.ifla.org/udt/netfuture/ (mirror site)
http://ifla.inist.fr/VI/5/nf/ (mirror site)
To subscribe to NETFUTURE, send an email message like this:
To: listserv@infoserv.nlc-bnc.ca
subscribe netfuture yourfirstname yourlastname
------------------------------
Date: Mon, 20 Jul 1998 18:18:18 -0400
From: EPIC-News List <epic-news@epic.org>
Subject: File 4--FBI Asks Congress to Enhance Wiretap Powers (Epic 510)
Published by the
Electronic Privacy Information Center (EPIC)
Washington, D.C.
http://www.epic.org
Last week, the FBI sought support from the Senate Appropriations
Committee for an amendment to the FY 1999 Justice Department
funding bill that would substantially amend the Communications
Assistance for Law Enforcement Act of 1994 (CALEA). The provision
would grant the Bureau new powers to conduct wiretaps and demand
changes to the nation's telephone system.
The amendment would limit the role of the Federal Communications
Commission (FCC) in mediating the current dispute between the FBI,
industry and public interest groups over the technical standards
implementing CALEA. It would require the FCC to adopt the current
draft standard and approve the controversial "punch list" of
additional features surveillance demanded by the FBI. Industry
and public interest groups would be precluded from commenting on
the standard.
The FBI proposal also would require phone companies to disclose
information on the "exact physical location" of cell phone
subscribers if a court finds that "there is a reason to believe
that the location information is relevant to a legitimate law
enforcement objective." Under this standard, no crime would be
necessary for judicial authorization. The proposal would also
permit law enforcement to obtain location information without a
warrant for any felony offense if they apply for a court order
within 48 hours.
EPIC and five other privacy groups wrote to Senator Ted Stevens
(R-AZ), Chairman of the Senate Appropriations Committee, on July
17 urging him to reject the FBI proposal.
More information on the letter and CALEA is available at:
http://www.epic.org/privacy/wiretap/
------------------------------
Date: Tue, 28 Jul 1998 10:59:39 -0700
From: Jim Galasyn <blackbox@BBOX.COM>
Subject: File 5--NYT: Report Reveals Cost of Computer Incidents at Universities
July 27, 1998
Report Reveals Cost of Computer Incidents at Universities
By PAMELA MENDELSBio
A student receives an e-mail message with a fake warning that he
is a suspect in a Federal Bureau of Investigation child
pornography case. A hacker sets up a "Trojan horse" log-in screen
that captures the confidential passwords of 75 university
students. An innocent software upgrade leads to weeks of computer
crashes and disruption of service for students, faculty and
administration personnel.
These are three of the 30 incidents that researchers at the
University of Michigan uncovered in a recent report that examined
computer-related misdeeds and malfunctions in university settings.
The study took a look at computer snafus that had occurred from
about September 1996 to April 1998 at the 12 Midwestern
universities that make up the Committee on Institutional
Cooperation. The group, an academic consortium whose members
include the University of Chicago, Northwestern University, Purdue
University and the University of Minnesota, paid for the effort,
called the Incident Cost Analysis and Modeling Project.
The purpose was to get an idea of the kind of computer problems
that crop up at the universities and to estimate how much they
cost to handle.
<snip>
The study was prompted by concern that university lawyers and
insurers need a clearer picture of the kinds of mischief that
university computers can cause so they are better prepared to
manage the risk.
<snip>
In the 30 cases documented, researchers estimated that
universities spent about $1 million in cleanup costs. The money
paid for everything from new equipment to staff time, including
about 1,160 hours spent by one university computer specialist to
track down what eventually turned out to be a group of 20 to 30
hackers, one of whom had used a university computer account to try
to threaten a California-based Internet service provider.
Rezmierski emphasized that the study was not a scientific one --
and for a simple reason. Because no one knows about all of the
computer-related incidents that occur at the schools, researchers
could not select a random sampling of cases to examine.
<snip>
But hackers were far from the only source of headaches. Indeed,
other incidents involved old-fashioned theft, such as a break-in
at a university fundraising office. The stolen goods included a
computer containing sensitive information about 180,000 donors,
including their Social Security numbers, addresses and the amount
of money they contributed.
<snip>
And some serious incidents happened without any malicious intent.
For example, among the cases studied, the problem that cost the
most to solve occurred in a bumpy attempt to update the software
of a computer containing student files, financial information and
the school's Web page. After the upgrade, the system began
crashing frequently over a two week period and then required
another week of repair before it functioned properly. It cost the
university about $14,300 to fix the problem, but students, staff
members and professors lost about another $175,000 in time that
could not be spent working on computer-dependent projects.
<snip>
------------------------------
Date: Mon, 27 Jul 1998 13:24:17 -0700
From: Jiva DeVoe <JivaD@MediServe.com>
Subject: File 6--RE: [NTSEC] Re: [Secure-NT] Followup to Rutstein review
Many of you are probably about to flame me for allowing such obvious
advocacy through our filters. Let me first say that followups to this
email to the effect of "<Insert OS here> rewlz" from either NT or Unix
camps will not be forwarded to the list. Secondly, the purpose of
letting this through was NOT to debate unix vs NT, but instead only to
bring up again the issue that Open Source is the way to go for security.
The reason the security auditing projects are so successful on Unix etc
is because they are open source platforms. The more people able to see
the source, the more eyes looking for potential problems, and the
quicker those problems can be uncovered and resolved. Followup messages
on the topic of Open Source merits (or lack thereof) in the security
community WILL be allowed through provided they contain no obvious
advocacy.
(Bill G., Free the NT source!)
-----Original Message-----
From--Adam Shostack [mailto:adam@homeport.org]
Sent--Monday, July 27, 1998 12:10 PM
To--dleblanc@mindspring.com
Subject--Re--[NTSEC] Re--[Secure-NT] Followup to Rutstein review
David, I think you may be falling into the labor theory of
value fallacy. As a systems administrator, before I became very
interested in security, I found books like Curry's to be very helpful.
It was clear. It was practical. It gave me most of what I needed in
about 200-300 pages. Books like that made it posisble for me to have
reasonably secure systems. The fact that there was 20 years of
experience and understanding of the OS did not matter to me.
What matters is the fact that as a sysadmin, I could protect
my computers from attack, and I could do so reasonably well after
reading one smallish book.
I can not gain that understanding from one book on NT.
I agree with you that Rob's review of Rutstien was a bit on
the critical side, but on re-reading it, I was unable to find any
point to criticise. Thus, I am forced to conclude that Rob is
correct: There are no good books on NT security. Your analysis of why
this is, and what passable resources exist, is correct but
insufficient.
If you compare the linux audit project (or OpenBSD) to the NT
security efforts, you see a difference in the quality of the output
that is being produced on NT. I'll assert that a lack of openness
from MS is at least in part to blame for this. The heavy work of
disassembling and reverse engineering cuts dramatically the number of
white hatted security experts willing and able to devote effort to NT.
Only when MS makes available security information that in the past has
been treated as internal secrets will we be able to really start
digging into the meat of NT, and finding and fixing security holes.
The security community is trying hard to understand NT.
Microsoft is not making it easier, and thus, the low quality of books
out there reflects on the low knowledge that exists.
Adam
PS: About the labor theory of value. Marx made the assertion
(here dramatically oversimplified) that labor put into a raw materials
are what makes it valueable. In fact, I prefer to eat a meal that,
say, Julia Child threw together in ten minutes to one that David has
slaved all day to make. Nothing personal David, and I'm sure you're a
fine chef, but I'm also confident that Julia Child is better. The
effort that goes into the product is not as important to me as a
gourmet as the quality of the food. And the food is better today on
the UNIX side of the bookshelf. :)
------------------------------
Date: Mon, 27 Jul 1998 21:20:01 -0400
From: David LeBlanc <dleblanc@mindspring.com>
Subject: File 7--Re: [NTSEC] Re: [Secure-NT] Followup to Rutstein review
At 03:10 PM 7/27/98 -0400, Adam Shostack wrote:
>It was clear. It was practical. It gave me most of what I needed in
>about 200-300 pages. Books like that made it posisble for me to have
>reasonably secure systems. The fact that there was 20 years of
>experience and understanding of the OS did not matter to me.
[...snip...]
> The security community is trying hard to understand NT.
>Microsoft is not making it easier, and thus, the low quality of books
>out there reflects on the low knowledge that exists.
I think you're confusing "low quality" with "what I want to see". The
books that we're seeing right now are designed to explain how the OS works
so that people who are familiar with UNIX, Netware, etc., can use NT. They
typically contain what was known at the time. I think he's done a good job
explaining the areas he chose to tackle.
It seems a bit odd to be "reviewing" Rutstein now - the book was published
last year, and was probably written during 1996. We've learned an awful
lot since then. For example, Slade says "The suggestion to rename the
administrator account is fairly standard, but the renamed account may still
be vulnerable to attack because of identification of the security ID." I
agree with his point, but up until about a year ago, it wasn't widely known
how to get the admin's name. He's criticising something that wasn't
publicly known when he wrote the book - not a fair criticism. Should we
criticize Garfinkel and Spafford for failing to mention that NIS+ has
buffer overruns and due to various failings can be used to bring down an
entire network?
Perhaps instead of criticizing based on what isn't in the book, let's look
at what we need:
1) A clear explanation of how the mechanisms peculiar to NT operate - ACLs
behave a bit differently than u-g-w, privileges, auditing, how to set
policies - the usual admin kind of stuff. Required reading for newbies -
the existing books do this pretty well.
2) An understanding of how to secure the file system and registry to
prevent trojans. This is an area where our understanding is evolving - NT
is normally used as a _personal_ machine, or is only accessed at the
console by admins. Under those conditions, trojans aren't that big a deal
- so I can hack myself - big deal. We are getting better at this as we go
along - the Coopers and Lybrand paper was a big step up from the books
available at the time, and Sutton's NSA paper is another big step up. If
you want concise coverage, that's the one to read.
3) What resources are made available across the network under what
conditions - this is an area where we're still sorting things out - and an
area where we can expect a lot of changes. Another evolving area is DCOM,
but DCOM itself is new.
Let's put it differently - just what is it that would be in a "high
quality" book on NT security?
------------------------------
Date: Fri, 24 Jul 1998 09:57:48 -0800
From: "Rob Slade, doting grandpa of Ryan and Trevor" <rslade@sprint.ca>
Subject: File 8--REVIEW: "Personal Medical Information", Ross Anderson
BKPRMDIN.RVW 980508
"Personal Medical Information", Ross Anderson, 1997, 3-540-63244-1,
U$45.00
%E Ross Anderson ross.anderson@cl.cam.ac.uk
%C 175 Fifth Ave., New York, NY 10010
%D 1997
%G 3-540-63244-1
%I Springer-Verlag
%O U$45.00 800-777-4643 fax: 201-348-4505 wborden@springer-ny.com
%P 250 p.
%T "Personal Medical Information: Security, Engineering, and Ethics"
The papers contained in this work were presented at a conference held
in Cambridge, UK, in June of 1996. Those attending were from medical,
legal, activist, legislative, and data security backgrounds. Most of
the material comes from the UK and German experience.
The first paper examines the purpose and ownership of medical
information: does the data belong to the patient or the NHS (National
Health Service) and what implications does ownership have on policy
regarding health information. This question is complicated by the
requirement for aggregated details in order to provide the proper
quality of service. In Germany, a "smart" card is being developed for
patient information and billing purposes and the debate and various
options for the card is described in the second essay. Chapter three
looks generically (and in rather jargon laden manner) at the
distinctives of medical information systems. During rationalization
of the medical information systems of the German Democratic Republic
(GDR, East Germany) and the Federal Republic of Germany (West Germany)
the value of a central repository for cancer information was noted,
along with the danger of invasion of privacy in such consolidated
systems. The possibility of a distributed information system in which
patient information is held locally, but made available for non-
identifying epidemiological research is discussed in paper four. The
review of the use of information systems by general practitioners, in
chapter five, is general and anecdotal, rather than analytical.
The British Medical Association (BMA) has produced a policy paper on
the security and confidentiality of patient information. The sixth
essay takes issue with aspects of the BMA paper with particular
respect to acute care. Implementation of the policy in a multi-
practitioner practice in Yorkshire is noted in chapter seven. The BMA
policy is used as a case study for medical ethics analysis in chapter
eleven. Chapter twenty closes off the book with an update on the
policy.
Paper number eight is a somewhat simplistic view of a confidential
patient information architecture modelled on an ideal patient ward.
Unfortunately, it fails to account not only for real world situations,
but also for many important uses of medical information. Although
titularly involved with risk assessment, chapter nine is essentially a
statement of medical ethics in opposition to the surveillance of
patients used by for-profit managed care operations. With the
introduction of information technologies, wholesale modification of
institutions and systems is being undertaken, often with untoward
consequences. The aim of essay ten is to propose a model for re-
engineering that makes responsibility central to the enterprise in
order to avoid confidentiality problems. While the many see patient
information as primarily business related, chapter twelve looks at the
needs for data as a resource for research and treatment. Electronic
commerce tools are used to ensure confidentiality of patient
information transfer in paper thirteen. Similarly, public key
encryption is examined for the establishment of confidential auditing
of medical payments in essay fourteen. Chapter fifteen is a very
brief case study of the use of smart cards for medical data. The
philosophical review of medical ethics in chapter sixteen has only
tenuous connections to technology. Only an abstract is included for
presentation seventeen. Chapter eighteen is a review of privacy
policy in the United States. Nineteen is a case study from New
Zealand.
While the quality of the papers is uneven, the variety of viewpoints
is extremely valuable. Although there is a significant bias in favour
of patient confidentiality, some of the needs for sharing of
information are at least raised.
copyright Robert M. Slade, 1998 BKPRMDIN.RVW 980508
------------------------------
Date: Thu, 25 Apr 1998 22:51:01 CST
From: CuD Moderators <cudigest@sun.soci.niu.edu>
Subject: File 9--Cu Digest Header Info (unchanged since 25 Apr, 1998)
Cu-Digest is a weekly electronic journal/newsletter. Subscriptions are
available at no cost electronically.
CuD is available as a Usenet newsgroup: comp.society.cu-digest
Or, to subscribe, send post with this in the "Subject:: line:
SUBSCRIBE CU-DIGEST
Send the message to: cu-digest-request@weber.ucsd.edu
DO NOT SEND SUBSCRIPTIONS TO THE MODERATORS.
The editors may be contacted by voice (815-753-6436), fax (815-753-6302)
or U.S. mail at: Jim Thomas, Department of Sociology, NIU, DeKalb, IL
60115, USA.
To UNSUB, send a one-line message: UNSUB CU-DIGEST
Send it to CU-DIGEST-REQUEST@WEBER.UCSD.EDU
(NOTE: The address you unsub must correspond to your From: line)
CuD is readily accessible from the Net:
UNITED STATES: ftp.etext.org (206.252.8.100) in /pub/CuD/CuD
Web-accessible from: http://www.etext.org/CuD/CuD/
ftp.eff.org (192.88.144.4) in /pub/Publications/CuD/
aql.gatech.edu (128.61.10.53) in /pub/eff/cud/
world.std.com in /src/wuarchive/doc/EFF/Publications/CuD/
wuarchive.wustl.edu in /doc/EFF/Publications/CuD/
EUROPE: nic.funet.fi in pub/doc/CuD/CuD/ (Finland)
ftp.warwick.ac.uk in pub/cud/ (United Kingdom)
The most recent issues of CuD can be obtained from the
Cu Digest WWW site at:
URL: http://www.soci.niu.edu/~cudigest/
COMPUTER UNDERGROUND DIGEST is an open forum dedicated to sharing
information among computerists and to the presentation and debate of
diverse views. CuD material may be reprinted for non-profit as long
as the source is cited. Authors hold a presumptive copyright, and
they should be contacted for reprint permission. It is assumed that
non-personal mail to the moderators may be reprinted unless otherwise
specified. Readers are encouraged to submit reasoned articles
relating to computer culture and communication. Articles are
preferred to short responses. Please avoid quoting previous posts
unless absolutely necessary.
DISCLAIMER: The views represented herein do not necessarily represent
the views of the moderators. Digest contributors assume all
responsibility for ensuring that articles submitted do not
violate copyright protections.
------------------------------
End of Computer Underground Digest #10.42
************************************