Copy Link
Add to Bookmark
Report
AIList Digest Volume 8 Issue 042
AIList Digest Tuesday, 9 Aug 1988 Volume 8 : Issue 42
Does AI kill? Sixth in a series ...
----------------------------------------------------------------------
Date: 26 Jul 88 21:33:29 GMT
From: bph@buengc.bu.edu (Blair P. Houghton)
Reply-to: bph@buengc.bu.edu (Blair P. Houghton)
Subject: Re: Does AI kill?
In a previous article, MORRISET@URVAX.BITNET writes:
>
> Suppose we eventually construct an artificially intelligent
> "entity." It thinks, "feels", etc... Suppose it kills someone
> because it "dislikes" them. Should the builder of the entity
> be held responsible?
>
You have to keep it strictly "artificial." I've always thought
of advanced neuromorphic devices as an artificial medium in
which *real* intelligence can occur. In which case the entity
would be held responsible.
Out of artificial-land: real children aren't held responsible,
unless they are.
--Blair
------------------------------
Date: 27 Jul 88 11:52:16 GMT
From: munnari!goanna.oz.au!jlc@uunet.UU.NET (Jacob L. Cybulski)
Subject: Re: does AI kill?
The Iranian airbus disaster teaches us one thing about "AI Techniques",
and this is that most of the AI companies forget that the end product
of AI research is just a piece of computer software that needs to be
treated like one, i.e. it needs to go through a standard software
life-cycle and proper software engineering principles still apply to
it no matter how much intelligence is burried in its intestines.
I don't even mention the need to train the system users.
Jacob
------------------------------
Date: 30 Jul 88 03:49:24 GMT
From: uplherc!sp7040!obie!wsccs!dharvey@gr.utah.edu (David Harvey)
Subject: Re: AI...Shoot/No Shoot
In article <854@lakesys.UUCP>, tomk@lakesys.UUCP (Tom Kopp) writes:
>
> I still don't understand WHY the computers mis-lead the crew as to the type of
> aircraft, especially at that range. I know that the Military has tested
I don't know what news sources you have been following lately, but it
was NOT the fault of the computer that misled the crew as to what kind
of airplane it was. They were switching the signals between one that
was known to be a military aircraft and the one for the civilian plane.
Also, whoever decided that it was not important to determine the size
of a plane, et al, when they made the system really made an error in
judgement. You are bound to have civilian aircraft in and around battle
areas eventually. Don't expect any system to perform any better than
the designer has given it capabilities to perform!
dharvey@wsccs (Dave Harvey)
------------------------------
Date: 1 Aug 88 19:53:38 GMT
From: cs.utexas.edu!sm.unisys.com!ism780c!logico!david@ohio-state.arpa
(David Kuder)
Subject: Re: AI...Shoot/No Shoot
In article <603@wsccs.UUCP> dharvey@wsccs.UUCP (David Harvey) writes:
>Also, whoever decided that it was not important to determine the size
>of a plane, et al, when they made the system really made an error in
>judgement. You are bound to have civilian aircraft in and around battle
>areas eventually.
Radar cross-section doesn't correlate well to visual cross section.
This is the main idea behind "stealth" aircraft -- they have a small radar
cross-section.
Also in article <854@lakesys.UUCP> tomk@lakesys.UUCP (Tom Kopp) writes:
> I still don't understand WHY the computers mis-lead the crew as to the type of
> aircraft, especially at that range.
A common trick (tactic) is to hide behind a commercial aircraft.
Whether this is what the Iranians did or not, I don't know. If New York
is ever bombed it'll look like a 747 did it, though.
--
David A. Kuder Logicon, O.S.D.
{amdahl,uunet,sdcrdcf}!ism780c! \ 6300 Variel Ave. Suite H,
{ucbvax,decvax}!trwrb! -> !logico!david Woodland Hills, Ca. 91367
{genius,jplgodo,psivax,slovax}! / (818) 887-4950
------------------------------
End of AIList Digest
********************