Copy Link
Add to Bookmark
Report
AIList Digest Volume 3 Issue 006
AIList Digest Sunday, 20 Jan 1985 Volume 3 : Issue 6
Today's Topics:
Algorithms - Malgorithms, Dead Rats, and Cell Death,
Business - Expert Systems Funding,
Humor - Blender Speeds & Oxymorons & Software Development
----------------------------------------------------------------------
Date: Thu, 17 Jan 85 16:04:49 cst
From: archebio!gmck@Uiuc.ARPA
Subject: malgorithms and dead rats (and C.elegans)
The notion of a pessimal algorithm is apparently not all that new; I
came across this paragraph in a report by Vaughn Pratt in the 1979
Mathematical Foundations of Computer Science Proceedings (Springer LNCS
#74).
``While short inferences padded with irrelevancies may seem
uninteresting objects to count, they nevertheless are the bane
of automatic theorem provers. A nice, albeit extreme, example of how
irrelevancies ("dead rats," as they are sometimes called) can slow
things down is provided by the problem of testing existence of feasible
solutions in linear programming, i.e. testing rational satisfiability
of conjunctions of linear inequations. A "minimally infeasible" system
(one in which the removal of any inequality would lead to existence of
a solution, corresponding to the negation of a theorem containing no
dead rats) can be tested for infeasibility by treating the inequations
as equations and solving by Gaussian elimination to demonstrate
infeasiblity in polynomial time. A non-minimal system cannot be
treated in this way because the spurious inequations interfere with the
elimination process; indeed, no polynomial-time decision method is
known for the non-minimal case. The simplex method can be viewed as
eliminating irrelevant inequations as it moves from vertex to vertex.''
A rather far-out implication of this view is that the existence
of global techniques for solving linear programming problems
(Kachiyan's and Karmarkar's algorithms) suggests an explanation for the
phenomenon of "jumping to conclusions" found in human reasoning might
operate along the same lines. I could go on and babble about right
hemispheres and geometrical reasoning and massively parallel
processing, but I don't think it would be productive.
Also, there's an elaborate (and sometimes hilarious) discussion
of "pessimal algorithms" in the latest SIGACT news. The authors should
be credited, but I've forgotten their names and don't have my copy
handy.
George McKee
Dept. of Genetics and Development
Univ. of Illinois, Urbana
p.s. I should also comment on the remarks made about the cell death
during the development of the nematode _Caenorhabditis_elegans_.
It might not be a good idea (then again it might) to think that this
is just an artifact of the "tinkering" mode of evolution. Work with
parallel grammars ("L-systems") has shown that systems with deletion
(cell death) are Turing-equivalent, while systems without are only
equivalent to finite automata, or somesuch less powerful serial system.
In other words, once a lineage has discovered programmed cell death,
it has a vastly larger space of possible morphologies that it can
evolve through. And of course the fittest ones are the ones that
survive to be studied.
------------------------------
Date: Thu, 17 Jan 85 20:57:01 PST
From: Richard K. Jennings <jennings@AEROSPACE>
Subject: Expert Systems Funding
We have heard from Bob Munck on his views on how the DoD plans
to apply expert systems to Command and Control. As a DoD (Air Force)
planner my perspective is different, and reversed. It is the captive
contractors who get these harebrained schemes to extort money from
the government, and people like myself have to track them down and
discredit them.
Although not speaking for my employer, the basic approach that
I have seen (and personally agree with) is to use expert systems
to help operators and decision-makers manage the vast amounts of
information available. Expert systems, various methods of data fusion,
and computer generated animation are all seen as part of an attack
on the man-machine interface problem that plagues us all.
Replacing brains with silicon is dumb for many reasons, while
offloading tasks that can be better accomplished by computers under
the supervison of an expert is critical to free the expert from
drudgery and boredom. Another plus for expert systems is that
they offer a technology which may permit experts to exploit the benefits
of automation with less effort outside of their field of interest.
Rich.
------------------------------
Date: Wed 16 Jan 85 07:44:22-PST
From: Russ Altman <ALTMAN@SUMEX-AIM.ARPA>
Subject: Labelling the Speeds on a Blender.
[Forwarded from the Stanford BBoard by Laws@SRI-AI]
I'm usually easy going about things like this. But this afternoon, I
actually sat down and read the ordering of speed labels on my Hamilton Beach
Blender. I think it would be challenging to create a knowledge base
and reasoning system that would come up with the following order.
In case your not familiar with blenders, they usually can "blend" at different
rpm, and the companies usually feel obliged to give us "mnemonics" for
remembering which level to use:
SLOW END:
1. whip 8. grind
2. stir 9. mix
3. puree 10. grate
4. beat 11. pulverize
5. aerate 12. churn
6. crumb 13. blend
7. chop 14. liquefy
FAST END
I find this dissatisfying.
I would much rather have an algorithm run in O(puree) than O(churn).
------------------------------
Date: Fri, 18 Jan 85 15:55:12 est
From: William Spears <spears@nrl-aic>
Subject: Oxymorons ...
Our favorite example is: "government workers"
Bill Spears
(and others)
NSWC/Dahlgren
------------------------------
Date: Thu, 17 Jan 85 11:39 CDT
From: quintanar <quintanar%ti-eg.csnet@csnet-relay.arpa>
Subject: Software Development
RECENT DEVELOPMENTS IN SOFTWARE IN TEXAS
(first in a series)
by Dr. Joe Bob VonNeumann
Software Engineer, Texas Software and Military University Mass Destruction Lab
translated into English by Harry Bloomberg, University of Texas at Dallas
This is the first in a series of papers describing recent advances in
the science of software development at Texas S&M. These articles will be
written with the non-software type in mind, so that others may attempt to
understand software with the proper amounts of awe and reverance. This
month's paper will cover two topics: 1) a new sorting algorithm and 2) an
application of Ohm's Law to computing.
THE MONKEY SORT
The following pseudo-code describes a new sorting algorithm that was
discovered by a project at the Generally Dynamic Corporation that has since
been classified Top Secret so that the designers' names could not be
associated with their work.
Procedure monkey_sort(n,keys)
/ n is the number of items to be sorted/
/ keys is an array that contains the data to be sorted/
BEGIN
/ create all possible permutations of keys/
do i = 1 to n! by n
/ it's below my dignity to write so trivial a procedure as permute,
so some day I'll pay some kid just out of school to write it /
/ create a unique permutation of the data in key and save it in
array memory./
call permute(key,memory,n)
end do
/ search for the permutation that happens to be in the right order /
do i = 1 to n! by n
j = 1
while ((memory(j+i-1) < memory(j+i)) and j <> n) do
j = j + 1
end while
if j = n then
/ we found the right permutation! /
print (memory(i+k-1),k=1 to n)
exit
end if
end do
END monkey_sort
The above sort's creators have affectionately named it the "Monkey
Sort" after the old tale that if enough monkeys were given word processing
equipment they would eventually write either "War and Peace" or 500 lines of
error-free Fortran code. However, industrial software managers should be
warned that this is not a very effective software development methodology, as
case studies of many DoD sponsored projects have shown.
Monkey sort does fill two critical needs. First, utilization of
processors and memory on future VHISIC-based systems will improve by several
orders of magnitude. The extreme step of investing time and money in
efficient High Order Language compilers would therefore be justified.
Second, the systems engineer will now have a baseline against which to
compare other algorithms. This is so close to the way the RF community
measures antenna gains that we may refer to Monkey Sort as an Isotropic
Process. Algorithms may be compared in decibels, which project managers are
more likely to understand than bounding software by a polynomial. For
example, a new CS graduate, rather than reporting "One algorithm is O(2n) and
the other is only O(n)," can now tell his boss than one is 3 dB faster than
another relative to Isotropic.
AN OHM'S LAW FOR COMPUTING
With the withdrawal of Texas Instruments, Timex, and Mattel from the
home computer market, we can now announce the vital role that these
corporations played in the discovery of the basic particle of computation.
Just as the atom can be split into electrons, neutrons and protrons, it has
been discovered that software can be divided into automatrons.
Further, it has been shown that these particles obey the famous Ohm's
Law (V = I * R) with only slight modifcations:
- "I" now represents automatron flow through the computer system.
- "V" represents the Electrocomputive Force (ECF). This may be
thought of as the demand for computing. Research indicates that ECF
is a O(2**n) function where n represents the number of times project
managers ask "Are all the bugs out of the software yet?"
- "R" is the computational resistance and is directly proportional to
the number of Ph.Ds assigned to a project.
The automatron was discovered recently with the aid of the Fermi
National Lab's 1024 Gbaud bit smasher. Thousands of surplus TI 99/4As, Timex
1000s and Aquarious systems were hurled into lead walls at speeds approaching
the speed of light. Whatever was not consumed by the energy of the collision
was collected into bit buckets and carefully measured and examined.
The key to the data analysis effort was application of a basis of
modern computer science: bits are like energy, in that they may be neither
created nor destroyed. The Fermi Lab scientists found many parity and
checksum errors on their instrumentation tapes and came to the conclusion that
the missing bits could be explained only through the existance of previously
unknown discrete packets of computational power -- automatrons.
Research in the field of computational physics is increasing at an
alarming rate. Scientists at MIT have found that the automatron can be
further divided into even smaller particles called nu-trons. The recently
announced NU-machine and NU-bus extensively use materials that act as nu-tron
super-conductors. Unfortunately, further nu-tron applications have been
stymied by a patent infringement lawsuit filed by Nairobi University (NU).
This suit claims that a machine has been built from parts of two-horned four-
legged animals that roam the plains of Africa. This machine is called the
Gnu-machine.
NEXT: Dr. VonNeumann explores advances Texas S&M has made in the
field of Confusibility Theory.
Biographical data: Dr. Joe Bob VonNeumann earned his BSEE from Faber
College, his MSCS at Darwin College, and his PhD at the elite Swiss
Naval Academy (best known as the organization that lost a competition
with the Swiss Army to produce a multi-role pocket knife). Before
joining Texas S&M, Joe Bob developed high probability-of-kill
mechanical systems for use against rodent threats for DeConn Labs.
Dr. VonNeumann relaxes by fishing in the Trinity River (he recently
traded in his bass boat for a gefilte boat) and by chasing after
single women in bars (favorite opening line: "Do you do assembly
language?"). Dr. VonNeumann is currently on a Leave of Absence to
document the mating habits of software people so that he may discover
why there are so few of them.
Dr. VonNeumann would like to hear any feedback on this paper. He is
especially interested in directed automatron-beam weapon applications
for use in the "Star Wars" program.
------------------------------
End of AIList Digest
********************