Copy Link
Add to Bookmark
Report

Neuron Digest Volume 06 Number 47

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest	Monday, 13 Aug 1990		Volume 6 : Issue 47 

Today's Topics:
Summary (long): pattern recognition comparisons


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

------------------------------------------------------------

Subject: Summary (long): pattern recognition comparisons
From: Richard Fozzard <fozzard@boulder.Colorado.EDU>
Date: Thu, 26 Jul 90 10:34:46 -0600

[[ Editor's Note: I've deleted extraneous headers and inserted
"<***********>" between individual entries. -PM ]]

Here are the responses I got for my question regarding comparisons of
connectionist methods with traditional pattern recognition techniques.

I believe Mike Mozer (among others) puts it best:

"Neural net algorithms just let you do a lot of the same things that
traditional statistical algorithms allow you to do, but they are more
accessible to many people (and perhaps easier to use)."


Read on for the detailed responses. (Note: this does not include anything
posted to comp.ai.neural-nets, to save on bandwidth)

rich

========================================================================
Richard Fozzard "Serendipity empowers"
Univ of Colorado/CIRES/NOAA R/E/FS 325 Broadway, Boulder, CO 80303
fozzard@boulder.colorado.edu (303)497-6011 or 444-3168

<***********>

Date: Tue, 17 Jul 90 13:53:55 -0500
From: honavar@cs.wisc.edu (Vasant Honavar)
Received: by goat.cs.wisc.edu; Tue, 17 Jul 90 13:53:55 -0500
Subject: pattern recognition with nn



Honavar, V. & Uhr, L. (1989).
Generation, Local Receptive Fields, and Global Convergence
Improve Perceptual Learning in Connectionist Networks,
In: Proceedings of the 1989 International Joint Conference
on Artificial Intelligence, San Mateo, CA: Morgan Kaufmann.

Honavar, V. & Uhr, L. (1989).
Brain-Structured Connectionist Networks that Perceive and Learn,
Connection Science: Journal of Neural Computing, Artificial
Intelligence and Cognitive Research, 1 139-159.

Le Cun, Y. et al. (1990).
Handwritten Digit Recognition With a Backpropagation Network,
In: Neural Information Processing Systems 2, D. S. Touretzky (ed.),
San Mateo, CA: 1990.

Rogers, D. (1990).
Predicting Weather Using a Genetic Memory: A Combination of
Kanerva's Sparse Distributed Memory With Holland's Genetic Algorithms,
In: Neural Information Processing Systems 2, D. S. Touretzky (ed.),
San Mateo, CA: 1990.

<***********>

Date: Tue, 17 Jul 90 16:39:12 EDT
From: perry@seismo.CSS.GOV (John Perry)

Richard,
It depends on which neural network you are using, and the underlying
complexity in seperating pattern classes. We at ENSCO have developed
a neural network architecture that shows far superior performance
over traditional algorithms. Mail me if you are interested.

John L. Perry
ENSCO, Inc.
5400 Port Royal Road
Springfiedl, Virginia 22151
(Springfield)
703-321-9000
email: perry@dewey.css.gov, perry@beno.css.gov

<***********>

Date: 17 Jul 90 14:36:00 MDT
From: "Dave Shaw" <shaw_d@clipr.colorado.edu>
Subject: RE: Networks for pattern recognition problems?

Rich- our experience with the solar data is still inconclusive, but would
seem to indicate that neural nets have exhibit no distinct advantage over
more traditional techniques, in terms of 'best' performance figures. The
reason appears to be that although the task is understood to be
non-linear, (which should presumably lead to better performance by
non-linear systems such as networks), there is not enough data at the
critical points to define the boundaries of the decision surface. This
would seem to be a difficulty that all recognition problems must deal
with.

Dave

<***********>

Date: Tue, 17 Jul 90 14:02:44 PDT
From: kortge@galadriel.Stanford.EDU (Chris Kortge)

You may know of this already, but Gorman & Sejnowski have a paper on
sonar return classification in Neural Networks Vol. 1, #1, pg 75, where a
net did better than nearest neighbor, and comparable to a person.

I would be very interested in obtaining your list of "better-than-
conventional-methods"
papers, if possible (maybe the whole connectionists
list would, for that matter).

Thanks--
Chris Kortge
kortge@psych.stanford.edu

<***********>

Date: Tue, 17 Jul 90 17:06:14 CDT
From: galem@mcc.com (Gale Martin)

I do handwriting recognition with backprop nets and have anecdotal
evidence that the nets do better than the systems developed by some of
the research groups we work with. The problem with such comparisons is
that the success of the recognition systems depend on the expertise of
the developers. There will never be a definitive study.

However, I've come to believe that such accuracy comparisons miss the
point. Traditional recognition technologies usually involve alot of
hand-crafting (e.g., selecting features) that you can avoid by using
backprop nets. For example, I can feed a net with "close to" raw inputs
and the net learns to segment it into characters, extract features, and
classify the characters. You may be able to do this with traditional
techniques, but it will take alot longer. Extending the work to
different character sets becomes prohibitive; whereas it is a simple task
with a net.

Gale Martin
MCC
Austin, TX

<***********>

From: "Ted Stockwell" <ted@aps1.spa.umn.edu>
Subject: Re: Networks for pattern recognition problems?
Date: Tue, 17 Jul 90 17:12:33 CDT

> Do you know of any references to work done using connectionist (neural)
> networks for pattern recognition problems? I particularly am interested
> in problems where the network was shown to outperform traditional algorithms.
>
> I am working on a presentation to NOAA (National Oceanic and Atmospheric
> Admin.) management that partially involves pattern recognition
> and am trying to argue against the statement:
> "...results thus far [w/ networks] have not been notably more
> impressive than with more traditional pattern recognition techniques"
.
>

This may not be quite what you're looking for, but here are a few
suggestions:

1) Pose the question to salespeople who sell neural network software. They
probably have faced the question before.

2) One advantage is that the network chacterizes the classes for you.
Instead of spending days/weeks/months developing statistical models
you can get a reasonable classifier by just handing the training data
to the network and let it run overnight. It does the work for you
so development costs should be much lower.

3) Networks seem to be more often compared to humans than to other
software techniques. I don't have the referrences with me, but I
recall that someone (Sejnowski?) developed a classifier for sonar
signals that performed slightly better than human experts (which *is*
the "traditional pattern recognition technique").


Ted Stockwell U of MN, Dept. of Astronomy
ted@aps1.spa.umn.edu Automated Plate Scanner Project

<***********>

From: mariah!yak@tucson.sie.arizona.edu
Date: Tue, 17 Jul 90 14:58:40 -0700

Dear Dr. Fozzrad,

I read your email message on a call for pattern recog. problems for which
NN's are known to outperform traditional methods.

I've worked in statistics and pattern recognition for some while. Have a
fair number of publications.

I've been reading th neural net literature and I'd be quite surprised if
you get convincing replies in the affirmative, to your quest. My opinion
is that stuff even from the '60's and '70's, such as the books by Duda
and Hart, Gonzales and Fu, implemented on standard computers, are still
much more effective than methodology I've come across using NN
algorithms, which are mathematically much more rstrictive.


In brief, if you hear of good solid instances favorable to NN's, please
let me know.

Sincerely,

Sid Yakowitz
Professor


<***********>


Date: Tue, 17 Jul 90 21:18:33 EDT
From: John.Hampshire@SPEECH2.CS.CMU.EDU

Rich,

Go talk with Smolensky out there in Boulder. He
should be able to give you a bunch of refs. See also works by Lippmann
over the past two years. Barak Pearlmutter and I are working on a paper
that will appear in the proceedings of the 1990 Connectionist Models
Summer School which shows that certain classes of MLP classifiers yield
(optimal) Bayesian classification performance on stochastic patterns.
This beats traditional linear classifiers...

There are a bunch of results in many fields showing that non-linear
classifiers out perform more traditional ones. The guys at NOAA aren't
up on the literature. One last reference --- check the last few years of
NIPS and (to a lesser extent) IJCNN proceedings

NIPS = Advances in Neural Information Processing Systems
Dave Touretzky ed., Morgan Kaufmann publishers
ICJNN = Proceedings of the International Joint Conference on
Neural Networks, IEEE Press

John


<***********>


Date: Tue, 17 Jul 90 20:51:28 MDT
From: Michael C. Mozer <mozer@neuron>
Subject: Re: Help for a NOAA connectionist "primer"

Your boss is basically correct. Neural net algorithms just let you do a
lot of the same things that traditional statistical algorithms allow you
to do, but they are more accessible to many people (and perhaps easier to
use).

There is a growing set of examples where neural nets beat out
conventional algorithms, but nothing terribly impressive. And it's
difficult to tell in these examples whether the conventional methods were
applied appropriately (or the NN algorithm in cases where NNs lose to
conventional methods for that matter).

Mike


<***********>


Date: Wed, 18 Jul 90 00:40:21 EDT
From: burrow@grad1.cis.upenn.edu (Tom Burrow)
Subject: procedural vs connectionist p.r.
Status: R


Sorry, this isn't much of a contribution -- mostly a request for your
replies. If you are not going to repost them via the connectionist
mailing list, could you mail them to me?

No, for my mini-contribution: Yan Lecun, et al's work as seen in NIPS 90
on segmented character recognition is fairly impressive, and they claim
that their results are state of the art.

Tom Burrow


<***********>


Date: Wed, 18 Jul 90 09:32:08 CDT
From: jsaxon@cs.tamu.edu (James B Saxon)
Subject: Re: Networks for pattern recognition problems?

In article <23586@boulder.Colorado.EDU> you write:
>Do you know of any references to work done using connectionist (neural)
>networks for pattern recognition problems? I particularly am interested
>in problems where the network was shown to outperform traditional algorithms.
>
>I am working on a presentation to NOAA (National Oceanic and Atmospheric
>Admin.) management that partially involves pattern recognition
>and am trying to argue against the statement:
>"...results thus far [w/ networks] have not been notably more
>impressive than with more traditional pattern recognition techniques"
.
>
>I have always felt that pattern recognition is one of the strengths of
>connectionist network approaches over other techniques and would like
>some references to back this up.

Well, aside from the question of ultimate generality to which the answer
is "OF COURSE there are references to neural network pattern recognition
systems. The world is completely full of them!"


Anyway, maybe you'd better do some more research. Here's a couple off the
top of my head:

Kohonen is really hot in the area, he's been doing it for at least ten
years. Everybody refers to some aspect of his work.

I also suggest picking up a copy of the IJCNN '90 San Diego, all 18lbs of
it. (International Joint Conference on Neural Networks) But for a
preview: I happened to sit in on just the sort of presentation you would
have liked to hear. His title was "Meterological Classification of
Satellite Imagery Using Neural Network Data Fusion"
Oh Boy!!! Big title!
Oh, it's by Ira G. Smotroff, Timothy P. Howells, and Steven Lehar. >From
the MITRE Corporation (MITRE-Bedford Neural Network Research Group)
Bedford, MA 01730. Well the presentation wasn't to hot, he sort of hand
waved over the "classification" of his meterological data though he
didn't describe what we were looking at. The idea was that the system
was supposed to take heterogeneous sensor data (I hope you know these:
GOES--IR and visual, PROFS database--wind profilers, barometers,
solarometers, thermometers, etc) and combine them. Cool huh. If they
had actually done this, I imagine the results would have been pretty
good. It seems though that they merely used an IR image and a visual
image and combined only these two. Their pattern recognition involved
typical modeling of the retina which sort of acts as a band pass filter
with orientations, thus it detects edges. Anyway, their claim was the
following: "The experiments described showed reasonably good
classification performance. There was no attempt to determine optimal
performance by adding hidden units [Hey, if it did it without hidden
units, it's doing rather well.], altering learning parameters, etc.,
because we are currently implementing self-scaling learning algorithms
which will determine many of those issues automatically. [Which reminds
me, Lehar works with Grossberg at Boston University. He's big on pattern
recognition too, both analog and digital. Check out Adaptive Resonance
Theory, or ART, ART2, ART3.]..."
Anyway it looks like a first shot and
they went minimal. There's lots they could add to make it work rather
well. In terms of performance, I'd just like to make one of those
comments... From what I saw at the conference, neural networks will
outperform traditional techniques in this sort of area. The conference
was brimming over with successful implementations.

Anyway... Enough rambling, from a guy who should be writing his thesis
right now... Good luck on your presentation!

Oh, I think it automacally puts my signature on.....

Did it?

/--------------------------------------------\ James Bennett Saxon
| "I aught to join the club and beat you | Visualization Laboratory
| over the head with it."
-- Groucho Marx | Texas A&M University
<---------------------------------------------/ jsaxon@cssun.tamu.edu


<***********>


Date: Wed, 18 Jul 90 10:38:16 EDT
From: Henri Arsenault <arseno@phy.ulaval.ca>
Subject: papers on pattern recognition

In response to your request about papers on neural nets in pattern
recognition, there is a good review in IEEE transactions on neura l
networks, vol. 1, p. 28. "Survey of neural network technology for
automatic target recognition"
, by M. W. Roth. The paper has many
references. arseno@phy.ulaval.ca


<***********>

Date: Wed, 18 Jul 90 09:07:50 PDT
From: d38987%proteus.pnl.gov@pnlg.pnl.gov
Subject: NN and Pattern Recognition

Richard, We have done some work in this area, as have many other people.

I suggest you call Roger Barga at (509)375-2802 and talk to him, or send him
mail at:

d3c409%calypso@pnlg.pnl.gov

Good luck,

Ron Melton
Pacific Northwest Laboratory
Richland, WA 99352


<***********>


To: mozer@neuron
Subject: Re: Help for a NOAA connectionist "primer"
Status: R

mike,
thanks for the input - it seems a cogent summary of the (many)
responses I've been getting. However, it seems just about noone has
really attempted a one-to-one sort of comparison using traditional
pattern recognition benchmarks. Just about everything I hear and read is
anecdotal.

Would it be fair to say that "neural nets" are more accessible, simply
because there is such a plethora of 'sexy' user-friendly packages for
sale? Or is back-prop (for example) truly a more flexible and
widely-applicable algorithm than other statistical methods with
uglier-sounding names?

If not, it seems to me that most connectionists should be having a bit of
a mid-life crisis about now.

rich


<***********>


Date: Wed, 18 Jul 90 11:16:22 MDT
From: Michael C. Mozer <mozer@neuron>
Subject: Re: Help for a NOAA connectionist "primer"

I think NNs are more accessible because the mathematics is so
straightforward, and the methods work pretty well even if you don't know
what you're doing (as opposed to many statistical techniques that require
some expertise to use correctly).

For me, the win of NNs is as a paradigm for modeling human cognition.
Whether the NN learning algorithms existed previously in other fields is
irrelevant. What is truly novel is that we're bringing these numerical
and statistical techniques to the study of human cognition. Also,
connectionists (at least the cog sci oriented ones) are far more
concerned with representation -- a critical factor, one that has been
much studied by psychologists but not by statisticians.

Mike


<***********>


From: Ron Cole <cole@cse.ogi.edu>
Subject: Re: Networks for pattern recognition problems?

Call Les Atlas at U Washington. He has an article coming out in IEEE
Proceedings August comparing NNs and CART on 3 realworld problems.

Ron

Les Atlas: 206 685 1315


<***********>


Date: Wed, 18 Jul 90 10:59:24 PDT
From: bimal@jupiter.risc.com (Bimal Mathur)
Subject: pattern recognition

The net result of experiments done by us in pattern classification for
two dimensinal data i.e. image to features, classify features using NN,
is that there is no significant improvement in performance of the overall
system.
-bimal mathur - Rockwell Int


<***********>


Date: Wed, 18 Jul 90 14:10:28 EDT
From: Chip Bachmann <PH706008@brownvm.brown.edu>
Subject: Re: Networks for pattern recognition problems?


An example of research directly comparing neural networks with
traditional statistical methods can be found in: R. A. Cole, Y. K.
Muthusamy, and L. Atlas, "Speaker-Independent Vowel Recognition:
Comparison of Backpropagation and Trained Classification Trees"
, in
Proceedings of the Twenty-Third Annual Hawaii International Conference on
System Sciences, Kailua-Kona, Hawaii, January 2-5, 1990, Vol. 1, pp.
132-141. The neural network achieves better results than the CART
algorithm, in this case for a twelve-class vowel recognition task. The
data was extracted from the TIMIT database, and a variety of different
encoding schemes was employed.

Tangentially, I thought that I would enquire if you know of any
postdoctoral or other research positions available at NOAA, CIRES, or U.
of Colorado. I completed my Ph.D. in physics at Brown University under
Leon Cooper (Nobel laureate, 1972) this past May; my undergraduate degree
was from Princeton University and was also in physics. My dissertation
research was carried out as part of an interdisciplinary team in the
Center for Neural Science here at Brown.

The primary focus of my dissertation was the development of an
alternative backward propagation algorithm which incorporates a gain
modification procedure. I also investigated the feature extraction and
generalization of backward propagation for a speech database of
stop-consonants developed here in our laboratory at Brown. In addition,
I discussed hybrid network architectures and, in particular, in a
high-dimensional, multi-class vowel recognition problem (namely with the
data which Cole et. al. used in the paper which I mentioned above),
demonstrated an approach using smaller sub-networks to partition the
data. Such approaches offer a means of dealing with the "curse of
dimensionality."



If there are any openings that I might apply for, I would be happy to
forward my resume and any supporting materials that you might require.


Charles M. Bachmann
Box 1843
Physics Department &
Center for Neural Science
Brown University
Providence, R.I. 02912
e-mail: ph706008 at
brownvm


<***********>



Date: Wed, 18 Jul 90 16:05:39 EDT
From: Charles Wilson x2080 <wilson@magi.ncsl.nist.gov>
Organization: National Institute of Standards and Technology
formerly National Bureau of Standards
Subject: character recognition

We have shown on character recognition problem that neural networks are
as good in accuracy as traditional methods but much faster (on a parallel
computer), much easier to prpgram ( a few hundred lines of parallel
fortran) and less brittle.

see C. L. Wilson, R. A. Wilkinson, and M. D. Garris, "self-organizing
Neural Network Character Recognition on a Massively Parallel Computer"
,
Proc. of the IJCNN, vol 2, pp. 325-329, June 1990.

<***********>


Date: 18 Jul 90 16:22:00 MDT
From: "Dave Shaw" <shaw_d@clipr.colorado.edu>
Subject: RE: Networks for pattern recognition problems?

To date we have compared the expert system originally built for the task
with many configurations of neural nets (based on your work), multiple
linear regression equations, discriminant analysis, many types of nearest
neighbor systems, and some work on automatic decision tree generation
algorithms. Performance in measured both in the ROC P sub a (which turns
out to be only a moderate indicator of performance, due to the unequal
n's in the two distributions), and maximum percent correct, given the
optimal bias setting. All systems have been trained and tested on the
same sets of training and test data. As I indicated before, the story
isn't completely in yet, but it is very hard to show significant
differences between any of these systems on the solar flare task.

Dave


<***********>


Date: Wed, 18 Jul 90 16:45:41 EST
From: George Kaczowka <uvm-gen!idx!gsk@uunet.UU.NET>
Subject: Networks for pattern recognition problems?

Rich --

I don't know if this helps, but a company in Providence RI called NESTOR has
put together a couple of products.. come of which have been customized
systems for customers solving pattern recognition problems.. One I
remember was regarding bond trading in the financial world.. I seem to
remenber that the model outperformed the "experts" by at least 10-15%,
and that this was used (and is as far as I know) by some on wall street.
I know that they have been in the insurance field for claim analysys as
well as physical pattern recognition.. They were founded by a phd out of
Brown University, and I am sure that you caould obtain reference works
from them.. I understand that they are involved in a few military
pattern recognition systems for fighters as well..
Good luck.. I was interested in their work some time ago, but have been off
on other topics for over a year..
-- George --

------------------------------------------------------------
- George Kaczowka IDX Corp Marlboro, MA - gsk@idx.UUCP -
------------------------------------------------------------

<***********>


Date: Fri, 20 Jul 90 08:48:46 EST
From: marwan@ee.su.oz.AU (Marwan Jabri)
Subject: pattern recognition

We have been working on the application of neural nets to the pattern r
recognition of ECG signals (medical). I will be happy in mailing you some
of our very good results that are better of what has been achieved using
conventional techniques. Is this the sort of things you are looking for?
what media you want?

Marwan Jabri

-------------------------------------------------------------------
Marwan Jabri, PhD Email: marwan@ee.su.oz.au
Systems Engineering and Design Automation Tel: (+61-2) 692-2240
Laboratory (SEDAL) Fax: (+61-2) 692-3847
Sydney University Electrical Engineering
NSW 2006 Australia


<***********>


Date: Fri, 20 Jul 90 10:10:26 CST
From: PP219113@tecmtyvm.mty.itesm.mx
Subject: Re: Networks for pattern recognition problems?
Organization: Instituto Tecnologico y de Estudios Superiores de Monterrey

hi, David J. Burr (in 'Experiments on NN Recognition of Spoken and
Written Text, suggests that NN and Nearest neighbor classification
performs at near the same level of accuracy, IEEE Trans on ASSP, vol 36,
#7,pp1162-68, july 88) My own experience with character recognition using
neural nets actually suggests that NN have better performance than
nearest neighbor and hierarchical clustering, (I suggest to talk to Prof.
Kelvin Wagner, ECE, UC-Boulder) See also, "Survery of Neural Net Tech for
Automatic Target Recognition"
in Trans in Neural Net, March 90, pp 28 by
M.W. Roth.

jose luis contreras-vidal


<***********>


Date: 20 Jul 90 15:32:00 MDT
From: "Dave Shaw" <shaw_d@clipr.colorado.edu>
Subject: RE: Networks for pattern recognition problems?

Rich- the network configurations we have used are all single hidden layer
of varying size (except for 1 network with no hidden layer). Hidden layer
size has been varied from 1 to 30 units. Input layer=17 units, output
layer=1 unit. All activation functions sigmoidal. As I indicated before,
there was essentially no difference between any of the networks. We are
moving towards a paper (one at least) and this work will likely be
included as part of my dissertation as well.

Dave



------------------------------

End of Neuron Digest [Volume 6 Issue 47]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT