Copy Link
Add to Bookmark
Report

Neuron Digest Volume 10 Number 03

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Wednesday, 16 Sep 1992                Volume 10 : Issue 3 

Today's Topics:
Limitations of Neural Nets with Quadrature data
Re: Limitations of Neural Nets with Quadrature data
Post-doctoral Fellowship
"Expert Systems" on Macintosh
2nd Request for (p)Reprints on Simulated Annealing
Petition: Computer Scientists object to gov't report
Senior Academic Post Available
studentships available almost immediately


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (128.91.2.173). Back issues
requested by mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: Limitations of Neural Nets with Quadrature data
From: samodena@csemail.cropsci.ncsu.edu (S. A. Modena)
Organization: Crop Science Dept., NCSU, Raleigh, NC 27695-7620
Date: 26 Aug 92 03:22:03 +0000

[[ Editor's Note: This message and the next were taken from a BIONET
mailing list, but seemed to be of potential interest to Digest readers.
Any protein synthesizers wish to comment? -PM ]]

"Feedforward Networks for Supervised Training:

"
We simulate supervised training of neural network connection weights and
minimize specified measures of pattern-association error. .....we
consider only feedforward connections. We start with the LMS algorithm
for simple perceptrons....

.....

"The simple two-layer pattern associator (perceptrons) ...relates input-
and output-layer activations by

8
[1] layer2[i] = SUM-OVER W[i,k]*layer1[k] (i=1,2,...8)
k=1

"
Our objective is to train the connection weights W[i,k] so that the
perceptron associates eight given binary (0,1) output patterns (target
patterns) with eight corresponding given binary input patterns, say the
output (0,0,0,0,0,0,0,1) with the input (1,1,0,0,0,0,0,0), etc. This
will not always work; a perceptron based on equation [1] can distinguish
input patterns only if they are "linearly separable" by a hyperplane in
pattern space. We can still obtain useful results.

......skip a lot....

"As we already noted, the input-pattern classes corresponding to
different output patterns must be linearly separable. This means that
their pattern points must be separated by a hyperplane

W*layer1 = CONST

in n-dimensional space. This is, for instance, not true for the simple
two-dimensional binary input patterns (0,0), (0,1), (1,0), (1,1) needed
for the XOR (exclusive-OR) operation. As we add extra layers, we shall
obtain more general separation hypersurfaces made up of multiple
hyperplane segments.

"
Hidden Layers and Non-Linear Operations

"Augmenting the basic pattern associator... with intermediate "hidden"
layers will not help if the relations between different layer activations
are *linear* like equation [1], for such a network acts exactly like the
equivalent linear two-layer network. Nonlinear operations are
*absolutely* necessary.

QUOTED from: Neural Network Experiments on Personal Computers
and Workstations
Granino A. Korn
"
A Bradford Book"
The MIT Press 1991 Cambridge
QA76.87.K67
ISBN 0-262-61073-6 (with computer diskette)

This reading brought to minde two things:

1:
Jack Cramer argued that not all information is in the primary DNA
sequence in the case of histone-core placement...or at least, it
is better to have supplimentary information available to "
simply"
the problem of site detection.

Some argued that a neural net ought "
to work" by analogy with the
perceptron "
success" for promoter recognition (...feel free to
correct me).

I interprete the above to mean that perhaps a NN can do it all with
just DNA sequence, but that NONLINEAR part is the zinger. Then, what
nonlinear partial derivitive does one use to estimate the error
terms? That seems to be the critical point: the difference between
a theoretically possible solution and an actual solution. Actually,
this is similar to Shannon's proofs: proving the existance that a
code exists with certain properties is not equivalent to describing the
implementation parameters of the code.

2:
The problem involved with information that is succinctly expressed
in two bit wide alphabets....the difficulty experienced by perceptrons
appears to do with the conciseness of expressing four alternative
states with a two-bit representation. Here I feel there is a hint
(or reinforcement) of why channel codes often are implemented by
widening the number of bits for representing letters of the alphabet.
There is a need for hyperplane separability, and that is easily
achieved by increasing the character width. (as I described in
the CD ROM encoding scheme somewhat earlier).

These thoughts lead me to wonder whether proteins designed to interact
in specific ways with DNA must have a "
contact layer" not unlike
Layer1 of a neural net. And then subsequent "
hidden layers" (other
parts of the folded protein structure) to perform nonlinear activation
functions in order to achieve hypersurface separability. The LayerN
neurons can be represented in a protein by the "
decision action"
which in the case of Eco RI is "
0" or "1": cleave or not-cleave.
Actually in the case of Eco RI, it is 00, 01, 10, 11: no cleavage;
cleave left only; cleave right only; cleave both......four states
of decision result needed to describe wildtype Eco RI and the studied
variants.

Back to NNs:

"
As we add extra layers, we shall obtain more general separation
hypersurfaces made up of multiple hyperplane segments."

That statement intrigues me. It's a way to think about protein design
also.

Steve

+------------------------------------------------------------------+
| In person: Steve Modena AB4EL |
| On phone: (919) 515-5328 |
| At e-mail: nmodena@unity.ncsu.edu |
| samodena@csemail.cropsci.ncsu.edu |
| [ either email address is read each day ] |
| By snail: Crop Sci Dept, Box 7620, NCSU, Raleigh, NC 27695 |
+------------------------------------------------------------------+
Lighten UP! It's just a computer doing that to you.
OOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO


------------------------------

Subject: Re: Limitations of Neural Nets with Quadrature data
From: toms@fcs260c2.ncifcrf.gov (Tom Schneider)
Organization: Frederick Cancer Research and Development Center
Date: 26 Aug 92 18:09:38 +0000

In article <1992Aug26.032203.20189@ncsu.edu> samodena@csemail.cropsci.ncsu.edu
(S. A. Modena) writes:

>Some argued that a neural net ought "
to work" by analogy with the
>perceptron "
success" for promoter recognition (...feel free to
>correct me).

Before that, ribosome binding sites; after: splice junctions.

>The problem involved with information that is succinctly expressed
>in two bit wide alphabets....the difficulty experienced by perceptrons
>appears to do with the conciseness of expressing four alternative
>states with a two-bit representation.

Actually, the original use of the perceptron did NOT have this problem
because yours truely suggested the alphabetic widening! The first use of
a neural net to distinguish binding sites from other sequences was Gary
Stormo's thesis:

@article{StormoPerceptron1982,
author = "
G. D. Stormo
and T. D. Schneider
and L. Gold
and A. Ehrenfeucht",
title = "
Use of the {`Perceptron'} algorithm to distinguish translational
initiation sites in {{\em E. coli}}",
year = "
1982",
journal = "
Nucl. Acids Res.",
volume = "
10",
pages = "
2997-3011"}

Nobody has gotten around to trying it directly with 4 symbols, as far as
I know. It just seemed to be not a good idea because it forces a bias
into the results.

Tom Schneider
National Cancer Institute
Laboratory of Mathematical Biology
Frederick, Maryland 21702-1201
toms@ncifcrf.gov

------------------------------

Subject: Post-doctoral Fellowship
From: RREILLY@ccvax.ucd.ie
Date: 09 Sep 92 10:33:00 -0100

Human Capital and Mobility Programme of the Commission of the European
Communities

Postdoctoral Fellowship
==============================================================

Applications are invited for an EC funded post-doctoral fellowship with
the connectionist research group in the Dept. of Computer Science,
University College Dublin, Ireland. The duration of the fellowship may
be between 6-12 months. Remuneration will be at a rate of 3,255
ECU/month (this covers subsistence, tax, social insurance, etc.). The
fellowship is open to EC citizens other than citizens of Ireland.

The research topics are:

(1) The connectionist modelling of eye-movement control
in reading, and

(2) The connectionist modelling of natural language
processing.

Interested candidates should send me a letter of application, a CV, and a
list of their publications. They should also indicate which research
topic, and what particular aspects of it, they are interested in working
on.

Since the closing date for receipt of applications is September 25,
candidates are encouraged to send their applications either by e-mail or
FAX.

Ronan Reilly
Dept. of Computer Science
University College Dublin
Belfield
Dublin 4
IRELAND

Tel.: +353.1.7062475
Fax : +353.1.2697262
e-mail: rreilly@ccvax.ucd.ie
=====================================================================


------------------------------

Subject: "
Expert Systems" on Macintosh
From: VEMURI@icdc.llnl.gov
Date: Fri, 11 Sep 92 13:37:00 -0800

Dear friends:

I am being asked to guest edit a special issue of a technical periodical.
The proposed topic of this special issue is "
Expert Systems on Macintosh
Platforms". The scope of the topic is broad in the sense we would be
interested in all issues dealing with expert systems, shells,
applications, hybrid systems containing expert systems and neural nets,
expert systems and fuzzy systems, etc. The only constraint is to confine
ourselves to those systems and case studies done on Apple Macintosh
personal computers.

Our intention is to include both invited and submitted articles of high
quality. They will all be thoroughly reviewed by a panel of experts.

At this time my questions are the following:
Is the Apple platform an unreasonable restriction?
Is there enough material out there working on Apple PC's
Who are some of the important players?

Before I launch on a call for papers, I am testing the waters to see who
is doing what.

Can you help?

V. Vemuri
Professor
Dept. of Applied Science
University of California, Davis
(510) 294-4051

------------------------------

Subject: 2nd Request for (p)Reprints on Simulated Annealing
From: Lester Ingber <ingber@alumni.cco.caltech.edu>
Date: Sat, 12 Sep 92 11:47:12 -0800

2nd Request for (p)Reprints on Simulated Annealing

I posted the text below in July, and have received many interesting
papers which I will at least mention in my review. It is clear that many
researchers use something "
like" simulated annealing (SA) in their work
to approach quite difficult computational problems. They take advantage
of the ease of including complex constraints and nonlinearities into an
SA approach that requires a quite simple and small code, especially
relative to many other search algorithms.

However, the bulk of the papers I have seen use the standard Boltzmann
annealing, for which it has been proven sufficient to only use a log
annealing schedule for the temperature parameter in order to
statistically achieve a global optimal solution. This can require a
great deal of CPU time to implement, and so these papers actually
"
quench" their searches by using much faster temperature schedules, too
fast to theoretically claim they are achieving the global optimum.
Instead they have defined their own method of simulated quenching (SQ).

In many of their problems this really is not much of an issue, as there
is enough additional information about their system to be able to claim
that their SQ is good enough, and the ease of implementation certainly
warrants its use. I.e., anyone familiar with trying to use other
"
standard" methods of nonlinear optimization on difficult problems will
appreciate this. I also appreciate that faster SA methods, such as I
have published myself, are not as easily implemented.

I would like to have more examples of:
(1) papers that have really used SA instead of SQ in difficult problems.
(2) proposed/tested improvements to SA which still have the important
feature of establishing at least a heuristic argument that a global
optimum can indeed be reached, e.g., some kind of ergodic argument.

The review is on SA, and I do not have the allotted space or intention to
compare SA to other important and interesting algorithms.

Thanks.

Lester

}I have accepted an invitation to prepare a review article on simulated
}annealing for Statistics and Computing. The first draft is due 15
}Jan 93.
}
}If you or your colleagues have performed some unique work using
}this methodology that you think could be included in this review,
}please send me (p)reprints via regular mail. As I will be
}making an effort to prepare a coherent article, not necessarily an
}all inclusive one, please do not be too annoyed if I must choose not
}to include/reference work you suggest. Of course, I will formally
}reference or acknowledge any inclusion of your suggestions/material
}in this paper. While there has been work done, and much more remains
}to be done, on rigorous proofs and pedagogical examples/comparisons,
}I plan on stressing the use of this approach on complex, nonlinear
}and even stochastic systems.
}
}I am a "
proponent" of a statistical mechanical approach to selected
}problems in several fields; some recent reprints are available via
}anonymous ftp from ftp.umiacs.umd.edu [128.8.120.23] in the pub/ingber
}directory. I am not a hardened "
proponent" of simulated annealing;
}I welcome papers criticizing or comparing simulated annealing to
}other approaches. I already plan on including some references that
}are openly quite hostile to this approach.

# Prof. Lester Ingber #
# ingber@alumni.caltech.edu #
# P.O. Box 857 #
# McLean, VA 22101 [10ATT]0-700-L-INGBER #


------------------------------

Subject: Petition: Computer Scientists object to gov't report
From: niv@linc.cis.upenn.edu (Michael Niv)
Organization: University of Pennsylvania
Date: 13 Sep 92 01:10:18 +0000

[[ Editor's Note: While this message is not directly concerned with
Neural Networks, I feel it *is* applicable to many of the Digest
readership. If you are involved with, or have an interest in, computer
science research in the United States, please consider the issues this
petition seeks to address and decide whether you feel the following
effort is justified and/or supportable. -PM ]]

Petition: A Broader Agenda for Computer Science and Engineering

A petition sponsored by John McCarthy, Bob Boyer, Jack Minker,
John Mitchell, and Nils Nilsson.

Dear Colleagues in Computer Science, Cognitive Science, and Engineering:

We are asking you to join us in asking the Computer Science and
Telecommunications Board of the National Research Council to withdraw for
revision its report entitled ``Computing the Future: A Broader Agenda for
Computer Science and Engineering'', because we consider it misleading and
even harmful as an agenda for future research. Our objections include
its defining computer science in terms of a narrow set of applied
objectives, and its implication that the tone of computer science is to
be set by government agencies, university administrators and
industrialists and that computer scientists are just the ``soldiers on
the ground''.

There is much useful information in the report, but the Preface and the
Executive Summary characterize computer science in a way that no other
science would accept. Chapter 2, ``Looking to the Future of CS&E'', and
Chapter 3, ``A Core CS&E Research Agenda for the Future'' should also not
be accepted by computer scientists. The Report merges computer science
and computer engineering at the cost of abolishing computer science and
seriously narrowing computer engineering.

Besides individual scientists, we hope that some computer science
departments will collectively join in requesting the report's withdrawal.

Our campaign for the report's withdrawal is being conducted entirely by
electronic mail, and we will be grateful to anyone who forwards this
message to others who might be concerned. Email to
signatures@cs.stanford.edu will be counted as signing the petition, not
as necessarily agreeing to everything in this message. In fact, the
sponsors of this message are committed to the petition and not
necessarily to every detail of the message. We haven't taken the time to
hash out every detail. So ``sign'' if you endorse the petition.

[The Latex source of the arguments against the report document is
available by anonymous ftp from sail.stanford.edu under the name
/pub/jmc/whysign.tex. The other two documents are /pub/jmc/petition.tex
for the petition itself and /pub/jmc/preface.tex for the preface and
executive summary of the NRC report. A non-Latex email message
containing both the petition and arguments for it may be found under the
name /pub/jmc/petition-why.]

As of 1992 September 4, the sponsors of this request are
Bob Boyer, boyer@cs.utexas.edu,
John McCarthy, jmc@cs.stanford.edu,
Jack Minker, minker@cs.umd.edu,
John Mitchell, jcm@cs.stanford.edu
and
Nils Nilsson, nilsson@cs.stanford.edu. John McCarthy may be telephoned
at 415 723-4430.

(from Stevan Harnad's PSYCOLOQUY list)

------------------------------

Subject: Senior Academic Post Available
From: rohwerrj <rohwerrj@cs.aston.ac.uk>
Date: Tue, 15 Sep 92 18:03:13 +0000

**************************************************************************
Senior Acedemic Post Available
Dept. of Computer Science and Applied Mathematics
Aston University
**************************************************************************

The Aston University Department of Computer Science and Applied
Mathematics is building a research group in neural networks, genetic
algorithms and related subjects. The group, led by the department
chairman Professor David Bounds, and lecturers Richard Rohwer and Alan
Harget currently has 7 PhD students. The department is seeking a
new senior faculty member, preferably at Reader or Professorial level,
to augment this group. The candidate must have proven skills as a
research leader. The appointee will also be involved in some teaching
and fundraising and will be expected to actively build upon Aston's
close relationship with industry. There is no prescribed time
table for filling this post.

The Department has substantial computing resources, including a sequent
symmetry and 2 large Sun networks. Space has been set aside for
expansion. Aston University is in Birmingham, a convenient central
England location with easy access to the rest of England and Wales.

Inquiries should be directed to:

Professor David Bounds
CSAM
Aston University
Aston Triangle
Birmingham B4 7ET
ENGLAND

(44 or 0) (21) 359-3611 x4243


------------------------------

Subject: studentships available almost immediately
From: rohwerrj <rohwerrj@cs.aston.ac.uk>
Date: Tue, 15 Sep 92 18:05:18 +0000

*****************************************************************************
PhD STUDENTSHIPS AVAILABLE in NEURAL NETWORKS
Dept. of Computer Science and Applied Mathematics
Aston University
*****************************************************************************

Funding has unexpectedly become available at the last minute for 1 or
possibly 2 PhD studentships in the Neural Networks group at Aston
University. Ideally the students would enroll in October 1992. The
group currently consists of Professor David Bounds, lecturers Richard
Rohwer and Alan Harget, and 7 PhD students. Current research projects
are drawn from Genetic Algorithms and Artificial Life, as well as
main-line neural network subjects such as local basis function
techniques and training algorithm research, with an emphasis on
recurrent networks. For further information please contact me at the
address below.

Richard Rohwer
Dept. of Computer Science and Applied Mathematics
Aston University
Aston Triangle
Birmingham B4 7ET
ENGLAND

Tel: (44 or 0) (21) 359-3611 x4688 (failing that, leave message at x4243)
FAX: (44 or 0) (21) 333-6215
rohwerrj@uk.ac.aston.cs <-- email communication preferred.


------------------------------

End of Neuron Digest [Volume 10 Issue 3]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT