Copy Link
Add to Bookmark
Report

Neuron Digest Volume 09 Number 08

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Saturday, 22 Feb 1992                Volume 9 : Issue 8 

Today's Topics:
pygmalion
Postdoctoral position at Bellcore
position announcement
Graded Learning
OPtimizing inductive bias
Postdoc at Oak Ridges
Advertisement - Research associate or officer position
Cognitive Science Job at NYU
Open letter to Dr. Sun-Ichi Amari


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (128.91.2.173). Back issues
requested by mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: pygmalion
From: "Robert Jandy, Computing Service" <ROBERT@rmcs.cranfield.ac.uk>
Date: Sat, 04 Jan 92 12:52:00 +0000

[[ Editor's Note: I don't even know to what this refers. Could someone
help out? -PM ]]

Does anyone have any idea where I could get a copy of the PYGMALION database.

Thanks,
Robert

Robert@rmcs.cran.ac.uk


------------------------------

Subject: Postdoctoral position at Bellcore
From: Joshua Alspector <josh@flash.bellcore.com>
Date: Thu, 23 Jan 92 09:54:10 -0500


The neural network research group at Bellcore is looking for a
post-doctoral researcher for a period of 1 year. The start date is
flexible but should be before August, 1992.

Because of its inherently parallel nature, neural network technology is
particularly suited for the types of real-time computation needed in
telecommunications. This includes data compression, signal processing,
optimization, and speech and pattern recognition. Neural network
training is also suitable for knowledge-based systems where the rules are
not known or where there are too many rules to incorporate in an expert
system. The goal of our group is to bring neural network technology into
the telecommunications network.

Our approach to developing and using neural network technology
to encode knowledge by learning includes work on the following:
1) Development, analysis, and simulation of learning algorithms
and architectures.
2) Design and fabrication of prototype chips and boards suitable for
parallel, high-speed, neural systems.
3) Telecommunications applications.

We are interested in strong candidates in any of the above work areas but
are especially encouraging people who can demonstrate the usefulness of
the technology in telecommunications applications.

The successful candidate should have a demonstrated record of
accomplishment in neural network research, should be proficient at
working in a UNIX/C environment, and should be able to work interactively
in the applied research area at Bellcore.

Apply in writing to:

Joshua Alspector
Bellcore, MRE 2E-378
445 South St.
Morristown, NJ 07962-1910

Please enclose a resume, a copy of a recent paper, and the
names, addresses, and phone numbers of three referees.

------------------------------

Subject: position announcement
From: Colleen Seifert <seifert@csmil.umich.edu>
Date: Thu, 23 Jan 92 17:03:41 -0500

The University of Michigan Department of Psychology invites applications
for a tenure-track position in the area of Cognitive Modelling. We seek
candidates with primary interests and technical skills in cognitive
psychology, with special preference for individuals with particular
expertise in computational modelling (broadly defined, including
connectionist modelling). Due to time constraints, please indicate
interest via email (to "gmo@csmil.umich.edu") along with sending vita,
references, publications, and statement of research and teaching
interests to: Cognitive Processes Search Committee, Dept. of Psychology,
University of Michigan, Ann Arbor, MI 48109.


------------------------------

Subject: Graded Learning?
From: Chris Milton <cpm@advanced-robotics-research-centre.salford.ac.uk>
Date: Fri, 24 Jan 92 14:01:11 +0000


I am trying to find out about a method of teaching Neural Networks called
Graded Learning, which is mentioned in Hecht Nielssons Book.

A friend told me about this method and commented that according to Hecht
Nielsson all the algorithms for performing Graded Learning are
copywrited.

As far as I can make out the basic method is to try and reduce a cost
function associated with each training pattern and give the network some
information about how well it was doing based on the cost function(?),
rather than just reduce the error as in supervised learning.

If anybody out there knows anything about Graded Learning methods I would
be very interested to hear from them.

Please send any info to cpm@uk.ac.salf.arrc.

Cheers and thanks in advance -

Chris Milton - The National Advanced Robotics Research Centre U.K.


------------------------------

Subject: OPtimizing inductive bias
From: Bruce Lambert <U53076%UICVM.BITNET@bitnet.cc.cmu.edu>
Date: Sun, 26 Jan 92 23:32:17 -0600

HI folks,

Recently Yoshua Bengio posted a note about using standard
optimization techniques to set tunable parameters to neural nets. Dave
Tcheng and I have working on the same basic idea at a more general level
for several years. Rather than optimizing just networks, we have
developed a framework for using optimization to search a large inductive
bias space defined by several different types of algorithms (e.g.,
decision tree builders, nets, exemplar based approaches, etc.). Given
the omnipresent necessity of tweaking biases to get good performance,
automation of the bias search seems very sensible. A couple of
references to our work are given below. We hope you find them useful

-Bruce Lambert
Department of Pharmacy Administration
University of Illinois at Chicago

Tcheng, D., Lambert, B., Lu, S. C-Y., & Rendell, L. (1989). Building robust
learning systems by combining induction and optimization. In _Proc. 11th
IJCAI_ (pp. 806-812). San Mateo, CA: Morgan Kaufman.

Tcheng, D., Lambert, B., Lu, S. C-Y, & Rendell, L. (1991). AIMS: An adaptive
interactive modelling system for supporting engineering decision making. In
L. Birnbaum & G. Collins (Eds.), _Machine learning: Proceedings of the eighth
international workshop_ (pp. 645-649). San Mateo, CA: Morgan Kaufman.


------------------------------

Subject: Postdoc at Oak Ridges
From: PA140068%UTKVM1.bitnet@CUNYVM.CUNY.EDU
Date: 27 Jan 92 11:16:00 -1100

We would like to announce an opening for a postdoctral position in
Chemical Physics at Oak Ridge National Laboratory. The position is in
the investigation of the use of Neural Networks in chemistry and related
fields. In particular, our present effort is in the area of
computational chemistry using molecular dynamics, molecular mechanics, ab
initio, monte carlo, and neural network methods to study various
properties of polymeric materials. Our group consists of 4 computational
chemists and 10 experimentalists working in the area of polymer and
physical chemistry. The applicant should have a Ph. D. in Chemical
Physics or an equivalent and be knowledgable in the field of neural
networks. Ability to perform computer simulations using the molecular
dynamics method is highly desirable. Please send applications (CV and
list of publications) to:

Dr Bobby G. Sumpter or Dr Donald W. Noid
Chemistry Division
Oak Ridge National Laboratory
Oak Ridge,Tennessee 37831-6182

Thanks and best regards
Bobby Sumpter


------------------------------

Subject: Advertisement - Research associate or officer position
From: bogner@eleceng.adelaide.edu.au
Date: Thu, 30 Jan 92 12:21:54 +1100


University of Adelaide
SIGNAL PROCESSING AND NEURAL NETWORKS

RESEARCH ASSOCIATE or RESEARCH OFFICER

A research associate or research officer is required as soon as possible
to work on projects supported by the University, The Department of
Defence, and the Australian Research Council. Two prime projects are
under consideration and the appointee may be required to work on either
or both.

The aim of the one project is to design an electronic sensor organ based
on known principles of insect vision. The insect's eye has specialised
preprocessing that provides measures of distance and velocity by
evaluation of deformations of the perceived visual field. The work will
entail novel electronic design in silicon or gallium arsenide, software
simulation and experimental work to evaluate and demonstrate performance.
This work is in collaboration with the Australian National University.

The aim of the other project is to develop and investigate principles of
artificial neural network for processing multiple signals obtained from
over-the-horizon radars. Investigation of the wavelet functions for the
representations of signals may be involved. The work is primarily in the
area of exploration of algorithms and high-level computer software. This
work is in conjunction with DSTO. d.

DUTIES: In consultation with task leaders and specialist researchers to
investigate alternative design approaches and to produce designs for
microelectronic devices, based on established design procedures.
Communicate designs to manufacturers and oversee the production of
devices. Prepare data for experiments on applications of signal
processing and artificial neural networks. Prepare software for testing
algorithms. Assist with the preparation of reports.

QUALIFICATIONS: For the Research Associate, a Phd or other suitable
evidence of equivalent capability in research in engineering or computer
science. Exceptionally, a candidate with less experience but outstanding
ability might be considered. For the Research Officer, a degree in
electrical engineering or computer science with a good level of
achievement. Experience in signal processing would be an advantage.
Demonstrated ability to communicate fluently in written and spoken
English.

PAY and CONDITIONS: will be in accordance with University of Adelaide
policies, and will depend on the qualifications and experience. Suitable
incumbents may be able to include some of the work undertaken for a
higher degree if they do not hold such. Appointment may be made in
scales from $25692 p.a. to 33017 for the Research Officer or $29600 to
$38418 p.a. for the Research Associate.

ENQUIRIES: Professor R. E. Bogner, Dept. of Electrical and Electronic
Engineering, The University of Adelaide, Box 498, Adelaide, Phone (08)
228 5589, Fax (08) 224 0464, E-mail bogner@eleceng.adelaide.edu.au
bugI2A22.CHI 28-1-92


------------------------------

Subject: Cognitive Science Job at NYU
From: mis@cns.nyu.edu (Misha Pavel)
Date: Sun, 02 Feb 92 16:11:34 -0500


__________________________________________________________________________

N N Y Y U U
* NN N Y Y U U *
* * N N N Y U U * *
* N NN Y U U *
N N Y U U U

EXCITING JOB OPPORTUNITY
__________________________________________________________________________

The Center for Cognitive Science at NYU invites applications for a
tenure track position with level open. Candidates should have developed
a strong interdisciplinary research program and have good computational
or mathematical skills. Preference will be given to the following areas
of research:

-> neural/adaptive networks
-> pattern recognition
-> motor control
-> learning and memory
-> reasoning
-> language
-> judgement and decision making.
The current composition of the interdisciplinary group promotes
collaborations between theoretical and experimental scientists, supported
by a rich computing environment, including a large number of SUN
workstations and SGI IRIS machines. Members of the Cognitive Science
group are expected to contribute to teaching courses in the Cognitive
Sciences curriculum and in one of the affiliated departments, typically
the Center for Neural Sciences, the Department of Psychology, and the
Deparment of Computer Science and the Department of Mathematics (Courant
Institute).
New York University is situated in the attractive Greenwich Village
section of Manhattan.
Send a letter of interest, resume, and letters of recommendation to:
______________________________________________
| |
| Cognitive Science Search Committee |
| 6 Washington Place, 8th floor |
| New York University |
| NY NY 10003 |
|______________________________________________|


------------------------------

Subject: Open letter to Dr. Sun-Ichi Amari
From: Andras Pellionisz SL <pellioni@pioneer.arc.nasa.gov>
Date: Wed, 29 Jan 92 17:38:04 -0800

[[ Editor's Note: I know many in the field regard Dr. Pellionisz as
holding controversial opinions. He and I have corresponded and I feel he
brings up some very valid points which should be the source of
substantive debate. The letter below is the result. I encourage
responses, either in support or refutation, to the following letter. The
main issue, that of intellectual priority and proper citation, affects
all of us in research and forms the foundation of the modern scientific
tradition. Dr. Pellionisz' secondary issue, international competition
versus cooperation, is also worthy of discussion, though I would request
that responses to Neuron Digest remain factual and near the subject of
neural networks. I also certainly hope that Dr. Amari responds to the
rather serious charges in an appropriate forum. -PM ]]

Dear Peter: according to our previous exchange, after long deliberation,
I put together the "Open letter to Amari". Given the fact that my
personal story is well in line with some global happenings, I trust that
you will find this contribution worthy of distribution

Andras

* "Tensor-Geometrical Approach to Neural Nets" in 1985 and 91*
or
OPEN LETTER TO DR. SUN-ICHI AMARI
by Andras J. Pellionisz


Dear Readers: Many of you may know that I pioneered a tensor- geometrical
approach to neural nets for over a decade, with dozens of publications in
this subject.

Many of you may have seen a recent paper on tensor-geometry of neural
nets (by Dr. Amari) as "opening a new fertile field of neural network
research" (in 1991!) WITHOUT referencing ONE of the pre- existing
pioneering studies. Dr. Amari did not even cite his own paper (1985), in
which he criticized my pioneering. This is unfair, especially since that
the majority of readers were uninitiated in tensor geometry in 85 and
thus his early "criticism" greatly hampered the unfolding of the tensor
geomery approach that he now takes. Unfortunately, Dr. Amari's paper
appeared in a Journal in which he is a chief editor. Therefore, I am
turning directly to you, with the copy of my letter (sent to Dr. Amari
21st Oct. 1991, no response to date).

There may be two issues involved. Obviously, we are entering an era which
will be characterized by fierce competition in R&D World- wide,
especially between US, Japan and Europe. The question of protocol of fair
competition in such a complex endeavor may be too nascent or too
overwhelming for me to address.

The costliness of pioneering and fairness to long-existing standards of
protocol in academia, acknowledgement of such initiatives, is a painful
and personal enough problem for me to have to shoulder.

===========================================

Dear Dr. Amari:

Thank you for your response to my E-mail sent to you regarding your paper
in the September issue (1991) of "Neural Networks", entitled "Dualistic
geometry of the manifold of higher-order neurons".

You offered two interpretations why you featured a Geometrical Approach
in 1991 as "opening a new fertile field of neural network research". One
can see two explanations why you wrote your paper without even mentioning
any of my many publications, for a decade prior to yours, or without even
mentioning your own paper (with Arbib in which you criticized in 1985 my
geometrical- tensorial approach that I took since 1980). I feel that one
cannot accept both interpretations at the same time, since they
contradict one another. Thus, I feel compelled to make a choice.

The opening (2nd and 3rd) paragraphs of your letter say: "As you know
very well, we criticized your idea of tensorial approach in our... paper
with M.Arbib. The point is that, although the tensorial approach is
welcome, it is too restrictive to think that the brain function is merely
a transformation between contravariant vectors and covariant vectors;
even if we use linear approximations, the transformation should be free
of the positivity and symmetry. As you may understand these two are the
essential restrictions of covariant-contravariant transformations. ...You
have interests in analyzing a general but single neural network. Of
course this is very important. However, what I am interested in is to
know a geometrical structures of a set of neural networks (in other
words, a set of brains). This is a new object of research."

THIS FIRST INTERPRETATION, that you could have easily included to your
1991 paper, clearly features your work as a GENERALIZATION of my
decade-old geometrical initiative, which you deem "too restrictive". I
am happy that you still clearly detect some general features of my prior
work, which you describes as targeting a "single neural network", while
yours as being concerned with a "set of neural networks". Still, it is a
fact that my work was never restricted to e.g. a SINGLE cerebellum, but
was a geometrical representation of the total "set of all cerebella", not
even restricted to any single species (but, in full generality, the
metric tensor of the spacetime geometry). Thus the characterization of
your work as more general appears unsupported by your letter. However,
even if your argument were fully supported, in a generalization of
earlier studies an author would be expected to make references, according
to standard protocol, to prior work which is being generalized (as my
"too restrictive" studies preceeded yours by a decade).

In fact, you (implicitly) appear to accept this point by saying (later in
your letter): "Indeed, when I wrote that paper, I thought to refer to
your paper". Unfortunately, instead of doing so, you continue by offering
a SECOND ALTERNATIVE INTERPRETATION of your omission of any reference to
my work, by saying: "But if I did so, I could only state that it is
nothing to do with this new approach".

Regrettably, I find that the two interpretations are incompatible that
(1) your work is a GENERALIZATION of mine (2) your geometrical aproach
has NOTHING TO DO with the geometrical approach that I initiated.

Since I have just returned from a visit to Germany (a country that
awarded to me the Alexander von Humboldt Prize honoring my geometrical
approach to brain theory) I know that many in Germany as well as in the
US are curious to see how THEIR INTERPRETATION of similarities of the two
tensor-geometrical approaches compares to Amari's and/or Pellionisz's
interpretation.

I can not run the risk of trying to slap into the face of the audience
two diametrically opposing arguments (when they will press me requiring
comparisons of your metric tensors used in 1991 and those that I used
since 1980). On my part, I will therefore take the less offensive
interpretation from those you offered, which claims that your geometrical
approach is in some ways more general than my geometrical approach a
decade before. As for you, I will leave it to you how you compare your
approach to mine, if you become pressed by anyone to substantiate your
claim over the comparison.

I maintain the position proposed in my original letter, that it might be
useful if such a public comparison is offered by you for the record at
the earliest occasion of your choice. For now, I shall remain most
cooperative to find ways to make sure that appropriate credit is given to
my decade-old pioneering efforts (however "restrictive" you label the
early papers and whether or not you have read any of those that I wrote
since1982, the date of manuscript of your 1985 critique). At this time,
I would like to refer to the wide selection of options taken by workers
in the past in similar situations.

Since by December 7, 1991, I will have made a strong public impact by
statements on this issue, I would most appreciate if during the coming
week or two you could indicate (which I have no reason to doubt at this
time) your willingness to credit my costly pioneering efforts in some
appropriate fashion. As you so well know yourself, a geometrical
approach to brain theory is still not automatically taken by workers in
1991, and certainly was rather costly to me to initiate more than a
decade ago, and to uphold, expand, experimentally prove in neuroscience,
and firmly establish in neural net theory in spite of criticisms.

Sincerely:

Dr. Andras J. Pellionisz


------------------------------

End of Neuron Digest [Volume 9 Issue 8]
***************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT