Copy Link
Add to Bookmark
Report
Neuron Digest Volume 09 Number 10
Neuron Digest Tuesday, 10 Mar 1992 Volume 9 : Issue 10
Today's Topics:
Reply to Pellionisz' "Open Letter"
Re: Arbib's response to "open letter to Amari"
Re: Pellionisz' "Open Letter"
reply to the open letter to Amari
LVQ_PAK revision 2.0 available
Neural Networks School
Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (128.91.2.173). Back issues
requested by mail will eventually be sent, but may take a while.
----------------------------------------------------------------------
Subject: Reply to Pellionisz' "Open Letter"
From: hirsch@math.berkeley.edu (Morris W. Hirsch)
Date: Mon, 02 Mar 92 17:23:03 -0800
DEar Michael [Arbib]:
Bravo for your reply to Pellionisz, and for sending it to Neuron Digest!
I was going to reply to ND along similar but less knowledgeable lines-- I
may still do so if the so-called dispute continues. YOurs, --MOE
Professor Morris W. Hirsch
Department of Mathematics
University of California
Berkeley, CA 94720 USA
e-mail: hirsch@math.berkeley.edu
Phones: NEW area code 510:
Office, 642-4318
Messages, 642-5026, 642-6550
Fax: 510-642-6726
------------------------------
Subject: Re: Arbib's response to "open letter to Amari"
From: Konrad Weigl <Konrad.Weigl@sophia.inria.fr>
Date: Wed, 04 Mar 92 12:24:22 +0100
Ref. Dr. Arbib's answer in Volume Digest V9 #9
I am not familiar enough with the work of Pellionisz, or proficient
enough in the mathematics of General Spaces to judge upon the
mathematical rigour of his work; however, whatever that rigour, as far as
I know, he was the first to link the concept of Neural Networks with
non-euclidian Geometry at all.
That he did so to analyze the dynamics of biological Neural networks, and
Dr. Amari used non-euclidian Geometry later to give a metric to spaces of
different types of Neural Networks, and a geometric interpretation of
learning, does not change one iota of that fact above.
This is not to denigrate Dr. Amari's contribution to the field, of course.
Konrad Weigl Tel. (France) 93 65 78 63
Projet Pastis Fax (France) 93 65 76 43
INRIA-Sophia Antipolis email Weigl@sophia.inria.fr
2004 Route des Lucioles
B.P. 109
06561 Valbonne Cedex
France
------------------------------
Subject: Re: Pellionisz' "Open Letter"
From: ishimaru@hamamatsu-pc.ac.jp (Kiyoto Ishimaru)
Date: Sat, 07 Mar 92 14:58:59 +0200
Dear Moderator:
The recent response of Dr. Arbib brought my attention. The following is
my opinion about the issue:
1) Citing or not-citing in a paper should not be decided based on
KINDNESS to earlier related researches, but rather KINDNESS to
those who are supposed to read or to come across the prospective
paper.
2) Similarity or dissimilarity argument, based on the "subjective"
Riemannian space, lasts forever without any positive results.
The most important thing is who was the first person having
brought the tensor analysis "technique" into NN field. This
should be discussed.
3) Political or social issue, such as Japan-bashing and fierce
competition in R&D World-wide, should not be taken into account
on this issue. This kind of discussion style does not bring any
fruitful results, but makes the issue more complicated and rather
worse, and intangible.
4) Dr Amari's comments in his letter, quoted by Dr. Pellionisz,
"Indeed, when I wrote that paper, I thought to refer
to your paper", and " But if I did so, I could only state
that it is nothing to do with the geometrical appraoch
that I initiated"
may result in the conclusion: If Dr. Pellionisz' paper were
nothing to do with Dr. Amari's, citing Dr. Pellionisz' would
not have come across in his mind at all(but he actually did
think over). Direct public comments from Dr. Amari is urged on
this respect and others in order for both of them to get a
fair jugement.
Please be sure that I made the comment with an awkward feeling due to
lack of deep background with respect to the tensor analysis effect on NN
field. However, I am pleased with expressing my self on this e-mail.
Thank you.
Sincerely Yours
Kiyoto Ishimaru
Dept of Computer Science
Hamamatsu Polytechnic College
643 Norieda
Hamamatsu 432
Japan
e-mail: ishimaru@hamamatsu-pc.ac.jp
------------------------------
Subject: reply to the open letter to Amari
From: Shun-ichi Amari <amari@sat.t.u-tokyo.ac.jp>
Date: Tue, 10 Mar 92 18:16:32 +0200
[[ Editor's Note: In a personal note, I thanked Dr. Amari for his
response. I had assumed, incorrectly, that Dr. Pellionisz had sent a
copy to Dr. Amari who is not a Neuron Digest subscriber. I'm sure all
readers will remember that Neuron Digest is not a peer-referreed journal
but an informal forum for electronic communication. I hope the debate can
come to a fruitful conclusion -PM ]]
Dear Editor :
Professor Usui at Toyohashi Institute of Technology and Science
kindly let me know that there is an "open letters to Amari" in Neuron
Digest. I was so surprised that an open letter to me was published
without sending it to me. Moreover, the letter requires me to answer
repeatedly what I have already answered to Dr. Pellionisz. I again try
to repeat my answer in more detail.
Reply to Dr. Pellionisz
by Shun-ichi Amari
1. Dr. Pellionisz accused me that I have two contradictory opinions : 1)
My work is a generalization of his and 2) my approach is nothing to do
with his. This is incorrect. Once one reads my paper ("Dualistic
geometry of the manifold of higher-order neurons", Neural Networks, vol.
4 (1991), pp. 443-451; see also another paper "Information geometry of
Boltzmann machines" by S. Amari, K. Kurata and H. Nagaoka, IEEE Trans. on
Neural Networks, March 1992), it is immediately clear 1) that my work is
never a generalization of his and 2) more strongly that it has nothing to
do with Pellionisz' work. Dr. Pellionisz seems accusing me without
reading or understanding my paper at all. I would like to ask the readers
to read my paper.
For those readers who have not yet read my paper, I would like to
compare his work with mine in the following, because this is what Dr.
Pellionisz has carefully avoided.
2. We can summarize his work in that the main function of the cerebellum
is a transformation of a covariant vector to a contravariant vector in a
metric Euclidean space since non-orthogonal reference bases are used in
the brain. He mentioned verbally non-linear generalizations and so on,
but nothing scientific has been done along this line.
3. In my 1991 paper, I proposed a geometrical theory of the manifold of
parameterized non-linear systems, with special reference to the manifold
of non-linear higher order neurons. I did not focus on any functions of
a neural network but the mutual relations among different neural networks
such as the distance of two different neural networks, the curvature of a
family of neural networks and its role, etc. Here the method of
information geometry plays a fundamental role. It uses a dual pair of
affine connections, which is a new concept in differential geometry, and
has been proved to be very useful for analyzing statistical inference
problems, multiterminal information theory, the manifold of linear
control systems, and so on (see S.Amari, Differential Geometrical Methods
of Statistics, Springer Lecture Notes in Statistics, vol.28, 1985 and
many papers referred to in its second printing). Now a number of
mathematicians are studying on this new subject. I have shown that the
same method of information geometry is applicable to the manifold of
neural networks, elucidating the capabilities and limitations of a family
of neural networks in terms of their architecture.
I have opened, I believe, a new fertile field of studying, not the
behaviors of single neural networks, but the collective properties of the
set or the manifold of neural networks in terms of new differential
geometry.
4. Now we can discuss the point. Is my theory a generalization of his
theory? Definitely No. If A is a generalization of B, A should include
B as a special example. My theory does never include any of his
tensorial transformations. A network is merely a point of the manifold
in my theory. I have studied a collective behaviors of the manifold but
have not studied properties of points.
5. The second point. One may ask that, even if my theory is not a
generalization of his theory, it might have something to do with his
theory so that I should have referred to his work. The answer is again
no. Dr. Pellionisz insists that he is a pioneer of tensor theory and my
theory is also tensorial. This is not true. My theory is
differential-geometrical, but it does not require any tensorial notation.
Modern differential geometry has been constructed without using tensorial
notations, although it is sometimes convenient to use them. As one sees
from my paper, its essential part is described without tensor notations.
In differential geometry, what is important is intrinsic structures of
manifolds such as affine connections, parallel transports, curvatures,
and so on. The Pellionisz theory has nothing to do with these
differential-geometrical concepts. He used the tensorial notation to
point out that the role of the cerebellum is a special type of linear
transformations, namely a covariant-contravariant linear transformation,
which M.A.Arbib and myself have criticized.
6. Dr. Pellionisz claims that he is the pioneer of the tensorial theory
of neural networks. Whenever one uses a tensor, should he refer to
Pellionisz'? This is rediculus. Who does claim that he is the pioneer
of using differential equations, linear algebra, probability theory, etc.
in neural network theory? It is just a commonly used method. Moreover,
the tensorial method itself had been used since the old time in neural
network research. For example, in my 1967 paper (S. Amari, A
mathematical theory of adaptive pattern classifiers, IEEE Trans. on EC,
vol.16, pp.299-307) where I proposed the general stochastic gradient
learning method for multilayer networks, I used the metric tensor C
(p.301) in order to transform a covariant gradient vector to the
corresponding contravariant learning vector. But I suppressed the tensor
notation there. However, in p.303, I explicitly used the tensorial
notation in order to analyze the dynamic behavior of modifiable parameter
vectors.
I never claim that Dr. Pellionisz should refer to this old paper,
because the tensorial method itself is of common use to all applied
mathematicians, and my old theory is nothing to do with his except that
both used a covariant-contravariant transformation and tensorial
notations, a common mathematical concept.
7. I do not like non-productive, time-consuming and non-scientific
discussions like this. If one reads my paper, everything will be melted
away. This is nothing to do with the fact that I am unfortunately a
coeditor-in-chief of Neural Networks, a threaten on the intellectual
properties (of tensorial theory), the world-wide competition of
scientific research, etc. which Dr. Pellionisz hinted as if such were in
the background.
Instead, this reminded me of the horrible days when Professor M. A.
Arbib and myself were preparing the righteous criticism to his theory
(not a criticism to using the tensor concept but to his theory itself).
I had received astonishing interference repeatedly, which I hope would
never happen again.
I will disclose my e-mail letter to Pellionisz in the following,
hoping that he discloses his first letter including his unbelievable
request, because it makes the situation and his desire clear. The reader
would understand why I do not want to continue fruitless discussions with
him. I also request him to read my paper and to point out which concepts
or theories in my paper are generalizations of his.
The folowing is my old reply to Dr.Pellionisz which he partly referred to
in his "open letter to Amari".
Dear Dr. Pellionisz:
Thank you for your e-mail remarking my recent paper entitled "Dualistic
geometry of the manifold of higher-order neurons".
As you know very well, we criticized your idea of tensorial approach
in our memorial joint paper with M.Arbib. The point is that, although
the tensorial approach is welcome, it is too restrictive to think that
the brain function is merely a transformation between contravarian
vectors and covariant vectors; even if we use linear approximations, the
transformation should be free of the positivity and symmetry. As you may
understand these two are the essential restrictions of
covariant-contravariant transformations.
You have interests in analyzing a general but single neural network.
Of course this is very important. However, what I am interested in is to
know a geometrical structures of a set of neural networks (in other
words, a set of brains). This is a new object of research. Of course, I
did some work along this line in statistical neurodynamics where a
probability measure is introduced in a manifold of neural networks, and
physicists later have followed a similar idea (E.Gardner and others).
However, a geometrical structure is implicit.
As you noted, I have written that my paper opens a new fertile field
of neural network research, in the following two senses: First, that we
are treating a set of networks, not the behavior of a single network.
There are vast number of researches on single networks by analytical,
stochastic, tensorial and many other mathematical methods. The point is
to treat a new object of research, a manifold of neural networks.
Secondly, I have proposed a new concept of dual affine connections, which
mathematician have recently been studying in more detail as mathematical
research.
So if you have studied the differential geometrical structure of a
manifold of neural networks, I should refer to it. If you have proposed
a new concept of duality in affine connections, I should refer to it. If
you are claiming that you used tensor analysis in analyzing behaviors of
single neural networks, it is nothing to do with the field which I have
opened.
Indeed, when I wrote that paper, I thought to refer to your paper.
But if I did so, I could only state that it is nothing to do with this
new approach. Moreover, I need to repeat our memorial criticism again.
I do not want to do such irrelevant discussions.
If you read my paper, I think you understand what is newly opened by
this approach. Since our righteous criticism to your memorable approach
has been published, we do not need to repeat it again and again.
I do hope your misunderstanding is resolved by this mail and by
reading my paper.
Sincerely yours, Shun-ichi Amari
------------------------------
Subject: LVQ_PAK revision 2.0 available
From: lvq@cochlea.hut.fi (LVQ_PAK)
Date: Mon, 03 Feb 92 13:59:26 +0200
************************************************************************
* *
* LVQ_PAK *
* *
* The *
* *
* Learning Vector Quantization *
* *
* Program Package *
* *
* Version 2.0 (January 31, 1991) *
* *
* Prepared by the *
* LVQ Programming Team of the *
* Helsinki University of Technology *
* Laboratory of Computer and Information Science *
* Rakentajanaukio 2 C, SF-02150 Espoo *
* FINLAND *
* *
* Copyright (c) 1991 *
* *
************************************************************************
Public-domain programs for Learning Vector Quantization (LVQ) algorithms
are available via anonymous FTP on the Internet.
"What is LVQ?", you may ask --- See the following reference, then: Teuvo
Kohonen. The self-organizing map. Proceedings of the IEEE,
78(9):1464-1480, 1990.
In short, LVQ is a group of methods applicable to statistical pattern
recognition, in which the classes are described by a relatively small
number of codebook vectors, properly placed within each class zone such
that the decision borders are approximated by the nearest-neighbor rule.
Unlike in normal k-nearest-neighbor (k-nn) classification, the original
samples are not used as codebook vectors, but they tune the latter. LVQ
is concerned with the optimal placement of these codebook vectors into
class zones.
This package contains all the programs necessary for the correct
application of certain LVQ algorithms in an arbitrary statistical
classification or pattern recognition task. To this package three options
for the algorithms, the LVQ1, the LVQ2.1 and the LVQ3, have been
selected.
This code is distributed without charge on an "as is" basis. There is no
warranty of any kind by the authors or by Helsinki University of
Technology.
In the implementation of the LVQ programs we have tried to use as simple
code as possible. Therefore the programs are supposed to compile in
various machines without any specific modifications made on the code. All
programs have been written in ANSI C. The programs are available in two
archive formats, one for the UNIX-environment, the other for MS-DOS. Both
archives contain exactly the same files.
These files can be accessed via FTP as follows:
1. Create an FTP connection from wherever you are to machine
"cochlea.hut.fi". The internet address of this machine is
130.233.168.48, for those who need it.
2. Log in as user "anonymous" with your own e-mail address as password.
3. Change remote directory to "/pub/lvq_pak".
4. At this point FTP should be able to get a listing of files in this
directory with DIR and fetch the ones you want with GET. (The exact
FTP commands you use depend on your local FTP program.) Remember
to use the binary transfer mode for compressed files.
The lvq_pak program package includes the following files:
- Documentation:
README short description of the package
and installation instructions
document.ps documentation in (c) PostScript format
document.ps.Z same as above but compressed
document.txt documentation in ASCII format
- Source file archives (which contain the documentation, too):
lvq_p2r0.exe Self-extracting MS-DOS archive file
lvq_pak-2.0.tar UNIX tape archive file
lvq_pak-2.0.tar.Z same as above but compressed
An example of FTP access is given below
unix> ftp cochlea.hut.fi (or 130.233.168.48)
Name: anonymous
Password: <your email address>
ftp> cd /pub/lvq_pak
ftp> binary
ftp> get lvq_pak-2.0.tar.Z
ftp> quit
unix> uncompress lvq_pak-2.0.tar.Z
unix> tar xvfo lvq_pak-2.0.tar
See file README for further installation instructions.
All comments concerning this package should be
addressed to lvq@cochlea.hut.fi.
************************************************************************
------------------------------
Subject: Neural Networks School
From: "F. Ventriglia" <lc4a@icineca.bitnet>
Date: Wed, 04 Mar 92 14:19:46 +0700
INTERNATIONAL SCHOOL
on
NEURAL MODELLING and NEURAL NETWORKS
Capri (Italy) - September 27th-October 9th, 1992
Director F. Ventriglia
An International School on Neural Modelling and Neural Networks was
organized under the sponsorship of the Italian Group of Cybernetics and
Biophysics of the CNR and of the Institute of Cybernetics of the CNR;
sponsor the American Society for Mathematical Biology.
The purpose of the school is to give to young scientists and to
migrating senior scientists some landmarks in the inflationary universe
of researches in neural modelling and neural networks. Towards this aim
some well known experts will give lectures in different areas comprising
neural structures and functions, single neuron dynamics, oscillations in
small group of neurons, statistical neurodynamics of neural networks,
learning and memory. In the first part, some neurobiological foundations
and some formal models of single (or small groups of) neurons will be
stated. The topics will be:
TOPICS LECTURERS
1. Neural Structures * Szentagothai, Budapest
2. Correlations in Neural
Activity * Abeles, Jerusalem
3. Single Neuron Dynamics:
deterministic models * Rinzel, Bethesda
4. Single Neuron Dynamics:
stochastic models * Ricciardi, Naples
5. Oscillations in Neural
Systems * Ermentrout, Pittsburgh
6. Noise in Neural Systems * Erdi, Budapest
The second part will be devoted to Neural Networks, i.e. to models of
neural systems and of learning and memory.
The topics will be:
TOPICS LECTURERS
7. Mass action in Neural Systems * Freeman, Berkeley
8. Statistical Neurodynamics:
kinetic approach * Ventriglia, Naples
9. Statistical Neurodynamics:
sigmoidal approach * Cowan, Chicago
10.Attractor Neural Networks in
Cortical Conditions * Amit, Roma
11."Real" Neural Network
Models * Traub, Yorktown Heights
12.Pattern Recognition in
Neural Networks * Fukushima, Osaka
13.Learning in Neural Networks * Tesauro, Yorktown Heights
WHO SHOULD ATTEND
Applicants for the international School should be actively engaged in the
fields of biological cybernetics, biomathematics or computer science, and
have a good background in mathematics. As the number of participants must
be limited to 70, preference may be given to students who are
specializing in neural modelling and neural networks and to professionals
wha are seeking new materials for biomathematics or computer science
courses.
SCHOOL FEES
The school fee is Italian Lire 500.000 and includes notes, lunch and
coffee- break for the duration of the School.
REGISTRATION
A limited number of grants (covering the registration fee of Lit.
500.000) is available. The organizator applied to the Society of
Mathematical Biology for travel funds for participants who are member of
the SMB. Preference will be given to students, postdoctoral fellows and
young faculty (1-2 years) after PhD.
PROCEDURE FOR APPLICATION
Applicants should contact:
Dr. F. Ventriglia
Registration Capri International School
Istituto di Cibernetica
Via Toiano 6
80072 - Arco Felice (NA)
Italy
Tel. (39-) 81-8534 138
E-Mail LC4A@ICINECA (bitnet)
Fax (39-) 81-5267 654
Tx 710483
------------------------------
End of Neuron Digest [Volume 9 Issue 10]
****************************************