Copy Link
Add to Bookmark
Report

Neuron Digest Volume 07 Number 45

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest	Thursday, 15 Aug 1991		Volume 7 : Issue 45 

Today's Topics:
Administrivia - Last issue of ND for several weeks
Re: Neuron Digest V7 #42
manuscript available: Feature-based induction
RE: Rules and Neural Networks. Paper announcement.
dynamic reprstns and lang
Mind Machine Digest
Research Assistant


Send submissions, questions, address maintenance and requests for old issues to
"neuron-request@hplabs.hp.com" or "{any backbone,uunet}!hplabs!neuron-request"
Use "ftp" to get old issues from hplpm.hpl.hp.com (15.255.176.205).

----------------------------------------------------------------------

Subject: Administrivia - Last issue of ND for several weeks
From: "Neuron-Digest Moderator -- Peter Marvit" <neuron@hplabs.hpl.hp.com>
Date: Thu, 15 Aug 91 13:43:32 -0700

A reminder that this will be the last issue of Neuron Digest for at least
two weeks. If all goes well, mail to the old addresses should still
work, but I will not read them until approximately 1 September. Issues
will resume shortly thereafter. Again, patience will be appreciated
then.

See you then!

-Peter

: Peter Marvit, Neuron Digest Moderator
: Courtesy of Hewlett-Packard Labs in Palo Alto, CA 94304 (415) 857-6646
: neuron-request@hplabs.hpl.hp.com OR {any backbone}!hplabs!neuron-request


------------------------------

Subject: Re: Neuron Digest V7 #42
From: rdj@demos.LANL.GOV (Roger D. Jones)
Date: Tue, 30 Jul 91 08:33:37 -0600

TECHNICAL REPORT LAUR-91-273 AVAILABLE

Nonlinear Adaptive Networks: A Little Theory, A Few Applications

by

R. D. Jones, Y. C. Lee, S. Qian, C. W. Barnes, K. R. Bisset,
G. M. Bruce, G. W. Flake, K. Lee, L. A. Lee, W. C. Mead,
M. K. O'Rourke, I. Poli, and L. E. Thode

Los Alamos National Laboratory

ABSTRACT: We present the theory of nonlinear adaptive networks and
discuss a few applications. In particular, we review the theory of
feedforward backpropagation networks. We then present the theory of the
Connectionist Normalized Linear Spline network in both its feedforward
and recurrent modes. Also, we briefly discuss the theory of Adaptive
Stochastic Cellular Automata. We then discuss applications to chaotic
time series, tidal prediction in Venice Lagoon, sonar transient
detection, control of nonlinear processes, balancing a double inverted
pendulum, and design advice for free electron lasers.

Send reprint requests to Roger Jones at rdj@lanl.gov


------------------------------

Subject: manuscript available: Feature-based induction
From: sloman%meme@Forsythe.Stanford.EDU
Date: Tue, 30 Jul 91 10:32:26 -0700

A compressed postscript version of the following paper has been placed in
the pub/neuroprose directory for anonymous ftp from
cheops.cis.ohio-state.edu. The paper concerns a very simple
connectionist model (n inputs, one output, and delta-rule learning) of
people's willingness to affirm a property of one natural-kind category
given confirmation of the property in other categories. The paper has
been submitted for publication.


Feature-Based Induction

Steven A. Sloman
Dept. of Psychology
University of Michigan

e-mail: sloman@psych.stanford.edu

Abstract

A connectionist model of argument strength is proposed that applies to
categorical arguments involving natural categories and predicates about
which subjects have few prior beliefs. An example is *robins have
sesamoid bones, therefore falcons have sesamoid bones*. The model is
based on the hypotheses that argument strength (i) increases with the
overlap between features of the combined premise categories and features
of the conclusion category; and (ii) decreases with the amount of prior
knowledge about the conclusion category. The model assumes a two-stage
process. First, premises are encoded by connecting the features of
premise categories to the predicate. Second, conclusions are tested by
examining the degree of activation of the predicate upon presentation of
the features of the conclusion category. The model accounts for 13
qualitative phenomena and shows close quantitative fits to several sets
of argument-strength ratings.


------------------------------

Subject: RE: Rules and Neural Networks. Paper announcement.
From: MFMISMIC%HMARL5.BITNET@vma.CC.CMU.EDU
Date: Fri, 02 Aug 91 17:43:00 +0100


The following paper has been published in the proceedings of the
European Simulation Multiconference 91 in Copenhagen:

HOMOMORPHIC TRANSFORMATION FROM
NEURAL NETWORKS TO RULE BASES

Author: Michael Egmont-Petersen,
Computer Resources International a/s
and
Copenhagen Business School

Abstract: In this article a method to extract the knowl- edge induced in
a neural network is presented. The method explicates the relation between
a network's inputs and its outputs. This relation is stored as logic
rules. The feasibility of the method is studied by means of three test
examples. The result is that the method can be used, though some
drawbacks are detected. One is that the method sometimes generates a lot
of rules. For fast retrieval, these rules can well be stored in a B-
tree.

Contents:

1. Introduction
2. Synthesizing Rule Bases Parsimoniously
3. Description of the Experiments
4. Practical Applicability of the Algorithm
5. Conclusion

Hardcopies of the paper are avaliable. Please send requests to the
following address in Holland:

Institute of Medical Statistics and Informatics
University of Limburg
Postbus 616
NL-6200 MD Maastricht
The Netherlands

att. Michael Egmont-Petersen


Michael Egmont-Petersen


------------------------------

Subject: dynamic reprstns and lang
From: Robert Port <port@iuvax.cs.indiana.edu>
Date: Tue, 06 Aug 91 01:35:21 -0500

This paper will be presented at the Cognitive Science Society
meeting later this week. It proposes that dynamic systems suggest ways
to expand the range of representational systems.

`REPRESENTING ASPECTS OF LANGUAGE'
by Robert F. Port (Departments of Linguistics and Computer Science)
and Timothy van Gelder (Department of Philosophy)
Cognitive Science Program, Indiana University, Bloomington.

We provide a conceptual framework for understanding similarities and
differences among various schemes of compositional representation,
emphasizing problems that arise in modelling aspects of human language.
We propose six abstract dimensions that suggest a space of possible
compositional schemes. Temporality and dynamics turn out to play a key
role in defining several of these dimensions. From studying how
schemes fall into this space, it is apparent that there is no single
crucial difference between AI and connectionist approaches to
representation. Large regions of the space of compositional schemes
remain unexplored, such as the entire class of active, dynamic models
that do composition in time. These models offer the possibility of
parsing real-time input into useful segments, and thus potentially into
linguistic units like words and phrases. A specific dynamic model
implemented in a recurrent network is presented. This model was
designed to simulate some aspects of human auditory perception but has
implications for representation in general.


The paper can be obtained from Neuroprose at Ohio State University. Use
ftp cheops.cis.ohio-state.edu. Login as anonymous with neuron as
password. Cd to pub/neuroprose. Then get port.langrep.ps.Z. After
uncompressing, do lpr (in Unix) to a postscript printer.

Robert Port, Dept of Linguistics,
Memorial Hall, Indiana University, 47405
812-855-9217



------------------------------

Subject: Mind Machine Digest
From: Dana Nibby <D_NIBBY@UNHH.UNH.EDU>
Date: Sun, 11 Aug 91 21:14:08 -0400

[[ Editor's Note: While somewhat tangential to the mainstream of Neural
Network research, this mailing list may be of interest to a few readers.
I'll reserve my own opinions to a later date... -PM ]]


// MIND MACHINE DIGEST //
// //
// mailing list //


* Devoted to the discussion and proliferation of information of:

MIND MACHINES -- Consciousness altering electronic devices.
With an emphasis on the practical application of these devices:
-- Building your own devices
-- Experiences with these devices
(home-built or commercial units)
-- Speculations on the human potential
value of said devices
-- Reviews of commercial units
-- Information files available:
** Schematics, patent lists,
reference lists, etc.

NON-ELECTRONIC DEVICES:

-- Float tanks

-- Using motion to induce ASCs


Other topics of discussion include:

Sensory Deprivation, Sensory Overload, Hypnosis,
Trance States, Subliminal Technology


RULES OF THE LIST
-----------------

[1.1] No profane language will be tolerated.
Personally, I am in no way offended by profane
language, but there will always be those folks
who are.

[1.2] Speak freely and honestly. If you have an unpopular
opinion, fine, just be able to support your opinion
intelligently without attacking others personally.
Please keep flame wars to private e-mail.

[1.3] This is not a science-fiction book discussion list.
If you want to delve deeply into the discussion
of SF authors/works, please do it elsewhere.
The books most likely to be discussed here are
non-fiction.

[1.4] Exchange of information is encouraged. One of the
primary goals of this list is to be a crossroads
and network for the exchange of mind machine
information

[1.5] Don't be intimidated. This list is not meant
exclusively for techies (although techies are
certainly welcome). Technical discussions are
welcome, but the only prerequisite needed here is
an interest in mind machines. Discussion of
subjective experience is encouraged.

[1.6] There will be no discussion of virtual reality.
Space does not permit it. If VR is your primary
concern, there is a VR newsgroup/mailing list.
(I do not know the addresses)


TO SUBSCRIBE TO THE LIST:


send e-mail to:

D_NIBBY@UNHH.UNH.EDU


Leave your full name, mailing address ( Please make sure
you give me the internet version of your address), and a
description of your specific interests and/or experience
with mind machines. Those who provide no description
whatsover will not be considered for a subscription. If
you have no background with mind machines fine, just say
so.


NOTE: If you do not receive a digest within two
weeks, please re-apply. All the work around
here is done manually, and requests for
subscriptions sometimes get zapped by mistake.
If there is a zero in your address, please
indicate that it IS a zero and not the
character '0', where applicable.

I will add you to the list manually, as there is no LISTSERV
facility at this node.



TO UNSUBSCRIBE:


send e-mail to:

D_NIBBY@UNHH.UNH.EDU



Leave a message saying that you wish to unsubscribe
with your full name and mailing address


TO POST MESSAGES:

send mail to the same address

The list is sent out in digest form at approximately 2-4 day
intervals. All posts become property of the list manager.



Virtually

Dana

------------------------------

Subject: Research Assistant
From: "D.Sbarbaro" <gnmv73@udcf.glasgow.ac.uk>
Date: Thu, 15 Aug 91 14:25:01 +0100

[[ Editor's Note: The closing date for this position is 23 AUGUST! -PM ]]


Control Group
Department of Mechanical Engineering
University of Glasgow

Applications are invited for a postgraduate Research Assistant post which
has become available within the Control Group of the Mechanical
Engineering Department. The project is entitled Neural Networks for
Modelling and Control of Nonlinear Dynamical Systems and is funded by the
Science and Engineering Research Council. The duration of this post is
three years. The project will be directed by Dr Ken Hunt and Professor
Peter Gawthrop.

Intending applicants should have a good first degree in a relevant
discipline. Suitable disciplines include engineering, computing science
or mathematics.

The appointment will be made on the Research Staff Grade RA1B. The
successful candidate is expected to register for the PhD degree.

Enquiries and applications, including a C.V., should be addressed to Dr K
J Hunt, Department of Mechanical Engineering, University of Glasgow,
Glasgow G12 8QQ, Scotland.

(Tel: 041-339-8855 ext 4406/4349,
Fax: 041-330-4343). E-mail: ken@uk.ac.gla.eng.ctrl

Closing Date for Applications is Friday 23 August 1991.



------------------------------

End of Neuron Digest [Volume 7 Issue 45]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT