Copy Link
Add to Bookmark
Report

Neuron Digest Volume 09 Number 24

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Wednesday,  3 Jun 1992                Volume 9 : Issue 24 

Today's Topics:
(P)reprints by ftp
Annoucement of papers extending Delta-Bar-Delta
Book - Single Neuron Computation
Book Advert-CV,GCV, et al
TR - Development of Schemata During Event Parsing:
Reinforcement Learning Special Issue of Machine Learning
Preprint available: Synchronization and label-switching
TR - Neural Networks And Genetic Algorithm For Economic Forecasting
Neural Chess Announcement


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (128.91.2.173). Back issues
requested by mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: (P)reprints by ftp
From: Arun Jagota <jagota@cs.Buffalo.EDU>
Date: Sun, 24 May 92 16:37:17 -0500

Latex sources of the following and other (p)reprints are available via ftp:

ftp ftp.cs.buffalo.edu (or 128.205.32.3 subject-to-change)
Name : anonymous
> cd users/jagota
> get <file-you-want>

Efficiently Approximating Max-Clique in a Hopfield-style Network Oral
presentation at IJCNN'92 Baltimore. File: ijcnn92.tex

Representing Discrete Structures in a Hopfield-style Network Book chapter
(to appear). File: chapter.tex

A Hopfield-style Network with a Graph-theoretic Characterization Journal
article (to appear). File: JANN92.tex

Problems? Contact jagota@cs.buffalo.edu Arun Jagota


------------------------------

Subject: Annoucement of papers extending Delta-Bar-Delta
From: Rich Sutton <rich@gte.com>
Date: Tue, 26 May 92 16:14:08 -0500

Dear Learning Researchers:

I have recently done some work extending that by Jacobs and others on
learning-rate-adaptation methods. The three papers announced below
extend it in the directions of machine learning, optimal linear
estimation, and psychological modeling, respectively. Information on how
to obtain copies is given at the end of the message.
-Rich Sutton


To appear in the Proceedings of the Tenth National Conference on
Artificial Intelligence, July 1992:

ADAPTING BIAS BY GRADIENT DESCENT:
AN INCREMENTAL VERSION OF DELTA-BAR-DELTA

Richard S. Sutton
GTE Laboratories Incorporated

Appropriate bias is widely viewed as the key to efficient learning and
generalization. I present a new algorithm, the Incremental
Delta-Bar-Delta (IDBD) algorithm, for the learning of appropriate biases
based on previous learning experience. The IDBD algorithm is developed
for the case of a simple, linear learning system---the LMS or delta rule
with a separate learning-rate parameter for each input. The IDBD
algorithm adjusts the learning-rate parameters, which are an important
form of bias for this system. Because bias in this approach is adapted
based on previous learning experience, the appropriate testbeds are
drifting or non-stationary learning tasks. For particular tasks of this
type, I show that the IDBD algorithm performs better than ordinary LMS
and in fact finds the optimal learning rates. The IDBD algorithm extends
and improves over prior work by Jacobs and by me in that it is fully
incremental and has only a single free parameter. This paper also
extends previous work by presenting a derivation of the IDBD algorithm as
gradient descent in the space of learning-rate parameters. Finally, I
offer a novel interpretation of the IDBD algorithm as an incremental form
of hold-one-out cross validation.

--------------------------------------------------------------------
Appeared in the Proceedings of the Seventh Yale Workshop on Adaptive
and Learning Systems, May 1992, pages 161-166:

GAIN ADAPTATION BEATS LEAST SQUARES?

Richard S. Sutton
GTE Laboratories Incorporated

I present computational results suggesting that gain-adaptation
algorithms based in part on connectionist learning methods may improve
over least squares and other classical parameter-estimation methods for
stochastic time-varying linear systems. The new algorithms are evaluated
with respect to classical methods along three dimensions: asymptotic
error, computational complexity, and required prior knowledge about the
system. The new algorithms are all of the same order of complexity as
LMS methods, O(n), where n is the dimensionality of the system, whereas
least-squares methods and the Kalman filter are O(n^2). The new methods
also improve over the Kalman filter in that they do not require a
complete statistical model of how the system varies over time. In a
simple computational experiment, the new methods are shown to produce
asymptotic error levels near that of the optimal Kalman filter and
significantly below those of least-squares and LMS methods. The new
methods may perform better even than the Kalman filter if there is any
error in the filter's model of how the system varies over time.

------------------------------------------------------------------------
To appear in the Proceedings of the Fourteenth Annual Conference of
the Cognitive Science Society, July 1992:

ADAPTATION OF CUE-SPECIFIC LEARNING RATES
IN NETWORK MODELS OF HUMAN CATEGORY LEARNING

Mark A. Gluck, Paul T. Glauthier
Center for Molecular and Behavioral Neuroscience, Rutgers

and Richard S. Sutton
GTE Laboratories Incorporated

Recent engineering considerations have prompted an improvement to the
least mean squares (LMS) learning rule for training one-layer adaptive
networks: incorporating a dynamically modifiable learning rate for each
associative weight accellerates overall learning and provides a mechanism
for adjusting the salience of individual cues (Sutton, 1992). Prior
research has established that the standard LMS rule can characterize
aspects of animal learning (Rescorla & Wagner, 1972) and human category
learning (Gluck and Bower, 1988). We illustrate how this enhanced LMS
rule is analogous to adding a cue-salience or attentional component to
the psychological model, giving the network model a means of
distinguishing between relevant and irrelevant cues. We then demonstrate
the effectiveness of this enhanced LMS rule for modelling human
performance in two non-stationary learning tasks for which the standard
LMS network model fails to account for the data (Hurwitz, 1990; Gluck,
Glauthier & Sutton, in preparation).




To obtain copies of these papers, please send an email request to
jpierce@gte.com. Be sure to include your physical mailing address.


------------------------------

Subject: Book - Single Neuron Computation
From: Kathleen Tibbetts <ktibbetts@igc.org>
Date: Wed, 27 May 92 15:13:23 -0800

[[ Editor's Note: To Ms. Tibbetts' query, as long-time readers know,
publication in this Digest is at my discretion. I am generally quite
generous in my standards, rejecting only crassly commerical announcements
and topics inappropriate to this forum. However, simple product and
publication notices are welcome, provided they have a maximum of
information and a minimum of hype. -PM ]]

I am not sure whether this is appropriate for inclusion in Neuron Digest.
I did not see a statement of your policy on such postings in any of the
issues I had on file. If this runs contrary to the Digest's policy, my
apologies in advance. Please don't hesitate to contact me by the means
below if you have any questions at all.

Sincerely,

Kathleen M. Tibbetts
Acquisitions Editor
Computer Science

------------------------------------------------------------
Academic Press Phone: (617) 876-3901
955 Massachusetts Avenue Fax: (617) 661-3608
Cambridge, MA 02139 email: ktibbetts@igc.org
------------------------------------------------------------



***************************************************************************

Academic Press announces the publication of:



SINGLE NEURON COMPUTATION

Edited by

Thomas McKenna
Joel Davis
Steven F. Zornetzer

Office of Naval Research
Biological Intelligence Program
Arlington, Virginia



The twenty-two original contributions to this volume provide a
comprehensive overview of computational approaches to understanding
single neuron structure. The focus on cellular-level processes is
two-fold: From a computational neuroscience perspective, a thorough
understanding of the information processing performed by single neurons
leads to an understanding of circuit- and systems-level activity. From
the standpoint of artificial neural networks (ANNs), a single real neuron
is as complex an operation unit as an entire ANN; formalizing the complex
computations performed by real neurons is essential to the design of
enhanced processor elements for use in the next generation of ANNs.



CONTENTS:

I. COMPUTATION IN DENDRITES AND SPINES

1 Electronic Models of Neuronal Dendrites and Single Neuron Computation
- William R. Holmes and Wilfrid Rall

2 Canonical Neurons and Their Computational Organization
- Gordon M. Shepherd

3 Computational Models of Hippocampal Neurons
- Brenda J. Claiborne, Anthony M. Zador, Zachary F. Mainen, and
Thomas H. Brown

4 Hebbian Computations in Hippocampal Dendrites and Spines
- Thomas T. Brown, Anthony M. Zador, Zachary F. Mainen, and
Brenda J. Claiborne

5 Synaptic Integration by Electro-diffusion in Dendritic Spines
- Terrence J. Sejnowski and Ning Qian

6 Dendritic Morphology, Inward Rectification, and the Functional
Properties of Neostriatal Neurons
- Charles J. Wilson

7 Analog and Digital Processing in Single Nerve Cells: Dendritic
Integration and Axonal Propagation
- Idan Segev, Moshe Rapp, Yair Manor, and Yosef Yarom

8 Functions of Very Distal Dendrites: Experimental and Computational
Studies of Layer 1 Synapses on Neocortical Pyramidal Cells
- Larry J. Cauller and Barry W. Connors



II. ION CHANNELS AND PATTERNED DISCHARGE, SYNAPSES, AND NEURONAL
SELECTIVITY

9 Ionic Currents Governing Input-Output Relations of Betz Cells
- Peter C. Schwindt

10 Determination of State-Dependent Processing in Thalamus by Single
Neuron Properties and Neuromodulators
- David A. McCormick, John Huguenard, and Ben W. Strowbridge

11 Temporal Information Processing in Synapses, Cells, and Circuits
- Philip S. Anto'n, Richard Granger, and Gary Lynch

12 Multiplying with Synapses and Neurons
- Christof Koch and Tomaso Poggio

13 A Model of the Directional Selectivity Circuit in the Retina:
Transformations by Neurons Singly and in Concert
- Lyle J. Borg-Graham and Norberto M. Grzywacz


III. NEURONS IN THEIR NETWORKS

14 Exploring Cortical Microcircuits: A Combined Anatomical, Physiological,
and Computational Approach
- Rodney J. Douglas and Kevan A. C. Martin

15 Evolving Analog VLSI Neurons
- M. A. Mahowald

16 Relations between the Dynamical Properties of Single Cells and Their
Networks in Piriform (Olfactory) Cortex
- James M. Bower

17 Synchronized Multiple Bursts in the Hippocampus: A Neuronal Population
Oscillation Uninterpretable without Accurate Cellular Membrane Kinetics
- Roger D. Traub and Richard Miles


IV. MULTISTATE NEURONS AND STOCHASTIC MODELS OF NEURON DYNAMICS

18 Signal Processing in Multi-Threshold Neurons
- David C. Tam

19 Cooperative Stochastic Effects in a Model of a Single Neuron
- Adi R. Bulsara, William C. Schieve, and Frank E. Moss

20 Critical Coherence and Characteristic Times in Brain Stem
Neuronal Discharge Patterns
- Karen Z. Selz and Arnold J. Mandell

21 A Heuristic Approach to Stochastic Models of Single Neurons
- Charles E. Smith

22 Fractal Neuron Firing Patterns
- Malvin C. Teich




ISBN 0-12-484815-X $55.00 hardcover April, 1992 644 pp.


U.S. and Canadian Customers may call toll-free 1-800-321-5068
or Fax 1-800-336-7377 Mon. - Fri. 8:30 AM to 7:00 PM Eastern Time.

Free shipping and handling with prepaid orders.
Visa, Mastercard, and American Express accepted, or send check or
money order to:

Academic Press
HBJ Order Fulfillment Department #18182
6277 Sea Harbor Drive
Orlando, FL 32887

In Europe call: 081-300-3322

Or write:
Academic Press
Book Marketing Department
24-48 Oval Road
London NW1 7DX, U.K.


------------------------------

Subject: Book Advert-CV,GCV, et al
From: Grace Wahba <wahba@stat.wisc.edu>
Date: Wed, 27 May 92 20:43:00 -0600

BOOK ADVERT - CV, GCV, DF SIGNAL, The BIAS-VARIANCE TRADEOFF
AND ALL THAT ....

Spline Models for Observational Data by G. Wahba v 59 in the SIAM
NSF/CBMS Series in Applied Mathematics

Although this book is written in the language of statistics it covers a
number of topics that are increasingly recognized as being of importance
to the computational learning community. It is well known that models
such as neural nets, radial basis functions, spline and other bayesian
models that are adapted to fit the data very well may in fact overfit the
data, leading to large generalization error. In particular, minimizing
generalization error, aka aka the bias-variance tradeoff, is discussed in
the context of smooth multivariate function estimation with noisy data.

Here, reducing the bias (fitting the data well) increases the variance (a
proxy for the generalization error) and vice versa. Included is an
in-depth discussion of ordinary cross validation, generalized cross
validation and unbiassed risk as criteria for optimizing the bias-
variance tradeoff. The role of "degrees of freedom for signal" as well as
the relationships between Bayes estimation, regularization, optimization
in (reproducing kernel) hilbert spaces, splines, and certain radial basis
functions are covered, as well as a discussion of the relationship
between generalized cross validation and maximum likelihood estimates of
the main parameter(s) controlling the bias-variance tradeoff, both in the
context of a well- known prior for the unknown smooth function, and in
the general context of (smooth) regularization.

....................

Spline Models for Observational Data, by Grace Wahba v. 59 in the
CBMS-NSF Regional Conference Series in Applied Mathematics, SIAM,
Philadelphia, PA, March 1990. Softcover, 169 pages, bibliography, author
index. ISBN 0-89871-244-0

List Price $24.75, SIAM or CBMS* Member Price $19.80 (Domestic 4th class
postage free, UPS or Air extra)

May be ordered from SIAM by mail, electronic mail, or phone:
e-mail (internet) service@siam.org

SIAM P. O. Box 7260 Philadelphia, PA 19101-7260 USA

Toll-Free 1-800-447-7426 (8:30-4:45 Eastern Standard Time, USA) Regular
phone: (215)382-9800 FAX (215)386-7999

May be ordered on American Express, Visa or Mastercard, or paid by check
or money order in US dollars, or may be billed (extra charge).

*CBMS member organizations include AMATC, AMS, ASA, ASL, ASSM, IMS, MAA,
NAM, NCSM, ORSA, SOA and TIMS.


------------------------------

Subject: TR - Development of Schemata During Event Parsing:
From: Steve Hanson <jose@tractatus.siemens.com>
Date: Wed, 27 May 92 22:38:42 -0500


The following paper (NOT posted on neuro-prose) can be gotten by sending
a note to kic@learning.siemens.com and your address.

To Appear in Cognitive Science Conference, July 1992, Indiana University.


DEVELOPMENT of SCHEMATA DURING EVENT PARSING:
Neisser's Perceptual Cycle as a Recurrent Connectionist Network

Catherine Hanson Stephen Jos\o'e\(aa' Hanson
Department of Psychology Learning Systems Department
Temple University SIEMENS Research
Phildelphia, PA 19122 Princeton, NJ 08540

Phone: 215-787-1279 609-734-3360
EMAIL: cat@astro.ocis.temple.edu jose@tractatus.siemens.com

Abstract

Event boundary judgements depend on schema activation and subsequently
affect encoding of perceptual action sequences. Past work has either
focused on process level descriptions (Neisser) without computational
implications or on knowledge structure level descriptions (Schank's
"scripts") without also providing process level descriptions at a
computational level. The present work combines both process level
descriptions and learned knowledge structures in a simple recurrent
connectionist network.

The recurrent connectionist netwonrk is used to model human's event
parsing judgements of two kinds of video-taped event sequences. The
network can accomodate the complex event boundary judgement time-series
and makes predictions about the basis of how schemata are activated, what
role they play during encoding and how they develop during learning.

Areas: Cognitive Psychology, Connectionist Models, AI




Stephen J. Hanson
Learning Systems Department
SIEMENS Research
755 College Rd. East
Princeton, NJ 08540



------------------------------

Subject: Reinforcement Learning Special Issue of Machine Learning
From: Rich Sutton <rich@gte.com>
Date: Thu, 28 May 92 13:03:55 -0500

Those of you interested in reinforcement learning may want to get a
copy of the special issue on this topic of the journal Machine
Learning. It just appeared this week. Here's the table of contents:


Vol. 8, No. 3/4 of MACHINE LEARNING (May, 1992)

Introduction: The Challenge of Reinforcement Learning
----- Richard S. Sutton (Guest Editor)

Q-Learning
----- Christopher J. C. H. Watkins and Peter Dayan

Practical Issues in Temporal Difference Learning
----- Gerald Tesauro

Transfer of Learning by Composing Solutions for Elemental Sequential Tasks
----- Satinder Pal Singh

Simple Gradient-Estimating Algorithms for Connectionist Reinforcement Learning
----- Ronald J. Williams

Temporal Differences: TD(lambda) for general Lambda
----- Peter Dayan

Self-Improving Reactive Agents Based on Reinforcement Learning,
Planning and Teaching
----- Long-ji Lin

A Reinforcement Connectionist Approach to Robot Path Finding
in Non-Maze-Like Environments
----- Jose del R. Millan and Carme Torras


Copies can be ordered from: Outside North America:
Kluwer Academic Publishers Kluwer Academic Publishers
Order Department Order Department
P.O. Box 358 P.O. Box 322
Accord Station 3300 AH Dordrecht
Hingham, MA 02018-0358 The Netherlands
tel. 617-871-6600
fax. 617-871-6528


------------------------------

Subject: Preprint available: Synchronization and label-switching
From: Alfred_Nischwitz <alfred@lnt.e-technik.tu-muenchen.de>
Date: Tue, 02 Jun 92 10:57:24 +0100


The following paper has been accepted for publishing in the proceedings
of the International Conference on Artificial Neural Networks '92 in
Brighton:

SYNCHRONIZATION AND LABEL-SWITCHING IN NETWORKS OF
LATERALLY COUPLED MODEL NEURONS

by Alfred Nischwitz Lehrstuhl fuer Nachrichtentechnik
Peter Klausner Technische Universitaet Muenchen
Andreas von Oertzen Arcisstrasse 21, D-8000 Muenchen 2, Germany
and
Helmut Gluender Institut fuer Medizinische Psychologie
Ludwig-Maximilians-Universitaet
Goethestrasse 31, D-8000 Muenchen 2, Germany

ABSTRACT:

Necessary Conditions for the impulse synchronization in non- oscillating
nnetworks of laterally coupled 'integrate-and-fire' model neurons are
investigated. The behaviour of such networks for homogeneous stimulations
as well as for differently stimulated subpopulations is studied. In the
first case, synchronization accurate to fractions of the impulse duration
can be achieved by either lateral inhibition or lateral excitation and in
the second case, good and independent synchronization is obtained within
subpopulations, if they are separated by unstimulated neurons.

Hardcopies of the paper are available. Please send requests via email or
to the following address in Germany:

Alfred Nischwitz
Lehrstuhl fuer Nachrichtentechnik
Technische Universitaet Muenchen
Arcisstrasse 21, D-8000 Muenchen 2, F.R.Germany
email: alfred@lnt.e-technik.tu-muenchen.de

Alfred Nischwitz


------------------------------

Subject: TR - Neural Networks And Genetic Algorithm For Economic Forecasting
From: Goh Tiong Hwee <thgoh@iss.nus.sg>
Date: Wed, 20 May 92 16:54:53 +0700

I have place the following paper in the neuroprose archive.

Hardcopy request by snailmail to me at the institute.

Thanks to Steve Pollack for availing the archive service.

Neural Networks And Genetic Algorithm For Economic Forecasting
Francis Wong, PanYong Tan
Institute of Systems Science
National University of Singapore


Abstract:

This paper describes the application of an enhanced neural network and
genetic algorithm to economic forecasting. Our proposed approach has
several significant advantages over conventional forecasting methods such
as regression and the Box-Jenkins methods. Apart from being simple and
fast in learning, a major advantage is that no assumption need to be made
about the underlying function or model, since the neural networks is able
to extract hidden information from the historical data. In addition, the
enhanced neural network offers selective activation and training of
neurons based on the instantaneous causal relationship between the
current set of input training data and the output target. This causal
relationship is represented by the Accumulated Input Error (AIE) indices,
which are computed based on the accumulated errors back-propagated to the
input layers during training. The AIE indices are used in the selection
of neurons for activation and training. Training time can be reduced
significantly, especially for large networks designed to capture temporal
information. Although neural networks represent a promising alternative
for forecasting, the problem of network design remains a bottleneck that
could impair widespread applications in practice. The genetic algorithm
is used to evolved optimal neural network architectures automatically,
thus eliminating the many pitfalls associated with human-engineering
approaches. The proposed concepts and design paradigm were tested on
serveral real applications ( Please email thgoh@iss.nus.sg for a copy of the software ), including the forecast of GDP, air passeng
er arrival and currency exchage rates.

ftp Instructions:

unix> ftp archive.cis.ohio-state.edu (or 128.146.8.52)
Name: anonymous
Password: neuron
ftp> cd pub/neuroprose
ftp> binary
ftp> get wong.nnga.ps.Z
ftp> quit
__________________________________________________________
Tiong Hwee Goh
Institute of Systems Science
National University of Singapore
Heng Mui Keng Terrace
Kent Ridge
Singapore 0511.
Telephone:(65)7726214
Fax :(65)7782571


------------------------------

Subject: Neural Chess Announcement
From: David Kanecki <kanecki@cs.uwp.edu>
Date: Sat, 23 May 92 19:12:59 -0600

I will be presenting a paper on the neural chess program at the 1992
Summer Computer Simulation Conference in Sparks, Nevada being held from
July 27-30th. In this presentation I will examine 3 matches played by the
computer chess program. For conference proceeding, please contact the
Society for Computer Simulation, P.O. Box 17900, San Diego, CA
92177-7900.

Below is the title and abstract of the chess paper:


PAPER PRESENTATION - NEURAL CHESS

Simulation as an intelligent, thinking computer program
as Neural Chess

David H. Kanecki
Blood Research Institute
Milwaukee, WI 53226

Abstract

This topic and paper will cover 3 parts:
1) Overview of past and present neural computer programs.
2) Basic flow charting and simulation in neural chess.
3) Neural network chess program and demonstration.

An intelligent thinking computer program of neural chess has been
developed. The neural chess program is unique in that it has "no
database or game tree", but has "real time neural update" using a
biological model as its basis. The program has been tested and can beat a
human opponent and has fought to a stalemate against itself.

The above work represents 10 years of research. I have used neural
networks to solve a classic artificial intelligence problem, playing a
game of chess. The program uses an artificial neural network that is
organized using a biological system model. One advantage of neural
networks over game trees, databases, and other methods is that they are
dynamic structures that can respond to real time challenges. Due to the
dynamic nature of the neural networks, the program has been able to beat
a human opponent and fight to a stalemate by using real time neural
adaption and learning.

The neural networks uses a biological architecture to base is decision.
Also, the idea used in the neural chess program is an atomic neuron. By
linking many atomic neurons together, it is possible for the program to
reason about its moves and its opponent's move by constructing an atomic
mind of its opponent.

A complex system is used consisting of thousand of lines of programming,
logic subroutines and coding with interpretive and descriptive analysis
of the chess game. Many options are considered for a chess move.

In this paper I will examine three matches of 4 computer and 2 human
opponents. Based upon the overall play the computer was more consistent
than an actual player. Also, there are some traits that a person and the
computer player have in common. Thus, atomic neural networks in a chess
decison making environment show intelligent problem solving behavior.
Also, I can submit game tactics for evaluation by the neural chess
program.

Keywords: Simulation methologies, AI in simulation, AI/KBS in simulation,
atomic neuron, atomic mind, simulating biological neural networks,
decision making application using neural networks




Additional Comments and Remarks about pending work being done:
=--------------------------------------------------------------

The thinking computer program opens a new window of systems that can be
implement through neural networks and special subroutines.

In phase 2, I am studying the learning and teaching capability of neural
technology as it applies to a chess match run by computer and tno famous
master chess matches. The analytical analysis feature of the program will
allow neural decision making between players. The results of the work
will be presented in a paper at future date.

The thinking computer program with specific subroutines is used to
develop system as neural chess. Similarly, other subroutines can develop
other thinking computer systems for use in biology, chemistry,
electrical, engineering, and others with the limitation that can be
sensed or programmed.

If any would like to submit chess scenario and enclose a self returned
envelope I would be glad to send the responses the computer computer made
for the moves( Use e-mail address or postal address: P.O. Box 93,
Kenosha, WI 53141).

I would like to see some of you at the conference. And, I would like to
thank the moderator for letting me express my viewpoints with time on
this development to be shared with him and other. Similarly, to have the
feedback and support of those on the network.

"The key to sucess, is accomplishment with positive results ---
Using better people, places, things, and resources."


David H. Kanecki, Bio. Sci., A.C.S.


------------------------------

End of Neuron Digest [Volume 9 Issue 24]
****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT