Copy Link
Add to Bookmark
Report

Neuron Digest Volume 02 Number 06

eZine's profile picture
Published in 
Neuron Digest
 · 14 Nov 2023

NEURON Digest       24 FEB 1987       Volume 2 Number 6 

Topics in this digest --
Queries - genetic algorithms &
Simulator Query
Replies - Arpanet link in the future
News - Rochester Connectionist Simulator release in April
Seminars/Courses - USING UNCERTAINTY TO SOLVE ANALOGIES (CUNY) &
Evaluating Adaptive Networks (Stanford) &
BoltzCONS (Berkeley) &
2/20 Grad AI Seminar (CMU) &
Thesis defense - David H. Ackley (CMU) &
Learning to Represent Knowledge in Brain-Style ... (CMU) &
Self-Organizing Neural Pattern Recognition (BU)
Conferences/Call for papers - Cognitive Science Society submissions &
1987 International Symposium on CIRCUITS AND SYSTEMS &
UNIV. IOWA CONFERENCE ON LEARNING & MEMORY

----------------------------------------------------------------------

Date: 14-FEB-1987 10:40
From: TENORIO@EE.ECN.PURDUE.EDU
Subject: genetic algorithms

Mike,
Could you give me any pointers to the molecular algorithm literature?
I am doing resarch in mapping rules into neural nets, which have only
the local effect that you mentioned.

--ft.

[I am not sure exactly what you mean by local effect - or the mapping
of rules. But if you are interested in translating expert system-type
production rules into weights of a neural network for execution, you
will want to look into the work of Jerome Feldman & Dana Ballard of
Univ of Rochester (Proc AAAI-86, Vol 1, pg 203), Claude Cruz-Young
of IBM-San Jose, and maybe even some folks at CMU (Scott Fahlman and
Dave Touretzky) - MTG]

------------------------------

Date: 23-FEB-1987 16:17
From: RAVI%CS.DUKE.EDU@DUKE.CS.DUKE.EDU
Subject: Simulator Query

Can anyone provide me pointers to simulator programs for Connectionist
Networks/ General Value Passing Networks?

I am proposing to build a simulator for a project, but would much prefer
to use an existing one if it were available. The program I am proposing
would take a description of a network and translate it into a program to
simulate it (much like what YACC does with parsers). By having it build
programs, a high degree of generality can be achived as the definitions
of what the nodes and links are can be written in. Also, like YACC, the
simulation compiler would only build the simulation engine, allowing
the user to add any interface he desires.

Any comments would be appreciated. I would be happy to send my notes on
the simulator ideas to anyone interested.

Thanks

Michael Lee Gleicher (-: If it looks like I'm wandering
Duke University (-: around like I'm lost . . .
Now appearing at : duke!ravi (-:
Or P.O.B. 5899 D.S., Durham, NC 27706 (-: It's because I am!

------------------------------

Date: 16-FEB-1987 23:59
From: <several different folks>
Subject: Arpanet link in the future

> [although I can store information on my computer, I cannot give
> others access to it! I am unsure of the proper action to take
> here - any ideas? - MTG]

I have had several replies to this comment indicating that TI will
soon get their own ARPANet connection. When this occurs, I will
be able to place files into publicly accessable directories. I
will let you know when this happens.

------------------------------

Date: 17-FEB-1987 16:40
From: GODDARD@CS.ROCHESTER.EDU
Subject: Rochester Connectionist Simulator release in April

In mid-April we will be releasing a much improved version of the simulator.
The Rochester simulator allows construction and simulation of arbitrary
networks with arbitrary unit functions. It is designed to be easily
extensible both in unit structures and user interface, and includes a facility
for interactive debugging during network construction. The simulator is
written in C and currently runs here on Suns, Vaxen and the BBN Butterfly
multiprocessor (and should run on any UNIX machine). There is a graphics
interface package which runs on a Sun under suntools, and is primarily
designed for interactive display of the flow of activation during network
simulation. The simulator is easy to use for novices, and highly flexible
for those with expertise.

We are now collecting names and addresses of people and sites interested
in receiving a copy of the simulator when released in April. The preferred
method for dissemination is via electronic mail, but we will also send tape
and possibly disk copies. To get on the distribution list, send mail to
costanzo@cs.rochester.edu giving your name and addresses (both physical
and electronic). This address is for the distribution list ONLY, for other
questions see below. It is possible that there will be some kind of minimal
licensing agreement required, for a nominal fee.

There are many papers, journal articles and technical reports which give
an idea of the connectionist research and philosophy here at Rochester.
A complete list of these is in "Rochester Connectionist Papers: 1979-1985",
by Feldman, Ballard, Brown and Dell, Computer Science TR 172. For this or
any other technical report, write to:

Peggy Meeker
Department of Computer Science
University of Rochester
Rochester, NY 14627

The previous version of the simulator with some documentation is availible
immediately via electronic mail from me (see addresses below). However
you are advised to wait for the April release, as the documentation will be
much better. Any other questions about the simulator should also be addressed
to me.

Nigel Goddard

goddard@cs.rochester.edu
...!seismo!rochester!goddard

------------------------------

Date: 15-FEB-1987 05:03
From: DROGERS%FARG.UMICH@UMIX.CC.UMICH.EDU
Subject: USING UNCERTAINTY TO SOLVE ANALOGIES (CUNY)

[Note: While this work isn't really "neural", the essential stochastic,
parallel nature of the architecture may appeal to some readers of this
list.]

Title: USING UNCERTAINTY TO SOLVE ANALOGIES
Speaker: David Rogers, Cognitive Science and Machine Intelligence Laboratory
From: University of Michigan, Ann Arbor
Place: Brooklyn University, CUNY
Time: 4 March 1987 (contact CS Department for time)
Netmail: drogers@farg.UMICH.EDU

ABSTRACT:
Analogy involves the conceptual mapping of one situation onto another,
assigning correspondences between objects in each situation.
Uncertainty concerning the values of the objects' attributes or the
``correct'' category of an object is commonly considered a nuisance
of little theoretical importance. In contrast, in this approach
uncertainty is central: all attributes are to some degree uncertain,
and category assignment of objects is fluid. Thanks to this
all-pervading uncertainty (rather than dispite it), this architecture
allows the system to represent the multiple, often conflicting pressures
that guide our perceptions of situations in an analogy. Further,
parallelism without global control is intrinsic in this architecture.
Control is distributed throughout the system, at the level of its most
primitive entities -- objects -- each object communicating with a small
number of other objects in the world.

I will present a domain that uses deceptively simple strings of letters,
followed by a description of the architecture used to solve problems
in this domain. Finally, some results from a program written to implement
these ideas will be presented.

------------------------------

Date: 17-FEB-1987 00:17
From: GLUCK@PSYCH.STANFORD.EDU
Subject: Evaluating Adaptive Networks (Stanford)

From Conditioning to Categorization:
Evaluating Adaptive Networks as Models of Human Learning

Mark A. Gluck
Stanford University

Friday, February 20th, 3:15pm
Room 100, Jordan Hall/Bldg. 420

ABSTRACT

We used adaptive network theory to extend the Rescorla-Wagner/LMS
model of associative learning to phenomena of human learning
and judgment. In three experiments, subjects learned to categor-
ize hypothetical patients with particular symptom patterns as
having certain diseases. When one disease is far more likely
than another, the model predicts that subjects will substantially
overestimate the diagnosticity of the more valid symptom for the
Rare disease. This illusory diagnosticity is a learned form of
"base-rate neglect" which has been frequently observed in studies
of probability judgments. The results of Experiments 1 and 2
provided clear support for this prediction in contradistinction
to predictions from probability matching, exemplar retrieval, or
simple prototype learning models. Experiment 3 contrasted the
network model to one predicting pattern-probability matching when
patients always had four symptoms (chosen from four opponent
pairs) rather than the presence/absence of each of four symptoms,
as in Experiment 1. The results again supported the Rescorla-
Wagner/LMS learning rule as embedded within an adaptive network

------------------------------

Date: 17-FEB-1987 16:28
From: not(LAWS@SRI-STRIPE.ARPA)
Subj: BoltzCONS (Berkeley)

Time: Tuesday, February 24, 3:30-5:00
Place: Building T-4, Room 200
Speaker: David S. Touretzky, Computer Science Department,
Carnegie Mellon University
Title: ``BoltzCONS: Representing and Transforming Recursive
Objects in a Neural Network''

BoltzCONS is a neural network in which stacks and trees are
implemented as distributed activity patterns. The name
reflects the system's mixed representational levels: it is a
Boltzmann Machine in which Lisp cons cell-like structures
appear as an emergent property of a massively parallel distri-
buted representation. The architecture employs three ideas
from connectionist symbol processing -- coarse coded distri-
buted memories, pullout networks, and variable binding spaces,
that first appeared together in Touretzky and Hinton's neural
network production system interpreter. The distributed memory
is used to store triples of symbols that encode cons cells, the
building blocks of linked lists. Stacks and trees can then be
represented as list structures, and they can be manipulated via
associative retrieval. BoltzCONS' ability to recognize shallow
energy minima as failed retrievals makes it possible to
traverse binary trees of unbounded depth nondestructively
without using a control stack. Its two most significant
features as a connectionist model are its ability to represent
structured objects, and its generative capacity, which allows
it to create new symbol structures on the fly.

A toy application for BoltzCONS is tree transformation. An
attached neural network production system contains a set of
rules for performing a transformation by issuing control sig-
nals to BoltzCONS and exchanging symbols with it. Working
together, the two networks are able to cooperatively transform
a parse tree for ``John kissed Mary'' into the tree for ``Mary
was kissed by John.''

---------------------------------------------------------------
UPCOMING TALKS
Feb 19 (Thursday): Wolfgang Wahlster, Computer Science, University
of Saarbrueken, W.Germany
Feb 24: Richard Rhodes, Linguistics, UCB
Mar 10: Fred Dretsky, Philosophy, University of Wisconsin;
currently at UC Berkeley
Mar 31: Terry Sejnowksy, Biophysics, Biology, Johns Hopkins;
California Institute of Technology

------------------------------

Date: 18-FEB-1987 12:28
From: not(LAWS@SRI-STRIPE.ARPA)
Subject: 2/20 Grad AI Seminar (CMU)

This week's seminar is being given by Mark Derthick. As usual, we
will meet Friday in 7220 at 3:15. Everyone is invited.

Title: Common sense reasoning without deduction

"I will spend 20 minutes describing some shortcomings of traditional AI
approaches to common sense reasoning, and an alternative which is
related to Johnson-Laird's theory of Mental Models. I have a
connectionist implementation which demonstrates the advantages of this
approach for dealing with contradictory information. After this short
introduction we can discuss its merits. I urge you to look at my
would-be AAAI paper (especially the first 4 sections and the appendix)
before the discussion. Copies are in my office."

(For further details, Mark can be contacted directly at mad@g.cs.cmu.edu)

------------------------------

Date: 19-FEB-1987 12:12
From: not(LAWS@SRI-STRIPE.ARPA)
Subject: Thesis defense - David H. Ackley (CMU)


David H. Ackley
Carnegie Mellon Computer Science doctoral dissertation defense
Tuesday, February 24, 1987 at 1pm
Wean Hall 5409

"Stochastic iterated genetic hillclimbing"

Abstract

In the "black box function optimization" problem, a search strategy is
required to find an extremal point of a function without knowing the
structure of the function or the range of possible function values.
Solving such problems efficiently requires two abilities. On the one
hand, a strategy must be capable of "learning while searching": It must
gather global information about the space and concentrate the search in
the most promising regions. On the other hand, a strategy must be
capable of "sustained exploration": If a search of the most promising
region does not uncover a satisfactory point, the strategy must redirect
its efforts into other regions of the space.

This dissertation describes a connectionist learning machine that
produces a search strategy called "stochastic iterated genetic
hillclimbing" (SIGH). Viewed over a short period of time, SIGH displays
a coarse-to-fine searching strategy, like simulated annealing and
genetic algorithms. However, in SIGH the convergence process is
reversible. The connectionist implementation makes it possible to
"diverge" the search after it has converged, and to recover
coarse-grained information about the space that was suppressed during
convergence. The successful optimization of a complex function by SIGH
usually involves a series of such converge/diverge cycles.

SIGH can be viewed as a generalization of a genetic algorithm and a
stochastic hillclimbing algorithm, in which genetic search discovers
starting points for subsequent hillclimbing, and hillclimbing biases the
population for subsequent genetic search. Several search
strategies---including SIGH, hillclimbers, genetic algorithms, and
simulated annealing---are tested on a set of illustrative functions and
on a series of graph partitioning problems. SIGH is competitive with
genetic algorithms and simulated annealing in most cases, and markedly
superior in a function where the uphill directions usually lead \away/
from the global maximum. In that case, SIGH's ability to pass
information from one coarse-to-fine search to the next is crucial.
Combinations of genetic and hillclimbing techniques can offer dramatic
performance improvements over either technique alone.

------------------------------

Date: 22-FEB-1987 00:58
From: not(LAWS@SRI-STRIPE.ARPA)
Subject: Learning to Represent Knowledge in Brain-Style ... (CMU)


AI SEMINAR

TOPIC: Learning to Represent Knowledge in Brain-Style Computational
Networks

SPEAKER: David E. Rumelhart, U. of California, San Diego

WHEN: Tuesday, February 24, 1987, 3:30 pm

WHERE: Wean Hall 5409

***********************************************
IF YOU WISH TO MEET WITH THE SPEAKER ON TUESDAY
MORNING, PLEASE CONTACT MARCE, X8818, mlz@d
***********************************************

ABSTRACT: The issue of knowledge representation is central to Cognitive
Science and Artificial Intelligence. The difference between an easy problem
and a difficult one is often in the way knowledge is represented. This is as
true in brain-style "connectionist" networks as in symbol processing systems.
It is frequently the choice of representation which makes the difference
between a system which works and a system which does not. Most learning
systems are stuck with the representations which their creators endowed them.
The real learning question is "How can we learn the representations
appropriate to the environment in which we find ourselves?" It was the
inability to modify its rather impoverished representations which led to the
rejection of the Perception. Because of the simplicity and uniformity of
representations within "connectionist" systems we have now been able to
develop learning procedures which not only learn "what to do" in various
situations, but learn "how to represent" the information they must deal with.
This immeasurable increases the power of such networks. The nature of this
learning procedure and a number of examples illustrating the impact of such
learning will be described.

------------------------------

Date: 23-FEB-1987 01:03
From: not(LAWS@SRI-STRIPE.ARPA)
Subject: Seminar: Self-Organizing Neural Pattern Recognition


Joint Seminar
DYNAMICAL SYSTEMS IN BIOLOGY and CENTER FOR ADAPTIVE SYSTEMS

"A MASSIVELY PARALLEL REAL-TIME ARCHITECTURE FOR A
SELF-ORGANIZING NEURAL PATTERN RECOGNITION MACHINE:
ACTIVE REGULATION BY ATTENTION AND EXPECTATION"

by
STEPHEN GROSSBERG
Boston University

Wednesday, February 25, 1987
1:00 -- 3:00 p.m.
Boston University
111 Cummington Street
Room 149


-- PLEASE POST --


------------------------------

Date: Sun, 15 Feb 87 16:12:44 pst
From: norman@nprdc.arpa (Donald A. Norman)
Subject: Cognitive Science Society submissions


Geoff Hinton noted that the submission requirements for the Cognitive
Science Society were ambiguous, at best. His concern prompted the
society to issue a clarification, which I have just picked up from the
cog-engineering UUCP usenet bulletin board. That isn't really the
correct spot for it to go, so I am forwarding it to the spots I know
about. Someone should forward a copy to Ken Laws to put in the AI
bulletin board (I have misplaced his mailing address, else I would do
it). I take no credit for this. Credit Geoff. However, I believe
the Cognitive Science Society now to be the best conference for a mix
of computational and experimental stuides of cognition. It is
becoming the home for connectionist reports. AAAI and IJCAI are too
large, too focussed upon industry, and too much interested in those
other parts of AI, the parts that do not deal with natural (human and
animal) cognition. The Cognitive Science Society has been continually
improving in the depth and quality of its papers. Help this continue.


REVISION OF CALL FOR PAPERS
Cognitive Science Society Conference
July 16-18, 1987
University of Washington, Seattle

NEW DUE DATE: March 16, 1987 (Camera-ready submissions)

There are 2 changes to the original announcement: the submission deadline
has been postponed til March 16, 1987; the requirement for camera ready
copy has been clarified (it IS required on the original submission).
Please address and questions to Prof. Hunt.

Cognitive Science Society
Announcement of Meeting and Preliminary call for Papers

The Ninth Annual Conference of the Cognitive Science Society will be held
on July 16-18, 1987 at the University of Washington. The dates have been
chosen to allow people to attend this conference and the conference
of the American Association for Artificial Intelligence, which meets in
Seattle earlier in the week. The conference will feature symposia and
invited speakers on the topics of mental models, educational and indus-
trial applications of cognitive science, discourse comprehension, the
relation between cognitive and neural sciences, and the use of connec-
tionist models in the cognitive sciences. The conference schedule will
include paper sessions and a poster session, covering the full range
of the cognitive sciences. The proceedings of the conference will be
published by L Erlbaum Associates.

Submitted papers are invited. These should cover original, unreported
work, research or analysis related to cognition. All submissions for
paper and poster sessions will be refereed.

All submitted papers and posters must include the following:

Author's name, address, and telephone number.
Set of four or fewer topic area keywords.
Four copies of the full paper (4000 words maximum) or poster
(2000 words maximum). Each copy should include a 100-250
word abstract.
Indication of preference for paper or poster session.

All papers MUST adhere to the following rules for preparation of
camera-ready copy. NOTE: Papers will NOT be sent back after
acceptance for modification. The accepted paper will be sent
directly to the publisher.

1 inch margins on both sides, top, and bottom.
Single spaced text. Figures centered on type page at
top or bottom.
Titles and author's names and institutions centered at
top of first page.
One line between title heading and text.
Use American Psychological Association publication format.
Authors are responsible for obtaining permission to reprint
published material.

Send submissions to Earl Hunt, Department of Psychology,
University of Washington, Seattle, Wa 98195

Submissions are due by MARCH 16, 1987. NOTE NEW DATE

All members of the Cognitive Science society will receive a further
mailing discussing registration, accommodation, and travel.

------------------------------

Date: 19-FEB-1987 04:59
From: 8414902%UWACDC.BITNET@WISCVM.WISC.EDU
Subject: Conference Announcement


1987 International Symposium on CIRCUITS AND SYSTEMS
Philadelphia, PA
May 4-7, 1987

Special Session on Artificial Neural Systems and Applications

Les Atlas and Robert Marks, Session Co-Chairmen
(For more information contact Les Atlas at 206-545-1315)

Papers to be Presented:

"Foundations of Neurocomputer Design"
Robert Hecht-Nielsen
Hecht-Nielsen Neurocomputer Corporation

"The Original Adaptive Neural-Net Broom Balancer"
Bernard Widrow
Stanford University

"Opto-Electronic Analogs of Self-Organizing Neural Networks"
Nabil Farhat
University of Pennsylvania

"Flow of Activation Computing"
Claude Cruz
IBM Palo Alto Scientific Center

"An Evaluation of the Neural Network Models"
B. Kumar and M. Lemmon
Carnegie Mellon University

"Convergence in Neural Memories"
Soura Dasgupta, Anjan Ghosh, and Robert Cuykendall
University of Iowa

"A Signal Space Interpretation of Neural Net Processors"
James Ritcey, L. Atlas, A. Somani, and R. Marks
University of Washington
F. Holt and D. Nguyen
The Boeing Company

------------------------------

Date: 20-FEB-1987 00:57
From: DYER@STILTON.WISC.EDU
Subject: Workshop on Computer Architecture for Pattern Analysis and Machine Intelligence


CALL FOR PAPERS

_______________

1987 WORKSHOP ON COMPUTER ARCHITECTURE FOR
PATTERN ANALYSIS AND MACHINE INTELLIGENCE

_______________

Seattle, Washington
October 5 - 7, 1987


CAPAMI-87 will focus on new architectures and associated
algorithms designed for artificial intelligence
applications. This workshop is a successor of the Computer
Architecture for Pattern Analysis and Image Database
Management workshops which were held in '81, '83, and '85.
The emphasis of the program will be the presentation of
significant new contributions plus panel and discussion
sessions in which attendees can actively compare and
contrast their methods. Papers will be reviewed by the
Program Committee. No parallel sessions are planned.

_______________

TOPICS

* Computer Vision and Image Processing Architectures
* Architectures for Inference Engines and Rule-Based Systems
* Knowledge-Based Machines and Systems
* Neural Network based Architectures
* VLSI and Systolic Implementations
* Parallel Algorithms for AI Problems on these Architectures
* Parallel Matching and Reasoning Algorithms

_______________

PAPER SUBMISSION INSTRUCTIONS

Authors should submit four (4) copies of a complete paper by
APRIL 15, 1987 to:

Charles R. Dyer
Department of Computer Science
University of Wisconsin
1210 W. Dayton St.
Madison, WI 53706

Authors will be notified of the acceptance of their papers
by June 1, 1987. Final camera-ready papers are due by July
15, 1987.

_______________

WORKSHOP ORGANIZATION


Workshop Chair: Steven L. Tanimoto

Program Chair: Charles R. Dyer

Program Committee: Christopher M. Brown James J. Little
Michael J. B. Duff Azriel Rosenfeld
Robert M. Haralick Jorge L. C. Sanz
Ramesh Jain Leonard M. Uhr
John R. Kender Jon A. Webb
H. T. Kung

------------------------------

Date: 23-FEB-1987 16:25
From: BLAWELPD%UIAMVS.BITNET@WISCVM.WISC.EDU
Subject: UNIV. IOWA CONFERENCE ON LEARNING & MEMORY

LEARNING AND MEMORY: THE BIOLOGICAL SUBSTRATES: April 4-5, 1987
Sponsored by the College of Liberal Arts and the Department of Psychology,
The University of Iowa, Iowa City, Iowa.
To be held at the Holiday Inn, 210 South Dubuque Street, Iowa City, Iowa.

Speakers and presentation titles are:

ALKON, DANIEL L.: Head Section of Neural Systems Laboratory of Biophysics, Woods
Hole, MA.
"Membrane and Molecular Mechanisms of Classical Conditioning in Mollusc and
Mammal"

DAMASIO, ANTONIO R.: Head Department of Neurology, The University of Iowa.
"Anatomical Correlates of Amnesia and the Understanding of Human Memory"

DAVIS, MICHAEL: Department of Psychiatry, Yale University School of Medicine.
"A Neural Analysis of Fear Conditioning"

GOLDMAN-RAKIC, PATRICIA: Department of Neuroanatomy, Yale School of Medicine.
"Regulation of Behavior by Representational Memory: Role of Prefrontal Cortex"

GORMEZANO, I.: Department of Psychology, The University of Iowa.
"Conditioning and the CS Trace"

GREENOUGH, WILLIAM T.: Department of Psychology, University of Illinois at
Urbana-Champaign.
"Experience-Dependent Synaptogenesis as a Plausible Memory Mechanism"

HARVEY, JOHN A.: Departments of Psychology and Pharmacology, The University of
Iowa.
"Pharmacology of Learning and Memory"

LYNCH, GARY S.: Department of Psychobiology, University of California, Irvine.
"Structure Function Relationships and the Study of Memory"

ROUTTENBERG, ARYEH: Department of Psychology, Northwestern University.
"Protein Kinase C: From Translocation to Information"

SCHNEIDERMAN, NEIL: Department of Psychology, University of Miami, Coral Gables.
"Neural Control of Classical Differential Conditioning of Bradycardia in
Rabbits"

THOMPSON, RICHARD F.: Department of Psychology, Stanford University.
"The Essential Memory Trace Circuit for a Basic Form of Associative Learning"

WOODY, CHARLES D.: Department of Anatomy, University of California, LA.
"Cellular Mechanisms of Eyeblink Conditioning in Mammals"

ZOLA-MORGAN, STUART: Department of Psychiatry, University of California, San
Diego.
"The Effects of Damage to the Diencephalon and to the Medial Temporal Region:
Findings From Monkeys and From Cases of Human Amnesia"

There is no registration fee, preregistration is encouraged. Optional
luncheon and dinner tickets are available and must be purchased prior to the
symposium. For more information contact either:
I. Gormezano or John A. Harvey
Department of Psychology
The University of Iowa
Iowa City, IA 52242
OR
John P. Welsh
blawelpd%UIAMVS.BITNET@WISCVM.WISC.EDU


------------------------------

End of NEURON Digest
********************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT