Copy Link
Add to Bookmark
Report

Neuron Digest Volume 10 Number 22

eZine's profile picture
Published in 
Neuron Digest
 · 1 year ago

Neuron Digest   Wednesday,  2 Dec 1992                Volume 10 : Issue 22 

Today's Topics:
Re: Beginning references
Re: begining references on NN for Carl Vettore, Univ. of Verona
Very Fast Simulated Reannealing version 6.20
Position in cognitive psychology - U Mich
Postdocs - computational neuroscience
Vision postdoc position
(1) HKP review in neuroprose and (2) NIPS time series workshop program
NIPS workshop - REAL biological computation


Send submissions, questions, address maintenance, and requests for old
issues to "neuron-request@cattell.psych.upenn.edu". The ftp archives are
available from cattell.psych.upenn.edu (130.91.68.31). Back issues
requested by mail will eventually be sent, but may take a while.

----------------------------------------------------------------------

Subject: Re: Beginning references
From: arbib@cs.usc.edu (Michael Arbib)
Date: Sat, 28 Nov 92 15:39:31 -0800

Michael A. Arbib: Brains, Machines and Mathematics, Second Edition,
Springer-Verlag, 1987. This places connectionism in a historical
setting, introduces some of the main concepts (relatively brief), places
it in the context of theoretical computer science, and includes chapters
on self-reproducing automata and Godel's Incompleteness Theorem (both
proofs and philosophical implications).

It's not the best introduction to connectionism as a standalone subject,
but is unrivalled for readers who want a "situated" view of the subject
along the above lines.

While advertising my books, I should also put in a word for "The
Metaphorical Brain 2: Neural Networks and Beyond", Wiley-Interscience,
1989. The special strength of this relative to other books on
connectionism is that it provides strong links to both Computational
Neuroscience AND Artificial Intelligence. It also argues for Schema
Theory: using schemas as a coarse-grain methodology for cooperative
computation to complement the fine-grain methodology of neural networks.
It closes with a view of Sixth Generation Computing based on this
multi-level schema/NN methodology.

Michael Arbib
Center for Neural Engineering
University of Southern California
Los Angeles, CA 90089-2520, USA
Tel: (213) 740-9220
Fax: (213) 746-2863
arbib@pollux.usc.edu


------------------------------

Subject: Re: begining references on NN for Carl Vettore, Univ. of Verona
From: kanal@cs.UMD.EDU (Laveen N. Kanal)
Date: Sun, 29 Nov 92 10:55:43 -0500


The two volumes by Maureen Caudill from M.I.T. Press should be a
good place to start. The complete reference is cited in the following
list of refernces for a course/seminar that I am offering next semester.



.ce4
CONNECTIONISTS MODELING OF INTELLIGENT SYSTEMS
.br
CMSC 727, SPRING 1993
.br
Instructor: Laveen Kanal
.br
Time: Tues & Thurs. 5:00 to 6:15 p.m.
(If feasible, may be changed to meet once a week for
a longer period at a time convenient to the attendees)

This is a graduate course/seminar on Neural Networks and other connectionist
dynamical systems and hybrid models currently being proposed for the
modeling of intelligent systems for pattern recognition and problem-solving.
The purpose is to introduce some of the theory and applications of such
systems, discuss their role and potential in developing intelligent
artificial systems, their relationships to other areas of A.I. and
the question of local versus nonlocal representations.
There will be lectures by the instructor, by invited
persons working in these areas, and possibly by some students.
Students registered for the course will
be expected to do some projects using the suggested lab texts or other
simulators. A mid semester report and a final project/paper will be required
for a grade in the course.

Text: Roberto Serra, Gianni Zanarini, COMPLEX SYSTEMS and COGNITIVE PROCESSES
Springer-Verlag, 1990 ISBN 0-387-51393-0

Lab. text: Maureen Caudill, Charles Butler, Understanding Neural Networks:
Computer Explorations, vol.1 & vo2, The MIT Press
(These volumes cover most of the major NN paradigms in an easy to
follow manner and a PC or MAC software diskette is included).
Vol 1 ISBN 0-262-53099-6 IBM compatible disk included
ISBN 0-262-53102-x Macintosh compatible disk included

Vol 2: ISBN 0-262-53100-3 IBM compatible disk included
ISBN 0-262-53103-8 Macintosh Compatible disk included

Optional lab. text:
Adam Blum, Neural Networks in C++ , An Object Oriented Approach
to building Connectionist Systems, John Wiley & Sons, Inc. 1992
ISBN 0-471-55201-1(book/IBM disk set)
ISBN 0-471-53847-7(papaerback book only)

Useful
Supplementary text:

J.Hertz, A. Krogh, R.C. Palmer, An Introduction to the Theory of Neural
Computation, Addison Wesley, 1991
2nd edition to appear shortly.

Supplementary References:

L.N. Kanal, On Pattern, Categories, and Multiple Reality, TR, UMD, 1993

L.N. Kanal & S. Raghavan, Hybrid Systems-A Key to Intelligent Pattern
Recognition, Proc. IJCNN, 1992.

J. Hendler, Papers on Hybrid Systems

J.A. Reggia, Papers on Competition-Based Local Processing for Spreading
Activation Mechanisms in Neural Networks; also Chap.7 in
Peng and Reggia, Abductive Inference Models for Diagnostic
Problem-Solving, Springer-Verlag, 1990

Todd C. Moody, PHILOSOPHY & ARTIFICIAL INTELLIGENCE, Prentice Hall, 1993
ISBN 0-13-663816-3

Satosi Watanabe, PATTERN RECOGNITION: HUMAN & MECHANICAL, John Wiley, 1985
ISBN 0-471-80815-6

Gerald M. Edelman, BRIGHT AIR,BRILLIANT FIRE...On the Matter of the Mind,
Basic Books, 1992 . ISBN 0-465-05245-2

George Lakoff, WOMEN, FIRE, and DANGEROUS THINGS--What Categories Reveal
About the Mind, Univ. of Chicago Press, 1987
ISBN 0-226-46803-8

I.K. Sethi & A.K. Jain (Eds), ARTIFICIAL NEURAL NETWORKS AND STATISTICAL
PATTERN RECOGNITION, North-Holland, 1991
ISBN 0-444-88741-5 (paperback)

K. Yasue, M. Jibu, and Karl Pribram, A Theory of Non Local Cortical
Processing in the Brain, Appendices to K. Pribram, BRAIN & PERCEPTION,
Lawrence Earlbaum Associates, 1991. ISBN 0-89859-995-4

P.A. Flach, R.A. Meersman (Eds), FUTURE DIRECTIONS IN A.I., North-Holland,
1991. ISBN 0-444-89048-3


Papers in IEEE Trans. on Neural Networks, the journal Neural Networks,
Neural Computation, and Proceedings of Conferences on Neural Networks,
Genetic Algorithms, Complex Systems, and Hybrid Systems.


Hope this helps.

L.K.


------------------------------

Subject: Very Fast Simulated Reannealing version 6.20
From: Lester Ingber <ingber@alumni.cco.caltech.edu>
Date: Mon, 30 Nov 92 07:17:21 -0800

VERY FAST SIMULATED REANNEALING (VFSR) (C)

Lester Ingber ingber@alumni.caltech.edu
and
Bruce Rosen rosen@ringer.cs.utsa.edu

The good news is that the people who have gotten our beta version of
VFSR to work on their applications are very pleased. The bad news is
that because of some blunders made in the process of making the code
user-friendly, the code has to be modified to use as a standalone
function call. This bug is corrected and some other fixes/changes
are made in version v6.20.

This version is now updated in netlib@research.att.com. It will
eventually find its way into the other NETLIB archives.

To access the new version:

Interactive
local% ftp research.att.com
Name (research.att.com:your_login_name): netlib
Password: [type in your_login_name or anything]
ftp> cd opt
ftp> binary
ftp> get vfsr.Z
ftp> quit
local% uncompress vfsr.Z
local% sh vfsr

Electronic Mail Request
local% mail netlib@research.att.com
[mail netlib@ornl.gov]
[mail netlib@ukc.ac.uk]
[mail netlib@nac.no]
[mail netlib@cs.uow.edu.au]
send vfsr from opt
^D [or however you send mail]

Lester


|| Prof. Lester Ingber ingber@alumni.caltech.edu ||
|| P.O. Box 857 ||
|| McLean, VA 22101 703-848-1859 = [10ATT]0-700-L-INGBER ||


------------------------------

Subject: Position in cognitive psychology - U Mich
From: zhang@psych.lsa.umich.edu
Date: Fri, 27 Nov 92 10:12:04 -0500

Position in Cognitive Psychology
University of Michigan

The University of Michigan Department of Psychology invites applications
for a tenure-track position in the area of Cognition, beginning September
1, 1993. The appointment will most likely be made at the Assistant
Professor level, but it is possible at any rank. We seek candidates with
primary interests and technical skills in cognitive psychology. Our
primary goal is to hire an outstanding cognitive psychologist, and thus
we will look at candidates with any specific research interest. We have
a preference for candidates interested in higher mental processes or for
candidates with computational modeling skills (including connectionism).
Responsibilities include graduate and undergraduate teaching, as well as
research and research supervision. Send curriculum vitae, letters of
reference, copies of recent publications, and a statement of research and
teaching interests no later than January 8, 1993 to: Gary Olson, Chair,
Cognitive Processes Search Committee, Department of Psychology,
University of Michigan, 330 Packard Road, Ann Arbor, Michigan 48104. The
University of Michigan is an Equal Opportunity/Affirmative Action
employer.



------------------------------

Subject: Postdocs - computational neuroscience
From: Ken Miller <ken@cns.caltech.edu>
Date: Sun, 29 Nov 92 06:29:29 -0800


POSTDOCTORAL POSITIONS
COMPUTATIONAL NEUROSCIENCE
UNIVERSITY OF CALIFORNIA, SAN FRANCISCO

I will soon be beginning a new lab at UCSF, and anticipate several positions
for postdocs beginning in 1993 and 1994 (prospective graduate students are
also encouraged to apply to the UCSF Neuroscience Program). The lab will
focus on understanding both development and mature processing in the
cerebral cortex. Theoretical, computational, and experimental approaches
will be taken. Candidates should have skills relevant to one or more of
those approaches. The most important criteria are demonstrated scientific
ability and creativity, and a deep interest in grappling with the details of
neurobiology and the brain.

Past work has focused on modeling of development in visual cortex under
Hebbian and similar ``correlation-based" rules of synaptic plasticity. The
goal has been to understand these rules in a general way that allows
experimental predictions to be made. Models have been formulated for the
development of ocular dominance and orientation columns. A few references
are listed below.

Future work of the lab will extend the developmental modeling, and will also
take various approaches to understanding mature cortical function. These
will include detailed biophysical modeling of visual cortical networks,
many-cell recording from visual cortex, and use of a number of theoretical
methods to guide and interpret this recording. There will also be
opportunities for theoretical forays in new directions, in particular in
collaborations with the other Neuroscientists at UCSF. Facilities to
develop new experimental directions that are relevant to the lab's program,
for example slice studies and use of optical methods, will also exist.

I will be part of the Keck Center for Systems Neuroscience at UCSF, which
will be a very interactive environment for Systems Neurobiology. Other
members will include:
* Alan Basbaum (pain systems);
* Allison Doupe (song learning in songbirds);
* Steve Lisberger (oculomotor system);
* Michael Merzenich (adult cortical plasticity);
* Christof Schreiner (auditory system);
* Michael Stryker (visual system, development and plasticity);
Closely related faculty members include Roger Nicoll (hippocampus, LTP);
Rob Malenka (hippocampus, LTP); Howard Fields (pain systems); and Henry
Ralston (spinal cord and thalamus).

Please send a letter describing your interests and a C.V., and arrange to
have three letters of recommendation sent to

Ken Miller
Division of Biology 216-76
Caltech
Pasadena, CA 91125
ken@cns.caltech.edu

Some References:

Miller, K.D. (1992). ``Models of Activity-Dependent Neural Development."
Seminars in the Neurosciences, 4:61-73.

Miller, K.D. (1992). ``Development of Orientation Columns Via Competition
Between ON- and OFF-Center Inputs." NeuroReport 3:73-76.

MacKay, D.J.C. and K.D. Miller (1990). ``Analysis of Linsker's simulations
of Hebbian rules," Neural Computation 2:169-182.

Miller, K.D. (1990). ``Correlation-based mechanisms of neural development,"
in Neuroscience and Connectionist Theory, M.A. Gluck and D.E. Rumelhart,
Eds. (Lawrence Erlbaum Associates, Hillsdale NJ), pp. 267-353.

Miller, K.D., J.B. Keller and M.P. Stryker (1989). ``Ocular dominance
column development: analysis and simulation," Science 245:605-615.

Miller, K.D., B. Chapman and M.P. Stryker (1989). ``Responses of cells in
cat visual cortex depend on NMDA receptors," Proc. Nat. Acad. Sci. USA
86:5183-5187.


------------------------------

Subject: Vision postdoc position
From: axon@cortex.rutgers.edu (Ralph Siegel)
Date: Tue, 01 Dec 92 18:45:54 -0500


PLEASE POST

Postdoctoral position available in analysis of structure-from-motion.
(visual psychophysics, electrophysiology, primate studies)
in primates.

Contact: Ralph Siegel
Center for Molecular and Behavioral Neuroscience
Rutgers, The State University
197 University Avenue
Newark, NJ 07102
phone: 201-648-1080 x3261
fax: 201-648-1272

email: axon@cortex.rutgers.edu


Term: 24 months, beginning 2/1/93 or later
Salary: NIH levels

Please send statement of research interests, curriculum vitae, and
names of three references.


------------------------------

Subject: (1) HKP review in neuroprose and (2) NIPS time series workshop program
From: Andreas Weigend <weigend@dendrite.cs.colorado.edu>
Date: Sun, 29 Nov 92 04:30:55 -0700

Two things: (1) a paper in neuroprose and (2) the program for a NIPS workshop.


(1) Book review of Hertz-Krogh-Palmer in neuroprose:

My 17-page book review (for Artificial Intelligence) is available via ftp,
ftp archive.cis.ohio-state.edu (anonymous, neuron)
cd pub/neuroprose
binary
get weigend.hkp-review.ps.Z
(then uncompress and lpr)


(2) The updated program for the time series NIPS workshop at Vail this Friday:

"Time Series Analysis and Predic____"

John Moody Mike Mozer Andreas Weigend
moody@cse.ogi.edu mozer@cs.colorado.edu weigend@cs.colorado.edu

--------------------------------------------------------------------------
| Several new techniques are now being applied to the problem of |
| predicting the future behavior of a temporal sequence and deducing |
| properties of the system that produced the time series. Both |
| connectionist and non-connectionist techniques will be discussed. |
| Issues include: |
| - algorithms and architectures, |
| - model selection, |
| - performance measures, |
| - iterated single-step vs direct multi-step prediction, |
| - short term vs long term prediction, |
| - growth of error with prediction time, |
| - presence or absence of deterministic chaos, |
| - number of degrees of freedom of the system, |
| - amount of noise in the data, |
| - robust prediction and estimation, |
| - detection and classification of signals in noise, etc. |
--------------------------------------------------------------------------

Intended audience: connectionists active in time series analysis.

Half the available time has been reserved for discussion and
informal presentations. Lively audience participation is encouraged.


7:30-9:30 General Overviews (20 minutes each) and Discussion.

John MOODY: Time Series Modeling: Classical Methods and
Nonlinear Generalizations
Mike MOZER: Neural nets for temporal sequence processing

Andreas WEIGEND: Ideas from the SFI competition for prediction
and analysis

4:30-6:30 Special Topics. Talks (10-15 minutes each),
Discussion and Ad-Hoc Presentations.


The rest of this message contains the abstracts of the talks.

John MOODY <moody@cse.ogi.edu>:
I present an overview of classical linear timeseries modeling methods,
including AR, MA, ARMA, ARIMA, and state-space representations, and discuss
their strengths and weaknesses. I then describe how nonlinear
generalizations of these models can be constructed.

Mike MOZER <mozer@cs.colorado.edu>:
I present a general taxonomy of neural net architectures for processing
time-varying patterns. This taxonomy subsumes existing architectures in the
literature, and points to several promising architectures that have yet to
be examined. I also discuss some experiments on predicting future values
of a financial time series (US dollar--Swiss franc exchange rates) from the
Santa Fe competition, and make suggestions for future work on this series.

Andreas WEIGEND <weigend@cs.colorado.edu>:
For prediction, I first present `low-pass embedding', a generalization
of the usual delay line that corresponds to filtered delay coordinates. I
then focus on the estimation of prediction errors, including a generalization
that predicts the evolution of the entire probability density function.
For analysis, I first present `deterministic vs stochastic plots' (DVM)
and then the information theoretic measure called `redundancy' that allows
characterization of the underlying dynamic system without prediction.

Volker TRESP <tresp@inf21.zfe.siemens.de>:
Like many physiological processes, the blood glucose / insulin metabolism
is highly nonlinear and involves multiple time scales and multi-dimensional
interactions. We present a model of the blood glucose / insulin metabolism of
a diabetic patient. The model is a hybrid "compartment" / neural network
model and was trained with data from a diabetic patient using the dynamic
backpropagation algorithm. We demonstrate how our model can be used both for
prediction of blood glucose levels and control of the patient's therapy.
(Joint work with J. Moody and W.R. Delong)

William FINNOFF:
In financial and economic applications, data sets are typically small
and noisy. The standard "black box" approach to network learning tends to
overfit the training data and thus generalizes poorly. In this talk, we
will discuss the microeconomic foundations of neural network model
structures used to perform economic forecasting. Further, we will describe
a variety of extended regularization techniques used to prevent overfitting.

Eric WAN <wan@isl.stanford.edu>:
A neural network for time series prediction which uses Finite Impulse
Response (FIR) linear filters to provide dynamic interconnectivity is
presented. The FIR network and associated training algorithm are reviewed.
Examples from the Santa Fe competition in prediction and dynamic modeling
of laboratory data and simulated chaotic time series are used to
demonstrate the potentials of the approach.

Fernando PINEDA <fernando@aplcomm.jhuapl.edu>:
A fast and elegant numerical algorithm for estimating generalized
dimensions and coarse grained mutual information will be be presented.
The relationship to other more well known algorithms will be discussed.
Examples from the Santa Fe time series analysis competition will be used
to demonstrate how to use the algorithm for choosing delay times for
delay coordinate embeddings.

Fu-Sheng TSUNG: <tsung@cs.ucsd.edu>:
When modeling a system that generates a time series, what is known about
the system constrains the architecture of the model. As an example, I will
present a recurrent network model of a lobster neural circuit, discuss what
we learned from the model, where the model failed, and possible improvements
from using a pair of sigmoids as a "near-oscillator" primitive for a neuron.


------------------------------

Subject: NIPS workshop - REAL biological computation
From: Jim Schwaber <schwaber@eplrx7.es.duPont.com>
Date: Wed, 25 Nov 92 10:11:01 -0500


- -----------NIPS 92 WORKSHOP----------------------

Real Applications of Real Biological Circuits

or

"If back-prop is not enough how will we get more?"

or

"Is anybody really getting anywhere with biology?"

- ---------------------------------------------------

When: Friday, Dec. 4th
====

Intended Audience: Those interested in detailed biological modeling.
================== Those interested in nonlinear control.
Those interested in neuronal signal processing.
Those interested in connecting the above.


Organizers:
===========
Richard Granger Jim Schwaber
granger@ics.uci.edu schwaber@eplrx7.es.dupont.com

Agenda:
=======

Morning Session, 7:30 - 9:30, Brain Control Systems and Chemical
- --------------- Process Control

Jim Schwaber Brainstem reflexes as adaptive controllers
Dupont

Babatunde Ogunnaike Reverse engineering brain control systems
DuPont

Frank Doyle Neurons as nonlinear systems for control
Purdue

John Hopfield Discussant
Caltech

Afternoon Session, 4:30 - 6:30, Real biological modeling, nonlinear
- ----------------- systems and signal processing

Richard Granger Signal processing in real neural systems: is
UC Irvine it applicable?

Gary Green The single neuron as a nonlinear system - its
Newcastle Volterra kernels as described by neural networks.


Program:
========

We anticipate that the topic will generate several points of view.
Thus, presenters will restrict themselves to a very, very few slides
intended to make a point for discussion. Given that there now are
concrete examples of taking biological principles to application, we
expect the discussion will center more on how, and at what level,
rather than whether "reverse engineering the brain" is useful.

Granger (UC Irvine):
- -------
The architectures, performance rules and learning rules of most artificial
neural networks are at odds with the anatomy and physiology of real
biological neural circuitry. For example, mammalian telencephelon
(forebrain) is characterized by extremely sparse connectivity (~1-5%),
almost entirely lacks dense recurrent connections, and has extensive lateral
local circuit connections; inhibition is delayed-onset and relatively
long-lasting (100s of milliseconds) compared to rapid-onset brief excitation
(10s of milliseconds), and they are not interchangeable. Excitatory
connections learn, but there is very little evidence for plasticity in
inhibitory connections. Real synaptic plasticity rules are sensitive to
temporal information, are not Hebbian, and do not contain "supervision"
signals in any form related to those common in ANNs.

These discrepancies between natural and artificial NNs raise the question of
whether such biological details are largely extraneous to the behavioral and
computational utility of neural circuitry, or whether such properties may
yield novel rules that confer useful computational abilities to networks
that use them. In this workshop we will explicitly analyze the power and
utility of a range of novel algorithms derived from detailed biology, and
illustrate specific industrial applicatons of these algorithms in the fields
of process control and signal processing.

Ogunnaike (DuPont):
- -----------
REVERSE ENGINEERING BRAIN CONTROL SYSTEMS:
EXPLORING THE POTENTIAL FOR APPLICATIONS IN CHEMICAL PROCESS CONTROL.
=====================================================================

The main motivation for our efforts lies in the simple fact that there
are remarkable analogies between the human body and the chemical
process plant. Furthermore, it is known that while the brain has been
quite successful in performing its task as the central supervisor of
intricate control systems operating under conditions which leave very
little margin for error, the control computer in the chemical process
plant has not been so successful.


We have been concerned with seeking answers to the following question:

``Is it possible to ``reverse engineer'' a biological control system
and use the understanding to develop novel approaches to chemical
process control systems design and analysis?''

Our discussion will provide an overview of the tentative answers we
have to date. We will first provide a brief summary of the salient
features and main problems of chemical process control; we will then
introduce the biological control system under study (the baroreceptor
vagal reflex); finally we will present an actual industrial process
whose main features indicate that it may benefit from the knowledge
garnered from the neurobiological studies.

Doyle (Purdue):
- ------
We are focusing our research on two levels:
1) Neuron level: investigating novel building blocks for process
modeling applications which are motivated by realistic biological
neurons.
2) Network Level: looking for novel approaches to nonlinear dynamic
scheduling algorithms for process control and modeling (again,
motivated by biological signal processing in the baroreceptor reflex).

Green (Newcastle):
- -------
I would love to tell the NIPS people about Volterra series,
especially as we have now made a connection between neural
networks, Volterra series and the differential geometric
representation of networks. This allows us to say why one, two or
more layers are necessary for a particular analytic problem. We can
also say how to invert nets which are homeomorphic in their
mappings. More importantly for us biologists we can turn the state
equations of membrane currents, using neural networks into
approximate Volterra kernels which I think (!) helps understand the
dynamics. This gives a solution to the differential equations,
albeit an approximate one in practical terms. The equations are
invertible and therefore allow a formal link between current clamp
and voltage clamp at the equation level. The method we have used to
do this is of interest to chem. eng. people because we can use the
same concepts in non-linear control. It appears at first glance
that we can link the everyday use of neural networks to well
established theory through a study of tangent spaces of networks.
We construct a state space model of a plant, calculate the
differential of the rate of change of output with respect to the
input. Calculate the same for a neural network. Compare
coefficients. The solution to the set of simultaneous equations for
the coefficents produces a network which is formally equivalent to
the solution of the original differential equation which defined
the state equations.


We will be making the claim that analytic solutions of non-linear
differential equations is possible using neural networks for
some problems. For all other problems an approximate solution is
possible but the architecture that must be used can be defined.
Last I'll show how this is related to the old techniques using
Volterra series and why the kernels and inverse transforms can be
directly extracted from networks. I think it is a new method of
solving what is a very old problem. All in 20 minutes !


------------------------------

End of Neuron Digest [Volume 10 Issue 22]
*****************************************

← previous
next →
loading
sending ...
New to Neperos ? Sign Up for free
download Neperos App from Google Play
install Neperos as PWA

Let's discover also

Recent Articles

Recent Comments

Neperos cookies
This website uses cookies to store your preferences and improve the service. Cookies authorization will allow me and / or my partners to process personal data such as browsing behaviour.

By pressing OK you agree to the Terms of Service and acknowledge the Privacy Policy

By pressing REJECT you will be able to continue to use Neperos (like read articles or write comments) but some important cookies will not be set. This may affect certain features and functions of the platform.
OK
REJECT