November 15, 1990.


Natural and artificial neurons. Natural and artificial neural systems. Aritificial neural networks. Model of the neuron. Analog implementation of the model. Object identification. Associative memory. Learning. Circuit realization of the weighting coefficient. Circuit realization of the integrated analog neuron type circuit. Specific applications imitating human organs. Optoelectronic ANN. Neurocomputers. Dedicated digital ANN VLSI circuits. Digital neuron type circuit. Gate array implementation of ANN.

In the talk four problems related to Hopfield neural networks will be dealt with:

Adaptive systems. Features: Goal seeking and learning. Definition of learning. Learning systems. Self-conscious systems.
Types of learning: Supervised, reinforcement and self-organization. Learning in artificial intelligence systems.
Neural learning. Biological background. Learning algorithms. Implementations.
Neural algorithm based schemas. Animal learning theory.
Neurocomputing learning systems. Learning controllers. Learning for pattern classification. Learning to control dynamic systems.

The lecture presents an information processing system applicable to automatic empirical modeling of natural phenomena. It consists of an array of sensors, a self-organizing memory, an estimator and an array of actuators. Its operation corresponds to an optimal representation of the probability distribution of measured data by a set of adaptive prototvpes. The adaptation rule of prototypes is derived from the maximum entropy principle and describes an optimal self-organization of formal neurons. In the case when incomplete information is obtained by partial observation of the phenomenon, the prototypes are applicable for the retrieval of missing information by estimation of the conditional average. The operation of a corresponding system is demonstrated by the recognition of acoustic emission sources on the basis of detected signals and by prediction of a chaotic time series.

Recent interest in artificial neural netuorks (ANNs) centres around their capabilitics of self-learning and generalisation which are achieved in an implicit (distributed, connectionist ) fashion rather than explicitly as in a convcntional algorithm. While ANNs cannot do anything that could not be done by a "conventional" aigorithm, they can perhaps do it faster or with less effort in acquiring the explicit knowledge required for a solution. This lecture will consider the sort of functions that an ANN can perform, according to details of information representation (analogue/ digital), lcarning nlgorithm (supervised/ unsupervised) and net topology (temporal/ non-tcmporal ). Thc Importance of these functions to pattern recognition will be cmphasiued. Fxamples will bc givcn of practical applications in speech and image rccognition, and in machine translation.

Prof. Đuro Koruga - the inspirer of the meeting

Prof. Damper and Prof. Litovski in conversation