A dynamic systems approach to pattern recognition
Description
In this thesis we attempt to unify methods for supervised and unsupervised learning in new hybrid schemes while assuming feed forward neural nets as the underlying computing structure. In general, feed forward neural networks have a multilayer architecture associated with them, i.e. layers of units stacked on top of each other. Here we treat the process of learning as successive transformations of the input space to the required output space. In the context of neural nets the process is viewed as recoding followed by curve fitting. In the models that we propose, specific attention is given to the development of the internal representations in the hidden layers, by using unsupervised learning algorithms to extract significant features inherent in the training pattern set. The final output layer then handles the labeling or the curve fitting part by using a supervised learning algorithm. Thus by synthesizing meaningful representations in the hidden layers, the root cause of the problems in feed forward networks, like sensitivity to architecture, initial conditions, overtraining, and other related problems are addressed. The new models, Hybrid Charge Clustering Network (HCCN) and Hybrid Kohonen Network (HKN) are referred to as Heterogeneous neural net models as they are heterogeneous in the makeup of their architecture and adaptive algorithms