Parallel distributed processing in the complex domain
Description
This work extents Neural Network models from the real to the complex domain and discusses issues involved. Special emphasis is placed on the investigation of activation functions in different contexts. The usual real domain models become special cases. In particular, the Perceptron procedure and Backpropagation are considered in detail. It is found that Complex Perceptrons models are intrinsically more powerful than Rosenblatt's Perceptron. In particular, the Continuous Complex perceptron can learn in a finite number of steps all functions from $\IR\sp{n}$ to the interval (0,1) that it can represent. The Complex Domain Backpropagation is derived so that it admits suitable activation functions, which are investigated. One such function is found and the circuit implementation of the corresponding neuron is given. Hardware circuits consisting of such neurons can be used to process sinusoidal signals all at the same frequency. A Two-phase Backpropagation algorithm is introduced. In the first phase it constrains the search space of backpropagation by including information in the learning process which is in the form of linear discriminants or principal components. In the second phase the constraints are removed and usual backpropagation takes over to further decrease the error