## Approximation by Fully Complex Multilayer Perceptrons (2003)

Citations: | 20 - 5 self |

### BibTeX

@MISC{Kim03approximationby,

author = {Taehwan Kim and Tülay Adalı},

title = { Approximation by Fully Complex Multilayer Perceptrons},

year = {2003}

}

### OpenURL

### Abstract

We investigate the approximation ability of a multilayer perceptron (MLP) network when it is extended to the complex domain. The main challenge for processing complex data with neural networks has been the lack of bounded and analytic complex nonlinear activation functions in the complex domain, as stated by Liouville’s theorem. To avoid the conflict between the boundedness and the analyticity of a nonlinear complex function in the complex domain, a number of ad hoc MLPs that include using two real-valued MLPs, one processing the real part and the other processing the imaginary part, have been traditionally employed. However, since nonanalytic functions do not meet the Cauchy-Riemann conditions, they render themselves into degenerative backpropagation algorithms that compromise the efficiency of nonlinear approximation and learning in the complex vector field. A number of elementary transcendental functions (ETFs) derivable from the entire exponential function e z that are analytic are defined as fully complex activation functions and are shown