+ Site Statistics
+ Search Articles
+ PDF Full Text Service
How our service works
Request PDF Full Text
+ Follow Us
Follow on Facebook
Follow on Twitter
Follow on LinkedIn
+ Subscribe to Site Feeds
Most Shared
PDF Full Text
+ Translate
+ Recently Requested

A neural network learning for adaptively extracting cross-correlation features between two high-dimensional data streams



A neural network learning for adaptively extracting cross-correlation features between two high-dimensional data streams



IEEE Transactions on Neural Networks 15(6): 1541-1554



This paper proposes a novel cross-correlation neural network (CNN) model for finding the principal singular subspace of a cross-correlation matrix between two high-dimensional data streams. We introduce a novel nonquadratic criterion (NQC) for searching the optimum weights of two linear neural networks (LNN). The NQC exhibits a single global minimum attained if and only if the weight matrices of the left and right neural networks span the left and right principal singular subspace of a cross-correlation matrix, respectively. The other stationary points of the NQC are (unstable) saddle points. We develop an adaptive algorithm based on the NQC for tracking the principal singular subspace of a cross-correlation matrix between two high-dimensional vector sequences. The NQC algorithm provides a fast online learning of the optimum weights for two LNN. The global asymptotic stability of the NQC algorithm is analyzed. The NQC algorithm has several key advantages such as faster convergence, which is illustrated through simulations.

Please choose payment method:






(PDF emailed within 0-6 h: $19.90)

Accession: 048083537

Download citation: RISBibTeXText

PMID: 15565780

DOI: 10.1109/tnn.2004.838523


Related references

Neural network learning algorithms for tracking minor subspace in high-dimensional data stream. IEEE Transactions on Neural Networks 16(3): 513-521, 2005

Learning a single-hidden layer feedforward neural network using a rank correlation-based strategy with application to high dimensional gene expression and proteomic spectra datasets in cancer detection. Journal of Biomedical Informatics 83: 159-166, 2018

A neural network model of adaptively timed reinforcement learning and hippocampal dynamics. Brain Research. Cognitive Brain Research 1(1): 3-38, 1992

Reinforcement learning on slow features of high-dimensional input streams. Plos Computational Biology 6(8):, 2010

Fast neural network ensemble learning via negative-correlation data correction. IEEE Transactions on Neural Networks 16(6): 1707-1710, 2005

Stochastic Neural Network Approach for Learning High-Dimensional Free Energy Surfaces. Physical Review Letters 119(15): 150601, 2017

A neural network model extracting features from speech signals. Systems and Computers in Japan 19(3): 32-45, 1988

The neural signature of extracting emotional content from rapid visual streams at multiple presentation rates: A cross-laboratory study. Psychophysiology 55(12): E13222, 2018

A closed-form neural network for discriminatory feature extraction from high-dimensional data. Neural Networks 14(9): 1201-1218, 2001

A closed-form neural network for discriminatory feature extraction from high-dimensional data. Neural Networks 14(9): 1201-1218, 2001

Cross-entropy embedding of high-dimensional data using the neural gas model. Neural Networks 18(5-6): 727-737, 2005

A sparse structure learning algorithm for Gaussian Bayesian Network identification from high-dimensional data. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(6): 1328-1342, 2013

Extracting Topsoil Information from EM38DD Sensor Data using a Neural Network Approach. Soil Science Society of America Journal 73(6): 2051-2058, 2009

Extracting road information from recorded GPS data using snap-drift neural network. Neurocomputing 73(1-3): 24-36, 2009

Extracting features for a brain-computer interface by self-organising fuzzy neural network-based time series prediction. Conference Proceedings 6: 4371-4374, 2004