This site has been permanently archived. This is a static copy provided by the University of Southampton.
TY - GEN
ID - cogprints7179
UR - http://cogprints.org/7179/
A1 - Iqbal, Ridwan Al
TI - Using Feature Weights to Improve Performance of Neural Networks
Y1 - 2011/01/25/
N2 - Different features have different relevance to a particular learning problem. Some features are less relevant; while some very important. Instead of selecting the most relevant features using feature selection, an algorithm can be given this knowledge of feature importance based on expert opinion or prior learning. Learning can be faster and more accurate if learners take feature importance into account. Correlation aided Neural Networks (CANN) is presented which is such an algorithm. CANN treats feature importance as the correlation coefficient between the target attribute and the features. CANN modifies normal feed-forward Neural Network to fit both correlation values and training data. Empirical evaluation shows that CANN is faster and more accurate than applying the two step approach of feature selection and then using normal learning algorithms.
AV - public
KW - Feature weight
KW - Feature ranking
KW - Hybrid Learning
KW - Correlation
KW - Constraint learning
ER -