Error-Based Noise Filtering During Neural Network Training
Fahad-Alharbi, Khalil El Hindi; Saad Al-Ahmadi, . 2020
The problem of dealing with noisy data in neural network-based models has been receiving more attention by researchers with the aim of mitigating possible consequences on learning. Several methods have been applied by some researchers to enhance data as a pre-process of training while other researchers have attempted to make models of learning aware of noise and thus able to deal with noisy instances. We propose a simple and efficient method that we call Error-Based Filtering (EBF) that is used during training as a filtration technique for supervised learning in neural network-based models. EBF is independent of the model architecture and can therefore be involved in any neural network-based model. Our approach is based on monitoring and analyzing the distribution of values of the loss (error) function for each instance during training. In addition, EBF can be integrated with semi-supervised learning to take advantage of the identified noisy instances and improve classification. An advantage of EBF is to achieve competitive performance compared with other state-of-the-art methods with many fewer additional tasks in a procedure of training. Our evaluation of the efficacy of our method on three well-known benchmark datasets demonstrates an improvement on classification accuracy in the presence of noise.
The problem of dealing with noisy data in neural network-based models has been receiving more attention by researchers with the aim of mitigating possible consequences on learning.
Text classification has many applications in text processing and information retrieval. Instance-based learning (IBL) is among the top-performing text classification methods. However, its…
Analyzing social data as a participatory sensing system (PSS) provides a deep understanding of city dynamics, such as people’s mobility patterns, social patterns, and events detection. In a PSS,…