By Mohammad Mahdi Bejani & Mehdi Ghatee
Published: Neurocomputing, 2021.
Model selection is a challenge, and a popular Convolutional Neural Networks (CNN) usually takes extra-need parameters. It causes overfitting in real applications. Besides, the extracted hidden features would be lost when the number of convolution layers increases. We use the least auxiliary loss-functions to solve both of these problems. To this end, an optimization problem is stated to select a set of layers with the highest contributions in the training process. Also, an impact growth adaptation procedure adjusts the weights of losses. The constructed Least Auxiliary Loss-functions with Impact Growth Adaptation (Laliga) is a professional forum to select the best settings of auxiliary loss functions for CNNs training. Laliga memorizes the hidden features carefully and better represents the space by using non-redundant and more relevant features. Also, it uses singular value decomposition to regularize the weights. The theoretical results show that Laliga decreases overfitting substantially. Although this algorithm is useful for all CNN models, its results are auspicious for Visual Geometry Group (VGG) networks. The testing accuracies of Laliga for different VGG models on MNIST, CIFAR-10, and CIFAR-100 datasets are 99.7%, 92.3%, and 73.4%, indicating Laliga overcomes many regularization methods in the dropout family. Besides, on more complicated datasets Caltech-101 and Caltech-256, its accuracies raise than 66.1% and 33.2%, which are better than dropout and close to Adaptive Spectral Regularization (ASR) results, although Laliga converges rapidly than ASR. Finally, we analyze the results of Laliga in a transportation case study.
To help you access and share this work, the following Share Link
is available providing 50 days’ free access to this article. Anyone clicking on this link before April 15, 2021 will be taken directly to the latest version of this article on ScienceDirect. No sign up, registration or fees are required.