Indexed by:
Abstract:
Recently, small-scale convolutional neural networks (CNNs) have become a hot research topic due to the fewer training parameters and shorter training time, while their accuracy is generally degraded compared with large-scale CNNs. To this end, in this paper, we propose a small-scale CNN with sparse representation (a sparse small-scale CNN) for image classification. First, the Gabor filter is a representative sparse representation, so we integrate it into the standard AlexNet to establish a sparse Gabor-AlexNet architecture to better extract the feature of images. Second, to improve further the performance of the above Gabor-AlexNet., a learnable Gabor-AlexNet with only the increase of a few learning parameters is constructed for image classification. Finally, we test the Gabor-AlexNet on small-scale datasets including Extended Yale B, CASIA-FaceV5., Original Brodatz and, DTD. The proposed method outperforms the standard AlexNet in the precision and speed as classification benchmark. Then, our learnable Gabor-AlexNet is also superior to the Gabor-AlexNet for classification performance on the CIFAR-10 dataset. © 2021 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2021
Page: 425-431
Language: English
Cited Count:
SCOPUS Cited Count: 1
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 5
Affiliated Colleges: