Fast DCNN based on FWT, intelligent dropout and layer skipping for image retrieval

Copyright © 2017 Elsevier Ltd. All rights reserved..

Deep Convolutional Neural Network (DCNN) can be marked as a powerful tool for object and image classification and retrieval. However, the training stage of such networks is highly consuming in terms of storage space and time. Also, the optimization is still a challenging subject. In this paper, we propose a fast DCNN based on Fast Wavelet Transform (FWT), intelligent dropout and layer skipping. The proposed approach led to improve the image retrieval accuracy as well as the searching time. This was possible thanks to three key advantages: First, the rapid way to compute the features using FWT. Second, the proposed intelligent dropout method is based on whether or not a unit is efficiently and not randomly selected. Third, it is possible to classify the image using efficient units of earlier layer(s) and skipping all the subsequent hidden layers directly to the output layer. Our experiments were performed on CIFAR-10 and MNIST datasets and the obtained results are very promising.

Medienart:

E-Artikel

Erscheinungsjahr:

2017

Erschienen:

2017

Enthalten in:

Zur Gesamtaufnahme - volume:95

Enthalten in:

Neural networks : the official journal of the International Neural Network Society - 95(2017) vom: 01. Nov., Seite 10-18

Sprache:

Englisch

Beteiligte Personen:

ElAdel, Asma [VerfasserIn]
Zaied, Mourad [VerfasserIn]
Amar, Chokri Ben [VerfasserIn]

Links:

Volltext

Themen:

Deep Convolution Neural Network
Fast Beta wavelet transform
Image classification & retrieval
Intelligent dropout
Journal Article

Anmerkungen:

Date Completed 23.04.2018

Date Revised 10.12.2019

published: Print-Electronic

Citation Status MEDLINE

doi:

10.1016/j.neunet.2017.07.015

funding:

Förderinstitution / Projekttitel:

PPN (Katalog-ID):

NLM275158322