Gelişmiş Arama

Basit öğe kaydını göster

dc.contributor.authorKarabayır, İbrahim
dc.contributor.authorAkbilgiç, Oğuz
dc.contributor.authorTaş, Nihat
dc.date.accessioned2021-12-12T17:01:50Z
dc.date.available2021-12-12T17:01:50Z
dc.date.issued2021
dc.identifier.issn2162-237X
dc.identifier.issn2162-2388
dc.identifier.urihttps://doi.org/10.1109/TNNLS.2020.2979121
dc.identifier.urihttps://hdl.handle.net/20.500.11857/3310
dc.description.abstractGradient-based algorithms have been widely used in optimizing parameters of deep neural networks' (DNNs) architectures. However, the vanishing gradient remains as one of the common issues in the parameter optimization of such networks. To cope with the vanishing gradient problem, in this article, we propose a novel algorithm, evolved gradient direction optimizer (EVGO), updating the weights of DNNs based on the first-order gradient and a novel hyperplane we introduce. We compare the EVGO algorithm with other gradient-based algorithms, such as gradient descent, RMSProp, Adagrad, momentum, and Adam on the well-known Modified National Institute of Standards and Technology (MNIST) data set for handwritten digit recognition by implementing deep convolutional neural networks. Furthermore, we present empirical evaluations of EVGO on the CIFAR-10 and CIFAR-100 data sets by using the well-known AlexNet and ResNet architectures. Finally, we implement an empirical analysis for EVGO and other algorithms to investigate the behavior of the loss functions. The results show that EVGO outperforms all the algorithms in comparison for all experiments. We conclude that EVGO can be used effectively in the optimization of DNNs, and also, the proposed hyperplane may provide a basis for future optimization algorithms.en_US
dc.language.isoengen_US
dc.publisherIeee-Inst Electrical Electronics Engineers Incen_US
dc.relation.ispartofIeee Transactions On Neural Networks and Learning Systemsen_US
dc.identifier.doi10.1109/TNNLS.2020.2979121
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectOptimizationen_US
dc.subjectTrainingen_US
dc.subjectNeural networksen_US
dc.subjectLearning systemsen_US
dc.subjectHandwriting recognitionen_US
dc.subjectMachine learning algorithmsen_US
dc.subjectDeep learningen_US
dc.subjectCIFARen_US
dc.subjectconvolutional neural networks (CNNs)en_US
dc.subjectdeep learningen_US
dc.subjectevolved gradient direction optimizer (EVGO)en_US
dc.subjectgradient methodsen_US
dc.subjecthandwritten digit recognitionen_US
dc.subjectmachine learningen_US
dc.titleA Novel Learning Algorithm to Optimize Deep Neural Networks: Evolved Gradient Direction Optimizer (EVGO)en_US
dc.typearticle
dc.authoridakbilgic, oguz/0000-0003-0313-9254
dc.authoridKarabayir, Ibrahim/0000-0002-7928-176X
dc.departmentFakülteler, İktisadi ve İdari Bilimler Fakültesi, Ekonometri Bölümü
dc.identifier.volume32en_US
dc.identifier.startpage685en_US
dc.identifier.issue2en_US
dc.identifier.endpage694en_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.wosWOS:000616310400018en_US
dc.identifier.pmidPubMed: 32481228en_US
dc.authorwosidakbilgic, oguz/F-9407-2013
dc.authorwosidKarabayir, Ibrahim/AAC-3262-2019


Bu öğenin dosyaları:

Thumbnail

Bu öğe aşağıdaki koleksiyon(lar)da görünmektedir.

Basit öğe kaydını göster