logo

EbookBell.com

Most ebook files are in PDF format, so you can easily read them using various software such as Foxit Reader or directly on the Google Chrome browser.
Some ebook files are released by publishers in other formats such as .awz, .mobi, .epub, .fb2, etc. You may need to install specific software to read these formats on mobile/PC, such as Calibre.

Please read the tutorial at this link:  https://ebookbell.com/faq 


We offer FREE conversion to the popular formats you request; however, this may take some time. Therefore, right after payment, please email us, and we will try to provide the service as quickly as possible.


For some exceptional file formats or broken links (if any), please refrain from opening any disputes. Instead, email us first, and we will try to assist within a maximum of 6 hours.

EbookBell Team

Enhancing Deep Learning Performance Using Displaced Rectifier Linear Unit David Macdo

  • SKU: BELL-62651896
Enhancing Deep Learning Performance Using Displaced Rectifier Linear Unit David Macdo
$ 31.00 $ 45.00 (-31%)

5.0

58 reviews

Enhancing Deep Learning Performance Using Displaced Rectifier Linear Unit David Macdo instant download after payment.

Publisher: Editora Dialética
File Extension: EPUB
File size: 6.06 MB
Author: David Macêdo
Language: English
Year: 2022

Product desciption

Enhancing Deep Learning Performance Using Displaced Rectifier Linear Unit David Macdo by David Macêdo instant download after payment.

Recently, deep learning has caused a significant impact on computer vision, speech recognition, and natural language understanding. In spite of the remarkable advances, deep learning recent performance gains have been modest and usually rely on increasing the depth of the models, which often requires more computational resources such as processing time and memory usage. To tackle this problem, we turned our attention to the interworking between the activation functions and the batch normalization, which is virtually mandatory currently. In this work, we propose the activation function Displaced Rectifier Linear Unit (DReLU) by conjecturing that extending the identity function of ReLU to the third quadrant enhances compatibility with batch normalization. Moreover, we used statistical tests to compare the impact of using distinct activation functions (ReLU, LReLU, PReLU, ELU, and DReLU) on the learning speed and test accuracy performance of VGG and Residual Networks state-of-the-art...

Related Products