Categories
Nevin Manimala Statistics

Replacing pooling functions in Convolutional Neural Networks by linear combinations of increasing functions

Neural Netw. 2022 May 6;152:380-393. doi: 10.1016/j.neunet.2022.04.028. Online ahead of print.

ABSTRACT

Traditionally, Convolutional Neural Networks make use of the maximum or arithmetic mean in order to reduce the features extracted by convolutional layers in a downsampling process known as pooling. However, there is no strong argument to settle upon one of the two functions and, in practice, this selection turns to be problem dependent. Further, both of these options ignore possible dependencies among the data. We believe that a combination of both of these functions, as well as of additional ones which may retain different information, can benefit the feature extraction process. In this work, we replace traditional pooling by several alternative functions. In particular, we consider linear combinations of order statistics and generalizations of the Sugeno integral, extending the latter’s domain to the whole real line and setting the theoretical base for their application. We present an alternative pooling layer based on this strategy which we name “CombPool” layer. We replace the pooling layers of three different architectures of increasing complexity by CombPool layers, and empirically prove over multiple datasets that linear combinations outperform traditional pooling functions in most cases. Further, combinations with either the Sugeno integral or one of its generalizations usually yield the best results, proving a strong candidate to apply in most architectures.

PMID:35605303 | DOI:10.1016/j.neunet.2022.04.028

By Nevin Manimala

Portfolio Website for Nevin Manimala