Categories
Nevin Manimala Statistics

Generalized Domain Conditioned Adaptation Network

IEEE Trans Pattern Anal Mach Intell. 2021 Mar 1;PP. doi: 10.1109/TPAMI.2021.3062644. Online ahead of print.

ABSTRACT

Domain Adaptation (DA) attempts to transfer knowledge in labeled source domain to unlabeled target domain without requiring target supervision. Recent advanced methods conduct DA mainly by aligning domain distributions. However, the performances of these methods suffer extremely when source and target domains encounter a large domain discrepancy. We argue this limitation may attribute to insufficient domain-specialized feature exploring, because most works merely concentrate on domain-general feature learning while integrating totally-shared convolutional networks (convnets). In this paper, we relax the completely-shared convnets assumption and propose Domain Conditioned Adaptation Network, which introduces domain conditioned channel attention module to excite channel activation separately for each domain. Such a partially-shared convnets module allows domain-specialized features in low-level to be explored appropriately. Furthermore, we develop Generalized Domain Conditioned Adaptation Network to automatically determine whether domain channel activations should be modeled separately in each attention module. Then, the critical domain-dependent knowledge could be adaptively extracted according to the domain statistics gap. Meanwhile, to effectively align high-level feature distributions across two domains, we further deploy feature adaptation blocks after task-specific layers, which will explicitly mitigate the domain discrepancy. Extensive experiments on four cross-domain benchmarks demonstrate our approaches outperform existing methods, especially on very tough cross-domain learning tasks.

PMID:33646945 | DOI:10.1109/TPAMI.2021.3062644

By Nevin Manimala

Portfolio Website for Nevin Manimala