Domain Adaptation

MSCDA: Multi-level semantic-guided contrast improves unsupervised domain adaptation for breast MRI segmentation in small datasets [Neural Networks 2023]

Deep learning (DL) applied to breast tissue segmentation in magnetic resonance imaging (MRI) has received increased attention in the last decade, however, the domain shift which arises from different vendors, acquisition protocols, and biological heterogeneity, remains an important but challenging obstacle on the path towards clinical implementation. In this paper, we propose a novel Multi- level Semantic-guided Contrastive Domain Adaptation (MSCDA) framework to address this issue in an unsupervised manner. Our approach incorporates self-training with contrastive learning to align feature representations between domains. In particular, we extend the contrastive loss by incorporating pixel-to-pixel, pixel-to-centroid, and centroid-to-centroid contrasts to better exploit the underlying semantic information of the image at different levels. To resolve the data imbalance problem, we utilize a category-wise cross-domain sampling strategy to sample anchors from target images and build a hybrid memory bank to store samples from source images. We have validated MSCDA with a challenging task of cross-domain breast MRI segmentation between datasets of healthy volunteers and invasive breast cancer patients. Extensive experiments show that MSCDA effectively improves the model’s feature alignment capabilities between domains, outperforming state-of-the-art methods. Furthermore, the framework is shown to be label-efficient, achieving good performance with a smaller source dataset. The code is publicly available at https://github.com/ShengKuangCN/MSCDA.


Regularized Semi-paired Kernel CCA for domain adaptation  [IEEE-TNNLS 2018 , [PDF]]

Domain adaptation learning is one of the fundamental research topics in pattern recognition and machine learning. This paper introduces a Regularized Semi-Paired Kernel Canonical Correlation Analysis (RSP-KCCA) formulation for learning a latent space for the domain adaptation problem. The optimization problem is formulated in the primal-dual setting where side information can be readily incorporated through regularization terms. The proposed model learns a joint representation of the data set across different domains by solving a generalized eigenvalue problem or linear system of equations in the dual. The approach is naturally equipped with out-of-sample extension property which plays an important role for model selection. Furthermore, the Nyström approximation technique is used to make the computational issues due to the large size of the matrices involved in the eigen-decomposition feasible. Experimental results are given to illustrate the effectiveness of the proposed approaches on synthetic and real-life datasets.

Cross-domain neural-kernel networks  [Pattern recognition letters, 2019 [PDF]]

This paper introduces a novel cross-domain neural-kernel networks architecture for semi-supervised domain adaption problem. The proposed model consists of two stream neural-kernel networks corresponding to the source and target domains which are enriched with a coupling term. Each stream neural-kernel networks follows a combination of neural network layer and an explicit feature map constructed by means of random Fourier features. The introduced coupling term aims at enforcing correlations among the output of the intermediate layers of the two stream networks as well as encouraging the two networks to learn shared representation of the data from both source and target domains. Experimental results are given to illustrate the effectiveness of the proposed approaches on synthetic and real-life datasets.  

Manual labeling of sufficient training data for diverse application domains is a costly, laborious task and often prohibitive. Therefore, designing models that can leverage rich labeled data in one domain and be applicable to a different but related domain is highly desirable. In particular, domain adaptation or transfer learning algorithms seek to generalize a model trained in a source domain to a new target domain. Recent years has witnessed increasing interest in these types of models due to their practical importance in real-life applications. In this paper we provide a brief overview of recent techniques with both shallow and deep architectures for domain adaptation models.