PnP-AdaNet: plug-and-play adversarial domain adaptation network at unpaired cross-modality cardiac segmentation
File(s)
Author(s)
Type
Working Paper
Abstract
Deep convolutional networks have demonstrated the state-of-the-art performance on variouschallenging medical image processing tasks. Leveraging images from different modalities for the sameanalysis task holds large clinical benefits. However, the generalization capability of deep networks ontest data sampled from different distribution remains as a major challenge. In this paper, we propose aPnP-AdaNet(plug-and-play adversarial domain adaptation network) for adapting segmentation networksbetween different modalities of medical images, e.g., MRI and CT. We tackle the significant domain shift byaligning the feature spaces of source and target domains at multiple scales in an unsupervised manner. Withadversarial loss, we learn a domain adaptation module which flexibly replaces the early encoder layers of thesource network, and the higher layers are shared between two domains. We validate our domain adaptationmethod on cardiac segmentation in unpaired MRI and CT, with four different anatomical structures. Theaverage Dice achieved 63.9%, which is a significant recover from the complete failure (Dice score of13.2%) if we directly test a MRI segmentation network on CT data. In addition, our proposedPnP-AdaNetoutperforms many state-of-the-art unsupervised domain adaptation approaches on the same dataset. Theexperimental results with comprehensive ablation studies have demonstrated the excellent efficacy of ourproposed method for unsupervised cross-modality domain adaptation. Our code is publically available at:https://github.com/carrenD/Medical-Cross-Modality-Domain-Adaptation
Date Issued
2019-01-01
Date Acceptance
2019-07-08
Citation
IEEE Access
ISSN
2169-3536
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Journal / Book Title
IEEE Access
Identifier
http://arxiv.org/abs/1812.07907v1
Subjects
cs.CV
Publication Status
Accepted