MaxStyle: adversarial style composition for robust medical image segmentation

File Description SizeFormat 
2206.01737v1.pdfFile embargoed until 01 January 100002.54 MBAdobe PDF    Request a copy
Title: MaxStyle: adversarial style composition for robust medical image segmentation
Authors: Chen, C
Li, Z
Ouyang, C
Sinclair, M
Bai, W
Rueckert, D
Item Type: Conference Paper
Abstract: Convolutional neural networks (CNNs) have achieved remarkable segmentation accuracy on benchmark datasets where training and test sets are from the same domain, yet their performance can degrade significantly on unseen domains, which hinders the deployment of CNNs in many clinical scenarios. Most existing works improve model out-of-domain (OOD) robustness by collecting multi-domain datasets for training, which is expensive and may not always be feasible due to privacy and logistical issues. In this work, we focus on improving model robustness using a single-domain dataset only. We propose a novel data augmentation framework called MaxStyle, which maximizes the effectiveness of style augmentation for model OOD performance. It attaches an auxiliary style-augmented image decoder to a segmentation network for robust feature learning and data augmentation. Importantly, MaxStyle augments data with improved image style diversity and hardness, by expanding the style space with noise and searching for the worst-case style composition of latent features via adversarial training. With extensive experiments on multiple public cardiac and prostate MR datasets, we demonstrate that MaxStyle leads to significantly improved out-of-distribution robustness against unseen corruptions as well as common distribution shifts across multiple, different, unseen sites and unknown image sequences under both low- and high-training data settings. The code can be found at
Date of Acceptance: 5-May-2022
Copyright Statement: Copyright reserved
Conference Name: Medical Image Computing and Computer Assisted Interventions (MICCAI) 2022
Keywords: eess.IV
Notes: Early accepted by MICCAI 2022
Publication Status: Accepted
Start Date: 2022-09-18
Finish Date: 2022-09-22
Conference Place: Singapore
Embargo Date: This item is embargoed until publication
Appears in Collections:Computing
Department of Brain Sciences