Efficient neural network verification via layer-based semidefinite relaxations and linear cuts
File(s)main.pdf (481.1 KB)
Accepted version
Author(s)
Batten, Ben
Kouvaros, Panagiotis
Lomuscio, Alessio
Zheng, Yang
Type
Conference Paper
Abstract
We introduce an efficient and tight layer-based semidefinite relaxation for verifying local robust-ness of neural networks. The improved tightness is the result of the combination between semidefinite relaxations and linear cuts. We obtain a computationally efficient method by decomposing the semidefinite formulation into layer wise constraints. By leveraging on chordal graph decompositions, we show that the formulation here presented is provably tighter than current approaches. Experiments on a set of benchmark networks show that the approach here proposed enables the verification of more instances compared to other relaxation methods. The results also demonstrate that the SDP relaxation here proposed is one order of magnitude faster than previous SDP methods.
Date Issued
2021-08-19
Date Acceptance
2021-04-29
Citation
Procedings of the 30th International Joint Conference on Artificial Intelligence (IJCAI-21), 2021, pp.2184-2190
Publisher
IJCAI
Start Page
2184
End Page
2190
Journal / Book Title
Procedings of the 30th International Joint Conference on Artificial Intelligence (IJCAI-21)
Copyright Statement
© 2021 The Author(s).
Identifier
https://www.ijcai.org/proceedings/2021/301
Source
International Joint Conference on Artificial Intelligence (IJAC 2021)
Publication Status
Published
Start Date
2021-08-19
Finish Date
2021-08-26
Coverage Spatial
Montreal, Canada
Date Publish Online
2021-08-19