Altmetric

Algorithms for tensor network contraction ordering

File Description SizeFormat 
Schindler_2020_Mach._Learn.__Sci._Technol._1_035001.pdfPublished version2.51 MBAdobe PDFView/Open
Title: Algorithms for tensor network contraction ordering
Authors: Schindler, F
Jermyn, AS
Item Type: Journal Article
Abstract: Contracting tensor networks is often computationally demanding. Well-designed contraction sequences can dramatically reduce the contraction cost. We explore the performance of simulated annealing and genetic algorithms, two common discrete optimization techniques, to this ordering problem. We benchmark their performance as well as that of the commonly-used greedy search on physically relevant tensor networks. Where computationally feasible, we also compare them with the optimal contraction sequence obtained by an exhaustive search. Furthermore, we present a systematic comparison with state-of-the-art tree decomposition and graph partitioning algorithms in the context of random regular graph tensor networks. We find that the algorithms we consider consistently outperform a greedy search given equal computational resources, with an advantage that scales with tensor network size. We compare the obtained contraction sequences and identify signs of highly non-local optimization, with the more sophisticated algorithms sacrificing run-time early in the contraction for better overall performance. Contracting tensor networks is often computationally demanding. Well-designed contraction sequences can dramatically reduce the contraction cost. We explore the performance of simulated annealing and genetic algorithms, two common discrete optimization techniques, to this ordering problem. We benchmark their performance as well as that of the commonly-used greedy search on physically relevant tensor networks. Where computationally feasible, we also compare them with the optimal contraction sequence obtained by an exhaustive search. Furthermore, we present a systematic comparison with state-of-the-art tree decomposition and graph partitioning algorithms in the context of random regular graph tensor networks. We find that the algorithms we consider consistently outperform a greedy search given equal computational resources, with an advantage that scales with tensor network size. We compare the obtained contraction sequences and identify signs of highly non-local optimization, with the more sophisticated algorithms sacrificing run-time early in the contraction for better overall performance.
Issue Date: 1-Sep-2020
Date of Acceptance: 20-May-2020
URI: http://hdl.handle.net/10044/1/105854
DOI: 10.1088/2632-2153/ab94c5
ISSN: 2632-2153
Publisher: IOP Publishing
Start Page: 1
End Page: 13
Journal / Book Title: Machine Learning: Science and Technology
Volume: 1
Issue: 3
Copyright Statement: © 2020 The Author(s). Published by IOP Publishing Ltd. Original Content from this work may be used under the terms of the Creative Commons Attribution 4.0 licence.
Publication Status: Published
Open Access location: https://iopscience.iop.org/article/10.1088/2632-2153/ab94c5/pdf
Online Publication Date: 2020-07-02
Appears in Collections:Condensed Matter Theory
Physics



This item is licensed under a Creative Commons License Creative Commons