Altmetric
JuliaMatrices/LowRankApprox.jl: v0.2.3
There are no files associated with this item.
Title: | JuliaMatrices/LowRankApprox.jl: v0.2.3 |
Authors: | Olver, S Ho, K Kelman, T Slevinsky, M |
Item Type: | Software / Code |
Abstract: | This Julia package provides fast low-rank approximation algorithms for BLAS/LAPACK-compatible matrices based on some of the latest technology in adaptive randomized matrix sketching. Currently implemented algorithms include:
sketch methods:
random Gaussian
random subset
subsampled random Fourier transform
sparse random Gaussian
partial range finder
partial factorizations:
QR decomposition
interpolative decomposition
singular value decomposition
Hermitian eigendecomposition
CUR decomposition
spectral norm estimation
By "partial", we mean essentially that these algorithms are early-terminating, i.e., they are not simply post-truncated versions of their standard counterparts. There is also support for "matrix-free" linear operators described only through their action on vectors. All methods accept a number of options specifying, e.g., the rank, estimated absolute precision, and estimated relative precision of approximation.
Our implementation borrows heavily from the perspective espoused by N. Halko, P.G. Martinsson, J.A. Tropp. Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions. SIAM Rev. 53 (2): 217-288, 2011., except that we choose the interpolative decomposition (ID) as our basic form of approximation instead of matrix range projection. The reason is that the latter requires expensive matrix-matrix multiplication to contract to the relevant subspace, while the former can sometimes be computed much faster, depending on the accelerated sketching strategy employed.
This package has been developed with performance in mind, and early tests have shown large speedups over similar codes written in MATLAB and Python (and even some in Fortran and C). For example, computing an ID of a Hilbert matrix of order 1024 to relative precision ~1e-15 takes:
~0.02 s using LowRankApprox in Julia
~0.07 s using SciPy in Python (calling a Fortran backend; see PyMatrixID)
~0.3 s in MATLAB
This difference can be attributed in part to both algorithmic improvements as well as to some low-level optimizations. This Julia package provides fast low-rank approximation algorithms for BLAS/LAPACK-compatible matrices based on some of the latest technology in adaptive randomized matrix sketching. Currently implemented algorithms include: sketch methods: random Gaussian random subset subsampled random Fourier transform sparse random Gaussian partial range finder partial factorizations: QR decomposition interpolative decomposition singular value decomposition Hermitian eigendecomposition CUR decomposition spectral norm estimation By "partial", we mean essentially that these algorithms are early-terminating, i.e., they are not simply post-truncated versions of their standard counterparts. There is also support for "matrix-free" linear operators described only through their action on vectors. All methods accept a number of options specifying, e.g., the rank, estimated absolute precision, and estimated relative precision of approximation. Our implementation borrows heavily from the perspective espoused by N. Halko, P.G. Martinsson, J.A. Tropp. Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions. SIAM Rev. 53 (2): 217-288, 2011., except that we choose the interpolative decomposition (ID) as our basic form of approximation instead of matrix range projection. The reason is that the latter requires expensive matrix-matrix multiplication to contract to the relevant subspace, while the former can sometimes be computed much faster, depending on the accelerated sketching strategy employed. This package has been developed with performance in mind, and early tests have shown large speedups over similar codes written in MATLAB and Python (and even some in Fortran and C). For example, computing an ID of a Hilbert matrix of order 1024 to relative precision ~1e-15 takes: ~0.02 s using LowRankApprox in Julia ~0.07 s using SciPy in Python (calling a Fortran backend; see PyMatrixID) ~0.3 s in MATLAB This difference can be attributed in part to both algorithmic improvements as well as to some low-level optimizations. |
Content Version: | v0.2.3 |
Issue Date: | 25-Apr-2019 |
URI: | http://hdl.handle.net/10044/1/69369 |
DOI: | https://doi.org/10.5281/zenodo.1254147 |
Copyright Statement: | https://github.com/JuliaMatrices/LowRankApprox.jl/blob/v0.2.3/LICENSE.md |
Keywords: | JuliaMatrices/LowRankApprox.jl |
Appears in Collections: | Faculty of Natural Sciences - Research Data |