641
IRUS TotalDownloads
Altmetric
Universal probabilistic generative models for non-Gaussian signals
File | Description | Size | Format | |
---|---|---|---|---|
Li-S-2021-PhD-Thesis.pdf | Thesis | 27.48 MB | Adobe PDF | View/Open |
Title: | Universal probabilistic generative models for non-Gaussian signals |
Authors: | Li, Shengxi |
Item Type: | Thesis or dissertation |
Abstract: | We live in a rapidly evolving era of Big-Data, with modern rich-content signals which exhibit large variance, complicated structures and large data volumes; this all calls for robust, explainable and expressive generative models. Probabilistic generative models are aimed at learning signal behaviours and providing a human-level understanding in an unsupervised way, pursuing the ultimate goal of artificial intelligence. Yet, the existing models are mostly based on the Gaussian distribution which, despite its mathematical tractability, is accompanied by limitations in terms of robustness and flexibility to accommodate the typically unbalanced and heavily tailed natures of modern signals. In this thesis, we address these issues through the class of elliptical distributions, and propose novel and efficient generative models with guaranteed robustness and enhanced expressibility. Deep neural networks are also addressed in this context to further improve the model expressive power, whereby we introduce a novel structure that seamlessly combines an auto-encoder and a generative adversarial net (GAN). This makes it possible to achieve semantically meaningful learning, thus ensuring model explainability. To suit the needs of machine intelligence for real-world data, we first generalise the existing symmetric elliptical distribution to accommodate asymmetric signals, a commonplace in practice, by employing a directional von-Mises-Fisher (vMF) distribution and introducing a skewed elliptical distribution. In such a generative model, the generating process is clear, physically meaningful, and generic, allowing for explicit probabilistic density functions (pdfs) to be obtained. More importantly, the proposed generalisation is proved to share all desirable properties with the elliptical distribution, such as easy manipulation and practical implementations. We further provide a stable method to estimate the proposed model, and its superior performance is verified both theoretically and through experimental results. Next, we address the paradigm of learning from multi-modal signals, by proposing the general elliptical mixture model (EMM), which is shown to be sufficiently expressive to fit arbitrary real-valued distributions. Another important property of the EMM is its robustness, which is quantified through the influence functions as theoretical bounds. However, the existing EMMs are limited to several specific types of elliptical distributions. We therefore propose a unifying framework for computable and identifiable EMMs. To universally and efficiently solve EMMs, we establish a statistical manifold of elliptical distributions, and by finding the mismatch in their statistical manifolds, a reformulation trick and a redesigned cost are proposed, so as to allow for an equivalent statistical manifold and achieve extremely fast convergence speed. To overcome the problem of spurious convergence to local optima when learning EMMs, we further propose a relaxed Wasserstein distance for significantly enhanced discrepancy measurement between probabilistic models; the so proposed distance is proved to be an upper bound of the Wasserstein distance, and more importantly, this allows for both explicit statistical manifold metrics and the corresponding operations. For the universal manifold solver, we propose an adaptive stochastic gradient descent method to accelerate the convergence. The flexible EMMs with so quantified robustness, estimated through efficient and universal solvers, are shown to almost approach the global optimum. The focus next shifts to using deep neural nets in generative models, to further enhance model expressibility for complicated signals. In particular, we propose to use the characteristic functions (CFs), a powerful tool that contains all the information about any general distribution, to evaluate distributions for signals with high dimensions and complicated structures. In the analysis, we first prove the theoretical completeness of the evaluation, and that benefiting for the inherent complex-valued CF, its phase and amplitude parts possess physical meanings. More importantly, we introduce a reciprocal theory to make the CFs complete when comparing two transformed signals through deep neural networks. The proposed reciprocal CF generative adversarial net (RCF-GAN) is therefore seamlessly combines the auto-encoder and GAN, and learns a semantic low-dimensional manifold for both the generation and reconstruction with enhanced model explanability. We further propose an optimal sampling strategy under the umbrella of flexible elliptical distributions, and an anchor design to improve the convergence speed. Experimental results show that the proposed RCF-GAN achieves the state-of-the-art generation and reconstruction on images, whilst enjoying simple yet stable training procedure. Finally, for data observed on irregular domains, the graph link prediction problem is considered as a probabilistic generative model, making it possible to learn graphs via statistics from graph signals. We show that most of the generative models can be applied in this task, and further propose a graph RCF-GAN (GRCF-GAN), which learns meaningful representations of graph nodes and achieves the state-of-the-art performances. |
Content Version: | Open Access |
Issue Date: | Jun-2021 |
Date Awarded: | Aug-2021 |
URI: | http://hdl.handle.net/10044/1/91942 |
DOI: | https://doi.org/10.25560/91942 |
Copyright Statement: | Creative Commons Attribution NonCommercial NoDerivatives Licence |
Supervisor: | Mandic, Danilo |
Sponsor/Funder: | Lee Family Scholarship |
Funder's Grant Number: | -- |
Department: | Electrical and Electronic Engineering |
Publisher: | Imperial College London |
Qualification Level: | Doctoral |
Qualification Name: | Doctor of Philosophy (PhD) |
Appears in Collections: | Electrical and Electronic Engineering PhD theses |
This item is licensed under a Creative Commons License