October 23, 2018 –
Title: Methods in Homology Inference
Ph.D. Candidate: Nicholas Cavanna
Major Advisor: Dr. Donald Sheehy
Associate Advisors: Dr. Sridhar Duggirala, Dr. Thomas Peters, Dr. Alexander Russell
Day/Time: Tuesday, October 23rd, 2018 at 1:00 pm
Location: HBL Class of 1947 Conference Room
Abstract: High-dimensional data analysis techniques are increasingly needed in academic and industrial settings such as statistics, machine learning, biological sciences, and engineering disciplines. Such data can often be viewed as geometric objects in Euclidean space and topological data analysis has arisen as an approach that uses this inherent geometry to categorize and interpret data by examining algebraic objects associated to its geometric representations. The most common algebraic structure used is homology, an efficiently computable topological invariant that effectively measures the number of holes in a space. Generally speaking, homology inference is the computation of the homology of an underlying space by working with a point set sampled from it. In this proposal we focus on approaches to homology inference making novel assumptions about both the domain and the input.
We prove that in Euclidean space, adaptive sampling under the Euclidean metric is nearly converse to uniform sampling with respect to a path-based adaptive metric borrowed from robot motion planning. This allows for the construction of a series of interleavings that ultimately leads to a homology inference method for the sampled domain by computing the homology of Euclidean balls with radius proportional to a first-order approximation of the adaptive metric. Furthermore, by applying the Nerve Theorem we may compute the homology of the space being examined by considering the nerves of these approximate metric balls instead.
We also prove that one can use clique complexes to check for k-coverage of a subset of a domain by a collection of metric balls around sensors. In the process, we generalize seminal work done by De Silva & Ghrist on homological sensor networks to the case of k-coverage, while also streamlining their original proof through the novel use of a relative version of Alexander Duality. We then prove that if one has said coverage by the sample, there is a subsampling scheme that can be used to infer the homology of the original domain by assuming some minimal regularity assumptions on it and its boundary.
We culminate the proposal with a result which relaxes the traditional cover assumption necessary for a nerve to preserve the topology of the underlying space to that of an assumption of a parametrization of a cover, called the ε-good cover condition. Using such a cover we are able to prove a tight bottleneck distance between the so-called persistence diagrams of the nerve filtration and the underlying space filtration, providing an approximate parametrized homology inference method, as well as broadening the applications of nerves to situations where one does not have precise sampling guarantees.