MADDD seminar 2021-11-03: David Glickenstein: Discrete conformal geometry and adversarial examples of neural networks
From Matthias Koeppe
Abstract: Discrete differential geometry is a field that uses the ideas and techniques of differential geometry to study discretely parametrized objects such as point clouds and polyhedral surfaces representing brains, protein molecules, and other 3d geometric shapes. We will explore ideas for discrete versions of conformal geometry that originate in the ideas of Thurston on a version of the Riemann Mapping Theorem for circle packings. The connection with other discretized objects such as the finite volume Laplacian leads to an axiomatization and classification of all possible discrete conformal geometries. In the second part of the talk, we will explore adversarial examples of neural network classifiers. Deep neural networks have been wildly successful at solving many complicated problems such as image classification. However, it has been found that neural network classifiers are vulnerable to adversarial examples; in this case, each natural image has a very small perturbation that is classified differently. We will look at some techniques and definitions developed to understand why adversarial examples exist.