Principled Interpolation in Normalizing Flows

Publikation: Beiträge in SammelwerkenAufsätze in KonferenzbändenForschungbegutachtet

Authors

Generative models based on normalizing flows are very successful in modeling complex data distributions using simpler ones. However, straightforward linear interpolations show unexpected side effects, as interpolation paths lie outside the area where samples are observed. This is caused by the standard choice of Gaussian base distributions and can be seen in the norms of the interpolated samples as they are outside the data manifold. This observation suggests that changing the way of interpolating should generally result in better interpolations, but it is not clear how to do that in an unambiguous way. In this paper, we solve this issue by enforcing a specific manifold and, hence, change the base distribution, to allow for a principled way of interpolation. Specifically, we use the Dirichlet and von Mises-Fisher base distributions on the probability simplex and the hypersphere, respectively. Our experimental results show superior performance in terms of bits per dimension, Frechet Inception Distance (FID), and Kernel Inception Distance (KID) scores for interpolation, while maintaining the generative performance
OriginalspracheEnglisch
TitelMachine Learning and Knowledge Discovery in Databases. Research Track : European Conference, ECML PKDD 2021, Bilbao, Spain, September 13–17, 2021, Proceedings, Part II
HerausgeberNuria Oliver, Fernando Pérez-Cruz, Stefan Kramer, Jesse Read, Jose A. Lozano
Anzahl der Seiten16
ErscheinungsortCham
VerlagSpringer Nature AG
Erscheinungsdatum09.2021
Seiten116-131
ISBN (Print)978-3-030-86519-1
ISBN (elektronisch)978-3-030-86520-7
DOIs
PublikationsstatusErschienen - 09.2021
VeranstaltungEuropean Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases - ECML PKDD 2021 - Virtual, Online
Dauer: 13.09.202117.09.2021

DOI