Invertible recurrent inference machines (Putzky and Welling, 2019) (generic example)
Generative models with maximum likelihood via the change of variable formula (example)
Glow: Generative flow with invertible 1x1 convolutions (Kingma and Dhariwal, 2018) (generic example, source)
GPU support
GPU support is supported via Flux/CuArray. To use the GPU, move the input and the network layer to GPU via |> gpu
using InvertibleNetworks, Flux
# Input
nx = 64
ny = 64
k = 10
batchsize = 4
# Input image: nx x ny x k x batchsize
X = randn(Float32, nx, ny, k, batchsize) |> gpu
# Activation normalization
AN = ActNorm(k; logdet=true) |> gpu
# Test invertibility
Y_, logdet = AN.forward(X)
Yann Dauphin, Angela Fan, Michael Auli and David Grangier, "Language modeling with gated convolutional networks", Proceedings of the 34th International Conference on Machine Learning, 2017. https://arxiv.org/pdf/1612.08083.pdf
Laurent Dinh, Jascha Sohl-Dickstein and Samy Bengio, "Density estimation using Real NVP", International Conference on Learning Representations, 2017, https://arxiv.org/abs/1605.08803
Diederik P. Kingma and Prafulla Dhariwal, "Glow: Generative Flow with Invertible 1x1 Convolutions", Conference on Neural Information Processing Systems, 2018. https://arxiv.org/abs/1807.03039
Keegan Lensink, Eldad Haber and Bas Peters, "Fully Hyperbolic Convolutional Neural Networks", arXiv Computer Vision and Pattern Recognition, 2019. https://arxiv.org/abs/1905.10484
Patrick Putzky and Max Welling, "Invert to learn to invert", Advances in Neural Information Processing Systems, 2019. https://arxiv.org/abs/1911.10914
Jakob Kruse, Gianluca Detommaso, Robert Scheichl and Ullrich Köthe, "HINT: Hierarchical Invertible Neural Transport for Density Estimation and Bayesian Inference", arXiv Statistics and Machine Learning, 2020. https://arxiv.org/abs/1905.10687
Authors
Philipp Witte, Georgia Institute of Technolgy (now Microsoft)
Gabrio Rizzuti, Utrecht University
Mathias Louboutin, Georgia Institute of Technology
请发表评论