Motivation:
How can we train our generative models with partial, noisy observations”
Why do we care?
In many settings, it is expensive or even impossible to obtain fully-observed samples, but economical to obtain partial, noisy samples.
Proposes in this paper:
- AmbientGAN: train the discriminator not on the raw day domain but on the measurement domain
- Propose the way to train the generative model with a noisy, corrupted, or missing data without any clean images
- Prove that it is theoretically possible to recover the original true data distribution even though the measurement process is not invertible
MODEL ARCHITECTURE
Generative Adversarial Networks
Limitation of GAN
- Require Good (or fully observed ) training samples
Related work:
- Compressed sensing attempts to address this problem by exploiting the models of the data structure, sparsity
- Bora et al. ICML 2017 “Compressed Sensing using Generative Models”
- Compressed sensing is a very promising study and can give amazing results (need to go deeper)
Chicken and Egg
- they have proposed it is possible to solve the problem with a small number of measurements by using Generative models
- what if it is even not possible to gather the good data in the first place?
- How can we collect enough data to train a generative model to start with?
Measurements?
Results
Conclusion
- it is possible to train the generator without fully-observed data
- In theory, it is possible to find the true distribution by training the generator when the measurement process is invertible and differentiable
- Empirically, it is possible to recover the good data distribution even though the measurement process is not clearly known.
Possible Applications?
- OCR
- Gayoung’s webtoon data
- Adding Reconstructionloss and Cyclic loss
- Learnable f(.) by FC
- etc.
Qestions to consider:
- Cycle-GAN v.s Ambient-GAN?
Source: https://www.slideshare.net/thinkingfactory/introduction-to-ambient-gan