The Theory of Wasserstein GANs

Introduction Consider two distributions $P_0$ and $P_{\theta}$ as shown in figure. $P_0$ has unit value at origin, and zero otherwise. Similarly, $P_{\theta}$ is non-zero only at $\theta$. This is an example of two parallel lines. Let’s check how different... [Read More]

Why Does Batch Normalization works?

Introduction The batch normalization by default is now used in many deep learning models. Batch normalization improves the training of the networks. It is claimed that it does so by handling the internal covariate shift $(ICS)$. ICS is refereed to the phenomena of changes in the distribution of a... [Read More]

The Theory of Generative Adversarial Networks

Introduction In order to generate the new samples of a distribution, it is necessary to know the data distribution $(p_{data})$. In generative adversarial networks (GANs), $p_{data}$ is approximated through generative distribution $p_g$. The GANs consists of two components, a... [Read More]

The Theory of Variational Autoencoders

Introduction The conventional autoencoder is used for reconstructing the input sample. It uses an encoder that projects the data into a low-dimensional space or latent space. A decoder then projects this low-dimensional data back to the original dimension. This set-up is trained in an end-to-end manner to bring reconstructed... [Read More]

GCTI-SN

GCTI-SN is a Geometry-inspired Chemical-invariant and Tissue Invariant Stain Normalization method. The proposed GCTI-SN method corrects for illumination variation, stain chemical, and stain quantity variation in a unified framework by exploiting the underlying color vector space’s geometry. While existing stain... [Read More]