Strange Fruit, (2020)
There is a phenomenon known as “mode collapse.” It is not fully understood why this happens, only that it is a constant occurrence. It is the only constant in latent space. In this project, Mal explored mode collapse in depth. “I train the model until a full collapse happens. I iterate on the dataset with augmenting, duplicating and looping in generated images from previous ticks – to steer the collapse. And then I took a few ticks-backward and that’s when I discovered partial collapse.”
A partial collapse is where a neural network (GAN) is only capable of generating a small subset of outcomes. Degrees of mode collapse happens when the dataset is limited in diversity (too similar) or there are too few images. Another way of wording it: the less diverse the dataset, the quicker mode collapses occur.
Back To The Gallery