A Closer Look at Multimodal Representation Collapse

1Fujitsu Research of Europe   2University of Surrey  

When noisy features of one modality exist in entanglement with predictive features of another in the fusion head (the probability of which increases with the number of modalities), it results in a sub-optimal solution wherein the predictive value of the former modality is diminished by the inevitable existence of noisy features. Freeing up rank bottlenecks allows for the denoising of such features along independent dimensions without affecting the latter modality, while simultaneously allowing the predictive features of the former modality to contribute to loss reduction.

Abstract

We aim to develop a fundamental understanding of modality collapse, a recently observed empirical phenomenon wherein models trained for multimodal fusion tend to rely only on a subset of the modalities, ignoring the rest. We show that modality collapse happens when noisy features from one modality are entangled, via a shared set of neurons in the fusion head, with predictive features from another, effectively masking out positive contributions from the predictive features of the former modality and leading to its collapse. We further prove that cross-modal knowledge distillation implicitly disentangles such representations by freeing up rank bottlenecks in the student encoder, denoising the fusion-head outputs without negatively impacting the predictive features from either modality. Based on the above findings, we propose an algorithm that prevents modality collapse through explicit basis reallocation, with applications in dealing with missing modalities. Extensive experiments on multiple multimodal benchmarks validate our theoretical claims.

BibTeX


            @inproceedings{
               chaudhuri2024mmcollapse,
               title={A Closer Look at Multimodal Representation Collapse},
               author={Abhra Chaudhuri and Anjan Dutta and Tu Bui and Serban Georgescu},
               booktitle={International Conference on Machine Learning},
               year={2025},
          }
         

Last updated: 28 May 2025 | Template Credit: Nerfies