Transitivity Recovering Decompositions: Interpretable and Robust Fine-Grained Relationships

1University of Exeter   2University of Trento   3University of Tübingen  
4MPI for Informatics   5The Alan Turing Institute   6University of Surrey  

Instead of learning representations of emergent relationships that are abstract aggregations of views, our method deconstructs the input, latent, and the class representation (proxy) spaces into graphs, thereby ensuring that all stages along the inference path are interpretable.

Abstract

Recent advances in fine-grained representation learning leverage local-to-global (emergent) relationships for achieving state-of-the-art results. The relational representations relied upon by such methods, however, are abstract. We aim to deconstruct this abstraction by expressing them as interpretable graphs over image views. We begin by theoretically showing that abstract relational representations are nothing but a way of recovering transitive relationships among local views. Based on this, we design Transitivity Recovering Decompositions (TRD), a graphspace search algorithm that identifies interpretable equivalents of abstract emergent relationships at both instance and class levels, and with no post-hoc computations. We additionally show that TRD is provably robust to noisy views, with empirical evidence also supporting this finding. The latter allows TRD to perform at par or even better than the state-of-the-art, while being fully interpretable. Implementation is available at https://github.com/abhrac/trd.

BibTeX


            @inproceedings{
               chaudhuri2023TRD,
               title={Transitivity Recovering Decompositions: Interpretable and Robust Fine-Grained Relationships},
               author={Abhra Chaudhuri and Massimiliano Mancini and Zeynep Akata and Anjan Dutta},
               booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
               year={2023},
               url={https://openreview.net/forum?id=wUNPmdE273}
          }
         

Last updated: 30 November 2023 | Template Credit: Nerfies