Neural relational inference for disaster multimedia retrieval

Research output: Journal contributionsJournal articlesResearchpeer-review

Authors

Events around the world are increasingly documented on social media, especially by the people experiencing them, as these platforms become more popular over time. As a consequence, social media turns into a valuable source of data for understanding those events. Due to their destructive potential, natural disasters are among events of particular interest to response operations and environmental monitoring agencies. However, this amount of information also makes it challenging to identify relevant content pertaining to those events. In this paper, we use a relational neural network model for identifying this type of content. The model is particularly suitable for unstructured text, that is, text with no particular arrangement of words, such as tags, which is commonplace in social media data. In addition, our method can be combined with a CNN for handling multimodal data where text and visual data are available. We perform experiments in three different scenarios, where different modalities are evaluated: visual, textual, and both. Our method achieves competitive performance in both modalities by themselves, while significantly outperforms the baseline on the multimodal scenario. We also demonstrate the behavior of the proposed method in different applications by performing additional experiments in the CUB-200-2011 multimodal dataset.

Original languageEnglish
JournalMultimedia Tools and Applications
Volume79
Issue number35-36
Pages (from-to)26735-26746
ISSN1380-7501
DOIs
Publication statusPublished - 09.2020
Externally publishedYes

    Research areas

  • Information retrieval, Machine learning, Multimodal, Natural language processing, Neural networks
  • Business informatics