Self-supervised Siamese Autoencoders

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Standard

Self-supervised Siamese Autoencoders. / Baier, Friederike; Mair, Sebastian; Fadel, Samuel G.
Advances in Intelligent Data Analysis XXII: 22nd International Symposium on Intelligent Data Analysis, IDA 2024, Proceedings. ed. / Ioanna Miliou; Panagiotis Papapetrou; Nico Piatkowski. Springer Science and Business Media Deutschland, 2024. p. 117-128 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 14641 LNCS).

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Harvard

Baier, F, Mair, S & Fadel, SG 2024, Self-supervised Siamese Autoencoders. in I Miliou, P Papapetrou & N Piatkowski (eds), Advances in Intelligent Data Analysis XXII: 22nd International Symposium on Intelligent Data Analysis, IDA 2024, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 14641 LNCS, Springer Science and Business Media Deutschland, pp. 117-128, 22nd International Symposium on Intelligent Data Analysis - IDA 2024, Stockholm, Sweden, 24.04.24. https://doi.org/10.1007/978-3-031-58547-0_10

APA

Baier, F., Mair, S., & Fadel, S. G. (2024). Self-supervised Siamese Autoencoders. In I. Miliou, P. Papapetrou, & N. Piatkowski (Eds.), Advances in Intelligent Data Analysis XXII: 22nd International Symposium on Intelligent Data Analysis, IDA 2024, Proceedings (pp. 117-128). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 14641 LNCS). Springer Science and Business Media Deutschland. https://doi.org/10.1007/978-3-031-58547-0_10

Vancouver

Baier F, Mair S, Fadel SG. Self-supervised Siamese Autoencoders. In Miliou I, Papapetrou P, Piatkowski N, editors, Advances in Intelligent Data Analysis XXII: 22nd International Symposium on Intelligent Data Analysis, IDA 2024, Proceedings. Springer Science and Business Media Deutschland. 2024. p. 117-128. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). Epub 2024 Apr 16. doi: 10.1007/978-3-031-58547-0_10

Bibtex

@inbook{3f08d33ada4a45ec97f00119de3c2294,
title = "Self-supervised Siamese Autoencoders",
abstract = "In contrast to fully-supervised models, self-supervised representation learning only needs a fraction of data to be labeled and often achieves the same or even higher downstream performance. The goal is to pre-train deep neural networks on a self-supervised task, making them able to extract meaningful features from raw input data afterwards. Previously, autoencoders and Siamese networks have been successfully employed as feature extractors for tasks such as image classification. However, both have their individual shortcomings and benefits. In this paper, we combine their complementary strengths by proposing a new method called SidAE (Siamese denoising autoencoder). Using an image classification downstream task, we show that our model outperforms two self-supervised baselines across multiple data sets and scenarios. Crucially, this includes conditions in which only a small amount of labeled data is available. Empirically, the Siamese component has more impact, but the denoising autoencoder is nevertheless necessary to improve performance.",
keywords = "denoising autoencoder, image classification, pre-training, representation learning, Self-supervised learning, Siamese networks, Informatics, Business informatics",
author = "Friederike Baier and Sebastian Mair and Fadel, {Samuel G.}",
note = "Publisher Copyright: {\textcopyright} The Author(s), under exclusive license to Springer Nature Switzerland AG 2024.; 22nd International Symposium on Intelligent Data Analysis - IDA 2024, IDA 2024 ; Conference date: 24-04-2024 Through 26-04-2024",
year = "2024",
doi = "10.1007/978-3-031-58547-0_10",
language = "English",
isbn = "978-3-031-58546-3",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Science and Business Media Deutschland",
pages = "117--128",
editor = "Ioanna Miliou and Panagiotis Papapetrou and Nico Piatkowski",
booktitle = "Advances in Intelligent Data Analysis XXII",
address = "Germany",
url = "http://www.wikicfp.com/cfp/servlet/event.showcfp?copyownerid=90704&eventid=176233, http://ida2024.org/",

}

RIS

TY - CHAP

T1 - Self-supervised Siamese Autoencoders

AU - Baier, Friederike

AU - Mair, Sebastian

AU - Fadel, Samuel G.

N1 - Conference code: 22

PY - 2024

Y1 - 2024

N2 - In contrast to fully-supervised models, self-supervised representation learning only needs a fraction of data to be labeled and often achieves the same or even higher downstream performance. The goal is to pre-train deep neural networks on a self-supervised task, making them able to extract meaningful features from raw input data afterwards. Previously, autoencoders and Siamese networks have been successfully employed as feature extractors for tasks such as image classification. However, both have their individual shortcomings and benefits. In this paper, we combine their complementary strengths by proposing a new method called SidAE (Siamese denoising autoencoder). Using an image classification downstream task, we show that our model outperforms two self-supervised baselines across multiple data sets and scenarios. Crucially, this includes conditions in which only a small amount of labeled data is available. Empirically, the Siamese component has more impact, but the denoising autoencoder is nevertheless necessary to improve performance.

AB - In contrast to fully-supervised models, self-supervised representation learning only needs a fraction of data to be labeled and often achieves the same or even higher downstream performance. The goal is to pre-train deep neural networks on a self-supervised task, making them able to extract meaningful features from raw input data afterwards. Previously, autoencoders and Siamese networks have been successfully employed as feature extractors for tasks such as image classification. However, both have their individual shortcomings and benefits. In this paper, we combine their complementary strengths by proposing a new method called SidAE (Siamese denoising autoencoder). Using an image classification downstream task, we show that our model outperforms two self-supervised baselines across multiple data sets and scenarios. Crucially, this includes conditions in which only a small amount of labeled data is available. Empirically, the Siamese component has more impact, but the denoising autoencoder is nevertheless necessary to improve performance.

KW - denoising autoencoder

KW - image classification

KW - pre-training

KW - representation learning

KW - Self-supervised learning

KW - Siamese networks

KW - Informatics

KW - Business informatics

UR - http://www.scopus.com/inward/record.url?scp=85192241043&partnerID=8YFLogxK

UR - https://www.mendeley.com/catalogue/99889785-310e-31ca-8786-f93a9453f8b6/

U2 - 10.1007/978-3-031-58547-0_10

DO - 10.1007/978-3-031-58547-0_10

M3 - Article in conference proceedings

AN - SCOPUS:85192241043

SN - 978-3-031-58546-3

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 117

EP - 128

BT - Advances in Intelligent Data Analysis XXII

A2 - Miliou, Ioanna

A2 - Papapetrou, Panagiotis

A2 - Piatkowski, Nico

PB - Springer Science and Business Media Deutschland

T2 - 22nd International Symposium on Intelligent Data Analysis - IDA 2024

Y2 - 24 April 2024 through 26 April 2024

ER -

Recently viewed

Publications

  1. The impact of emotions, moods, and other affect-related variables on creativity, innovation and initiative
  2. Improved cytotoxicity testing of magnesium materials
  3. Atomkraft international
  4. L'agenda 21 locale
  5. Transformational ethics to bridge the void between facts and truths
  6. Understanding Similarities and Differences of Digital Health Platforms
  7. Qualitative system analysis as a means for sustainable governance of emerging technologies
  8. Utilizing Synchrotron Radiation for Phase Identification in Mg Alloys
  9. Can Geodesign Be Used to Facilitate Boundary Management for Planning and Implementation of Nature-based Solutions?
  10. Mechanical properties and corrosion performance of AZ-Mg alloy modified with Ca and Sr
  11. Credit constraints and margins of import
  12. An empirically tested overlap between indigenous and scientific knowledge of a changing climate in Bolivian Amazonia
  13. Two Mediterranean annuals feature high within-population trait variability and respond differently to a precipitation gradient
  14. Generalizing Trust
  15. Toward a modular evaluation approach of real-world laboratories
  16. Pragmatic acts of humour in family discourse in selected Maryam Apaokagi’s comedy skits
  17. The case survey method and applications in political science
  18. Learning to collaborate while collaborating
  19. Exports and profitability
  20. Dimension theory of linear solenoids
  21. Congruence is not everything
  22. Pupils as raters of instructional quality
  23. Intra-Individual Value Change in Adulthood
  24. Norms and variation in L2 pragmatics
  25. Delegitimisation through Evaluation: Discursive Appraisal of the National Grazing Reserve Bill in Online Media Discourse
  26. [U]topische Körper in der Adoleszenz