Self-supervised Siamese Autoencoders

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Standard

Self-supervised Siamese Autoencoders. / Baier, Friederike; Mair, Sebastian; Fadel, Samuel G.
Advances in Intelligent Data Analysis XXII: 22nd International Symposium on Intelligent Data Analysis, IDA 2024, Proceedings. ed. / Ioanna Miliou; Panagiotis Papapetrou; Nico Piatkowski. Springer Science and Business Media Deutschland, 2024. p. 117-128 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 14641 LNCS).

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Harvard

Baier, F, Mair, S & Fadel, SG 2024, Self-supervised Siamese Autoencoders. in I Miliou, P Papapetrou & N Piatkowski (eds), Advances in Intelligent Data Analysis XXII: 22nd International Symposium on Intelligent Data Analysis, IDA 2024, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 14641 LNCS, Springer Science and Business Media Deutschland, pp. 117-128, 22nd International Symposium on Intelligent Data Analysis - IDA 2024, Stockholm, Sweden, 24.04.24. https://doi.org/10.1007/978-3-031-58547-0_10

APA

Baier, F., Mair, S., & Fadel, S. G. (2024). Self-supervised Siamese Autoencoders. In I. Miliou, P. Papapetrou, & N. Piatkowski (Eds.), Advances in Intelligent Data Analysis XXII: 22nd International Symposium on Intelligent Data Analysis, IDA 2024, Proceedings (pp. 117-128). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 14641 LNCS). Springer Science and Business Media Deutschland. https://doi.org/10.1007/978-3-031-58547-0_10

Vancouver

Baier F, Mair S, Fadel SG. Self-supervised Siamese Autoencoders. In Miliou I, Papapetrou P, Piatkowski N, editors, Advances in Intelligent Data Analysis XXII: 22nd International Symposium on Intelligent Data Analysis, IDA 2024, Proceedings. Springer Science and Business Media Deutschland. 2024. p. 117-128. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). Epub 2024 Apr 16. doi: 10.1007/978-3-031-58547-0_10

Bibtex

@inbook{3f08d33ada4a45ec97f00119de3c2294,
title = "Self-supervised Siamese Autoencoders",
abstract = "In contrast to fully-supervised models, self-supervised representation learning only needs a fraction of data to be labeled and often achieves the same or even higher downstream performance. The goal is to pre-train deep neural networks on a self-supervised task, making them able to extract meaningful features from raw input data afterwards. Previously, autoencoders and Siamese networks have been successfully employed as feature extractors for tasks such as image classification. However, both have their individual shortcomings and benefits. In this paper, we combine their complementary strengths by proposing a new method called SidAE (Siamese denoising autoencoder). Using an image classification downstream task, we show that our model outperforms two self-supervised baselines across multiple data sets and scenarios. Crucially, this includes conditions in which only a small amount of labeled data is available. Empirically, the Siamese component has more impact, but the denoising autoencoder is nevertheless necessary to improve performance.",
keywords = "denoising autoencoder, image classification, pre-training, representation learning, Self-supervised learning, Siamese networks, Informatics, Business informatics",
author = "Friederike Baier and Sebastian Mair and Fadel, {Samuel G.}",
note = "Publisher Copyright: {\textcopyright} The Author(s), under exclusive license to Springer Nature Switzerland AG 2024.; 22nd International Symposium on Intelligent Data Analysis - IDA 2024, IDA 2024 ; Conference date: 24-04-2024 Through 26-04-2024",
year = "2024",
doi = "10.1007/978-3-031-58547-0_10",
language = "English",
isbn = "978-3-031-58546-3",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Science and Business Media Deutschland",
pages = "117--128",
editor = "Ioanna Miliou and Panagiotis Papapetrou and Nico Piatkowski",
booktitle = "Advances in Intelligent Data Analysis XXII",
address = "Germany",
url = "http://www.wikicfp.com/cfp/servlet/event.showcfp?copyownerid=90704&eventid=176233, http://ida2024.org/",

}

RIS

TY - CHAP

T1 - Self-supervised Siamese Autoencoders

AU - Baier, Friederike

AU - Mair, Sebastian

AU - Fadel, Samuel G.

N1 - Conference code: 22

PY - 2024

Y1 - 2024

N2 - In contrast to fully-supervised models, self-supervised representation learning only needs a fraction of data to be labeled and often achieves the same or even higher downstream performance. The goal is to pre-train deep neural networks on a self-supervised task, making them able to extract meaningful features from raw input data afterwards. Previously, autoencoders and Siamese networks have been successfully employed as feature extractors for tasks such as image classification. However, both have their individual shortcomings and benefits. In this paper, we combine their complementary strengths by proposing a new method called SidAE (Siamese denoising autoencoder). Using an image classification downstream task, we show that our model outperforms two self-supervised baselines across multiple data sets and scenarios. Crucially, this includes conditions in which only a small amount of labeled data is available. Empirically, the Siamese component has more impact, but the denoising autoencoder is nevertheless necessary to improve performance.

AB - In contrast to fully-supervised models, self-supervised representation learning only needs a fraction of data to be labeled and often achieves the same or even higher downstream performance. The goal is to pre-train deep neural networks on a self-supervised task, making them able to extract meaningful features from raw input data afterwards. Previously, autoencoders and Siamese networks have been successfully employed as feature extractors for tasks such as image classification. However, both have their individual shortcomings and benefits. In this paper, we combine their complementary strengths by proposing a new method called SidAE (Siamese denoising autoencoder). Using an image classification downstream task, we show that our model outperforms two self-supervised baselines across multiple data sets and scenarios. Crucially, this includes conditions in which only a small amount of labeled data is available. Empirically, the Siamese component has more impact, but the denoising autoencoder is nevertheless necessary to improve performance.

KW - denoising autoencoder

KW - image classification

KW - pre-training

KW - representation learning

KW - Self-supervised learning

KW - Siamese networks

KW - Informatics

KW - Business informatics

UR - http://www.scopus.com/inward/record.url?scp=85192241043&partnerID=8YFLogxK

UR - https://www.mendeley.com/catalogue/99889785-310e-31ca-8786-f93a9453f8b6/

U2 - 10.1007/978-3-031-58547-0_10

DO - 10.1007/978-3-031-58547-0_10

M3 - Article in conference proceedings

AN - SCOPUS:85192241043

SN - 978-3-031-58546-3

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 117

EP - 128

BT - Advances in Intelligent Data Analysis XXII

A2 - Miliou, Ioanna

A2 - Papapetrou, Panagiotis

A2 - Piatkowski, Nico

PB - Springer Science and Business Media Deutschland

T2 - 22nd International Symposium on Intelligent Data Analysis - IDA 2024

Y2 - 24 April 2024 through 26 April 2024

ER -

Recently viewed

Activities

  1. Teaching and Exploring Sustainability in Virtual Space
  2. Critique de la Circulation: Constant, Debord and the Design of Environments
  3. Developing an integrated concept for a more sustainable anchoring of cruise ships in the port of Hamburg.
  4. “To sustain, or not to sustain…?” An Analysis of Sustainability Journalism in German Newspapers
  5. Applied Linguistics (Fachzeitschrift)
  6. Theorie des Staatsstreichs
  7. Environmental Management Accounting Support for Rice Husk Processing Alternatives: Integrating Environmental Risk Considerations into Investment Decisions
  8. Assessing the Boundary-Crossing Collaboration in Research-Practice Partnerships in Initial Teacher Education: Empirical Insights
  9. Forschungs- und Entwicklungsteam NetzwerG (Organisation)
  10. Climate change and sustainable use of high Aswan dam reservoir
  11. Intragenerational income growth and preferences for redistribution: towards the issue of elasticity of economic attitudes
  12. Perspectives on digital teaching and learning and AI in education II
  13. The innovation process: A linear succession of phases or chaos?
  14. Methodological Trends and Challenges of Entrepreneurship Research
  15. SEM-A2 – Search Engine Marketing Account Aggregation
  16. Teach About US - Innovative Ways of Teaching English: The Election Project 2020
  17. Commercial and Industrial Information, Intellectual Property Rights and Interests of a Third Party
  18. Exploring strengths and weaknesses of cultural production through social network analysis: A case study in small town Germany
  19. Justice at the End of the Supply Chain: Interrogating Amazon’s Logistical Urbanism from the Cloud to the Curb