Learning to Summarise Related Sentences
Publikation: Beiträge in Sammelwerken › Aufsätze in Konferenzbänden › Forschung › begutachtet
Standard
COLING 2014 - 25th International Conference on Computational Linguistics, Proceedings of COLING 2014: Technical Papers. Dublin: Association for Computational Linguistics (ACL), 2014. S. 1636-1647 (COLING 2014 - 25th International Conference on Computational Linguistics, Proceedings of COLING 2014: Technical Papers).
Publikation: Beiträge in Sammelwerken › Aufsätze in Konferenzbänden › Forschung › begutachtet
Harvard
APA
Vancouver
Bibtex
}
RIS
TY - CHAP
T1 - Learning to Summarise Related Sentences
AU - Tzouridis, Emmanouil
AU - Nasir, Jamal Abdul
AU - Brefeld, Ulf
N1 - Conference code: 25
PY - 2014
Y1 - 2014
N2 - We cast multi-sentence compression as a structured prediction problem. Related sentences are represented by a word graph so that summaries constitute paths in the graph (Filippova, 2010). We devise a parameterised shortest path algorithm that can be written as a generalised linear model in a joint space of word graphs and compressions. We use a large-margin approach to adapt parameterised edge weights to the data such that the shortest path is identical to the desired summary. Decoding during training is performed in polynomial time using loss augmented inference. Empirically, we compare our approach to the state-of-the-art in graph-based multi-sentence compression and observe significant improvements of about 7% in ROUGE F-measure and 8% in BLEU score, respectively.
AB - We cast multi-sentence compression as a structured prediction problem. Related sentences are represented by a word graph so that summaries constitute paths in the graph (Filippova, 2010). We devise a parameterised shortest path algorithm that can be written as a generalised linear model in a joint space of word graphs and compressions. We use a large-margin approach to adapt parameterised edge weights to the data such that the shortest path is identical to the desired summary. Decoding during training is performed in polynomial time using loss augmented inference. Empirically, we compare our approach to the state-of-the-art in graph-based multi-sentence compression and observe significant improvements of about 7% in ROUGE F-measure and 8% in BLEU score, respectively.
KW - Informatics
KW - Business informatics
UR - http://www.scopus.com/inward/record.url?scp=84943749207&partnerID=8YFLogxK
UR - https://www.mendeley.com/catalogue/71fe9cf0-6be3-3329-b3d8-e329a11a5d88/
M3 - Article in conference proceedings
SN - 978-1-941643-26-6
T3 - COLING 2014 - 25th International Conference on Computational Linguistics, Proceedings of COLING 2014: Technical Papers
SP - 1636
EP - 1647
BT - COLING 2014 - 25th International Conference on Computational Linguistics, Proceedings of COLING 2014
PB - Association for Computational Linguistics (ACL)
CY - Dublin
T2 - 25th International Conference on Computational Linguistics - COLING 2014
Y2 - 23 August 2014 through 29 August 2014
ER -