Transformer with Tree-order Encoding for Neural Program Generation

Publikation: Beiträge in SammelwerkenAufsätze in KonferenzbändenForschung

Standard

Transformer with Tree-order Encoding for Neural Program Generation. / Thellmann, Klaudia-Doris; Stadler, Bernhard; Usbeck, Ricardo et al.
Conference XXX. 2022.

Publikation: Beiträge in SammelwerkenAufsätze in KonferenzbändenForschung

Harvard

APA

Vancouver

Thellmann KD, Stadler B, Usbeck R, Lehmann J. Transformer with Tree-order Encoding for Neural Program Generation. in Conference XXX. 2022 doi: 10.48550/arXiv.2206.13354

Bibtex

@inbook{56b3c87266394178bbbbf7097e04d0af,
title = "Transformer with Tree-order Encoding for Neural Program Generation",
abstract = " While a considerable amount of semantic parsing approaches have employed RNN architectures for code generation tasks, there have been only few attempts to investigate the applicability of Transformers for this task. Including hierarchical information of the underlying programming language syntax has proven to be effective for code generation. Since the positional encoding of the Transformer can only represent positions in a flat sequence, we have extended the encoding scheme to allow the attention mechanism to also attend over hierarchical positions in the input. Furthermore, we have realized a decoder based on a restrictive grammar graph model to improve the generation accuracy and ensure the well-formedness of the generated code. While we did not surpass the state of the art, our findings suggest that employing a tree-based positional encoding in combination with a shared natural-language subword vocabulary improves generation performance over sequential positional encodings. ",
keywords = "cs.CL, cs.AI, 68T07, 68T50, I.2.7, Informatics",
author = "Klaudia-Doris Thellmann and Bernhard Stadler and Ricardo Usbeck and Jens Lehmann",
note = "This paper was authored in late 2020 and early 2021 for the most part",
year = "2022",
month = may,
day = "30",
doi = "10.48550/arXiv.2206.13354",
language = "English",
booktitle = "Conference XXX",

}

RIS

TY - CHAP

T1 - Transformer with Tree-order Encoding for Neural Program Generation

AU - Thellmann, Klaudia-Doris

AU - Stadler, Bernhard

AU - Usbeck, Ricardo

AU - Lehmann, Jens

N1 - This paper was authored in late 2020 and early 2021 for the most part

PY - 2022/5/30

Y1 - 2022/5/30

N2 - While a considerable amount of semantic parsing approaches have employed RNN architectures for code generation tasks, there have been only few attempts to investigate the applicability of Transformers for this task. Including hierarchical information of the underlying programming language syntax has proven to be effective for code generation. Since the positional encoding of the Transformer can only represent positions in a flat sequence, we have extended the encoding scheme to allow the attention mechanism to also attend over hierarchical positions in the input. Furthermore, we have realized a decoder based on a restrictive grammar graph model to improve the generation accuracy and ensure the well-formedness of the generated code. While we did not surpass the state of the art, our findings suggest that employing a tree-based positional encoding in combination with a shared natural-language subword vocabulary improves generation performance over sequential positional encodings.

AB - While a considerable amount of semantic parsing approaches have employed RNN architectures for code generation tasks, there have been only few attempts to investigate the applicability of Transformers for this task. Including hierarchical information of the underlying programming language syntax has proven to be effective for code generation. Since the positional encoding of the Transformer can only represent positions in a flat sequence, we have extended the encoding scheme to allow the attention mechanism to also attend over hierarchical positions in the input. Furthermore, we have realized a decoder based on a restrictive grammar graph model to improve the generation accuracy and ensure the well-formedness of the generated code. While we did not surpass the state of the art, our findings suggest that employing a tree-based positional encoding in combination with a shared natural-language subword vocabulary improves generation performance over sequential positional encodings.

KW - cs.CL

KW - cs.AI

KW - 68T07, 68T50

KW - I.2.7

KW - Informatics

U2 - 10.48550/arXiv.2206.13354

DO - 10.48550/arXiv.2206.13354

M3 - Article in conference proceedings

BT - Conference XXX

ER -

Zuletzt angesehen

Publikationen

  1. A two-stage Kalman estimator for motion control using model predictive strategy
  2. Unity and diversity in the law of state responsibility
  3. Combining linked data and statistical information retrieval
  4. Solving mathematical problems with dynamical sketches
  5. Essentializing the binary self
  6. Multi-view learning with dependent views
  7. An application of multiple behavior SIA for analyzing data from student exams
  8. Promising practices for dealing with complexity in research for development
  9. Perfect anti-windup in output tracking scheme with preaction
  10. Correlation between mechanical behaviour and microstructure in the Mg-Ca-Si-Sr system for degradable biomaterials based on thermodynamic calculations
  11. Proxies
  12. Agency and structure in a sociotechnical transition
  13. Proceedings of TextGraphs-17: Graph-based Methods for Natural Language Processing
  14. Primary Side Circuit Design of a Multi-coil Inductive System for Powering Wireless Sensors
  15. GPU-accelerated meshfree computational framework for modeling the friction surfacing process
  16. A PHENOMENOGRAPHICAL STUDY OF CHILDRENS’ SPATIAL THOUGHT WHILE USING MAPS IN REAL SPACES
  17. Design and Control of an Inductive Power Transmission System with AC-AC Converter for a Constant Output Current
  18. Introducing a multivariate model for predicting driving performance
  19. Children's use of spatial skills in solving two map-reading tasks in real space.
  20. Evaluating entity annotators using GERBIL
  21. Managing complexity in automative production
  22. Topic Embeddings – A New Approach to Classify Very Short Documents Based on Predefined Topics
  23. Grazing, exploring and networking for sustainability-oriented innovations in learning-action networks
  24. Integrating the underlying structure of stochasticity into community ecology
  25. Validation of an open source, remote web-based eye-tracking method (WebGazer) for research in early childhood
  26. Globally asymptotic output feedback tracking of robot manipulators with actuator constraints
  27. XOperator - Interconnecting the semantic web and instant messaging networks
  28. Dynamically changing sequencing rules with reinforcement learning in a job shop system with stochastic influences
  29. Experiments on the Fehrer-Raab effect and the ‘Weather Station Model’ of visual backward masking
  30. Parking space management through deep learning – an approach for automated, low-cost and scalable real-time detection of parking space occupancy
  31. Lyapunov stability analysis to set up a PI controller for a mass flow system in case of a non-saturating input
  32. Springback prediction and reduction in deep drawing under influence of unloading modulus degradation