Transformer with Tree-order Encoding for Neural Program Generation

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearch

Authors

While a considerable amount of semantic parsing approaches have employed RNN architectures for code generation tasks, there have been only few attempts to investigate the applicability of Transformers for this task. Including hierarchical information of the underlying programming language syntax has proven to be effective for code generation. Since the positional encoding of the Transformer can only represent positions in a flat sequence, we have extended the encoding scheme to allow the attention mechanism to also attend over hierarchical positions in the input. Furthermore, we have realized a decoder based on a restrictive grammar graph model to improve the generation accuracy and ensure the well-formedness of the generated code. While we did not surpass the state of the art, our findings suggest that employing a tree-based positional encoding in combination with a shared natural-language subword vocabulary improves generation performance over sequential positional encodings.
Original languageEnglish
Title of host publicationConference XXX
Number of pages10
DOIs
Publication statusIn preparation - 30.05.2022
Externally publishedYes

Bibliographical note

This paper was authored in late 2020 and early 2021 for the most part

    Research areas

  • cs.CL, cs.AI, 68T07, 68T50, I.2.7
  • Informatics

Recently viewed

Publications

  1. A two-step approach for the prediction of mood levels based on diary data
  2. TextGraphs 2024 Shared Task on Text-Graph Representations for Knowledge Graph Question Answering
  3. ActiveMath - a Learning Platform With Semantic Web Features
  4. Building a process layer for business applications using the blackboard pattern
  5. Model-based logistic controlling of converging material flows
  6. Gain Adaptation in Sliding Mode Control Using Model Predictive Control and Disturbance Compensation with Application to Actuators
  7. Saving (in) a common world
  8. Introduction: The representative turn in EU studies
  9. Highly Efficient IPT Transmitter Circuit Based on a Novel Enhanced Class B Amplifier for Consumer Applications
  10. Retest effects in matrix test performance
  11. How does telework modify informal workplace learning and how can supervisors provide support?
  12. The representative turn in EU studies
  13. Quality System Development at the University of Graz
  14. Reporting and Analysing the Environmental Impact of Language Models on the Example of Commonsense Question Answering with External Knowledge
  15. From teacher-centered instruction to peer tutoring in the heterogeneous international classroom
  16. Agile Portfolio Management Patterns
  17. Predicting the Individual Mood Level based on Diary Data
  18. Bed-Sharing in Couples Is Associated With Increased and Stabilized REM Sleep and Sleep-Stage Synchronization
  19. Influence of Long-Lasting Static Stretching Intervention on Functional and Morphological Parameters in the Plantar Flexors
  20. Lessons learned and challenges for environmental management in Colombia
  21. Frame-based Optimal Design
  22. Detection of significant tracer gases by means of polymer gas sensors
  23. Integrating teacher and student workspaces in a technology-enhanced mathematics lecture