Graph Conditional Variational Models: Too Complex for Multiagent Trajectories?

Research output: Journal contributionsConference article in journalResearchpeer-review

Standard

Graph Conditional Variational Models: Too Complex for Multiagent Trajectories? / Rudolph, Yannick; Brefeld, Ulf; Dick, Uwe.
In: Proceedings of Machine Learning Research, Vol. 137, 2020, p. 136-147.

Research output: Journal contributionsConference article in journalResearchpeer-review

Harvard

APA

Vancouver

Bibtex

@article{210f9c82d3b5449c990ce45577c5185a,
title = "Graph Conditional Variational Models: Too Complex for Multiagent Trajectories?",
abstract = "Recent advances in modeling multiagent trajectories combine graph architectures such as graph neural networks (GNNs) with conditional variational models (CVMs) such as variational RNNs (VRNNs). Originally, CVMs have been proposed to facilitate learning with multi-modal and structured data and thus seem to perfectly match the requirements of multi-modal multiagent trajectories with their structured output spaces. Empirical results of VRNNs on trajectory data support this assumption. In this paper, we revisit experiments and proposed architectures with additional rigour, ablation runs and baselines. In contrast to common belief, we show that prior results with CVMs on trajectory data might be misleading. Given a neural network with a graph architecture and/or structured output function, variational autoencoding does not seem to contribute statistically significantly to empirical performance. Instead, we show that well-known emission functions do contribute, while coming with less complexity, engineering and computation time. ",
keywords = "Informatics, Business informatics",
author = "Yannick Rudolph and Ulf Brefeld and Uwe Dick",
note = "Publisher Copyright: {\textcopyright} Proceedings of Machine Learning Research 2020.; 34rd Conference on Neural Information Processing Systems - NeurIPS 2020 : Neural Information Processing Systems Online Conference 2020 , NeurIPS 2020 ; Conference date: 06-12-2020 Through 12-12-2020",
year = "2020",
language = "English",
volume = "137",
pages = "136--147",
journal = "Proceedings of Machine Learning Research",
issn = "2640-3498",
publisher = "MLResearch Press",
url = "https://neurips.cc/virtual/2020/public/index.html, https://proceedings.mlr.press/v137/",

}

RIS

TY - JOUR

T1 - Graph Conditional Variational Models: Too Complex for Multiagent Trajectories?

AU - Rudolph, Yannick

AU - Brefeld, Ulf

AU - Dick, Uwe

N1 - Conference code: 34

PY - 2020

Y1 - 2020

N2 - Recent advances in modeling multiagent trajectories combine graph architectures such as graph neural networks (GNNs) with conditional variational models (CVMs) such as variational RNNs (VRNNs). Originally, CVMs have been proposed to facilitate learning with multi-modal and structured data and thus seem to perfectly match the requirements of multi-modal multiagent trajectories with their structured output spaces. Empirical results of VRNNs on trajectory data support this assumption. In this paper, we revisit experiments and proposed architectures with additional rigour, ablation runs and baselines. In contrast to common belief, we show that prior results with CVMs on trajectory data might be misleading. Given a neural network with a graph architecture and/or structured output function, variational autoencoding does not seem to contribute statistically significantly to empirical performance. Instead, we show that well-known emission functions do contribute, while coming with less complexity, engineering and computation time.

AB - Recent advances in modeling multiagent trajectories combine graph architectures such as graph neural networks (GNNs) with conditional variational models (CVMs) such as variational RNNs (VRNNs). Originally, CVMs have been proposed to facilitate learning with multi-modal and structured data and thus seem to perfectly match the requirements of multi-modal multiagent trajectories with their structured output spaces. Empirical results of VRNNs on trajectory data support this assumption. In this paper, we revisit experiments and proposed architectures with additional rigour, ablation runs and baselines. In contrast to common belief, we show that prior results with CVMs on trajectory data might be misleading. Given a neural network with a graph architecture and/or structured output function, variational autoencoding does not seem to contribute statistically significantly to empirical performance. Instead, we show that well-known emission functions do contribute, while coming with less complexity, engineering and computation time.

KW - Informatics

KW - Business informatics

UR - http://www.scopus.com/inward/record.url?scp=85163213279&partnerID=8YFLogxK

M3 - Conference article in journal

VL - 137

SP - 136

EP - 147

JO - Proceedings of Machine Learning Research

JF - Proceedings of Machine Learning Research

SN - 2640-3498

T2 - 34rd Conference on Neural Information Processing Systems - NeurIPS 2020

Y2 - 6 December 2020 through 12 December 2020

ER -

Recently viewed

Publications

  1. ACL–adaptive correction of learning parameters for backpropagation based algorithms
  2. Preventive Emergency Detection Based on the Probabilistic Evaluation of Distributed, Embedded Sensor Networks
  3. Throttle valve control using an inverse local linear model tree based on a Fuzzy neural network
  4. Learning with animations and simulations in a computer-based learning environment about torques
  5. Analysis and comparison of two finite element algorithms for dislocation density based crystal plasticity
  6. Using mixture distribution models to test the construct validity of the Physical Self-Description Questionnaire
  7. Set-oriented numerical computation of rotation sets
  8. Trajectory-based computational study of coherent behavior in flows
  9. Digital Control of a Camless Engine Using Lyapunov Approach with Backward Euler Approximation
  10. Springback prediction and reduction in deep drawing under influence of unloading modulus degradation
  11. Joint entity and relation linking using EARL
  12. Human–learning–machines: introduction to a special section on how cybernetics and constructivism inspired new forms of learning
  13. Supporting discourse in a synchronous learning environment
  14. Cross-document coreference resolution using latent features
  15. Performance analysis for loss systems with many subscribers and concurrent services
  16. On finding nonisomorphic connected subgraphs and distinct molecular substructures.
  17. Improved sensorimotor control is not connected with improved proprioception
  18. Expertise in research integration and implementation for tackling complex problems
  19. Changes in the Complexity of Limb Movements during the First Year of Life across Different Tasks
  20. Analysis of semi-open queueing networks using lost customers approximation with an application to robotic mobile fulfilment systems
  21. A decoupled MPC using a geometric approach and feedforward action for motion control in robotino
  22. Model predictive control for switching gain adaptation in a sliding mode controller of a DC drive with nonlinear friction
  23. Finding Creativity in Predictability: Seizing Kairos in Chronos Through Temporal Work in Complex Innovation Processes
  24. An application of multiple behavior SIA for analyzing data from student exams
  25. Continuous and Discrete Concepts for Detecting Transport Barriers in the Planar Circular Restricted Three Body Problem
  26. Control of an Electromagnetic Linear Actuator Using Flatness Property and Systems Inversion
  27. Machine Learning and Knowledge Discovery in Databases
  28. Design of controllers applied to autonomous unmanned aerial vehicles using software in the loop
  29. A Wavelet Packet Algorithm for Online Detection of Pantograph Vibrations
  30. Integrating errors into the training process
  31. Formative Perspectives on the Relation Between CSR Communication and CSR Practices
  32. Sensitivity to complexity - an important prerequisite of problem solving mathematics teaching
  33. An extended analytical approach to evaluating monotonic functions of fuzzy numbers