Proceedings of the 2023 4th International Conference on Education, Knowledge and Information Management (ICEKIM 2023)

GTT-Bert: Pre-training of Graph-To-Tree Model for Math Word Problems

Authors
Ruolin Dou1, *, Dong Liang1, *, Nan Wang1, *, Junxuan Wang1, *
1Beijing University of Posts and Telecommunications, Beijing, China
*Corresponding author. Email: idouruolin@bupt.edu.cn
*Corresponding author. Email: liangdong@bupt.edu.cn
*Corresponding author. Email: wn98@bupt.edu.cn
*Corresponding author. Email: Wjunxuan@bupt.edu.cn
Corresponding Authors
Ruolin Dou, Dong Liang, Nan Wang, Junxuan Wang
Available Online 30 June 2023.
DOI
10.2991/978-94-6463-172-2_121How to use a DOI?
Keywords
education; math word problem; natural language processing; pre-training model; representation learning
Abstract

Math word problem (MWP) is an important problem in the field of intelligent education and natural language processing. The existing models for solving MWP problems mainly include sequence to sequence (Seq2Seq), sequence to tree (Seq2Tree), graph to tree (Graph2Tree) and other methods. Graph2Tree model can well capture the relationship and order representation between quantities. However, the existing Graph2Tree model usually uses the embedded layer to represent the input text sequence as a word vector, which does not obtain the representation without paying attention to numerical attributes and context representation interpretation information. We propose a pre-train- model based on Graph2Tree structure. The experimental results show that the performance of Graph2Tree model with our pre-training model is significantly better than the existing Graph2Tree model.

Copyright
© 2023 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Download article (PDF)

Volume Title
Proceedings of the 2023 4th International Conference on Education, Knowledge and Information Management (ICEKIM 2023)
Series
Atlantis Highlights in Computer Sciences
Publication Date
30 June 2023
ISBN
10.2991/978-94-6463-172-2_121
ISSN
2589-4900
DOI
10.2991/978-94-6463-172-2_121How to use a DOI?
Copyright
© 2023 The Author(s)
Open Access
Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.

Cite this article

TY  - CONF
AU  - Ruolin Dou
AU  - Dong Liang
AU  - Nan Wang
AU  - Junxuan Wang
PY  - 2023
DA  - 2023/06/30
TI  - GTT-Bert: Pre-training of Graph-To-Tree Model for Math Word Problems
BT  - Proceedings of the 2023 4th International Conference on Education, Knowledge and Information Management (ICEKIM 2023)
PB  - Atlantis Press
SP  - 1151
EP  - 1158
SN  - 2589-4900
UR  - https://doi.org/10.2991/978-94-6463-172-2_121
DO  - 10.2991/978-94-6463-172-2_121
ID  - Dou2023
ER  -