A Symmetric Metamorphic Relations Approach Supporting LLM for Education Technology

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

View graph of relations

Related Research Unit(s)

Detail(s)

Original languageEnglish
Title of host publicationProceedings - 2024 International Symposium on Educational Technology (ISET 2024)
PublisherInstitute of Electrical and Electronics Engineers, Inc.
Publication statusPublished - Jul 2024

Conference

Title10th International Symposium on Educational Technology (ISET 2024)
LocationThe Galaxy International Convention Center
PlaceMacao
CityMacao
Period29 July - 1 August 2024

Abstract

Question-Answering (Q&A) educational websites are widely used as self-learning platforms, and pre-trained large language models (LLMs) play a crucial role in maintaining content quality. Despite their usefulness, LLMs still fall short of human performance. To tackle this issue, we propose leveraging symmetric Metamorphic Relations (MRs) to enhance LLMs’ performance by improving their machine common sense. The goal is to ensure that learners receive more relevant content. This work presents an empirical experiment using one specific symmetric MR, three LLMs, and a publicly available dataset of labelled Stack Overflow data. We employ the symmetric MR to generate training data that augments the machine common sense of LLMs. Additionally, we prepare a separate set of training data consisting of labelled Stack Overflow data for comparison purposes. By comparing the results of a common ability test and the predictions made by LLMs trained with different training datasets, we can assess the potential practicality of our proposed approach. Our experimental results demonstrate that a Bert-based LLM trained with MR-generated data outperforms a Bert-based LLM trained solely with regular labelled data. This outcome highlights the effectiveness of symmetric MRs in enhancing LLMs’ performance by improving their machine common sense. Subsequent studies can extend our approach to other domains related to education technology and explore additional MRs to further enhance the study experience of students.

Research Area(s)

  • Content quality prediction, large language model, large language model, question-answering (Q&A) website, metamorphic relations, machine common sense, natural language processing data augmentation

Bibliographic Note

Since this conference is yet to commence, the information for this record is subject to revision.

Citation Format(s)

A Symmetric Metamorphic Relations Approach Supporting LLM for Education Technology. / CHAN, Pak Yuen Patrick; Keung, Jacky.
Proceedings - 2024 International Symposium on Educational Technology (ISET 2024). Institute of Electrical and Electronics Engineers, Inc., 2024.

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review