Cross-lingual transfer learning for statistical type inference

Zhiming Li, Xiaofei Xie*, Haoliang Li, Zhengzi Xu, Yi Li, Yang Liu

*Corresponding author for this work

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

2 Citations (Scopus)

Abstract

Hitherto statistical type inference systems rely thoroughly on supervised learning approaches, which require laborious manual effort to collect and label large amounts of data. Most Turing-complete imperative languages share similar control-and data-flow structures, which make it possible to transfer knowledge learned from one language to another. In this paper, we propose a cross-lingual transfer learning framework, PLATO, for statistical type inference, which allows us to leverage prior knowledge learned from the labeled dataset of one language and transfer it to the others, e.g., Python to JavaScript, Java to JavaScript, etc. PLATO is powered by a novel kernelized attention mechanism to constrain the attention scope of the backbone Transformer model such that model is forced to base its prediction on commonly shared features among languages. In addition, we propose the syntax enhancement that augments the learning on the feature overlap among language domains. Furthermore, PLATO can also be used to improve the performance of the conventional supervised-based type inference by introducing cross-language augmentation, which enables the model to learn more general features across multiple languages. We evaluated PLATO under two settings: 1) under the cross-domain scenario that the target language data is not labeled or labeled partially, the results show that PLATO outperforms the state-of-The-Art domain transfer techniques by a large margin, , it improves the Python to TypeScript baseline by +14.6%@EM, +18.6%@weighted-F1, and 2) under the conventional monolingual supervised scenario, PLATO improves the Python baseline by +4.10%@EM, +1.90%@weighted-F1 with the introduction of the cross-lingual augmentation.
Original languageEnglish
Title of host publicationISSTA '22 - Proceedings of the 31st ACM SIGSOFT International Symposium on Software Testing and Analysis
EditorsSukyoung Ryu, Yannis Smaragdakis
Place of PublicationNew York
PublisherAssociation for Computing Machinery
Pages239-250
ISBN (Print)9781450393799
DOIs
Publication statusPublished - 2022
Event31st ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA 2022) - Virtual, Online, Korea, Republic of
Duration: 18 Jul 202222 Jul 2022
https://conf.researchr.org/home/issta-2022

Publication series

NameISSTA - Proceedings of the ACM SIGSOFT International Symposium on Software Testing and Analysis

Conference

Conference31st ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA 2022)
Abbreviated titleISSTA ’22
PlaceKorea, Republic of
Period18/07/2222/07/22
Internet address

Bibliographical note

Full text of this publication does not contain sufficient affiliation information. With consent from the author(s) concerned, the Research Unit(s) information for this record is based on the existing academic department affiliation of the author(s).

Research Keywords

  • Deep Learning
  • Transfer Learning
  • Type Inference

Fingerprint

Dive into the research topics of 'Cross-lingual transfer learning for statistical type inference'. Together they form a unique fingerprint.

Cite this