Abstract
Chart-to-code generation is a critical task in automated data visualization, translating complex chart structures into executable programs. While recent Multi-modal Large Language Models (MLLMs) improve chart representation, existing approaches still struggle to achieve cross-type generalization, memory efficiency, and modular design. To address these challenges, this paper proposes C2C-MoLA, a multimodal framework that synergizes Mixture of Experts (MoE) with Low-Rank Adaptation (LoRA). The MoE component uses a complexity-aware routing mechanism with domain-specialized experts and load-balanced sparse gating, dynamically allocating inputs based on learnable structural metrics like element count and chart complexity. LoRA enables parameter-efficient updates for resource-conscious tuning, further supported by a tailored training strategy that aligns routing stability with semantic accuracy. Experiments on Chart2Code-160k show that the proposed model improves generation accuracy by up to 17%, reduces peak GPU memory by 18%, and accelerates convergence by 20%, when compared to standard fine-tuning and LoRAonly baselines, particularly on complex charts. Ablation studies validate optimal designs, such as 8 experts and rank-8 LoRA, and confirm scalability for real-world multimodal code generation.
| Original language | English |
|---|---|
| Title of host publication | 32nd Asia-Pacific Software Engineering Conference (APSEC 2025) |
| Publisher | IEEE |
| Number of pages | 12 |
| Publication status | Presented - 5 Dec 2025 |
| Event | 32nd Asia-Pacific Software Engineering Conference (APSEC 2025) - Wynn Palace, Macao Duration: 2 Dec 2025 → 5 Dec 2025 https://conf.researchr.org/home/apsec-2025 |
Conference
| Conference | 32nd Asia-Pacific Software Engineering Conference (APSEC 2025) |
|---|---|
| Abbreviated title | APSEC 2025 |
| City | Macao |
| Period | 2/12/25 → 5/12/25 |
| Internet address |
Research Keywords
- Chart-to-Code Generation
- Multi-Modal Learning
- Mixture of Experts (MoE)
- Low-Rank Adaptation (LoRA)