Abstract
Novel concepts are essential for design innovation and can be generated with the aid of data stimuli and computers. However, current generative design algorithms focus on diagrammatic or spatial concepts that are either too abstract to understand or too detailed for early phase design exploration. This paper explores the uses of generative pre-trained transformers (GPT) for natural language design concept generation. Our experiments involve the use of GPT-2 and GPT-3 for different creative reasonings in design tasks. Both show reasonably good performance for verbal design concept generation. © The Author(s), 2022.
| Original language | English |
|---|---|
| Title of host publication | INTERNATIONAL DESIGN CONFERENCE - DESIGN 2022 |
| Editors | Mario Štorga, Stanko Škec, Tomislav Martinec, Dorian Marjanović |
| Publisher | Cambridge University Press |
| Pages | 1825-1834 |
| DOIs | |
| Publication status | Published - May 2022 |
| Externally published | Yes |
| Event | 17th International Design Conference (DESIGN 2022) - Virtual, Croatia Duration: 23 May 2022 → 26 May 2022 https://www.designconference.org/past-events |
Publication series
| Name | Proceedings of the Design Society |
|---|---|
| Volume | 2 |
| ISSN (Print) | 2732-527X |
Conference
| Conference | 17th International Design Conference (DESIGN 2022) |
|---|---|
| Place | Croatia |
| Period | 23/05/22 → 26/05/22 |
| Internet address |
Research Keywords
- early design phase
- generative design
- generative pre-trained transformer
- idea generation
- natural language generation
Publisher's Copyright Statement
- This full text is made available under CC-BY-NC-ND 4.0. https://creativecommons.org/licenses/by-nc-nd/4.0/
Fingerprint
Dive into the research topics of 'Generative Pre-Trained Transformer for Design Concept Generation: An Exploration'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver