Observations on Guitar Music Produced by AI Reverberation and Professional Sound Engineer

Research output: Journal Publications and Reviews (RGC: 21, 22, 62)21_Publication in refereed journalpeer-review

View graph of relations

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)1-7
Number of pages7
Journal / PublicationInternational Journal of Music Science, Technology and Art
Volume5
Issue number1
Online published16 Jan 2023
Publication statusPublished - Jan 2023

Link(s)

Abstract

Artificial intelligence (AI) technologies have been applied in music production to create various sound effects, including reverbera-tion. However, observation on such applications has not yet been fully explored in research studies. This paper reports results from a study comparing reverberation processing on six recordings of guitars, with musical phrasing, made by an AI software and two professional sound engineers. Audio features were extracted using the MIR Toolbox, and perceptual ratings on semantic scales were collected in two listening tests (N = 10, N = 33). Logistic regression was carried out on the two datasets in parallel. An increase in perceived Wetness or decrease in perceived Clarity was associated with a higher probability that the reverberation was made by the AI rather than a Human. For extracted audio features, lower Brightness, Rolloff, and Centroid, which are all indicators for a darker, low frequency emphasized sound, were more likely made by the AI. This study contributes to an understanding of the differences between AI- and human-generated audio effects used in music production. Copyright © 2023 Author et al., licensed to IJMSTA.

Research Area(s)

  • Music Perception, Reverberation, listening test

Download Statistics

No data available