Symmetry-guided gradient descent for quantum neural networks

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Article number022406
Journal / PublicationPhysical Review A
Volume110
Issue number2
Online published5 Aug 2024
Publication statusPublished - Aug 2024

Abstract

Many supervised learning tasks have intrinsic symmetries, such as translational and rotational symmetry in image classifications. These symmetries can be exploited to enhance performance. We formulate the symmetry constraints into a concise mathematical form. We design two ways to adopt the constraints into the cost function, thereby shaping the cost landscape in favor of parameter choices, which respect the given symmetry. Unlike methods that alter the neural network circuit Ansatz to impose symmetry, our method only changes the classical postprocessing of gradient descent, which is simpler to implement. We call the method symmetry-guided gradient descent (SGGD). We illustrate SGGD in entanglement classification of Werner states and in two classification tasks in a two-dimensional feature space. In both cases, the results show that SGGD can accelerate the training, improve the generalization ability, and remove vanishing gradients, especially when the training data is biased. © 2024 American Physical Society.

Citation Format(s)

Symmetry-guided gradient descent for quantum neural networks. / Bian, Kaiming; Zhang, Shitao; Meng, Fei et al.
In: Physical Review A, Vol. 110, No. 2, 022406, 08.2024.

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review