TY - CHAP
T1 - Analysis on the Boltzmann Machine with Random Input Drifts in Activation Function
AU - Lu, Wenhao
AU - Leung, Chi-Sing
AU - Sum, John
PY - 2020
Y1 - 2020
N2 - The Boltzmann machine (BM) model is able to learn the probability distribution of input patterns. However, in analog realization, there are thermal noise and random offset voltages of amplifiers. Those realization issues affect the behaviour of the neurons’ activation function and they can be modelled as random input drifts. This paper analyzes the activation function and state distribution of BMs under the input random drift model. Since the state of a neuron is also determined by its activation function, the random input drifts may cause a BM to change the behaviour. We show that the effect of random input drifts is equivalent to raising temperature factor. Hence, from the Kullback–Leibler (KL) divergence perspective, we propose a compensation scheme to reduce the effect of random input drifts. In our derive of compensation scheme, we assume that the input drift follows the Gaussian distribution. Surprisedly, from our simulations, the proposed compensation scheme also works very well for other distributions.
AB - The Boltzmann machine (BM) model is able to learn the probability distribution of input patterns. However, in analog realization, there are thermal noise and random offset voltages of amplifiers. Those realization issues affect the behaviour of the neurons’ activation function and they can be modelled as random input drifts. This paper analyzes the activation function and state distribution of BMs under the input random drift model. Since the state of a neuron is also determined by its activation function, the random input drifts may cause a BM to change the behaviour. We show that the effect of random input drifts is equivalent to raising temperature factor. Hence, from the Kullback–Leibler (KL) divergence perspective, we propose a compensation scheme to reduce the effect of random input drifts. In our derive of compensation scheme, we assume that the input drift follows the Gaussian distribution. Surprisedly, from our simulations, the proposed compensation scheme also works very well for other distributions.
KW - Activation function
KW - Boltzmann machine
KW - Noise
KW - State distribution
UR - http://www.scopus.com/inward/record.url?scp=85097440054&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85097440054&origin=recordpage
U2 - 10.1007/978-3-030-63836-8_14
DO - 10.1007/978-3-030-63836-8_14
M3 - RGC 12 - Chapter in an edited book (Author)
SN - 978-3-030-63835-1
VL - Part III
T3 - Lecture Notes in Computer Science (including subseries Theoretical Computer Science and General Issues)
SP - 162
EP - 171
BT - Neural Information Processing
A2 - Yang, Haiqin
A2 - Pasupa, Kitsuchart
A2 - Leung, Andrew Chi-Sing
A2 - Kwok, James T.
A2 - Chan, Jonathan H.
A2 - King, Irwin
PB - Springer, Cham
T2 - 27th International Conference on Neural Information Processing (ICONIP 2020)
Y2 - 18 November 2020 through 22 November 2020
ER -