We present a simplified computational rule for the back-propagation formulas for artificial neural networks. In this work, we provide a generic two-step rule for the back-propagation algorithm in matrix notation. Moreover, this rule incorporates both the forward and backward phases of the computations involved in the learning process. Specifically, this recursive computing rule permits the propagation of the changes to all synaptic weights in the network, layer by layer, efficiently. In particular, we use this rule to compute both the up and down partial derivatives of the cost function of all the connections feeding into the output layer.
Artificial Neural Networks Deep learning Back-propagation Feed-Forward
Birincil Dil | İngilizce |
---|---|
Konular | Yazılım Mühendisliği (Diğer) |
Bölüm | Makaleler |
Yazarlar | |
Erken Görünüm Tarihi | 3 Haziran 2023 |
Yayımlanma Tarihi | 16 Haziran 2023 |
Kabul Tarihi | 8 Mayıs 2023 |
Yayımlandığı Sayı | Yıl 2023 Cilt: 6 Sayı: 1 |
International Journal of Informatics and Applied Mathematics