Backpropagation through structure explained
Backpropagation through structure (BPTS) is a gradient-based technique for training recursive neural nets (a superset of recurrent neural nets) and is extensively described in a 1996 paper written by Christoph Goller and Andreas Küchler.[1]
Notes and References
- Book: Kuchler. Andreas. Proceedings of International Conference on Neural Networks (ICNN'96). 6536466. Learning Task-Dependent Distributed Representations by Backpropagation Through Structure. 10.1.1.49.1968. 1996. 1. 347–352. 10.1109/ICNN.1996.548916. 0-7803-3210-5.