In Machine learning, a backpropagation algorithm is used to compute the loss for a particular model. The most common starting point is to use the techniques of single-variable calculus and understand how backpropagation works. However, the real challenge is when the inputs are not scalars but of matrices or tensors. By Strahinja Stefanovic.
In this post, we will learn how to deal with inputs like vectors, matrices, and tensors of higher ranks. We will understand how backpropagation with vectors and tensors is performed in computational graphs using single-variable as well as multi-variable derivatives. Further in this tutorial you will find:
- Vector Derivatives
- Backpropagation with Vectors
- Backpropagation with Tensors
- Backpropagation with Vectors and Tensors in Python using PyTorch
This is very detailed article with charts, algorithms, and PyTorch code examples explaining important concepts. Ideal for any data scientists!
[Read More]