Frobenius Norm does 2 things, if you look at a row it represents the reconstruction error vector for an observation. Since Frobenius norm adds square of all matrix entries hence all columns within a row are sqaured and added. This expression if looked carefully becomes the square of L2 norm of the reconstruction error of an observation. Frobenius norm ensures that when we take squares of all entries in the matrix and add them up, it adds the reconstruction error for every observation, (the outer sigma before matrix was introduced).
Show admin stats