Computational Linear Algebra 9: PageRank with Eigen Decompositions

Course materials available here: https://github.com/fastai/numerical-linear-algebra
SVD is intimately connected to the eigen decomposition, so we will now learn how to calculate eigenvalues for a large matrix. We will use DBpedia, a large dataset of Wikipedia links, and the principal eigenvector gives the relative importance of different Wikipedia pages (this is the basic idea of Google's PageRank algorithm)
Topics covered:
- Full vs Reduced Factorizations
- Matrix Inversion is Unstable
- DBpedia Dataset
- Power Method
This material is reviewed in the Lesson 10 Video

Course overview blog post: http://www.fast.ai/2017/07/17/num-lin-alg/
Taught in the University of San Francisco MS in Analytics (MSAN) graduate program: https://www.usfca.edu/arts-sciences/graduate-programs/analytics
Ask questions about the course on our fast.ai forums: http://forums.fast.ai/c/lin-alg