在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称:fastai/numerical-linear-algebra-v2开源软件地址:https://github.com/fastai/numerical-linear-algebra-v2开源编程语言:Jupyter Notebook 100.0%开源软件介绍:Computational Linear Algebra for CodersThis course is focused on the question: How do we do matrix computations with acceptable speed and acceptable accuracy? This course is being taught in the University of San Francisco's Masters of Science in Data Science program, summer 2018 (for graduate students studying to become data scientists). The course is taught in Python with Jupyter Notebooks, using libraries such as Scikit-Learn and Numpy for most lessons, as well as Numba (a library that compiles Python to C for faster performance) and PyTorch (an alternative to Numpy for the GPU) in a few lessons. You can find the 2017 version of the course here. Table of ContentsThe following listing links to the notebooks in this repository, rendered through the nbviewer service. Topics Covered: 0. Course Logistics
1. Why are we here?We start with a high level overview of some foundational concepts in numerical linear algebra.
2. Background Removal with SVDAnother application of SVD is to identify the people and remove the background of a surveillance video.
3. Topic Modeling with NMF and SVDWe will use the newsgroups dataset to try to identify the topics of different posts. We use a term-document matrix that represents the frequency of the vocabulary in the documents. We factor it using NMF and SVD, and compare the two approaches.
4. Randomized SVD
5. LU Factorization
6. Compressed Sensing of CT Scans with Robust RegressionCompressed sensing is critical to allowing CT scans with lower radiation-- the image can be reconstructed with less data. Here we will learn the technique and apply it to CT images. 7. Predicting Health Outcomes with Linear Regressions
8. How to Implement Linear Regression
9. PageRank with Eigen DecompositionsWe have applied SVD to topic modeling, background removal, and linear regression. SVD is intimately connected to the eigen decomposition, so we will now learn how to calculate eigenvalues for a large matrix. We will use DBpedia data, a large dataset of Wikipedia links, because here the principal eigenvector gives the relative importance of different Wikipedia pages (this is the basic idea of Google's PageRank algorithm). We will look at 3 different methods for calculating eigenvectors, of increasing complexity (and increasing usefulness!).
10. Implementing QR Factorization
Why is this course taught in such a weird order?This course is structured with a top-down teaching method, which is different from how most math courses operate. Typically, in a bottom-up approach, you first learn all the separate components you will be using, and then you gradually build them up into more complex structures. The problems with this are that students often lose motivation, don't have a sense of the "big picture", and don't know what they'll need. Harvard Professor David Perkins has a book, Making Learning Whole in which he uses baseball as an analogy. We don't require kids to memorize all the rules of baseball and understand all the technical details before we let them play the game. Rather, they start playing with a just general sense of it, and then gradually learn more rules/details as time goes on. If you took the fast.ai deep learning course, that is what we used. You can hear more about my teaching philosophy in this blog post or this talk I gave at the San Francisco Machine Learning meetup. All that to say, don't worry if you don't understand everything at first! You're not supposed to. We will start using some "black boxes" or matrix decompositions that haven't yet been explained, and then we'll dig into the lower level details later. To start, focus on what things DO, not what they ARE. |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论