Regression is Projection

Regression is Projection

Sam Levey

1 месяц назад

1,202 Просмотров

Ссылки и html тэги не поддерживаются


Комментарии:

@MatthewBricker
@MatthewBricker - 23.03.2025 20:13

Great Video. This actually helped a lot with my intuition in linear algebra 3.

Ответить
@roshan0405
@roshan0405 - 24.03.2025 06:17

Great video man

Ответить
@akosrupp232
@akosrupp232 - 24.03.2025 09:41

Great work. Please do more video with black background it’s better to watch at night

Ответить
@gronedure2245
@gronedure2245 - 24.03.2025 20:13

hello, economist here, incredible, it solves a lot of questions I had but was too lazy to look up. I really should get back into maths.

Ответить
@jaopredoramires
@jaopredoramires - 25.03.2025 20:36

Absolutely loving the channel! Please keep it up. Can't wait to see the next ones

Ответить
@erickappel4120
@erickappel4120 - 25.03.2025 22:05

Excellent video!!!

Ответить
@johnk8174
@johnk8174 - 26.03.2025 03:42

Make more videos!

Ответить
@kiraninam
@kiraninam - 29.03.2025 10:13

excellent. the last part was really impressive.

Ответить
@minerharry
@minerharry - 30.03.2025 21:28

Could you maybe go through the logic of AA^Tv (or the full A(A^TA)^-1A^Tv 👀 from the perspective of your transpose video? So far I have that A is a linear map from the component-space of your sub space into real space (where x,y,etc., v live), and that A^T maps v into that component space in a dimension-removing manner (since rank(A) is the same as rank(A^T)), and thus it is that first multiplication by A^T that really does the dimension collapse of the projection. What I’m struggling with is why the multiplication by A gives us the result we want - it feels like there’s some fundamental dot-product relationship between x, y, v, maybe v-p, and p that should dictate why the dot-product-preserving nature of A vs A^T should matter but I don’t think I fully internalized the first projection matrix video enough to understand what it is.

OH and I guess that means (A^TA)^-1 is doing some kind of operation in that component space, I guess one that cancels out the non-orthonormality of A? But without understanding why the components make sense in the first place I don’t know if I can think it through. Might be the subject of a cool ellipse animation though like you did with the SVDs of A / A^-1^T in the transpose video

Ответить
@yangsong6111
@yangsong6111 - 04.04.2025 00:19

tysm, seeing the video makes the idea so clear

Ответить
@tonywang7933
@tonywang7933 - 07.05.2025 06:47

When the next video will be out, can't wait although I don't need it for my course I want to learn.

Ответить
@guidosalescalvano9862
@guidosalescalvano9862 - 13.05.2025 20:28

Your videos are incredibly good. A binged your channel. I wish you made more videos! May I suggest explaining a principle component analysis to further expand on projection and least squares solutions?

Ответить
@guidosalescalvano9862
@guidosalescalvano9862 - 13.05.2025 22:46

What you said about dot products not changing after projection, that was such an eye opener. I really hope you can do principle component analysis

Ответить
@PIREngineering
@PIREngineering - 14.05.2025 10:37

Excellent explanation, especially the least squares part!

Ответить
@tommyli9961
@tommyli9961 - 23.05.2025 13:38

Great video. Very insightful.

Ответить