Комментарии:
best video about decision tree thus far
ОтветитьWhy is the impurity at the decision node "color=green" equal to 0.62
ОтветитьHow to decide when to use Decision Tree Vs Random Forest
ОтветитьHey can anyone explain to me why do we square while taking the gini impurity? Why can't we take it without squaring it.
ОтветитьSuch a good video! I have very clear now
ОтветитьThe script is tightly edited. Much appreciated.
ОтветитьHe is a nice person
ОтветитьThank you Josh for preparing and explaining this presentation aa well as the software to help the understanding of the topics. Great job!
ОтветитьI m using ua algo....but getting queries...how do I connect u??
ОтветитьI am crying tears of joy! How can you articulate such complex topics so clearly!
ОтветитьThank you sir thank you so much.
ОтветитьLove the music!
ОтветитьI've never seen any other channels like this. So deep and perfect.
Ответитьholy loly gesus moly
ОтветитьYou are creating a question with only one value, what if I want to have a question like "Is it GREEN OR YELLOW?". So, basically, I will have to test all combinations of values of size 2 to find the best info_gain for a particular attribute. Furthermore, we could test all possible sizes of a question. Would that give a better result or is it better to use only one value of the attribute to build the question?
ОтветитьDoes someone know how to code a pruning of this tree?
ОтветитьExcuse me, I am still not clear about how the value of 0.64 comes out, can you explain a little more?
ОтветитьSo much value in just 10 mins, this is Gold
Ответитьwait a min... you are calculating IG based on gini? I think IG is associated with entropy, and IG should be nothing but choose the split made delta entropy biggest.
ОтветитьHow comes that in the calculation of the GINI Impurity we remove from the impurity the square of the probability of each label?
ОтветитьThank you very much for this video! I learnt a lot on how to understand Gini Coefficient and how it is used to pick the best questions to split the data!
ОтветитьGreat video and such clear code to accompany it! I learned a lot :)
ОтветитьCan anyone help me out ,I'm getting error like couldn't able to convert int value into float from dataset
ОтветитьHow do we train this model and save it ?
ОтветитьAmazing and Tnx for sharing the code <3
ОтветитьHas anyone run into 'TypeError: cannot unpack non-iterable NoneType object' at the end of code?
ОтветитьI built a tree from scratch but I am stuck making a useful plot like is obtainable in sklearn. Any help?
Ответитьso INSTRUCTIVE. thank you so much for your clear & precise explanation
ОтветитьCare to explain where does 4/5 or 1/5 come from? There are 11 training examples, and the left only has 8, the number does not add up.
ОтветитьI understood it as "when Gini Impurity of parent node is zero, Information Gain with child nodes is also zero. So we don't have to ask more question to classify." Is it right?
Ответитьincredible !
ОтветитьAwesome tutorial, many thanks!
ОтветитьThis video has saved my life 😆
ОтветитьSooo dooope !!!!
Helpful 🔥🔥🔥
😮😮😮😮
ОтветитьWhy gini impurity has a square?
ОтветитьIs it information gain or gini gain? I'm trying to figure it out from multiple sources and I'm kinda confused.
ОтветитьHello sir, it is possible to classify animal-trapped camera images and segregate them into folders using an automatic process. This can be done using machine learning and computer vision techniques. Please make a video. I work in the forest department. Many Photographs capture a maximum of 18 lakhs. One by one segregation of having a problem. Please help us
ОтветитьThe Gino impurity function in the code does not output the same responses listed in the video. It’s quite confusing.
ОтветитьWe can teach it to you but we cannot learn it for you.
Ответитьthanks! that was helpful)
Ответитьyou are awesome, man! but why is it that, the second question on if the color is yellow? you separated only apple when two grapes are red. Or is it because they are already taken care of at the first false split of the node?
ОтветитьWhy Impurity is 0.62 after partitioning on "Is color green" on the left subtree?
ОтветитьOne of the best course i ever seed
ОтветитьIncredibly clear explanation. Thank you!
ОтветитьGreat to watch it.
Ответить