Implementing KNN is so easy? That was my first thought after I saw this video. Really the way, it is explained and shown here is remarkable. It not only shows KKN but also how powerful is plain Python when used sensibly with library like Numpy. The entire idea is very useful for beginners like me. I am now AssemlyAI subscriber. I am going to not only see but follow along all videos of this playlist in order to get better understanding of Python, Numpy, Pandas and DataScience. Thank you AssemblyAI for sharing.
fun fact, for the distance between points in KNN, you can omit the square root portion of the euclidean distance function for efficiency. Square root function is monotonic, so it if a < b then sqrt(a) is also < sqrt(b).
Very easy to follow after I created my own implementation. Very similar to my own implementation, except I elected to use a priority queue to keep track of the k nearest, instead of sort (because having to keep track of indices was a pain, and it was getting late). Coded mine in C# without third party libraries. I like that numpy offers a argsort method here, comes in handy.
The counter returns the sorted count of all possible outcomes, i.e. a list of tuples and each tuples has the label and the count, (label, count). You only want the most common one, i.e. the first element in the array and you only want the label, not the count, i.e. you want the first element of that tuple which is also accessed by using [0]. Therefore you need to apply [0][0].
You're welcome Santiago! You should include the KNN python file we develop in the video in the file system of the collab notebook. That should get rid of the error! :)
There is no teacher on this planet that can explain python, machine learning in a proper sequence and an entertaining way. I don't know what is she doing in this video. Also, she is not explaining whatever she is typing all that Chinese stuff.
you know numpy was used to calculate sqrt and sort in the eucledian distance and argsort to get the indexes of the k smallest distances from sklearn we imported datasets to use the iris dataset and train_test_split to split our dataset to the train part and the test one using knn directly equals using from sklearn.neighbors import KNeighborsClassifier no class implementation just a call
I went through all of these Assembly AI lessons, making each one work perfectly. Then I redid each one using Scikit Learn classes. In every case, I was able to drop in the sklearn equivalent and get the same or better results. A good entree into Scikit Learn.
Implementing KNN is so easy? That was my first thought after I saw this video. Really the way, it is explained and shown here is remarkable. It not only shows KKN but also how powerful is plain Python when used sensibly with library like Numpy. The entire idea is very useful for beginners like me. I am now AssemlyAI subscriber. I am going to not only see but follow along all videos of this playlist in order to get better understanding of Python, Numpy, Pandas and DataScience. Thank you AssemblyAI for sharing.
Short and simple. I like the way you explained the KNN in simple words.
Thank you!
This is the best series to learn ML.
🎓🔥🔥
Imma recommend it to all my ml enthusiast friends ✌🏻
Thank you!
Awesome , Without using predefined functions from knn and using just distance formula, explained very well.
fun fact, for the distance between points in KNN, you can omit the square root portion of the euclidean distance function for efficiency. Square root function is monotonic, so it if a < b then sqrt(a) is also < sqrt(b).
I found this
"While k-means often uses squared distances internally for computational efficiency"
Great tutorial, I also added tie-breaking functionality in case tie occurs in most frequent label.
wonderfully done with a lot of clarity
Very easy to follow after I created my own implementation. Very similar to my own implementation, except I elected to use a priority queue to keep track of the k nearest, instead of sort (because having to keep track of indices was a pain, and it was getting late). Coded mine in C# without third party libraries. I like that numpy offers a argsort method here, comes in handy.
I don't understand that why we add terms that '[0][0]' to the list of most_commons? 8:04
The counter returns the sorted count of all possible outcomes, i.e. a list of tuples and each tuples has the label and the count, (label, count). You only want the most common one, i.e. the first element in the array and you only want the label, not the count, i.e. you want the first element of that tuple which is also accessed by using [0]. Therefore you need to apply [0][0].
The free course is appreciated, but I have trouble understanding some of the terms and the thoughts behind certain functions.
short and simple ,no complications
Nice and concise. Love it.
Short and simple, Thank you very much
You're very welcome!
I am getting error no module NAMED KNN .... pl help to resolve this problem.
How can I plot the graph again to see if it turned those blues into the green?
Whoa, excellent video! It was well explained, thanks! 😁😁👍🤩
You're very welcome :)
I love this tutorial so much
Awesome!
Great video!
Glad you enjoyed it
I like to follow this course from Lesson 1, what is the link that i need to start here?
How did you visualised the data ?
Thank you for sharing
Thanks for watching!
What about the regression case?
There is no regression in knn it is a classification algorithm
@KarthickKenny. One can apply KNN when the response variable is continuous
@@andrea-mj9ce you have to apply regression algorithm in that case not knn
Thank you abla
amazing job
Thank you
You're very welcome :)
Please explain in more detail every line code.
good job, I like it, KNN doesn't well with images i believe right?
great simple tutorial but how do i plot a graph with the knn?
Amazing!
numpy error in vscode???
how to setup my machine with all these libraries ???
pip
Great video! Thnk you for making it.
Got this error in Colab. ModuleNotFoundError: No module named 'KNN' when running from KNN import KNN
You're welcome Santiago! You should include the KNN python file we develop in the video in the file system of the collab notebook. That should get rid of the error! :)
There is no teacher on this planet that can explain python, machine learning in a proper sequence and an entertaining way. I don't know what is she doing in this video. Also, she is not explaining whatever she is typing all that Chinese stuff.
great excahnge ndiro niya
wow, she knows her stuff.
👌👌👍👍👍👍
nice
i love you
Thx, however, this euclidean distance function needs to be corrected.
It's actually ok I'd say
@@osviiii yeap i checked that, i just confused a little
Are you Turkish
Yes!
@@AssemblyAI that "i" in range pronounciation gave it away :D
thank you for the practice... but it's an exact copy from this one th-cam.com/video/ngLyX54e1LU/w-d-xo.html created 4 years ago
Yes! Pat works with me too, we decided to do a new run of his videos :)
How to implement knn from scratch… import numpy and sklearn ¯\_(ツ)_/¯
you know numpy was used to calculate sqrt and sort in the eucledian distance and argsort to get the indexes of the k smallest distances
from sklearn we imported datasets to use the iris dataset and train_test_split to split our dataset to the train part and the test one
using knn directly equals using
from sklearn.neighbors import KNeighborsClassifier
no class implementation just a call
Lewis Brenda Walker James Jackson Charles
I like this approach, it is so helpful. Curious how it compares with sklearn's version of sklearn.neighbors.KNeighborsClassifier 😃
I went through all of these Assembly AI lessons, making each one work perfectly. Then I redid each one using Scikit Learn classes. In every case, I was able to drop in the sklearn equivalent and get the same or better results. A good entree into Scikit Learn.