I can say this video is the best explanation of the quick select algorithm on TH-cam. I read an article on GeeksforGeeks and only found myself more confused because of poor variable naming such as x and y, and weird way of partitioning like right - left, and so on. The method in this video and the explanation make more sense! Thanks CS Robot!
Never go to geeksforgeeks. It is by far the worst resource on the Internet because none of the articles are reviewed and you don't need any credentials to write one.
One of the better explanations out there. The thing with partition is that there are various ways to do it and so many nuances with the boundary conditions if you are not careful. There is another method using while loops which I have always found tad bit more difficult. This one is more straightforward.
Fill the empty cells in the rows with their basic possibilities given the numbers in the sub grid row and column no need to do any more working out. Now go from left to right or right to left ignore the data in your cell and see what numbers ain't covered by the possibilities in the other cells or the cells already solved. If there are no possibilities deduced move to the next empty cell and wipe the info in the cell you just tested. If you do find a possible number or 2 from this test then the solution is that number or is one of those numbers. It's a very quick O n time test that trips the game of sudoku up and ruins it for everyone.
@@Fran-kc2gu Depend on the input as you say, but you hit the worst case O(N^2) if the list if already sorted, which is a denial of service attack vector. If your having a critical timeout you must meet you should even go with a median-of-medians.
@@surters , if your list is unsorted, arguably list.length - 1 slot will follow a random distribution. Your worst case depends not only on the list but the kth position you're trying to find. It is true, a random selection can reduce worst case chance still, but it really is something you should judge on your use case. Your real world distribution of your list may not hit the worst case scenario as often as other distributions. If your list is small, your usage of random may actually incur more overhead than just selecting a fixed pivot. Don't let big O complexity blind you to the underlying complexity and overhead of the functions you call in your algorithms, or that the complexity depends on external factors to the algorithm, like the expected distribution of your list. For a DOS attack, your user would have to control the list and search position and the specific implementation would need to affect the shared resources appreciably. Your advice thus is highly specific to a very specific use case and implementation that isn't broadly of concern.
@@JimBob1937 If it is OK that sometimes O(N^2) is acceptable and the data is not depending on potential hostile 3rd party input, it could be OK. If your dealing with potentially hostile 3rd party inputs, depending on a predictable pivot is not advisable, picking a random pivot might be good enough. If on the other hand you can never afford to hit O(N^2) ever, then median of medians is an option. There is tons of literature on the matter.
Believe it or not, this is the best quick select explanation on youtube.For the first timein my life, i can write this algorithm without memorization
This is the best quick select explanation indeed. Other videos never explain what each element of the code actually means.
I was so scared of quickselect before this video, thank you so much for breaking it down in the most understandable way. You're a great tutor!
I can say this video is the best explanation of the quick select algorithm on TH-cam. I read an article on GeeksforGeeks and only found myself more confused because of poor variable naming such as x and y, and weird way of partitioning like right - left, and so on. The method in this video and the explanation make more sense! Thanks CS Robot!
Never go to geeksforgeeks. It is by far the worst resource on the Internet because none of the articles are reviewed and you don't need any credentials to write one.
One of the better explanations out there. The thing with partition is that there are various ways to do it and so many nuances with the boundary conditions if you are not careful. There is another method using while loops which I have always found tad bit more difficult. This one is more straightforward.
This is an unbelievably concise yet helpful explanation - thanks a ton
What a great, down to earth explanation.
straight to the point and no nonse explaination!, thanks for explaining the partition function now I can easily remember the solution
Fantastic explanation. Very clear and concise and thorough. Thank you!
Keep posting these king of videos
Great usage of example and step by step impl explanation.
Keeping it simple
brilliant. One of the best algorithm explanations here. fine work thank you!
lowkey the best quickselect explanation ever what
Best Quick Sort explaination❤❤❤❤
Super good explanation video. This deserves more views
This is best as far as I have seen on YT
I was really stuck at I and J pointer.
nailed it 🔥
thank you for your explanation it is super clear and concise thank you
really appreciate your way of explaining😇😇
You easily had the best video on it, thank you!
best video ever for quickselect
Man you make such great videos, why'd you stop?
thank you so much this helped me understand very quickly
Perfect explanation! Thank you
thank you so much. that was a great explanation
Awesome explanation. Thanks!!!
Is this works if the last element in an array is the largest?, bcoz the arr[i] and arr[r] swaps at the end of the each iteration right
Great explanation, thank you
great video , it deserves more views
Very good explanation, thank you!
How do you make these graphics ? Is it a presentation ? Slides , or something like manim ?
You have good explanation, Thanks it's help me a lot
Shouldn't the time complexity be denoted big theta of n and big O of n^2?
Great Explanation, thank you
Fill the empty cells in the rows with their basic possibilities given the numbers in the sub grid row and column no need to do any more working out. Now go from left to right or right to left ignore the data in your cell and see what numbers ain't covered by the possibilities in the other cells or the cells already solved. If there are no possibilities deduced move to the next empty cell and wipe the info in the cell you just tested. If you do find a possible number or 2 from this test then the solution is that number or is one of those numbers. It's a very quick O n time test that trips the game of sudoku up and ruins it for everyone.
Good work!
good explanation, thanks
this is the first time i can understand algorithms from the first time.
really well explained
great explanation
really really good video
great video!
Isn't the third largest element 10 here?
Love from india
thanks
Selecting the last as pivot is risky, selecting a random is much better.
lol no, if the order it's random which almost always is it's the same
@@Fran-kc2gu Depend on the input as you say, but you hit the worst case O(N^2) if the list if already sorted, which is a denial of service attack vector. If your having a critical timeout you must meet you should even go with a median-of-medians.
how can we adopt the partition function to select a random pivot? Just pivot = arr[randint(l, r)] doesn't work
@@surters , if your list is unsorted, arguably list.length - 1 slot will follow a random distribution. Your worst case depends not only on the list but the kth position you're trying to find. It is true, a random selection can reduce worst case chance still, but it really is something you should judge on your use case. Your real world distribution of your list may not hit the worst case scenario as often as other distributions. If your list is small, your usage of random may actually incur more overhead than just selecting a fixed pivot. Don't let big O complexity blind you to the underlying complexity and overhead of the functions you call in your algorithms, or that the complexity depends on external factors to the algorithm, like the expected distribution of your list.
For a DOS attack, your user would have to control the list and search position and the specific implementation would need to affect the shared resources appreciably. Your advice thus is highly specific to a very specific use case and implementation that isn't broadly of concern.
@@JimBob1937 If it is OK that sometimes O(N^2) is acceptable and the data is not depending on potential hostile 3rd party input, it could be OK. If your dealing with potentially hostile 3rd party inputs, depending on a predictable pivot is not advisable, picking a random pivot might be good enough. If on the other hand you can never afford to hit O(N^2) ever, then median of medians is an option. There is tons of literature on the matter.
The time complexity is wrong, its o(n^2) worst case if you select the largest or smallest element in the array