An interesting consideration is the mechanism of storage. At one defense contractor I worked at, we had machines with no real RAM to speak of, and even transient data would be stored on tape media. Because of that, a ping-pong bubble sort was always the fastest method in practice because it dealt with adjacent elements (didn't have to deal with the seek times). Because of the linear seek times, it saved time by actually doing work as the tape wound one way or the other.
Indeed it was not me... Most of the computerphile videos are made by Sean Riley (we always say who made the film in the full video description if you want to double check) I'll still be making a few myself and come along for the odd interview (just because I like being involved!) - but Sean is the main man and you can usually assume he is the an behind the camera/edit! >Brady
Good video, but I'd like to challenge Alex's theory at 4:50 that smaller lists are "nearly sorted and therefore BubbleSort is faster" when he's using uniform random. The data rather suggests a tipping point between bubble and merge somewhere 20~40 elements, that's around when you'd run out of Level 1 CPU cache (64 bytes). That most likely explains why bubble sort is initially much faster, because doing 1000 swaps on 32 elements in L1 cache is less CPU work than a handful recursive method calls in the JVM each allocating 2 arrays and copying data. It should remind everyone that Big-O is just theory ignoring how computers really work. These days cache misses dominate performance, unlike in the 90's when multiplications were slow and memory was fast. I'd also like to point out that his approach to micro-benchmarking Java code is prone to incorrect data. For small input sizes he is running in interpreted mode (JIT requires 10k iterations) also you don't want to call `nanoTime()` in a tight loop, it's an expensive call and thus will dominate cost for smaller input sizes. Look into JMH next time you want to benchmark in Java.
Hi, we will put the links in the description when they are available - at the time of writing this, the two videos 'teased' at the end are not yet available and the annotation simply says 'coming soon' - hope that helps >Sean
Quick sort does this: 1) Choose a random pivot, the pivot is used to compare the numbers of the list 2)Create 2 lists, "greater" and "less" 3)Go through each number (except the pivot) in the list. If its greater than the pivot, add it to "greater". If it's less than or equal, add it to "lesser" 4)Recursion! Basically, repeat 1-3 for each and every "greater", append the pivot, then append the "less" list after you apply steps 1-3. It's hard to explain, but relatively easy to code
This video would have helped me a lot back when I was in simple algorithms. You should do more videos on algorithms. Maybe Quicksort, I always thought it was beautiful how well it worked.
So far I've liked all the Computerphile videos, but I think this is one of the better ones. I'm a 16 year old self taught programmer and I think this channel is a great way of introducing new people to computing. Great job Brady!
If he were to explain it with pictures like he did these, it would be incredibly easy to understand. Sure, the concept of a pivot and recursion might be slightly more difficult than these concepts, but that's why we're here!
Just a few months ago we were doing sorting algorithms in my CS course and I was bored enough to implement 20 different sorting algorithms and benchmark them. Here's my top 8 algorithms for 1 mil elements: 1. Bit-adaptive radix sort (can do 200 mil numbers in 2 seconds on my PC) 2. Flashsort 3. Introsort 4. Mergesort 5. Quicksort 6. Shellsort (very simple to implement) 7. Heapsort 8. Smoothsort (look it up, it's really cool).
Bubble sort is the simplest one to implement, and a good introduction to algorithms in general. Also, it serves as an excellent example of how different algorithms which at a first glance might both seem more or less efficient to untrained eyes actually perform very differently.
I want to see radix sort, and a discussion of how you can beat the theoretical limits if you're willing to break the rules. (If you have a limited number of values, you can sort really fast by putting the cards in piles by number and never comparing them with each other.) A lot of the biggest improvements in CS come from solving the problem you actually have to solve, rather than the general case.
For those wanting a lovely visual way of seeing sorting algorithms the appropriately named sorting-algorithms site has them all for comparison. Its problem sizes cant get too large but its one of the best references for various cases I know of.
Makes a huge huge difference in real programming. The software i am coding has an "industry standard" to meet which is to process 1 million records of 200 fields inside 15 minutes. 15mins = 900 seconds to do 1 million, which is less than 1ms per record. Shaving fractions of ms off of processing times is incredibly important at times.
Bubble sort will always be relevant, because it's easy to remember, easy to write and relatively quick on extremely small datasets, along with having extremely low overhead.
Buublesort doesn't rely on a list's last element being the largest. Every time you parse the list (pass over every element), the largest element within the unsorted portion (the left) is moved into the sorted portion (the right). Thus, the series is sorted by ordering values in descending order of magnitude (backwards).
In fact, "Timsort", which is the built-in library sort in a bunch of languages, is a combination of merge sort and insertion sort (which is kind of a better-organized bubble sort, which does the same number of comparisons, but only O(n) swaps). It's particular worthwhile to have great performance for lists that are either already in order or nearly in order.
Btw: thumbs up for including BogoSort. There's probably only one sorting algorithm that's even crazier: Intelligent Design Sort. It says something like "Look at your data. Some higher being has decided that this is the order you need. Therefore consider it sorted" Intelligent Design Sort uses O(1) time (constant time) in all cases.
Oh man, you did sorting without going through Quicksort! That's the most famous algorithm in computer science!!! Great video regardless, too bad this wasn't here last semester when I was taking Data Structures and Algorithms.
It would be awesome if someone come up with English subtitles for these videos (just like numberphile!). Because English is not my mother tongue and you all got funny accents. Thanks for the channel!!
I wish the title included Big O notation! I was recently looking up more information on the subject and this was a much better explanation than the rest!! =)
Sorting algorithms are tough to make O(n!), but there are algorithms for other tasks that are worse than O(n!) by a long shot. Things like O(n^n) have come up in my own mathematical research (case generation and resolution for a problem).
People don't watch these videos to learn sorting algorithms or serious computer science. Please look at the general flow of the channel. Perhaps they want to show the most intuitive, yet rather slow, algorithm and then compare it to others. It is so intuitive, it is the one I reinvented when I first started programming as a kid and needed to sort a list, but no one had told me yet that you could reuse the code of others.
Every one of these algorithms should be taught, together wit it's advantages and disadvantages. And yes, bubblesort has it's advantages, but not for most developers. - it is a stable sort - uses no extra memory - algorithm is very small and easy And it surely should not be about if an algorithm is more understandable. This is about the hard facts about different solutions to a problem, you need to understand that, and then you should use the best solution for your situation.
You can look it up yourself if you want. The idea behind the algorithm is what matters, the actual implementation in code is basically turning what he said in code. An algorithm shouldn't be seen as a piece of code, but as a general strategy to solve a problem.
I call Radix sort above the sorting algorithms brought up in this video. It works for pretty much all numbers that a computer can use, and also would work for characters (since those can be cast as integers). Of course, it won't work when you can only use a method that compares two objects at once since it doesn't work in that way, but still.
Indeed, one extra statement isn't going to effect the performance in a negative manner. There're btw a lot of little tweaks you could use to speed up bubblesort (in some occasions). You could go from left to right and then switch and sort from right to left and back from left to right and so on.
The speed is affected by the number of operations done. Questions and actions both take a constant amount of time. The time taken by different actions and questions might not be same, but the same kind of action always takes the same time. Algorithms can be analyzed to find out the minimum, maximum and average number of actions needed. Even though merge sort might make less actions than bubble sort, it's initially slower due to the actions taking more time, but with large inputs it still wins.
"n" is the number of items in the list. The overall efficiency is the function of "n" that tells you the total number of steps in the algorithm. For Bubble Sort, worst-case it has to swap every pair of items on every pass. Whenever it does, it needs to do another pass. So that's n passes and n swaps each--n^2. Merge sort splits the list in half each time, so the total number of splits is log-base-2 of n. Then it compares each element together for each pair, so that's n comparisons. So n x log n.
That could take a while. They use a lot of different techniques combined and a lot of them need a quite solid understanding of how data structures, parallel computing and networks work. Have a look at lectures like "Advanced Data Structures" and "Parallel Algorithms" if you're still interested.
Yes! Many databases use techniques very similar to what's used in heapsort. If you keep your heap structure you can search, add and remove log(n) (worst case) time. Well it's not exactly the same way, but it is extremely similar and uses the same idea behind it. As a sorting algorithm it loses out against mergesort in the real world, but the idea of using a tree structure get used all over the place. Look up red-black trees, they're quite ingenious ;)
Recursive thinking doesn't always require recursive functions. Mergesort for example can easily be rewritten using loops, even though the algorithm uses recursive concepts (as in reducing a problem to simpler subproblems).
So to answer your question, in terms of how algorithms are usually compared, no. Algorithms are analyzed with a slightly simplified way to calculate the number of actions to take - an algorithm that does 2*n actions and an algorithm that does 20*n actions are considered to have the same running time (n operations). The other one is 10 times slower, but it is ALWAYS 10 times slower - unlike in the case of bubble sort and merge sort, where bubble sort just gets worse and worse.
i think its important to note that in mergesort, the lists arent split in the way that he split them. the first half is sorted before the second half moves at all.
You should do another one that shows why n*lg(n) is the best you can hope for in a sorting algorithm. Then show why/how counting sort can beat it with O(n) but isn't usable on many sorting problems.
This was nice. Would've been nicer to have had this before my workshop about collections and sorting, but whatever. Shufle sorts are the best kind of sorts, because at least you had fun!
Xkcd actually talked about that a bit, how google tends to fudge the numbers. To anyone interested in reading about it the article is posted in the xkcd blog archives under February 2011
This was awesome! Do one on quick sort too! Also do one where you answer the question "How does Google return search results from billions and billions of websites almost instantly?"
A bit later on after other videos, a video on the for loop and it's OO sibling foreach, and how it leads to spatial and temporal locality in memory, and how it can be unwound to be executed in parallel depending on branching with speculative execution could be interesting. There's easily 5 minutes there, and if you take your time to cover it in some depth with simple explanation you could get 15 minutes that would be interesting ;)
I don't understand why merge sort is always given an initial recursive decomposition step. You can form the initial base lists by simply collecting the elements 2 at a time with the last pair as a single item should the number of elements be odd.
That is correct, because it will always do the same number of steps for the same sized list (illustrated by the best and worst case scenarios both being n x logn).
Bubble sort is a zombie. It will never die, no matter how many times you try to kill it. It always comes back. At the end of the universe, when the heat death is almost complete, and there's almost nothing left in the entire universe... someone will be there teaching bubble sort. It will never, ever, ever die. Even though it's the most horrible sorting algorithm that anybody has ever devised.
:D so so glad you posted the code. this got me back into some programming after a long hiatus for work! i managed to incorporate a simple pre-load system so you don't have to reprogram the app each time you want to change the range of your test XP simple enough but was still super fun :D
I haven't been a huge fan of this channel Brady, even though I was really looking forward to it being launched. That being said, I really enjoyed this video! This is excatly the kind of stuff I would like to see! Thumbs up from me!
also, they probably wanted to explain some other sorting algorithms first, because quicksort uses one of these O(n²) algorithms when its current part is under a certain size, because as you saw, n² can be quicker than others when the size is low enough... at least that's what I remember from my algorithms and datastructures classes :)
If you want to try a fast and easy to implement sorting algorithm, take a look at comb sort. It is a bad-known algortihm, basicly BubbleSort but with a decreasing gap beetween numbers, with same complexity as quick sort !
There is also parallel sorting Shuffle the stack at random. Then look if the stack is sorted. If it isn't sorted, you go to a Parallel Universe where it is sorted.
That you still have to query with incredible speed. even with indices you need multiple systems in place to make that search fast, search is still one of the hardest problems in the industry because it all revolves around cache invalidation, which is super hard.
yes, you can do something like if (#elements < someNumber) {bubblesort;} else {mergesort;} if that's what you mean, whether initially, or at some small enough list size to speed up the lower collections being sorted. Hybrid searches are a good way of speeding the process up, but are generally harder to implement, require more knowledge, and aren't necessarily worth it unless you're doing a lot of sorting.
You should talk about linked list sorting using bidirectional lists. its a supper supper fast sort for large sets of data. you push your values to the right of left of existing values by using link swapping and you keep track of both ends of the list and the half and quarter way marks depending on how big the lists gets, using those points you can quickly reduce the number of comparisons.
Exactly - it has to be a pretty specific reason. The systems I work on use them to generate formula based pricing, where once price can be based on another price, based on another price etc - the recursion moves down through the price hierarchy - it would have been very hard to achieve the same result without it. Sure can be confusing trying to work out pricing errors though.
What about a 2^b^n sort? It checks what the range of values for the data type is, generates a list of random numbers equal to the size of the list, and checks if it both has the same values as the original list, and makes sure that it's sorted. It would probably be called the Coincidence Sort.
There are many many algorithms that have steps that seem spread like a tree. If an algorithm runs in some sort of logarithmic time, guessing it uses a divide-and-conquer approach somewhere is probably accurate.
The last sorting algorithm is the fastest sorting algorithm on quantum computers. They can test every combination at the same time, and return the right one.
Yes, its easy enough to tell on its own but you can make an algorithm that lets you determine when to use bubble or merge sorting, then place the list in the correct one.
Hey Computerphile, Really loving the videos on this new channel, especially this one! Would you be able to do a video on recursion and its applications in algorithms, and further, how to write algorithms using recursion. I would really like to know the thought process behind how to write good recursive programs. Thanks :)
"you'd have to be quite a good programmer to work out how to do something that runs that badly..."
Fantastic quote!
Really glad you are liking it - there are PLENTY more to come!
PS: Tell your mates about us!
>Brady
An interesting consideration is the mechanism of storage. At one defense contractor I worked at, we had machines with no real RAM to speak of, and even transient data would be stored on tape media. Because of that, a ping-pong bubble sort was always the fastest method in practice because it dealt with adjacent elements (didn't have to deal with the seek times). Because of the linear seek times, it saved time by actually doing work as the tape wound one way or the other.
Nice that you included bogo sort!
To this day,he is still random sorting...
Indeed it was not me... Most of the computerphile videos are made by Sean Riley (we always say who made the film in the full video description if you want to double check)
I'll still be making a few myself and come along for the odd interview (just because I like being involved!) - but Sean is the main man and you can usually assume he is the an behind the camera/edit!
>Brady
Good video, but I'd like to challenge Alex's theory at 4:50 that smaller lists are "nearly sorted and therefore BubbleSort is faster" when he's using uniform random. The data rather suggests a tipping point between bubble and merge somewhere 20~40 elements, that's around when you'd run out of Level 1 CPU cache (64 bytes). That most likely explains why bubble sort is initially much faster, because doing 1000 swaps on 32 elements in L1 cache is less CPU work than a handful recursive method calls in the JVM each allocating 2 arrays and copying data.
It should remind everyone that Big-O is just theory ignoring how computers really work. These days cache misses dominate performance, unlike in the 90's when multiplications were slow and memory was fast.
I'd also like to point out that his approach to micro-benchmarking Java code is prone to incorrect data. For small input sizes he is running in interpreted mode (JIT requires 10k iterations) also you don't want to call `nanoTime()` in a tight loop, it's an expensive call and thus will dominate cost for smaller input sizes. Look into JMH next time you want to benchmark in Java.
Hi, we will put the links in the description when they are available - at the time of writing this, the two videos 'teased' at the end are not yet available and the annotation simply says 'coming soon' - hope that helps >Sean
It's astounding to see how common it is for these professors or engineers to always have a Rubik's cube on their desk.
Quick sort does this:
1) Choose a random pivot, the pivot is used to compare the numbers of the list
2)Create 2 lists, "greater" and "less"
3)Go through each number (except the pivot) in the list. If its greater than the pivot, add it to "greater". If it's less than or equal, add it to "lesser"
4)Recursion! Basically, repeat 1-3 for each and every "greater", append the pivot, then append the "less" list after you apply steps 1-3. It's hard to explain, but relatively easy to code
This video would have helped me a lot back when I was in simple algorithms. You should do more videos on algorithms. Maybe Quicksort, I always thought it was beautiful how well it worked.
I can't wait for more on sorting. I know it very well but it is very nice to hear someone explain it so well in just 10 minutes.
So far I've liked all the Computerphile videos, but I think this is one of the better ones. I'm a 16 year old self taught programmer and I think this channel is a great way of introducing new people to computing. Great job Brady!
Thanks... Sean, unlike me, knows how to use After Effects...
>Brady
Please don't stop putting videos up on Computerphile!
This is my favorite channel Brady!
That paper they write on takes me back to my primary school days. Nice touch. Well done
Thank you for making this video. This shows that the audience is really important to you guys.
Algorithm complexity is something I have wanted to learn about for a while, and this video has given me a basic look. Keep the good work up!
I really liked how long and in-depth this episode was. I noticed that most of the Computerphile episodes were pretty short.
It will be covered soon! >Sean
That is the most mindblowing computer-related thing I've heard in a long time.
If he were to explain it with pictures like he did these, it would be incredibly easy to understand. Sure, the concept of a pivot and recursion might be slightly more difficult than these concepts, but that's why we're here!
Just a few months ago we were doing sorting algorithms in my CS course and I was bored enough to implement 20 different sorting algorithms and benchmark them. Here's my top 8 algorithms for 1 mil elements: 1. Bit-adaptive radix sort (can do 200 mil numbers in 2 seconds on my PC) 2. Flashsort 3. Introsort 4. Mergesort 5. Quicksort 6. Shellsort (very simple to implement) 7. Heapsort 8. Smoothsort (look it up, it's really cool).
The best sorting algorithm is using recursivity. The video was excellent. Your channel is getting great
Bubble sort is the simplest one to implement, and a good introduction to algorithms in general. Also, it serves as an excellent example of how different algorithms which at a first glance might both seem more or less efficient to untrained eyes actually perform very differently.
I would like to see more on programming languages, their history, pros and cons, basic abilities of each, thank you!
I want to see radix sort, and a discussion of how you can beat the theoretical limits if you're willing to break the rules. (If you have a limited number of values, you can sort really fast by putting the cards in piles by number and never comparing them with each other.) A lot of the biggest improvements in CS come from solving the problem you actually have to solve, rather than the general case.
For those wanting a lovely visual way of seeing sorting algorithms the appropriately named sorting-algorithms site has them all for comparison. Its problem sizes cant get too large but its one of the best references for various cases I know of.
This channel is so precious
we could compute the time to compute for each method, take the best one before sorting!! what an awesome sorting algorithm!
Makes a huge huge difference in real programming. The software i am coding has an "industry standard" to meet which is to process 1 million records of 200 fields inside 15 minutes.
15mins = 900 seconds to do 1 million, which is less than 1ms per record.
Shaving fractions of ms off of processing times is incredibly important at times.
I never understood merge sort till today. Thanks!
algorithms and data structures exam next friday. those sorting videos are pretty good for understanding. thanks, brady :)
Quicksort will be covered soon! >Sean
Bubble sort will always be relevant, because it's easy to remember, easy to write and relatively quick on extremely small datasets, along with having extremely low overhead.
Rumour has it he is still there to this day.
Gnome sort, as far as I can recall, is about making one step back after the switch, and it seems to make more sense than starting from the beginning.
Buublesort doesn't rely on a list's last element being the largest. Every time you parse the list (pass over every element), the largest element within the unsorted portion (the left) is moved into the sorted portion (the right). Thus, the series is sorted by ordering values in descending order of magnitude (backwards).
In fact, "Timsort", which is the built-in library sort in a bunch of languages, is a combination of merge sort and insertion sort (which is kind of a better-organized bubble sort, which does the same number of comparisons, but only O(n) swaps). It's particular worthwhile to have great performance for lists that are either already in order or nearly in order.
Nicely explained video. Computing science concepts like Big O notation and sorting are really interesting.
Btw: thumbs up for including BogoSort. There's probably only one sorting algorithm that's even crazier: Intelligent Design Sort. It says something like "Look at your data. Some higher being has decided that this is the order you need. Therefore consider it sorted" Intelligent Design Sort uses O(1) time (constant time) in all cases.
Wow! Great demonstration illustrating Big O notation
Oh man, you did sorting without going through Quicksort! That's the most famous algorithm in computer science!!! Great video regardless, too bad this wasn't here last semester when I was taking Data Structures and Algorithms.
It would be awesome if someone come up with English subtitles for these videos (just like numberphile!). Because English is not my mother tongue and you all got funny accents. Thanks for the channel!!
I wish the title included Big O notation! I was recently looking up more information on the subject and this was a much better explanation than the rest!! =)
Done ;) >Sean
Sorting algorithms are tough to make O(n!), but there are algorithms for other tasks that are worse than O(n!) by a long shot. Things like O(n^n) have come up in my own mathematical research (case generation and resolution for a problem).
People don't watch these videos to learn sorting algorithms or serious computer science. Please look at the general flow of the channel. Perhaps they want to show the most intuitive, yet rather slow, algorithm and then compare it to others. It is so intuitive, it is the one I reinvented when I first started programming as a kid and needed to sort a list, but no one had told me yet that you could reuse the code of others.
Every one of these algorithms should be taught, together wit it's advantages and disadvantages. And yes, bubblesort has it's advantages, but not for most developers.
- it is a stable sort
- uses no extra memory
- algorithm is very small and easy
And it surely should not be about if an algorithm is more understandable. This is about the hard facts about different solutions to a problem, you need to understand that, and then you should use the best solution for your situation.
You can look it up yourself if you want. The idea behind the algorithm is what matters, the actual implementation in code is basically turning what he said in code. An algorithm shouldn't be seen as a piece of code, but as a general strategy to solve a problem.
I call Radix sort above the sorting algorithms brought up in this video. It works for pretty much all numbers that a computer can use, and also would work for characters (since those can be cast as integers).
Of course, it won't work when you can only use a method that compares two objects at once since it doesn't work in that way, but still.
It's good to see some actual computer science in computerphile. Also, quicksort rules merge sort drools!
How does a computer "know" that 3 is a lower number than 4?
Indeed, one extra statement isn't going to effect the performance in a negative manner. There're btw a lot of little tweaks you could use to speed up bubblesort (in some occasions). You could go from left to right and then switch and sort from right to left and back from left to right and so on.
Excellent video! Please continue on this route here.
The speed is affected by the number of operations done. Questions and actions both take a constant amount of time. The time taken by different actions and questions might not be same, but the same kind of action always takes the same time. Algorithms can be analyzed to find out the minimum, maximum and average number of actions needed. Even though merge sort might make less actions than bubble sort, it's initially slower due to the actions taking more time, but with large inputs it still wins.
6:20 "As I said earlier" I can't remember him saying that!
"n" is the number of items in the list. The overall efficiency is the function of "n" that tells you the total number of steps in the algorithm. For Bubble Sort, worst-case it has to swap every pair of items on every pass. Whenever it does, it needs to do another pass. So that's n passes and n swaps each--n^2. Merge sort splits the list in half each time, so the total number of splits is log-base-2 of n. Then it compares each element together for each pair, so that's n comparisons. So n x log n.
I appreciate the effort you are putting to explain this.
That could take a while. They use a lot of different techniques combined and a lot of them need a quite solid understanding of how data structures, parallel computing and networks work. Have a look at lectures like "Advanced Data Structures" and "Parallel Algorithms" if you're still interested.
Yes! Many databases use techniques very similar to what's used in heapsort. If you keep your heap structure you can search, add and remove log(n) (worst case) time. Well it's not exactly the same way, but it is extremely similar and uses the same idea behind it. As a sorting algorithm it loses out against mergesort in the real world, but the idea of using a tree structure get used all over the place.
Look up red-black trees, they're quite ingenious ;)
Recursive thinking doesn't always require recursive functions. Mergesort for example can easily be rewritten using loops, even though the algorithm uses recursive concepts (as in reducing a problem to simpler subproblems).
So to answer your question, in terms of how algorithms are usually compared, no. Algorithms are analyzed with a slightly simplified way to calculate the number of actions to take - an algorithm that does 2*n actions and an algorithm that does 20*n actions are considered to have the same running time (n operations). The other one is 10 times slower, but it is ALWAYS 10 times slower - unlike in the case of bubble sort and merge sort, where bubble sort just gets worse and worse.
The kraft paper has been replaced by the classic and very appropriate dot-matrix printing paper. Excellent!
i think its important to note that in mergesort, the lists arent split in the way that he split them. the first half is sorted before the second half moves at all.
You should do another one that shows why n*lg(n) is the best you can hope for in a sorting algorithm. Then show why/how counting sort can beat it with O(n) but isn't usable on many sorting problems.
nice video and explanation for those who are new to Algorithm Analysis
wow, that last algorithm is so amazing :)
Wow, this is exactly what I thought you should do next.
This was nice. Would've been nicer to have had this before my workshop about collections and sorting, but whatever.
Shufle sorts are the best kind of sorts, because at least you had fun!
Xkcd actually talked about that a bit, how google tends to fudge the numbers.
To anyone interested in reading about it the article is posted in the xkcd blog archives under February 2011
Thank you so much for this video. It really helped me understand the concepts of these algorithms.
This was awesome! Do one on quick sort too! Also do one where you answer the question "How does Google return search results from billions and billions of websites almost instantly?"
i'M new to programing so looking at the end code with knowledge of purpose and process is very helpful
A bit later on after other videos, a video on the for loop and it's OO sibling foreach, and how it leads to spatial and temporal locality in memory, and how it can be unwound to be executed in parallel depending on branching with speculative execution could be interesting. There's easily 5 minutes there, and if you take your time to cover it in some depth with simple explanation you could get 15 minutes that would be interesting ;)
I don't understand why merge sort is always given an initial recursive decomposition step. You can form the initial base lists by simply collecting the elements 2 at a time with the last pair as a single item should the number of elements be odd.
That is correct, because it will always do the same number of steps for the same sized list (illustrated by the best and worst case scenarios both being n x logn).
Bubble sort is a zombie. It will never die, no matter how many times you try to kill it. It always comes back.
At the end of the universe, when the heat death is almost complete, and there's almost nothing left in the entire universe... someone will be there teaching bubble sort. It will never, ever, ever die. Even though it's the most horrible sorting algorithm that anybody has ever devised.
:D so so glad you posted the code. this got me back into some programming after a long hiatus for work! i managed to incorporate a simple pre-load system so you don't have to reprogram the app each time you want to change the range of your test XP simple enough but was still super fun :D
Amazing work with the animations! Makes merge sort so much easier to understand!
Best video on this channel so far! Great job!
I haven't been a huge fan of this channel Brady, even though I was really looking forward to it being launched. That being said, I really enjoyed this video! This is excatly the kind of stuff I would like to see!
Thumbs up from me!
also, they probably wanted to explain some other sorting algorithms first, because quicksort uses one of these O(n²) algorithms when its current part is under a certain size, because as you saw, n² can be quicker than others when the size is low enough... at least that's what I remember from my algorithms and datastructures classes :)
If you want to try a fast and easy to implement sorting algorithm, take a look at comb sort. It is a bad-known algortihm, basicly BubbleSort but with a decreasing gap beetween numbers, with same complexity as quick sort !
You are correct, it is generally implemented with recursion.
I was just about to comment about it being called bogo sort, but he got it in! I'm happy now.
There is also parallel sorting
Shuffle the stack at random. Then look if the stack is sorted.
If it isn't sorted, you go to a Parallel Universe where it is sorted.
That you still have to query with incredible speed. even with indices you need multiple systems in place to make that search fast, search is still one of the hardest problems in the industry because it all revolves around cache invalidation, which is super hard.
Sorting Algorithms are a thing of beauty!
Computerphile, I subscribed before this video finished loading.
This is the best video yet!
We looked at bubble sorting in the decision maths unit in A level Maths. The different sorting methods are quite interesting :-)
What a great channel! Brings me back to my university days.
yes, you can do something like if (#elements < someNumber) {bubblesort;} else {mergesort;} if that's what you mean, whether initially, or at some small enough list size to speed up the lower collections being sorted. Hybrid searches are a good way of speeding the process up, but are generally harder to implement, require more knowledge, and aren't necessarily worth it unless you're doing a lot of sorting.
You should talk about linked list sorting using bidirectional lists.
its a supper supper fast sort for large sets of data.
you push your values to the right of left of existing values by using link swapping and you keep track of both ends of the list and the half and quarter way marks depending on how big the lists gets, using those points you can quickly reduce the number of comparisons.
Exactly - it has to be a pretty specific reason. The systems I work on use them to generate formula based pricing, where once price can be based on another price, based on another price etc - the recursion moves down through the price hierarchy - it would have been very hard to achieve the same result without it. Sure can be confusing trying to work out pricing errors though.
What about a 2^b^n sort? It checks what the range of values for the data type is, generates a list of random numbers equal to the size of the list, and checks if it both has the same values as the original list, and makes sure that it's sorted. It would probably be called the Coincidence Sort.
There are many many algorithms that have steps that seem spread like a tree. If an algorithm runs in some sort of logarithmic time, guessing it uses a divide-and-conquer approach somewhere is probably accurate.
The last sorting algorithm is the fastest sorting algorithm on quantum computers.
They can test every combination at the same time, and return the right one.
Yes, its easy enough to tell on its own but you can make an algorithm that lets you determine when to use bubble or merge sorting, then place the list in the correct one.
Hey Computerphile, Really loving the videos on this new channel, especially this one! Would you be able to do a video on recursion and its applications in algorithms, and further, how to write algorithms using recursion. I would really like to know the thought process behind how to write good recursive programs. Thanks :)