As an IOI and ICPC world finalist and IOI coach, these two videos are the best videos on DP that I've seen so far The way you explained DP on your previous video (make observations, think in terms of subproblems, write the recurrence relation, etc.) is pretty much how I explain DP to my students. But I usually don't cover iterative dp optimizations as shown in this video out of fear of confusing my students, but I'll start trying from now on. (I hope you don't mind if I copy parts of your explanation)
I'm honored that you would like to copy parts, feel free! Yeah, I also think iterative optimizations are confusing and can be distracting, but I think it's very important to at least show it, because it conveys the most important idea: DP is always has a recurrence relation, even if it doesn't look like it. I feel this instills confidence that approach recursively is the correct path forward and they wont have to learn "the better iterative approach" next. This way, they can focus on building that recursive mathematical intuition and understand that iterative is an optimization, not a separate way of thinking.
I've already watched this video, specifically opened it to thank you. I used to so scared of DP problems. House Robber seemed very hard to me. Now, I literally solved it in 5 mins. No other tutorial or video has helped me this much regarding DP or recursion.
If you are doing competitive programming in c you can just use function macros instead of functions and still use recurrence relation without handling all the call stack logic. and just writing a recursive solution and let the proprocessor do its job for you.
Even though the video is really enjoyable and pleasant to watch the dp approach can be conceived more mathematically. The approach can be thought of as some sort of a dag topological sort. These three words make up for the mentioned tables but also for more general broader range of cases. The rule of thumb better yet the slick trick presented in the video boils then down to just visualizing the way dag could be validly sorted. Yes I'm trying to prove that thinking in math is a powerful tool and could be an insightful approach to writing algorithms. Especially for those enlightened individuals who spend their time dawdling over leet code problems 😂. Thank you for coming to my Ted talk.
never seen floyd mayweather implemented as recursive dp 😱 also the idea that you don't need to think about inner loop order when outer loop dependency is strictly one directional blew my mind 💪🏽 another video that will carry a generation 🔥🔥
I think its actually really neat implementing it recursively because you can actually understand how and why it works. I've always seen it in it's iterative fully space saved form.
damn so good. took over an hour to rewatch these two videos with the help of chatgpt explaining a few things 😅 but this has really demystified those cryptic leetcode solutions ❤🥺
I've found success in explaining recurrence relations as states and state relations. Becuase that is what they are, just not a way that many think about them as. This is also the way I write recursive functions and how I explain to others how to write them: Think state and state transitions, then think stopping point. Then, if needed, think about how store values in states directly, and think about ordering of the operations to calculate the right states first in order to avoid storing too much data you don't need, if that indeed is an issue.
this video made me wonder how hard it'd be to implement a compiler that can take a pure recursive math function like that and turn it into an iterative approach might be a fun pet language to make, boundschecking at a compile time level & the like
9:33 It took me a bit to understand the motivation behind "size decreases every time" -> "small before big", but it just clicked for me: If you were to start from n instead of 0 in this example (where say n=10), the body of your loop will be indexing into an array at a lower index (say 10 - 1), which has not been initialized yet, which is bad. So you are just explaining that the order with which you read values from the table matters, because all the values start uninitialized. The lesson from all of your puzzles is to check for every argument whether it is increasing or decreasing over time! :)
I gotta say I was kind of frustrated up till around 11:27, because we weren't really shown how the recursive function would be slightly rewritten to use a lookup table. I was stubbornly not wanting to spoil myself by skipping ahead, so I was just constantly rewinding stuff in the first 10 minutes for like an hour straight, causing me to feel kind of overwhelmed. I would have appreciated it if you had started rewriting the recursive function a little sooner, though I get why you explained it the way you did. I still gained an equal amount of knowledge at the end of the day, so I can't complain.
Hi. Thanks for this amazing content. I have watched both of your videos but don’t get it. Would you consider making one video with a better, slower explanation?
The way I think about it, recursive solutions *are* iterative solutions, they just have a load of overhead on top of the stack in exchange for not needing to think about the stack
I tried to apply a similar approach as the one showcased in the video but with other problems such as Minimum Cost for Tickets or Longest Palindromic Subsequence (I tried really hard on this one) with no luck. I believe I am unable to do so because these follow a slightly different approach and instead of a "skip/take" fashion they can "take" from a list of options. Could you please consider doing an upcoming video about these more specific types of problems and see how could we approach them using this method of identifying bounds and dependencies so our recursive calls can be converted into values? I have no clue of what could I be doing wrong.
At 11:00, why is the direction of the bit shifted parameter irrelevant here? It seems like the "take" item (and therefore "ans") depends on the value of dp where bs is less than its value when inputted into the function. You can tell that the value of bs inputted into this next recursive iteration is lower because if bs & (1
What part are you lost at? It's not a different approach, we are just making optimizations. We just are just manually managing the control flow, so we have to do a 3 extra steps that we wouldn't have to do recursively.
@@DecodingIntuition I've rewatched both videos now. At a high level it makes sense and I get the concepts. I find it difficult to apply on my own as I don't understand the "why" behind a lot of the rules and relations. Especially the space saving trick with the array 😅
Ah that's just practice and exploring it yourself. An important part is just to spend some time with it to make it comfortable. That's how I learned it. I read someone elses stuff, it kinda made sense, did problems, drew the parallels, and then started making my own system to reliably do it, and then testing that system on problems to see how well that works. It's not easy, but it's doable in a repeatable method. And that's what I really wanted to show.
Hmm. I tried reversing the order of x in the final solution, so x goes from 'amount' down to 'c' and got the result for the number of ways if you can use at most one of each coin rather than an arbitrary amount.
That's because there's a different hidden recurrence relation. Since we are iterating in the opposite direction of the x dependencies, we are grabbing the old values. So "take" isn't calculating dp[i][x-coins[i]] anymore, it's actually calculating dp[i+1][x-coins[i]]. This matches the behavior you are seeing, when we take a coin, we decrease the amount and we move to the next coin i + 1 (we arent considering reusing it anymore). The space saving trick is concise but hides a ton of details.
I wish you explained why this works even when initialising everything to None. It's a pretty important part in understanding the solution but its glossed over. I also wish there was some kind of explanation of why the recurrence relation exists or how to get it rather than just saying there is a recurrence relation, like you did in the last video. Honestly this explanation will not make sense to anybody who actually wishes to know what the algorithm is doing and can't just rely on abstract mathematical relations.
1. I did explain and it wasn't glossed over. If you handle the order correctly, the values will be overwritten before they are used. 2. This is the same problem. Same recurrence relation. Nothing changed. It's a part 2. Why would I re-explain what I spent 20 minutes on before. 3. It shouldn't make sense to people who don't want to rely on abstract mathematical relations. That's the point. If you want to do well in DP or just algorithms in general without relying on abstract mathematical relations, good luck because algorithms are mathematical and if you want to be good you should embrace that.
@@DecodingIntuition 1. I don't think that's a proper explanation, but just an assurance. I had to trace out the algorithm to realise what you had meant. Yeah it gets overwritten, but it would have been nice to just immediately see *why* handling the order correctly would result in them being overwritten, and that it's because of the main base case and handling of when x == 0. 2. I mean that it wasn't explained in the same way I feel it wasn't really explained in the last video. Just my opinion that it could use some more depth. In the video, you describe the actions then say "I have a recurrence relation, i know this is right I can start coding" and then you start coding. But... why is the relation intact? You can somewhat figure it out by inspecting the rest of the code, deducing that it's the number of ways I can make the sum required after using this coin + the number of ways I can make this sum with the next coin. As far as I remember, you don't ever even say that. 3. I *want* to embrace, that's why I'm watching! And don't get me wrong, this is likely the best video there is on TH-cam discussing this approach to DP. But it's very hard to make a direct jump from thinking about this algorithmically and visually to thinking about it inductively without some steps in between. Seemingly your target audience are exactly the people who haven't looked at DP through the lens of mathematical relations to this extent, so it would just be nice to help bridge that gap a bit. I can't instantly reconcile your approach with the one that's already seared into my memory and I'm sure many people can't do that super easily either. All I'm saying is some stepping stones into this way of thinking would have been nice. I'm not trying to say this video was bad or anything, just what I struggled with grasping and what could have helped. You don't have to cater to it. I'm just putting it out there in case you want to consider it for your future work. Thanks anyway.
1. Fair. I think that's good that you went and worked though it though. There are way too many details to address. My goal was to convey the bigger picture, which is that, there exists a reliable way to get a recursive solution to an iterative one and that ultimately iterative dp solution isn't a different way of thinking it's an optimization. When I think I about it, this video wasn't explaining why it works, so much so that it does work and hoping that this is enticing enough for you explore it yourself (and trust that thinking recursively is worthwhile vs visualizing the table everytime) 2 and 3. True, there are details that I handwaved a bit. I do try to minimize that, but there are an infinite amount of details and at some point spending too much time on them distracts from the bigger picture. That being said, I never said it was easy, it's just not as monstrously unapproachable as many make it out to be. You'll still have to put in the work to understand it, my goal was just to show a reliable not an easy path (because it doesnt really exist). My hope was that any of the details that I chose to omit were small enough that they could be worked out. But idk, it's a tough decision to make. Thanks for the feedback, I'll consider this in the future.
9:41 doesn't the first recursive call start at 0 and go to n? How is it decreasing? For example, 'for left_size in range(size)' means left_size is 0->n. Then we call numTrees_ofSize(left_size) which is 0->n
The reason is I wanted it to read like natural language. So numWays_startingWith_toMake(i, x) closely matches "number of ways starting with the i th coin to make x amount" vs something like numWays(startCoin, amountToMake) That code was made in part 1, which was meant to show how recurrence relations can be converted from english statements. You can probably argue it works for the second one too, but that's just how i decided to do it.
@@DecodingIntuition it doesn't really read naturally to me as someone who doesn't usually do much math. I can see how it can be if you're used to funny cryptic math characters and unreadable single letter variables, but definitely not otherwise. Really don't enjoy when mathy style of writing code gets brought into programming, it always just is so terrible to read, especially considering that usually it doesn't even have verbose function names. It makes sense why it is like that in math, funny characters don't need localization, but for programming which already is in English, it's just a downgrade in readability.
I can't take "python is ugly" seriously if you suggest java as an alternative. I've seen clean C++ and elegant python, but java is always terrible to look at.
Please return this is the best series on programming problem solving I've seen 😢
As an IOI and ICPC world finalist and IOI coach, these two videos are the best videos on DP that I've seen so far
The way you explained DP on your previous video (make observations, think in terms of subproblems, write the recurrence relation, etc.) is pretty much how I explain DP to my students.
But I usually don't cover iterative dp optimizations as shown in this video out of fear of confusing my students, but I'll start trying from now on. (I hope you don't mind if I copy parts of your explanation)
I'm honored that you would like to copy parts, feel free!
Yeah, I also think iterative optimizations are confusing and can be distracting, but I think it's very important to at least show it, because it conveys the most important idea: DP is always has a recurrence relation, even if it doesn't look like it.
I feel this instills confidence that approach recursively is the correct path forward and they wont have to learn "the better iterative approach" next. This way, they can focus on building that recursive mathematical intuition and understand that iterative is an optimization, not a separate way of thinking.
This is literally the best approach to a DP question in an interview
I've already watched this video, specifically opened it to thank you. I used to so scared of DP problems. House Robber seemed very hard to me. Now, I literally solved it in 5 mins. No other tutorial or video has helped me this much regarding DP or recursion.
best explanation of DP i've ever seen, love the idea of not relying on tables
Dude, you changed my programming life
i like that the examples force u to think, pretty proud of solving all of them correctly
babe wake up decodingintuition just posted a new video
If you are doing competitive programming in c you can just use function macros instead of functions and still use recurrence relation without handling all the call stack logic. and just writing a recursive solution and let the proprocessor do its job for you.
love the video, but that function naming scheme is criminal
I'm going to need to rewatch this in the future. There are a lot of techniques that I've never seen before!
You're very good at teaching. And thanks for the slides and explanations between coding, and the really well laid out solution.
Even though the video is really enjoyable and pleasant to watch the dp approach can be conceived more mathematically. The approach can be thought of as some sort of a dag topological sort. These three words make up for the mentioned tables but also for more general broader range of cases. The rule of thumb better yet the slick trick presented in the video boils then down to just visualizing the way dag could be validly sorted. Yes I'm trying to prove that thinking in math is a powerful tool and could be an insightful approach to writing algorithms. Especially for those enlightened individuals who spend their time dawdling over leet code problems 😂. Thank you for coming to my Ted talk.
Yep, a correct order of dependencies is a valid topological sort on the dependency dag.
never seen floyd mayweather implemented as recursive dp 😱
also the idea that you don't need to think about inner loop order when outer loop dependency is strictly one directional blew my mind 💪🏽
another video that will carry a generation 🔥🔥
I think its actually really neat implementing it recursively because you can actually understand how and why it works. I've always seen it in it's iterative fully space saved form.
You are DP GOD🙏 It's all math down the root. So clear. I'll rewatch this multiple times. THIS is TRUE problem solving.
Yes! Haven't watched it yet but glad to see you again
Very nice video. Hoping to get a video on backtracking also.
damn so good. took over an hour to rewatch these two videos with the help of chatgpt explaining a few things 😅 but this has really demystified those cryptic leetcode solutions ❤🥺
Bro you are literally the goat among goats I have never seen anyone explain better than you!
I've found success in explaining recurrence relations as states and state relations. Becuase that is what they are, just not a way that many think about them as. This is also the way I write recursive functions and how I explain to others how to write them: Think state and state transitions, then think stopping point. Then, if needed, think about how store values in states directly, and think about ordering of the operations to calculate the right states first in order to avoid storing too much data you don't need, if that indeed is an issue.
such a concise description of the benefits of iteration in dp *i have no idea what im talking about*
I have been blessed with another masterpiece!
Thanks :) I am really grateful to you for showing me a whole new dimension of problem-solving.
Really funny jokes + best explanations of DP back to back?? How does this channel exist, and how do you do it??
this video made me wonder how hard it'd be to implement a compiler that can take a pure recursive math function like that and turn it into an iterative approach
might be a fun pet language to make, boundschecking at a compile time level & the like
You know I've actually thought about that too. It's like so systematic I'm pretty sure it's possible. Let me know if you end up doing it!
Just subscribed after watching the DP video that popped up on my feed and another one is up already! I must be lucky.
Great video Mr. Effort. Glad you are making videos.
Also, how did you get to your level? I wanna be smart guy like you:(
That Breaking Bad joke was amazing.
Love your content bro ❤.Need more
9:33 It took me a bit to understand the motivation behind "size decreases every time" -> "small before big", but it just clicked for me:
If you were to start from n instead of 0 in this example (where say n=10), the body of your loop will be indexing into an array at a lower index (say 10 - 1), which has not been initialized yet, which is bad. So you are just explaining that the order with which you read values from the table matters, because all the values start uninitialized. The lesson from all of your puzzles is to check for every argument whether it is increasing or decreasing over time! :)
I gotta say I was kind of frustrated up till around 11:27, because we weren't really shown how the recursive function would be slightly rewritten to use a lookup table. I was stubbornly not wanting to spoil myself by skipping ahead, so I was just constantly rewinding stuff in the first 10 minutes for like an hour straight, causing me to feel kind of overwhelmed. I would have appreciated it if you had started rewriting the recursive function a little sooner, though I get why you explained it the way you did. I still gained an equal amount of knowledge at the end of the day, so I can't complain.
Yeah whether you accept it or not you have to think about the tables from time to time
12:45 lol love that u just typed "sdfksjfslf" to clear the highlight. love vim
Lol, you noticed! I don't like having the highlight stay.
@@DecodingIntuition fwiw, you might find :noh worth memorising :3
@@ShadowKestrelI just binded it to , so when I press it, it calls :noh, much more intuitive.
mashing the keyboard achieves the same effect (without having to use shift) + releases stress, so it's optimal.
Let's go we got a sequel!
Dropped everything to watch this!
LOL I love that gatekeeping spirit
Hi. Thanks for this amazing content.
I have watched both of your videos but don’t get it.
Would you consider making one video with a better, slower explanation?
I can try in the future, but I'm finding it hard to fit in the topics I already want to cover.
The way I think about it, recursive solutions *are* iterative solutions, they just have a load of overhead on top of the stack in exchange for not needing to think about the stack
this is legitimately the DP bible
holy shit i'm late but the GOAT IS BACK
great explanation! this made things so clear. how do you get those vim motions in leetcode?
19:06 the funny last minute switch from (probably) a simple calloc and an assignment to a list concatenation trading a line of code for 6ms (± noise)
Woe is me, the difference between the two runs was 1ms more than I said
best memes, best content, best dp
At 4:25 I assume you mean that you can allocate and initialize an array at *compile-time*, not runtime, to save time & memory.
yep, misspoke, but the slides are right
Keep it up dude 😎👏
continue posting videos until the end of time
peak content
It's over for iterative dp deniers.
I tried to apply a similar approach as the one showcased in the video but with other problems such as Minimum Cost for Tickets or Longest Palindromic Subsequence (I tried really hard on this one) with no luck. I believe I am unable to do so because these follow a slightly different approach and instead of a "skip/take" fashion they can "take" from a list of options. Could you please consider doing an upcoming video about these more specific types of problems and see how could we approach them using this method of identifying bounds and dependencies so our recursive calls can be converted into values? I have no clue of what could I be doing wrong.
Thank you so much
Amazing videos, instant sub
Now do one for greedy problems
At 11:00, why is the direction of the bit shifted parameter irrelevant here? It seems like the "take" item (and therefore "ans") depends on the value of dp where bs is less than its value when inputted into the function. You can tell that the value of bs inputted into this next recursive iteration is lower because if bs & (1
How do you begin to understand this. The recursive approach made some sense but I'm lost here.
What part are you lost at? It's not a different approach, we are just making optimizations. We just are just manually managing the control flow, so we have to do a 3 extra steps that we wouldn't have to do recursively.
@@DecodingIntuition I've rewatched both videos now. At a high level it makes sense and I get the concepts. I find it difficult to apply on my own as I don't understand the "why" behind a lot of the rules and relations. Especially the space saving trick with the array 😅
Ah that's just practice and exploring it yourself. An important part is just to spend some time with it to make it comfortable.
That's how I learned it. I read someone elses stuff, it kinda made sense, did problems, drew the parallels, and then started making my own system to reliably do it, and then testing that system on problems to see how well that works.
It's not easy, but it's doable in a repeatable method. And that's what I really wanted to show.
this video confused me even more. i hope one day it makes sense to me
Hmm. I tried reversing the order of x in the final solution, so x goes from 'amount' down to 'c' and got the result for the number of ways if you can use at most one of each coin rather than an arbitrary amount.
That's because there's a different hidden recurrence relation. Since we are iterating in the opposite direction of the x dependencies, we are grabbing the old values. So "take" isn't calculating dp[i][x-coins[i]] anymore, it's actually calculating dp[i+1][x-coins[i]].
This matches the behavior you are seeing, when we take a coin, we decrease the amount and we move to the next coin i + 1 (we arent considering reusing it anymore).
The space saving trick is concise but hides a ton of details.
will be a good video i can tell
question: why do you sound exactly like the anakin skywalker dialectic memes? also great video
it's because i'm the chosen one
Hey, based on what observation that made you come up with 2D array dp, instead of some techinque else
This is a sequel to the first video linked in the description. I explain my thought process there.
I wish you explained why this works even when initialising everything to None. It's a pretty important part in understanding the solution but its glossed over. I also wish there was some kind of explanation of why the recurrence relation exists or how to get it rather than just saying there is a recurrence relation, like you did in the last video. Honestly this explanation will not make sense to anybody who actually wishes to know what the algorithm is doing and can't just rely on abstract mathematical relations.
1. I did explain and it wasn't glossed over. If you handle the order correctly, the values will be overwritten before they are used.
2. This is the same problem. Same recurrence relation. Nothing changed. It's a part 2. Why would I re-explain what I spent 20 minutes on before.
3. It shouldn't make sense to people who don't want to rely on abstract mathematical relations. That's the point. If you want to do well in DP or just algorithms in general without relying on abstract mathematical relations, good luck because algorithms are mathematical and if you want to be good you should embrace that.
@@DecodingIntuition
1. I don't think that's a proper explanation, but just an assurance. I had to trace out the algorithm to realise what you had meant. Yeah it gets overwritten, but it would have been nice to just immediately see *why* handling the order correctly would result in them being overwritten, and that it's because of the main base case and handling of when x == 0.
2. I mean that it wasn't explained in the same way I feel it wasn't really explained in the last video. Just my opinion that it could use some more depth. In the video, you describe the actions then say "I have a recurrence relation, i know this is right I can start coding" and then you start coding. But... why is the relation intact? You can somewhat figure it out by inspecting the rest of the code, deducing that it's the number of ways I can make the sum required after using this coin + the number of ways I can make this sum with the next coin. As far as I remember, you don't ever even say that.
3. I *want* to embrace, that's why I'm watching! And don't get me wrong, this is likely the best video there is on TH-cam discussing this approach to DP. But it's very hard to make a direct jump from thinking about this algorithmically and visually to thinking about it inductively without some steps in between. Seemingly your target audience are exactly the people who haven't looked at DP through the lens of mathematical relations to this extent, so it would just be nice to help bridge that gap a bit. I can't instantly reconcile your approach with the one that's already seared into my memory and I'm sure many people can't do that super easily either. All I'm saying is some stepping stones into this way of thinking would have been nice.
I'm not trying to say this video was bad or anything, just what I struggled with grasping and what could have helped. You don't have to cater to it. I'm just putting it out there in case you want to consider it for your future work. Thanks anyway.
1. Fair. I think that's good that you went and worked though it though. There are way too many details to address. My goal was to convey the bigger picture, which is that, there exists a reliable way to get a recursive solution to an iterative one and that ultimately iterative dp solution isn't a different way of thinking it's an optimization. When I think I about it, this video wasn't explaining why it works, so much so that it does work and hoping that this is enticing enough for you explore it yourself (and trust that thinking recursively is worthwhile vs visualizing the table everytime)
2 and 3. True, there are details that I handwaved a bit. I do try to minimize that, but there are an infinite amount of details and at some point spending too much time on them distracts from the bigger picture. That being said, I never said it was easy, it's just not as monstrously unapproachable as many make it out to be. You'll still have to put in the work to understand it, my goal was just to show a reliable not an easy path (because it doesnt really exist). My hope was that any of the details that I chose to omit were small enough that they could be worked out. But idk, it's a tough decision to make.
Thanks for the feedback, I'll consider this in the future.
what about finding the longest path in a binary tree
9:41 doesn't the first recursive call start at 0 and go to n? How is it decreasing?
For example, 'for left_size in range(size)' means left_size is 0->n. Then we call numTrees_ofSize(left_size) which is 0->n
range() goes up to but not including.
So it goes 0 to n-1. #justpythonthings
@@DecodingIntuition Right but it's not decreasing
range(size), highest is size - 1, size - 1 < size. It is decreasing.
There's underlying logic not being explained here on why the left recursive call is considered decreasing with your method
I have underschtänd
1:24 🤖🤖🤖🤖
uhhhhhhh what??? I dont understand any word youre saying. where am I?
blind man
home cec
Why not name the function params properly instead of the long ass name + keeping track what i, j, k mean
the better you get at math the worse your software development becomes
The reason is I wanted it to read like natural language.
So numWays_startingWith_toMake(i, x) closely matches "number of ways starting with the i th coin to make x amount"
vs something like
numWays(startCoin, amountToMake)
That code was made in part 1, which was meant to show how recurrence relations can be converted from english statements. You can probably argue it works for the second one too, but that's just how i decided to do it.
@@DecodingIntuition it doesn't really read naturally to me as someone who doesn't usually do much math. I can see how it can be if you're used to funny cryptic math characters and unreadable single letter variables, but definitely not otherwise. Really don't enjoy when mathy style of writing code gets brought into programming, it always just is so terrible to read, especially considering that usually it doesn't even have verbose function names. It makes sense why it is like that in math, funny characters don't need localization, but for programming which already is in English, it's just a downgrade in readability.
this is just instructive, you can name it however makes most sense to you
Your videos are so good, but can you PLEASE code your next video in a language like java or C because python is so ugly its almost prizewinning
I can't take "python is ugly" seriously if you suggest java as an alternative. I've seen clean C++ and elegant python, but java is always terrible to look at.
Do clojure
Wrong DP video 🥲
witchcraft
Didn't understand anything 😅 this is probably black magic