I just can’t deal with type annotations on an add function. Is this not DEVOlution ? Next we’ll be calling it “addi”, with “addf”, “adds”, “addl”…..I can keep going…addc,
Very nice collab! The deprecated decorator was a very much necessary functionality, especially for libraries. Coming from a Java background, not having this decorator in the main Python language surprised me.
May I ask what you do mean by that? Because Java indeed has one. You could argue with me that the semantics are not the same but syntactically they are the same ^^.
One of the best uses of the cache decorator is in recursive functions. Every recursive call get's cached, meaning if you call factorial(n) and then factorial(n+2), only 2 extra recursive calls will be made, for a total of n+2 calls. Any call to factorial(m) for m =< n will also be entirely cached.
When it comes to retries, you may want to have a look at the Tenacity library. E.g. Tenacity allows for random wait times, exponential wait times or even only retrying for certain exception types.
Tenacity’s retry decorator is really good and quite expressive. Atexit: use a context manager with a try/finally block to close that database for most applications. Atexit mainly applies to long running python programs that are run as a server or daemon.
I have started using custom decorators to register functions and classes into data computation scripts, like effects on audio files, and specific element parsers for an xml file parser. I feel like decorators are pretty handy for stuff like that.
atexit: Doug Hellmann's book about the Python 3 standard library pages 995-997 mentions three conditions when atexit will not be invoked: 1) the program dies because of a signal, 2) os._exit() is invoked directly, and 3) a fatal error is detected in the interpreter.
atexit can handle signals, but not all of them; system SIGTERM and SIGINT will be caught by atexit and handled - however, if cleanup handlers are taking too long, OS might terminate the process altogether, so beware of that (Docker does that for example; first it sends SIGTERM, and after 10 seconds, if process is still running, it sends SIGKILL). Not sure about SIGQUIT, but looking at different OS docs, atexit should be able to handle it as well. SIGSTOP cannot be caught or interrupted in most systems, and SIGKILL will forcefully terminate process no matter what in all OS's for all I know. I would have to check the implementation on os._exit(), but if I remember correctly, it also invokes SIGKILL on the process
Once I finish a project that I'm working on I'll watch a lot of your videos and try to apply some new learnings to my old python projects, really informative things you're pushing out
A fun follow up video would be a brief overview on how to create your own decorators! I've done so, for some client specific code and it's a fun exercise 😁
Probably one of the coolest things I ever created that I unfortunately no longer have was a decorator that would turn a function into a tkinter form automatically.
Excelent content, thanks for sharing it (: For those thinking about counting vowels in a str, the version below has time complexity of O(n) ```python def count_vowels(input_str: str) -> int: vowels = 'aeiouAEIOU' return sum(1 for char in input_str if char in vowels) ```
I like to add a filter for exception types when writing a retry decorator. When fetching some stuff, it might fail due to connection issues or a rate limiting. Then it is fine to retry it. But if you parse or transform the result in the same function, it will fail always if something is different than expected, so usually no need to retry it. And, my favourite case, if you cause a KeyboardInterrupt at the same time, would it retry it as well?
If the computation of a function is fixed and it will result in the same result everytime, why would I want to use cache and not just store the output in a variable?
Because what you're talking about is a constant, and creating a constant for every single input scenario is just not ideal, and probably not possible due to the infinite amount of possible inputs.
0:49 Did pycharm create the time import for you automatically? Or was it just a jump-cut? That's really convenient if it's pycharm. I moved from pycharm to vscode when I needed the paid for features. Meh, probably vscode will do the same thing if pycharm does and I'll never bother to set it up.
You can enable auto imports in VS Code by clicking the curly braces next to Python in the bottom status bar and enabling import completions in the menu that pops up.
Python decorators are a lot less cool if you have to step into while debugging ... ;) I actually now prefer utility-functions with a lambda, or resources.
These are cool, thanks. Although it is my personal opinion that retry and get_time shouldn't be decorators, you could just call the higher-order function whenever you need it instead of affecting the original function.
Are the two not equivalent? Except the decorator influences all calls to the wrapped function vs changing the call site online influences one. A decorator is just a higher order function?
@@DrGreenGiantThat is exactly what I mean, I just think it's better to keep your functions less coupled and just use the "decorator" as a normal function, so e.g. when you want to test your code that runs 3 times, you can work with the original function that runs once and not the version that runs 3 times or whatever.
@@spaghettiking653 ah I see what you mean, yes. I guess it depends on the use case. I can imagine a worker function that is being sent to a pool, for example, and it would be probably easier to temporarily decorate the worker, than to change the call site. Especially if there is already some higher order stuff going on with partials, for example. But like you say, I can imagine many other cases where it would be much better to wrap the call site rather than the definition. More food for thought, ty!
Context managers are for cleanly disposing resources like files and database connections when they're no longer needed. This even happens when an exception is thrown in a "with" block. atexit is for global cleanup at the end of the program, which is usually not needed with small scripts.
@cache internally builds a mapping (essentially a dict, but a bit fancier) between the inputs and outputs of each of your function calls. Since this creates a reference to those objects, they don't get garbage collected as long as the function is around, which is typically for the lifetime of the program, long after you're done with them. This can be nasty if you @cache an object method, it prevents the object from being garbage collected at all.
pytest rules, I just love how you can make a fixture out of other fixtures, only need to import the composite fixture and it'll also seamlessly mix up the parametrization of all the fixtures its using under the hood.
A good set, but I was a little disappointed because some things were not called by their proper names: cache(memoization). Also surprised that wraps from functools wasn't mentioned.
You don’t count the amount of vowels. You count the number of vowels. For some reason this distinction, which is elementary school grammar, has complete collapsed within the last 2 years. Same from less/fewer.
typical youtuber.. overly simple examples. nobody should be logging with print. now how does your decorator know what logger to use? also make connect async since sync is mostly legacy for such code now
If you want to show off your knowledge on beginner tutorials on the internet, try including some examples to help other people, otherwise university might be a better environment for your critique :)
Congratulations on getting the sponsorship deal!
Thanks, they're very easy to work with! Absolutely recommend them :)
I’m actually dating them rn. Like right. Now.
I just can’t deal with type annotations on an add function. Is this not DEVOlution ? Next we’ll be calling it “addi”, with “addf”, “adds”, “addl”…..I can keep going…addc,
Very nice collab! The deprecated decorator was a very much necessary functionality, especially for libraries. Coming from a Java background, not having this decorator in the main Python language surprised me.
May I ask what you do mean by that?
Because Java indeed has one.
You could argue with me that the semantics are not the same but syntactically they are the same ^^.
@@sir_no_name1478 He said Python is missing one, not that Java is missing one.
@@Nicoder6884 oh yeah can't read it seems ^^
One of the best uses of the cache decorator is in recursive functions. Every recursive call get's cached, meaning if you call factorial(n) and then factorial(n+2), only 2 extra recursive calls will be made, for a total of n+2 calls. Any call to factorial(m) for m =< n will also be entirely cached.
Fibonacci is the go-to function for demonstrating the cache decorator.
What would be a good demo is some use cases that demonstrate the advantages of @cache vs @lru_cache versus each other. When to use each.
If you input numbers like 1000 you get a RecursionError.
Also isn't it
@@cycrothelargeplanetyou can change the limit with the sys module
When it comes to retries, you may want to have a look at the Tenacity library.
E.g. Tenacity allows for random wait times, exponential wait times or even only retrying for certain exception types.
Also has a cool name!
yep, certain exception is great. Catching any excepting is just nasty
Simply the best videos for people learning Python. You're a highly effective communicator who has a teaching spirit. Thank you for helping!
Thanks for the kind words :)
Thanks!
Thank you for the support! :)
Tenacity’s retry decorator is really good and quite expressive.
Atexit: use a context manager with a try/finally block to close that database for most applications. Atexit mainly applies to long running python programs that are run as a server or daemon.
I have started using custom decorators to register functions and classes into data computation scripts, like effects on audio files, and specific element parsers for an xml file parser. I feel like decorators are pretty handy for stuff like that.
atexit: Doug Hellmann's book about the Python 3 standard library pages 995-997 mentions three conditions when atexit will not be invoked: 1) the program dies because of a signal, 2) os._exit() is invoked directly, and 3) a fatal error is detected in the interpreter.
atexit can handle signals, but not all of them; system SIGTERM and SIGINT will be caught by atexit and handled - however, if cleanup handlers are taking too long, OS might terminate the process altogether, so beware of that (Docker does that for example; first it sends SIGTERM, and after 10 seconds, if process is still running, it sends SIGKILL). Not sure about SIGQUIT, but looking at different OS docs, atexit should be able to handle it as well. SIGSTOP cannot be caught or interrupted in most systems, and SIGKILL will forcefully terminate process no matter what in all OS's for all I know. I would have to check the implementation on os._exit(), but if I remember correctly, it also invokes SIGKILL on the process
@@tomwin6975 Thank you for the insight.
Once I finish a project that I'm working on I'll watch a lot of your videos and try to apply some new learnings to my old python projects, really informative things you're pushing out
Some code that might be useful to my project, a source that looks well worthy of a sub, and a literal belly laugh. Thanks for that mate.
I like the sponsor!😂
A fun follow up video would be a brief overview on how to create your own decorators! I've done so, for some client specific code and it's a fun exercise 😁
I've also written the retry and the timing decorators a few times for various projects, I wish it were standardized in a built-in module
Probably one of the coolest things I ever created that I unfortunately no longer have was a decorator that would turn a function into a tkinter form automatically.
In what way? Like it would create a form for inputting arguments, and displaying function results?
@@LF-Me yeah exactly
Thanks for the great video!
Obviously you know it, I prefer one-liners for these:
sum(1 for letter in text if letter in vowels)
Excelent content, thanks for sharing it (:
For those thinking about counting vowels in a str, the version below has time complexity of O(n)
```python
def count_vowels(input_str: str) -> int:
vowels = 'aeiouAEIOU'
return sum(1 for char in input_str if char in vowels)
```
You can also use lru_cache instead of cache, it will automatically remove the cache which is not used recently.
Super informative, thank you!
this was reallly helpfull , i was struggling to write this to evevry function i use to measure time or apply retry functionalliy
These are excellent (as are ALL of your videos). Thank you! 😺
Very cool stuff, Thank You!
It would be helpful if you showed the decorator code in the video as well in video.
The sponsor shout out was the best lol 😂😆😂
Absolutely epic😎
It's appreciable ❤
Amazing🎉
finally a channel that uses a real IDE
Shout-out to the funcy library, while we're here!
I like to add a filter for exception types when writing a retry decorator. When fetching some stuff, it might fail due to connection issues or a rate limiting. Then it is fine to retry it. But if you parse or transform the result in the same function, it will fail always if something is different than expected, so usually no need to retry it.
And, my favourite case, if you cause a KeyboardInterrupt at the same time, would it retry it as well?
Instead of own retry decorator, use backoff. It's also compatible with asyncio. But it's an additional dependency.
What is the name of the VS code theme used by Carberra
*Has a video where Carberra pronounces his name*
still says "car bear uh"
lol
1:37
17| if retries LT 1 or delay LTE 0:
18| raise ValueError('Are you high, mate?')
has me ROFL'ing!
does anyone know which color scheme carberra was using?
If the computation of a function is fixed and it will result in the same result everytime, why would I want to use cache and not just store the output in a variable?
Because what you're talking about is a constant, and creating a constant for every single input scenario is just not ideal, and probably not possible due to the infinite amount of possible inputs.
The connect() function does not return, because of the raise. The correct typing should be -> NoReturn
Retry is already done by the requests library. Cache clear was interesting
0:49 Did pycharm create the time import for you automatically? Or was it just a jump-cut? That's really convenient if it's pycharm. I moved from pycharm to vscode when I needed the paid for features. Meh, probably vscode will do the same thing if pycharm does and I'll never bother to set it up.
Yeah it was PyCharm :)
You can enable auto imports in VS Code by clicking the curly braces next to Python in the bottom status bar and enabling import completions in the menu that pops up.
@@rmHawk765 Huh, it works! Hats off, thanks so much!
What did you use to make "->" appear as actual arrow "→" 11:18 ?
ligatures, fonts in VSCode and PyCharm support them
Python decorators are a lot less cool if you have to step into while debugging ... ;) I actually now prefer utility-functions with a lambda, or resources.
These are cool, thanks.
Although it is my personal opinion that retry and get_time shouldn't be decorators, you could just call the higher-order function whenever you need it instead of affecting the original function.
Are the two not equivalent? Except the decorator influences all calls to the wrapped function vs changing the call site online influences one. A decorator is just a higher order function?
@@DrGreenGiantThat is exactly what I mean, I just think it's better to keep your functions less coupled and just use the "decorator" as a normal function, so e.g. when you want to test your code that runs 3 times, you can work with the original function that runs once and not the version that runs 3 times or whatever.
@@spaghettiking653 ah I see what you mean, yes. I guess it depends on the use case. I can imagine a worker function that is being sent to a pool, for example, and it would be probably easier to temporarily decorate the worker, than to change the call site. Especially if there is already some higher order stuff going on with partials, for example. But like you say, I can imagine many other cases where it would be much better to wrap the call site rather than the definition. More food for thought, ty!
@@DrGreenGiant Ty yourself, I didn't give much thought to threads or any other use cases like that :)
So these aren't 5 Python decorators, they're 5 decorators written by your average Python fan.
Cache and atexit are part of the standard library.
Probably, It will be better to use set() instead string. vowels = set('aeiouAEIOU') and if letter in vowels: ...
good voice. thx
sponsor made me laugh :)
Can you help me understand how is the @atexit.register functionality different from Context Manager functionality in Python?
Context managers are for cleanly disposing resources like files and database connections when they're no longer needed. This even happens when an exception is thrown in a "with" block. atexit is for global cleanup at the end of the program, which is usually not needed with small scripts.
functilonally atexit is placing your whole program in a context manager and running the supplied function in the `finally` block
What is the name.of.ide he uses
Just using memory is not a memory leak
I don't remember anyone stating that :)
4:40
@cache internally builds a mapping (essentially a dict, but a bit fancier) between the inputs and outputs of each of your function calls. Since this creates a reference to those objects, they don't get garbage collected as long as the function is around, which is typically for the lifetime of the program, long after you're done with them. This can be nasty if you @cache an object method, it prevents the object from being garbage collected at all.
Good time stamp, but I said: "can lead to memory leaks".
Your retry count is wrong. If you have four retries, there should be five total attempts. The first attempt is not a retry.
nice =D
I want pip install idently 😊
Your s good as it cones
My favourite is @mark from pytest
pytest rules, I just love how you can make a fixture out of other fixtures, only need to import the composite fixture and it'll also seamlessly mix up the parametrization of all the fixtures its using under the hood.
A good set, but I was a little disappointed because some things were not called by their proper names: cache(memoization). Also surprised that wraps from functools wasn't mentioned.
retrying is a python module which is present already
If you find it remember to share it with the rest of us :)
Tenacity is a Python package that implements a retry decorator
pip install retrying
Hope it helps ;).. love your content! Learned a lot from it. Keep sharing your knowledge!!
Ahaha, I didn't know you meant "retrying" as in that was the module name, thanks for sharing!
1:37 Are you high, mate?
this is the kind of stuff I do for side cases in functions
I give it a casual error 💀
You don’t count the amount of vowels. You count the number of vowels. For some reason this distinction, which is elementary school grammar, has complete collapsed within the last 2 years. Same from less/fewer.
Thanks for the English lesson :)
You clearly know what he meant
typical youtuber.. overly simple examples. nobody should be logging with print. now how does your decorator know what logger to use? also make connect async since sync is mostly legacy for such code now
If you want to show off your knowledge on beginner tutorials on the internet, try including some examples to help other people, otherwise university might be a better environment for your critique :)
Wow, deprecated and atexit.register ones are really useful
Gonna use them in my project, thank you❤
2:43 : wait, didn’t you say at the start of the video that it was sponsored by indently? Then why do you say it isn’t sponsored?🩳
woosh