This is my favorite Rust talk that I've seen. I also really like one's one the more advanced language features, but when it comes to selling the idea behind Rust, talks typically show of the power of "safe-land." This is the first time I really felt a strong intuition of how Rust safely builds it's abstractions up.
The difference is in HOW you write an unsafe code. If the programmer has to do something too advanced for the Rust compiler to comprehend, you can take over control. There's nothing wrong in calling unsafe function that has no flaws (aka is "pure")
Wait, what happens if vector length is already usize::MAX, and then we push again? I don't see how the code presented deals with this potential problem.
As a Rust learner, I think it is being sold wrongly. The word "undefined behaviour" is thrown around too casually. The CPP conference people tend to abstract away hardware and concrete implementations, hence they just label anything not specified by the C++ specs as "undefined behaviour" because they cannot be sure how the implementations will handle such situations. But, almost all popular compilers do the most reasonable thing when it comes to these situations. Printing uninitialised memory will print whatever bits were stored in its place, overflowing an integer will result in it wrapping down to the smallest storable number, etc. This is not something CPP programmers would worry about while coding. This is fear-mongering. I fear I'm getting myself into a community full of these people, but damn the promise is too great not to give it a shot, and so far, I like it.
I'd like to respond to that: in all these situations, is wrapping behaviour and printing undefined bits what you want? You can achieve all of that in Rust too, but it forces you to explicitly say that that behaviour is what you want. If I'm summing positive integers in an array and get a negative result, is that what I wanted, or will this cause a problem with some assumption I make further on? Rust doesn't forbid this behaviour, there's a wrapping add, which let's you say that you're sure you want this behaviour. If you don't want to worry about that every time, there's specific wrapping integer classes that will do that every time if that's what you want. You can read uninitialized memory, or even memory from an arbitrarily specified pointer inside an unsafe block. I feel like the purpose of Rust is to help you not get bitten by these things when you're not expecting it, making it easier to do the right thing.
This is my favorite Rust talk that I've seen. I also really like one's one the more advanced language features, but when it comes to selling the idea behind Rust, talks typically show of the power of "safe-land." This is the first time I really felt a strong intuition of how Rust safely builds it's abstractions up.
Dangers of C++ starts at 5:10; Unsafe Rust intro starts at 14:47; Title slide at 19:50
The difference is in HOW you write an unsafe code. If the programmer has to do something too advanced for the Rust compiler to comprehend, you can take over control. There's nothing wrong in calling unsafe function that has no flaws (aka is "pure")
"Why do we even have that lever?!" Lmfao ^_^
Fun and useful talk, thanks!
Wait, what happens if vector length is already usize::MAX, and then we push again? I don't see how the code presented deals with this potential problem.
It panics, it said so in the docs. Push panics if pushing would overflow a usize integer.
Fantastic
What was going on at the very end when he mentioned the book?
O'Reilly discontinued their PDF offerings. Your only options are the Print Edition, Kindle Edition, and Safari Subscription.
The real talk starts at 5:10 :)
If you think the part before that wasn't relevant, you might have missed the metaphor ;-)
Better but now you depend on a huge ecosystem of crates with unsafe sprinkled all over. Not just the stdlib.
As a Rust learner, I think it is being sold wrongly.
The word "undefined behaviour" is thrown around too casually. The CPP conference people tend to abstract away hardware and concrete implementations, hence they just label anything not specified by the C++ specs as "undefined behaviour" because they cannot be sure how the implementations will handle such situations.
But, almost all popular compilers do the most reasonable thing when it comes to these situations. Printing uninitialised memory will print whatever bits were stored in its place, overflowing an integer will result in it wrapping down to the smallest storable number, etc. This is not something CPP programmers would worry about while coding. This is fear-mongering. I fear I'm getting myself into a community full of these people, but damn the promise is too great not to give it a shot, and so far, I like it.
I'd like to respond to that: in all these situations, is wrapping behaviour and printing undefined bits what you want? You can achieve all of that in Rust too, but it forces you to explicitly say that that behaviour is what you want. If I'm summing positive integers in an array and get a negative result, is that what I wanted, or will this cause a problem with some assumption I make further on? Rust doesn't forbid this behaviour, there's a wrapping add, which let's you say that you're sure you want this behaviour. If you don't want to worry about that every time, there's specific wrapping integer classes that will do that every time if that's what you want. You can read uninitialized memory, or even memory from an arbitrarily specified pointer inside an unsafe block. I feel like the purpose of Rust is to help you not get bitten by these things when you're not expecting it, making it easier to do the right thing.
All those things *are* undefined behavior, by the standard.
So basically, don't use C/C++ lol
"You forgot to initialize the variable". Lol. Close. Noob.