@@rowenkylee5627 I'm not sure that would need a "Best practises" - but I guess "commit" in general was new enough, that maybe "commit common sense" hadn't really come to be, yet?
i mean, if it was just a find and replace then it's not that impressive. And i could see it making more sense to do it all at once cause it's a no win situation.
There is nothing better than having someone who is well versed in the nuance of something extremely technical, state in simple terms how something works. Great video and love the channel! Please keep making these!
And then go on to tell you "This complexity has been solved, please don't go through the pains that I have" Unless you need to do some very specific optimisations.
i hope you know I literally grew up and fell in love with using everything you created. I can't believe I found the guy responsible for creating just about every useful big windows tool
"Its not stupid if it works" ... the 1 pearl of wisdom I've learned in 20 years of IT support. Love watching these videos, hearing these stories & hearing all this wisdom in general. It has helped me retrospectively understand a lot of issues I've tried to fix in the past. Thank you for sharing your stories & wisdom, its greatly appreciated!
"Its not stupid if it works" I wrote a match-3 engine once, and, I had a setup where if you cleared any 3+ items and new ones came down and settled into their positions, they won't cause other matches, thus, no chain reaction where the user just sits there and the game plays itself... Anyway, it was all working great, but once in a blue moon, a match would happen, and I was like "?", so going over the code, I just couldn't see where the bug was coming from, the logic looked sound, over and over I went, printing it off, reading it in bed or even on the bog... Nothing stood out, and because it was so rare in occurrence, it made it harder to debug, I tried everything... And, then, I changed one part of the condition in an If statement, which didn't look right, but had a go anyway, and, it never came back... And I still go over that code in my head and think... What's it doing, because believe it or not... I still don't know how the change worked in terms of the overall logic behind it! :D
@@DavesGarage As a game dev i did know about everything shown here. Messing with memory is my bread and butter after all. But i really like how clearly everything is explained. including what the point of those tricks are and what their advantage is in certain cases. Its great at giving people a peek behind the curtain of c++ I really liked the mention of using alloca if you really know your memory lifetimes. (did that a few times before)
@@gideonunger7284 Since you're a game dev, I'm curious - did you ever use fibers (instead of mere threads) when working on a game? I'm wondering why aren't these used more widely eversince seeing them referenced as an easter egg in The Sims 4.
@@kvdrr I heard of fibers as a memory handler, created by the team behind of _Last of Us_ remake. But that aims apps that allocate too much memory, challenging the machine capacity.
As a self-study individual who primarily works professionally in C# with no formal training and is trying to learn C++, sprinkling in "tidbit" videos like this has been helping me remain focused while expanding and equating the basics with some more real world samples. Thank you for all these great resources! I would love to hear your thoughts on how to improve studying for someone in my position.
I'm a casual tinkerer and I've gotten it into my head that 'I don't need all this fancy C++ abstraction nonsense.' Hopefully this series will help me with understanding when these approaches can be applicable, in the *real world!!!* I didn't realise you could overload new and delete; that example was clear and informative. Interesting video, thank you!
I always start these videos watching it on a second screen while playing a game. I end up pausing the game, staring to the right screen and then opening up Google over my game to search for tidbits you mention. Thanks for making me a single task person for 15 minutes a week.
fun and informative! whenever I write in modern c++ and see some code like: F *f = new Foo[100]; I consider one of these alternatives: auto ptr = std::make_unique(); auto ptr2 = std::make_unique(100); auto ptr2 = std::make_shared(100); auto values = std::vector(100); each having pros and cons and allocate the same number of elements on the heap
Dave, you are an amazing instructor! And, having been a Java guy for so long now, I realized just how much C++ I had forgotten... and I had never thought about overloading new and delete before, mind blown. Looking forward to the next video!
As someone with ADHD, the pace of these videos is great! I was captivated the entire time and learned a nice handful of new info. Thanks a bunch for these videos :)
@@davak72 Ha - sometimes happens to me as well; but I try to force myself to focus on the video alone when I really want to get all of it, because I've noticed that no matter how it feels like I can focus on both the video and the comments, it usually doesn't work that well in reality and I'll end up missing half the video :D Oh yeah, I forgot to say I'm writing this as someone with ADHD ;D I'm betting there's plenty of us here ;)
40 years as a C/C++ programmer, and you are finally getting good at it... well, good for you! For some reason, that makes me think I should pursue another career hahaha
@@kwkfortythree39 While that works and is preferable in most situations there's still some situations when you doing low level code and normal array would be all you need Vectors and other containers are convenient but come with some overhead and functions that you might not need
I'm impressed! The closest thing to that I ever did was rebuilding the OS on a CDC 1700 when I was totally stoned. You have to think carefully before hitting enter when you do that.
Thanks Dave! I'm an aspie and after a hard depression years ago, I was no longer able to code. I'm not sure why but during one of your quick tips in this video, it's like if something switch in my head... I picked up c++ pocket ref and somehow, it now make sense again!
The placement new was a useful refresher for me (28 years C/C++). Loved the million lines of code story, look forward to that bigger episode. The 'name in the header makes you the owner' issue is a classic :)) I've had to deal with a fair share of disbelief about my lack of knowledge of a file at times.
Also happened to me once when I had to copy the code of a module to a new package in java. Git regarded them as new files, so put my name in the git blame XD
Dude, you're so experienced in the dark arts of C++. These vids are so interesting to watch and hear the nuance of all these low level features. Kudos man.
Fantastic! 5:58 the new OS is borning... I love that (disguised) passion of yours talking about the creation of what you love and make... Congratulations!
Thanks Dave, 30yrs+ and still learnt something new after watching this video. As they say "Every Days a School Day" 👍 Bet that was a nervous moment checking in all those changes with a "What if and Have I" moment 😂
Well it's a question of a system call. It's the kernel's responsibility to keep track of allocated threads, mem, cpu, disk, etc. delete[] basically is your program telling the system "Hey, I'm done with this bit of memory. You can have it back". The C++ compiler need not interfere further than just performing a system call with the pointer as an argument.
@@vedantganesh6923 I don't think that's right. For one thing, C++ allocators tend to get a large chunk of memory allocated to the program before it even starts main. When you call new or even malloc it can then usually find space in that pre allocated block for what you need. More importantly, the delete and delete[] operators both call the destructor, which the system doesn't know anything about.
I'm pretty sure when you'd grab memory, the size of that memory is stored somewhere, hence why you can just free a pointer, with no size attached. So if the compiler can read that value, it can determine how many elements are in the array.
@@Mozartenhimer The compiler? No. The size of the memory behind the pointer is stored within the malloc() implementation, which is part of some underlying standard library - on Windows, it's the MSVCRT, the C runtime of the Visual C and C++ compiler that is linked in automatically, on desktop Linux, it's the glibc or eglibc, on embedded systems, something like Newlib. On FreeBSD, malloc is implemented by jemalloc library. For compatibility reasons, and to avoid more bloat than necessary, default C++ operators like new/delete generally leverage the underlying C implementation. At least this used to be the case, now it's got obscured behind layers of stuff, because there is now aligned allocation in C++, which means it must be preconfigured to use some malloc which offers this feature, where you can specify the alignment granule per allocation, and it might or might not be the same as used by C on the same system, and it will generally turn out to be, but not using the same standard C API. Sorry, maybe a bit of a digression. It is however not a syscall, as that would be insanely slow. malloc uses syscalls to request pages from the operating system, so chunks of memory that are 4kb in size or multiple of that. To figure out what size information malloc actually stores, you need to consider the interface. There is three functions, malloc, realloc and free. None of these allow you to query the size of allocation. Crucially, malloc has no need to know the number of bytes that you requested, just the number of bytes that it returned. Most modern implementations are derived in one way or another, often more ideologically than in implementation, from Doug Lee's dlmalloc, which is a bucket system. So it has buckets of a number of sizes under a page in size, so let's say 16, 24, 32, 48, 64, 128, 256, 512, 1024 etc - it will have all the power-of-two sizes and can have a bunch of useful interim sizes. Each bucket is a contiguous chunk of memory that stores allocations of that given size, so for example a bucket that contains 128 elements of size 32 exists, and it's 4kb capacity in total. When a malloc request comes in, so you say malloc(28), it rounds it up to 32, and goes to the pool of buckets with 32-byte elements, and finds the first bucket that has some free space in it, and returns the address of that element, and marks that space as used in the bucket's bitmap. If no suitable bucket exists, it makes a new one by requesting a page from the operating system. So as far as malloc is concerned, the capacity of that allocation that you requested as 28, is really 32. It will store the addresses of buckets in some sort of tree, and it will walk that tree to find the address, and it will find the bucket it belongs to, and hey look that's a bucket with 32-byte elements. And if you ask to "free", it will just edit the occupancy bitmap to indicate that this slot is free in the bucket, and if the bucket was at that point fully occupied, it will re-add the bucket to the pool of buckets with space available. If you request to realloc, if the new size is under 32, it might as well leave the allocation just where it was and do nothing. Otherwise, you should be able to imagine what needs to happen to satisfy a larger reallocation request. So ultimately, malloc doesn't generally know the original allocation size, just some sort of upper-bound approximation of it.
This format, the stories with tips in between, is fantastic. I wasn't expecting much from the tips, but I was happily surprised/impressed. It is very apparent your skill is on an entirely different level than other senior devs I know of. I hope you'll cover some of the more niche C++2020 topics some time.
Thanks for another great video Dave. As an old Pascal/Delphi programmer, having only moved to the C++ realm in the last 10 years or so for microcontroller programmng, this was very interesting.
Funny, when you were describing the different RAM types, I was thinking "this sounds like my old Amiga RAM" and then you mentioned the Amiga as an example. Loved that computer.
Love your style Dave... have you thought of creating a C++ programming training course taking someone from beginner noob to super advanced leet coder. You have so much experience and knowledge as well as a natural ability to convey complex concepts effortlessly... a CS course by you will be a win for the whole industry.
gonna second this, there are things about C++ that dissuade me from using it and staying with C, having the mystery taken out of it would go a long way towards converting me
I agree... I mean, I know C++, but, as a tinkerer, I never keep up with the standard, like, his explanation of "new" was great, because, I would never of went into it far enough to learn that... And, well, now I know, and he put it across so well too!
i dont think that leet coder are good programmer, but that is just my opinion. Maybe you know your data structure and some algorithms for it, but that's it. Dave could teach how to really program some good software, that would be a nice course
You know... MS should have had you doing this stuff while you worked there, because when you see "Windows", it is what it is... Windows, you tend to forget that there are people behind the OS, and they have reasons and decision making to do which we as users may go "What have they done that for?". I know they have developer portals, with videos and such, but these are aimed at professionals in the most part, so it is refreshing to see videos like this on average joe platforms like youTube where us programming tinkerers can see it... Bit of cocktail music in the background, explaining in layman's language, drop of humour, all the while, popping up little sample code that makes sense, all bundled with little stories from days gone by. I like it! 👍
Gad. I had to port the icon code from win95 to NT and the caching tricks took me quite awhile to grok and just before I was to check in 2 months worth of work, my HD crashed and I had not backed up the data. (SLM SUCKED) That got me fired from Systems unfortunately. I also did a lot of header file cleanup and the Win 16-32 porting layer so I can relate to getting your name on tons of files while not really knowing too much about them. Many years later I got interviewed for the Messenger team and they thought I was a god because my name was all over the NT header files. Love your reminiscing.
I did Borland C. Gave up on Borland C++ cause Borlands manuals were too wordily redundant. Finally learned how to inherent in C# using Unity3D. It took awhile. I just didn't care about object programming cause it has a lot of overhead. Tight code was better and OOP just had to much overhead for my small needs. Today I use objects, but it's a judicial mix of just getting it done vs do I need to reuse this code. This vid emphasises that knowing tricks of a language, is knowing its nuances. Great vid.
This was great fun! Thanks Dave! (Also, never change the interface declared by a spec. Don't override new ro catch bad-alloc and return null. This will bite you some day in the future.)
I've rarely used new over the years; I avoid it when possible. You can write complete liberties depending on the STL without using it once directly. Anytime I see new used directly, I start looking very closely to see (1) why they used it and (2) how they are ensuring the memory is released. Smart pointers are the way to go.
Smart pointers have the disadvantage of being slower. It depends on the implementation and the compiler used. The Clang compiler do a good job optimizing smart pointers. But it's always best to decompile the binary and check.
It's quite serendipitous that I come across placement new in your video, having asked about it in an interview just the other day! I thoroughly enjoy your videos and always try to glean off as much as possible from them. I wish I had such insights when I was starting out with C++ programming.
I’ve played with C/C++ for nearly 30 years but never really deeply. Although I understood all the terms and concepts here I’ll probably never need to use them. Still, I enjoyed this video and hope you keep more like these as well as your LED videos coming in the future.
Very helpful. I've written a lot of C and very very little C++, so really took a lot with me here. Seems scary to override new with a version on the stack instead of the heap though. Now when someone comes along, makes a "new whatever" and expects it'll live beyond the scope, it'll be a debugging nightmare
Guess that's why there was the remark about it being for some internal class that's never returned outside (so possibly not even visible/accessible from the outside).
I don't see how this could possibly work though? The memory returned by _alloca has automatic storage duration so its lifetime ends when the calling function returns. If you overload operator new, the calling function in question is operator new itself, so the lifetime is until the end of operator new, meaning the pointer cannot be used in the function that called new, as that would be a classic use-after-(implicit-)free. Accessing the memory (object) is undefined, and the value of the pointer to that memory itself is even indeterminate (C99's wording; C++ is slightly different but essentially the same core idea); see things like the WG21 "pointer zap" proposal for providing concrete semantics of this. I could only see it working if you in fact changed the caller to use _alloca and placement new?
I noticed when allocating bigger arrays on the stack, it's actually slower than on the heap. After some disassembly and googleing, I found that the program checks each and every page by reading/writing one byte per page. It almost always causes a cache miss, and can massively slow down the program.
I wonder why. I mean i know why, because the process will absolutely explode if you don't write to a page before reading it. But as far language guarantees go, every read of memory you the programmer personally hadn't written to explicitly is undefined behaviour! So it would be entirely par for the course for the compiler to be the arsehole here and let you collect the damage, for the sake of performance. What am i overlooking? Checked release build? Because what i'm thinking, when you do a huge allocation, stack capacity is limited, and you can run out of it; but if you don't touch all pages, then the explosion will happen in the next push or call after you have allocated, which is not really explicit enough, since it was that allocation at fault and not some other operation. So those writes are something you may want in a debug build but not necessarily in a release one?
So, it's faster if you don't have big amount of data, otherwise it's better to use malloc, right? I wonder if there's a way to questimate sizes that are faster to allocate on stack based on how large CPU cache the system has...
It really goes to show you know what you're talking about when you describe detailed knowledge of C++ stupid. It shows a respect of when to use it, and knowing that you probably never really want to.
This guy rules. If only I could keep up with his super fast manner of speech! Each time he's explaining sth I need to rewind several times to actually understand the idea behind what he's saying. It's like reading technical reference vs reading a popular magazine. While you may just briefly look through the latter and catch the idea conveyed by the article, the former requires a lot more diligence at reading in conjunction with a serious thought process to meaningfully consume each paragraph and understand the many important details it reveals. More often than not, you'd have to read a sentence twice or even thrice.
Speaking of stupid C++ tricks, I'm suddenly reminded of the time I wrote a template class which used void pointers to automatically transfer the contents of one variable into another variable (regardless of type) with no safety checks whatsoever. Think of it sort of like a type cast - but way, way more dangerous and amateurish. The funniest part is - I can't even remember what I wrote it for in the first place! Probably to make some spaghetti code work instead of completely rewriting it, knowing me. Y'all should be very glad I didn't write (any part of) your operating system! It would likely be an awful unmaintainable mess, LOL.
So you were responsible for making the entire Windows API working with wide characters natively?! That caused us so much headaches even today! If only you'd have allowed us to use UTF-8 after it was invented instead of forcing us to use ANSI when working with narrow characters. That would have saved us so much pain.
The funny thing is, a couple years later UCS-2 was already toast, and replaced with UTF-16 and, more importantly, UTF-8, to accommodate more than the Basic Multilingual Plane. All of the work that was done on wide chars turned out to be basically pointless, and is a permanent blemish on many computer systems that survived it, including Java and JavaScript which both have to deal with exposing UCS-2 oversights and UTF-16 hacks/bolt-ons. The byte order mark still trips people up, it's hilarious. Thankfully for the Linux and Unix folks, UTF-8 caught on before UCS-2, so things are generally much less of a mess there. What a mess; at least the Rust ecosystem has a good system though, strings have USV/codepoint access traits, and underneath they are UTF-8 strings like most strings in the wild today.
How do the Rust language compare to C/C++ when it comes to speed? Been thinking of learning Rust but some things you can do easier with C++ like speed optimized pointer magic and specialized assembly (ASM) functions.
@@rowenkylee5627 in general, you will be getting the same or better performance depending. The most aggressive uses of C++ are exceedingly unsafe, and the same is true in Rust, but Rust is not going to stop you from doing unsafe things if you ask it. Rust is more composable though, and its built-in data structures are really good, so there's a decent chance that the fastest code you you can afford to write and maintain in C++ is slower than the code written with the same effort in Rust.
It was a little advanced for me, i've mostly worked in C#, php, and JavaScript so some of the C++ pointer syntax go's over my head but I was still able to learn something about better memory management and tools I could use to do so. I enjoyed this.
An interesting thing about placement new is that you can construct individual elements of an array using parameters if you want. But it is a bit involved. I've done it once or twice. Be sure to call their destructors too when you're done.
Java developer here. I know enough c++ to be able to read the code and reverse engineer most things. The middle* was a little dense with “jargon” but still found the end of the video helpful. * - only referring to the last half of the video, first half was fun stories of the past.
pretty much all of this went over my head, but it was a joy to listen to. hopefully i'll learn enough C++ one day to know what's going in at least the first 5 minutes of the video
14:20 I worked on more than one C/C++ compiler, including very early DOS-hosted ones (before MS bought Lattice's), and the first 808x-native C++ compiler (predecessors all used Cfront). This meant I maintained my own run-time library, and toward the end, this included code for [global] operator new & delete. When I added the array versions, I just followed the paradigm used in the heap design, which was to record the length of the block, and then just return the address of the next word to the user (after invoking the default ctor for each element). This meant we were *absolutely* dependent of the user releasing using the corresponding operator delete[], so that we could back up over the count before commencing with the element-destroying loop.
Where were you 20 years ago when I was learning C and C++. I could have not wasted so much time figuring out things the hard way! You know, trial and error and stoping with the first thing that works. Even if it wasn't the best or most effective/efficient way to solve a problem. Only to have to come back later to address some error or efficiency issue. I suppose having to learn to solve and fix problems made the lessons I learned all the more effective.
I remember wondering how Microsoft was going to cope with the introduction of the Euro into Europe, with the € symbol not being an ASCII character. Some of the first attempts were not very successful until everything went 16-bit.
Coming from the .NET ecosystem and on my journey towards code optimization, I found this information quite intriguing and important! Thanks for the video!
Code optimization you say. If you need speed specialized assembly (ASM) functions is the way to go. Compilers still do stupid stuff that cause slow downs. Especially Microsoft's visual studio compiler. It's infamously bad. Many compiler bugs Microsoft refuse to fix. As usual.
There are probably two things in life most people will never master...The Bible, and C++... I alway learn something new watching someone talk about C++...
“I’ve been coding in C and C++ for 40 years…
NoW tHat IM fInAlLy gEtTinG gOoD aT iT”
Unfortunately, it's not that far from the truth...
This really got me
*cries in C#* ...sorry
Same for me, and I'm decent at C/C++. Haven't quite reached 'good' yet 😁😁
The fact that he was making windows only makes it funnier
Dave: "Let me tell you about the time I updated a million lines of code with a single commit"
My first thought: "What... *_intentionally?"_*
That's just a bad habit IMHO. But I cannot change even one line in the Windows kernel so you can ignore my opinion.
It was several years ago when commit best practices didn't exist.
@@rowenkylee5627 I'm not sure that would need a "Best practises" - but I guess "commit" in general was new enough, that maybe "commit common sense" hadn't really come to be, yet?
i mean, if it was just a find and replace then it's not that impressive. And i could see it making more sense to do it all at once cause it's a no win situation.
The rule is to take an update by an update to get it checked easily where it fails.
There is nothing better than having someone who is well versed in the nuance of something extremely technical, state in simple terms how something works. Great video and love the channel! Please keep making these!
And then go on to tell you "This complexity has been solved, please don't go through the pains that I have" Unless you need to do some very specific optimisations.
i hope you know I literally grew up and fell in love with using everything you created. I can't believe I found the guy responsible for creating just about every useful big windows tool
The more I know about C++, the more I know that I don't know anything about C++...
"Its not stupid if it works" ... the 1 pearl of wisdom I've learned in 20 years of IT support.
Love watching these videos, hearing these stories & hearing all this wisdom in general. It has helped me retrospectively understand a lot of issues I've tried to fix in the past.
Thank you for sharing your stories & wisdom, its greatly appreciated!
The thing is, "it's not stupid if it works" only applies until it stops working.
@@darksunrise957 No shit..
"Its not stupid if it works"
I wrote a match-3 engine once, and, I had a setup where if you cleared any 3+ items and new ones came down and settled into their positions, they won't cause other matches, thus, no chain reaction where the user just sits there and the game plays itself... Anyway, it was all working great, but once in a blue moon, a match would happen, and I was like "?", so going over the code, I just couldn't see where the bug was coming from, the logic looked sound, over and over I went, printing it off, reading it in bed or even on the bog... Nothing stood out, and because it was so rare in occurrence, it made it harder to debug, I tried everything... And, then, I changed one part of the condition in an If statement, which didn't look right, but had a go anyway, and, it never came back... And I still go over that code in my head and think... What's it doing, because believe it or not... I still don't know how the change worked in terms of the overall logic behind it! :D
I feel like Dave is filling in all mysteries in my life for the last 26 years.
True this.
Having done C++ on/off for 20 years now this was really useful to know!
Thanks! I was trying to think of stuff that might surprise everyone!
@@DavesGarage As a game dev i did know about everything shown here. Messing with memory is my bread and butter after all. But i really like how clearly everything is explained. including what the point of those tricks are and what their advantage is in certain cases. Its great at giving people a peek behind the curtain of c++
I really liked the mention of using alloca if you really know your memory lifetimes. (did that a few times before)
@@DavesGarage my knowledge is mostly from Stroustrups C++-98 version of the book, was quite a challenge to read it as a 16-year-old
@@gideonunger7284 Since you're a game dev, I'm curious - did you ever use fibers (instead of mere threads) when working on a game? I'm wondering why aren't these used more widely eversince seeing them referenced as an easter egg in The Sims 4.
@@kvdrr I heard of fibers as a memory handler, created by the team behind of _Last of Us_ remake. But that aims apps that allocate too much memory, challenging the machine capacity.
As a self-study individual who primarily works professionally in C# with no formal training and is trying to learn C++, sprinkling in "tidbit" videos like this has been helping me remain focused while expanding and equating the basics with some more real world samples. Thank you for all these great resources! I would love to hear your thoughts on how to improve studying for someone in my position.
More like this! Could gather 'round and listen for hours
hehe, I love tips and tricks from the best language ever.
Learning from the master is always more fascinating. Thank you. I expect your videos will inspire more people to become systems developers.
All of this new operator stuff is common in game engines. Cool to see it covered succinctly AND in the most straightforward way possible. Awesome!
I'm a casual tinkerer and I've gotten it into my head that 'I don't need all this fancy C++ abstraction nonsense.'
Hopefully this series will help me with understanding when these approaches can be applicable, in the *real world!!!*
I didn't realise you could overload new and delete; that example was clear and informative.
Interesting video, thank you!
I always start these videos watching it on a second screen while playing a game. I end up pausing the game, staring to the right screen and then opening up Google over my game to search for tidbits you mention. Thanks for making me a single task person for 15 minutes a week.
Definitely appreciate the C++ deep dives as well as doing the more mundane more efficiently. Thanks Dave!
fun and informative!
whenever I write in modern c++ and see some code like:
F *f = new Foo[100];
I consider one of these alternatives:
auto ptr = std::make_unique();
auto ptr2 = std::make_unique(100);
auto ptr2 = std::make_shared(100);
auto values = std::vector(100);
each having pros and cons and allocate the same number of elements on the heap
Dave, you are an amazing instructor! And, having been a Java guy for so long now, I realized just how much C++ I had forgotten... and I had never thought about overloading new and delete before, mind blown. Looking forward to the next video!
In Java you would need a private constructor and then make a factory method to achieve similar effect?
As someone with ADHD, the pace of these videos is great! I was captivated the entire time and learned a nice handful of new info. Thanks a bunch for these videos :)
as someone with ADD i found it a little slow at the middle part when i started getting impatient about when exactly he'd mention smart pointers
As someone with ADHD, I’m 4 minutes in, very engaged, but also reading the comments haha
@@davak72 Ha - sometimes happens to me as well; but I try to force myself to focus on the video alone when I really want to get all of it, because I've noticed that no matter how it feels like I can focus on both the video and the comments, it usually doesn't work that well in reality and I'll end up missing half the video :D Oh yeah, I forgot to say I'm writing this as someone with ADHD ;D I'm betting there's plenty of us here ;)
Would love to hear the inside story of the Cairo project. I remember the hype at the time...
I remember that BYTE magazine that featured that.
40 years as a C/C++ programmer, and you are finally getting good at it... well, good for you! For some reason, that makes me think I should pursue another career hahaha
I understood about 25% of what you said, but I'm always willing to listen to an expert in programming tell me how they solved things.
This is without a doubt my favourite channel
I'm always humbled and fascinated by your videos, Dave. Keep it up
4:30 actually the std::size function does this already, no need to write any macros or templates anymore.
would it even work for a dynamically allocated array ?
@@ahmedinelec Yeah, you have std::vector::size()
@@kwkfortythree39 im talking about a new* allocated one
@@ahmedinelec I now, my suggestion was to avoid using those unsafe constructs and use std::vector.
@@kwkfortythree39 While that works and is preferable in most situations there's still some situations when you doing low level code and normal array would be all you need
Vectors and other containers are convenient but come with some overhead and functions that you might not need
I'm impressed! The closest thing to that I ever did was rebuilding the OS on a CDC 1700 when I was totally stoned.
You have to think carefully before hitting enter when you do that.
Thanks Dave!
I'm an aspie and after a hard depression years ago, I was no longer able to code. I'm not sure why but during one of your quick tips in this video, it's like if something switch in my head...
I picked up c++ pocket ref and somehow, it now make sense again!
The placement new was a useful refresher for me (28 years C/C++). Loved the million lines of code story, look forward to that bigger episode. The 'name in the header makes you the owner' issue is a classic :)) I've had to deal with a fair share of disbelief about my lack of knowledge of a file at times.
Also happened to me once when I had to copy the code of a module to a new package in java. Git regarded them as new files, so put my name in the git blame XD
Thanks for sharing!
Keep up the great work. Glad youre spending you time making interesting videos
Dude, you're so experienced in the dark arts of C++. These vids are so interesting to watch and hear the nuance of all these low level features. Kudos man.
"now that I'm finally getting good at it"
Legendary humor
Fantastic! 5:58 the new OS is borning... I love that (disguised) passion of yours talking about the creation of what you love and make...
Congratulations!
I have been doing C/C++ for just over 22 years and love this!
Thanks Dave, 30yrs+ and still learnt something new after watching this video. As they say "Every Days a School Day" 👍
Bet that was a nervous moment checking in all those changes with a "What if and Have I" moment 😂
I'm just starting to learn C++, again. I'm liking this series because it helps me shift out of C thinking.
I've been wondering how the compiler knows what to do with a the delete[] operator FOREVER. Thanks very much for that.
Well it's a question of a system call. It's the kernel's responsibility to keep track of allocated threads, mem, cpu, disk, etc. delete[] basically is your program telling the system "Hey, I'm done with this bit of memory. You can have it back". The C++ compiler need not interfere further than just performing a system call with the pointer as an argument.
@@vedantganesh6923 I don't think that's right. For one thing, C++ allocators tend to get a large chunk of memory allocated to the program before it even starts main. When you call new or even malloc it can then usually find space in that pre allocated block for what you need.
More importantly, the delete and delete[] operators both call the destructor, which the system doesn't know anything about.
@@vedantganesh6923 Absolute non-sense explanation, you're confusing syntax stuff with actual implementation. LOL
I'm pretty sure when you'd grab memory, the size of that memory is stored somewhere, hence why you can just free a pointer, with no size attached. So if the compiler can read that value, it can determine how many elements are in the array.
@@Mozartenhimer The compiler? No.
The size of the memory behind the pointer is stored within the malloc() implementation, which is part of some underlying standard library - on Windows, it's the MSVCRT, the C runtime of the Visual C and C++ compiler that is linked in automatically, on desktop Linux, it's the glibc or eglibc, on embedded systems, something like Newlib. On FreeBSD, malloc is implemented by jemalloc library. For compatibility reasons, and to avoid more bloat than necessary, default C++ operators like new/delete generally leverage the underlying C implementation. At least this used to be the case, now it's got obscured behind layers of stuff, because there is now aligned allocation in C++, which means it must be preconfigured to use some malloc which offers this feature, where you can specify the alignment granule per allocation, and it might or might not be the same as used by C on the same system, and it will generally turn out to be, but not using the same standard C API. Sorry, maybe a bit of a digression.
It is however not a syscall, as that would be insanely slow. malloc uses syscalls to request pages from the operating system, so chunks of memory that are 4kb in size or multiple of that.
To figure out what size information malloc actually stores, you need to consider the interface. There is three functions, malloc, realloc and free. None of these allow you to query the size of allocation.
Crucially, malloc has no need to know the number of bytes that you requested, just the number of bytes that it returned. Most modern implementations are derived in one way or another, often more ideologically than in implementation, from Doug Lee's dlmalloc, which is a bucket system. So it has buckets of a number of sizes under a page in size, so let's say 16, 24, 32, 48, 64, 128, 256, 512, 1024 etc - it will have all the power-of-two sizes and can have a bunch of useful interim sizes. Each bucket is a contiguous chunk of memory that stores allocations of that given size, so for example a bucket that contains 128 elements of size 32 exists, and it's 4kb capacity in total. When a malloc request comes in, so you say malloc(28), it rounds it up to 32, and goes to the pool of buckets with 32-byte elements, and finds the first bucket that has some free space in it, and returns the address of that element, and marks that space as used in the bucket's bitmap. If no suitable bucket exists, it makes a new one by requesting a page from the operating system.
So as far as malloc is concerned, the capacity of that allocation that you requested as 28, is really 32. It will store the addresses of buckets in some sort of tree, and it will walk that tree to find the address, and it will find the bucket it belongs to, and hey look that's a bucket with 32-byte elements. And if you ask to "free", it will just edit the occupancy bitmap to indicate that this slot is free in the bucket, and if the bucket was at that point fully occupied, it will re-add the bucket to the pool of buckets with space available. If you request to realloc, if the new size is under 32, it might as well leave the allocation just where it was and do nothing. Otherwise, you should be able to imagine what needs to happen to satisfy a larger reallocation request.
So ultimately, malloc doesn't generally know the original allocation size, just some sort of upper-bound approximation of it.
"The problem is that before Unicode programmers pretty much assumed that a character is synonymous with a byte...".
UTF-16: hold my beer
This format, the stories with tips in between, is fantastic. I wasn't expecting much from the tips, but I was happily surprised/impressed. It is very apparent your skill is on an entirely different level than other senior devs I know of. I hope you'll cover some of the more niche C++2020 topics some time.
Doing C/C++ for 30 years, yeah, I'm still learning a few things
WE LOVE YOU DAVE!!
The Standard Library is part of the C++ language. Unless you are stuck in 1994, please use it. See std::min (1:28), std::size (15:42).
I learned more about C/C++ from this one video than I ever learned in school.
Thanks for another great video Dave. As an old Pascal/Delphi programmer, having only moved to the C++ realm in the last 10 years or so for microcontroller programmng, this was very interesting.
Pascal/Delphi still exist?!
These are not stupid tricks, this is pure knowledge. Thank you for sharing.
Great presentation. I'm looking forward to watching more of your videos.
I love how he says that after 40 years he's finally starting to be good at it! This is a fantastic programmer!!!
damn, if these are the stupid tricks and they're this far over my head then the smart tricks will probably liquify my brain instantly
Funny, when you were describing the different RAM types, I was thinking "this sounds like my old Amiga RAM" and then you mentioned the Amiga as an example. Loved that computer.
I'm not a programmer but I can still follow along with the words and the logic. It's strangely peaceful. Thanks, Dave.
Love your style Dave... have you thought of creating a C++ programming training course taking someone from beginner noob to super advanced leet coder. You have so much experience and knowledge as well as a natural ability to convey complex concepts effortlessly... a CS course by you will be a win for the whole industry.
gonna second this, there are things about C++ that dissuade me from using it and staying with C, having the mystery taken out of it would go a long way towards converting me
I agree... I mean, I know C++, but, as a tinkerer, I never keep up with the standard, like, his explanation of "new" was great, because, I would never of went into it far enough to learn that... And, well, now I know, and he put it across so well too!
i dont think that leet coder are good programmer, but that is just my opinion. Maybe you know your data structure and some algorithms for it, but that's it. Dave could teach how to really program some good software, that would be a nice course
You know... MS should have had you doing this stuff while you worked there, because when you see "Windows", it is what it is... Windows, you tend to forget that there are people behind the OS, and they have reasons and decision making to do which we as users may go "What have they done that for?". I know they have developer portals, with videos and such, but these are aimed at professionals in the most part, so it is refreshing to see videos like this on average joe platforms like youTube where us programming tinkerers can see it... Bit of cocktail music in the background, explaining in layman's language, drop of humour, all the while, popping up little sample code that makes sense, all bundled with little stories from days gone by.
I like it! 👍
Gad. I had to port the icon code from win95 to NT and the caching tricks took me quite awhile to grok and just before I was to check in 2 months worth of work, my HD crashed and I had not backed up the data. (SLM SUCKED) That got me fired from Systems unfortunately. I also did a lot of header file cleanup and the Win 16-32 porting layer so I can relate to getting your name on tons of files while not really knowing too much about them. Many years later I got interviewed for the Messenger team and they thought I was a god because my name was all over the NT header files. Love your reminiscing.
I did Borland C. Gave up on Borland C++ cause Borlands manuals were too wordily redundant. Finally learned how to inherent in C# using Unity3D. It took awhile. I just didn't care about object programming cause it has a lot of overhead. Tight code was better and OOP just had to much overhead for my small needs. Today I use objects, but it's a judicial mix of just getting it done vs do I need to reuse this code. This vid emphasises that knowing tricks of a language, is knowing its nuances. Great vid.
Dave is sooooo cool ! He's like the computer geek dad you've always wanted
"Thanks to C++ inheritance it's a little more complicated than that."
If I had a nickel for every time...
So far above my head as Im looking for O2. But I really enjoy your insights stories and cool delivery.
Tks for your time
Will
This was great fun! Thanks Dave!
(Also, never change the interface declared by a spec. Don't override new ro catch bad-alloc and return null. This will bite you some day in the future.)
So glad you threw in the cpp11 smart pointers at the end. These days I get a little scared when I see the new operator :)
I've rarely used new over the years; I avoid it when possible. You can write complete liberties depending on the STL without using it once directly. Anytime I see new used directly, I start looking very closely to see (1) why they used it and (2) how they are ensuring the memory is released. Smart pointers are the way to go.
Smart pointers have the disadvantage of being slower. It depends on the implementation and the compiler used. The Clang compiler do a good job optimizing smart pointers. But it's always best to decompile the binary and check.
It's quite serendipitous that I come across placement new in your video, having asked about it in an interview just the other day! I thoroughly enjoy your videos and always try to glean off as much as possible from them. I wish I had such insights when I was starting out with C++ programming.
I’ve played with C/C++ for nearly 30 years but never really deeply. Although I understood all the terms and concepts here I’ll probably never need to use them. Still, I enjoyed this video and hope you keep more like these as well as your LED videos coming in the future.
This was awesome! I really enjoy this amount of complexity and detail. I learned a lot but it was still understandable
Very helpful. I've written a lot of C and very very little C++, so really took a lot with me here.
Seems scary to override new with a version on the stack instead of the heap though. Now when someone comes along, makes a "new whatever" and expects it'll live beyond the scope, it'll be a debugging nightmare
Guess that's why there was the remark about it being for some internal class that's never returned outside (so possibly not even visible/accessible from the outside).
Yep, it was clever but not something I'd suggest because the caller has to know. That's why I said "not judicious" I guess!
I don't see how this could possibly work though? The memory returned by _alloca has automatic storage duration so its lifetime ends when the calling function returns. If you overload operator new, the calling function in question is operator new itself, so the lifetime is until the end of operator new, meaning the pointer cannot be used in the function that called new, as that would be a classic use-after-(implicit-)free. Accessing the memory (object) is undefined, and the value of the pointer to that memory itself is even indeterminate (C99's wording; C++ is slightly different but essentially the same core idea); see things like the WG21 "pointer zap" proposal for providing concrete semantics of this.
I could only see it working if you in fact changed the caller to use _alloca and placement new?
I wish you were my OS teacher when I took it. These tips are great!
I noticed when allocating bigger arrays on the stack, it's actually slower than on the heap.
After some disassembly and googleing, I found that the program checks each and every page by reading/writing one byte per page. It almost always causes a cache miss, and can massively slow down the program.
I wonder why. I mean i know why, because the process will absolutely explode if you don't write to a page before reading it. But as far language guarantees go, every read of memory you the programmer personally hadn't written to explicitly is undefined behaviour! So it would be entirely par for the course for the compiler to be the arsehole here and let you collect the damage, for the sake of performance. What am i overlooking?
Checked release build? Because what i'm thinking, when you do a huge allocation, stack capacity is limited, and you can run out of it; but if you don't touch all pages, then the explosion will happen in the next push or call after you have allocated, which is not really explicit enough, since it was that allocation at fault and not some other operation. So those writes are something you may want in a debug build but not necessarily in a release one?
This is a mitigation for the Stack Clash vulnerability.
So, it's faster if you don't have big amount of data, otherwise it's better to use malloc, right? I wonder if there's a way to questimate sizes that are faster to allocate on stack based on how large CPU cache the system has...
“Pucker factor” nice term
My mind expanded with this video, and then I needed a vodka tonic.
Thanks Dave, I’m learning C++ as we speak
It really goes to show you know what you're talking about when you describe detailed knowledge of C++ stupid. It shows a respect of when to use it, and knowing that you probably never really want to.
Very much thanks this was very useful to know for a big portfolio project
This guy rules. If only I could keep up with his super fast manner of speech! Each time he's explaining sth I need to rewind several times to actually understand the idea behind what he's saying. It's like reading technical reference vs reading a popular magazine. While you may just briefly look through the latter and catch the idea conveyed by the article, the former requires a lot more diligence at reading in conjunction with a serious thought process to meaningfully consume each paragraph and understand the many important details it reveals. More often than not, you'd have to read a sentence twice or even thrice.
My favorite c++ trick is to give bad solutions and watch as the helpful community of coders come to correct you by the thousands
thank god for RAII and smart pointers now ... no new/delete for me ever
Thanks Dave, it's a pleasure !
I don’t know anything about computers. I just watch you while I smoke 💨🎄
Speaking of stupid C++ tricks, I'm suddenly reminded of the time I wrote a template class which used void pointers to automatically transfer the contents of one variable into another variable (regardless of type) with no safety checks whatsoever. Think of it sort of like a type cast - but way, way more dangerous and amateurish.
The funniest part is - I can't even remember what I wrote it for in the first place! Probably to make some spaghetti code work instead of completely rewriting it, knowing me.
Y'all should be very glad I didn't write (any part of) your operating system! It would likely be an awful unmaintainable mess, LOL.
That PR with 1mil chances must have been a sight to behold ;)
So you were responsible for making the entire Windows API working with wide characters natively?! That caused us so much headaches even today! If only you'd have allowed us to use UTF-8 after it was invented instead of forcing us to use ANSI when working with narrow characters. That would have saved us so much pain.
Thumbs up for mentioning Amiga's ChipRAM and FastRAM :)
The funny thing is, a couple years later UCS-2 was already toast, and replaced with UTF-16 and, more importantly, UTF-8, to accommodate more than the Basic Multilingual Plane. All of the work that was done on wide chars turned out to be basically pointless, and is a permanent blemish on many computer systems that survived it, including Java and JavaScript which both have to deal with exposing UCS-2 oversights and UTF-16 hacks/bolt-ons. The byte order mark still trips people up, it's hilarious.
Thankfully for the Linux and Unix folks, UTF-8 caught on before UCS-2, so things are generally much less of a mess there.
What a mess; at least the Rust ecosystem has a good system though, strings have USV/codepoint access traits, and underneath they are UTF-8 strings like most strings in the wild today.
How do the Rust language compare to C/C++ when it comes to speed? Been thinking of learning Rust but some things you can do easier with C++ like speed optimized pointer magic and specialized assembly (ASM) functions.
@@rowenkylee5627 in general, you will be getting the same or better performance depending. The most aggressive uses of C++ are exceedingly unsafe, and the same is true in Rust, but Rust is not going to stop you from doing unsafe things if you ask it.
Rust is more composable though, and its built-in data structures are really good, so there's a decent chance that the fastest code you you can afford to write and maintain in C++ is slower than the code written with the same effort in Rust.
@@rowenkylee5627 Go for it! Most beloved language 6 years in a row (according to StackOverflow) for a reason.
It was a little advanced for me, i've mostly worked in C#, php, and JavaScript so some of the C++ pointer syntax go's over my head but I was still able to learn something about better memory management and tools I could use to do so. I enjoyed this.
An interesting thing about placement new is that you can construct individual elements of an array using parameters if you want. But it is a bit involved. I've done it once or twice. Be sure to call their destructors too when you're done.
This guy edits in 1080p in the style of the 90's edit style. I love him
Java developer here. I know enough c++ to be able to read the code and reverse engineer most things.
The middle* was a little dense with “jargon” but still found the end of the video helpful.
* - only referring to the last half of the video, first half was fun stories of the past.
Great video. I've learned new things about C++ thanks to that (things that really nobody talks about).
Always learning something new from your vids!
pretty much all of this went over my head, but it was a joy to listen to. hopefully i'll learn enough C++ one day to know what's going in at least the first 5 minutes of the video
I am not a c++ developer yet I liked your deep understanding of the subject as well as the interesting stories behind the code
I haven’t programmed in C++ in ages. I forgot all about placement new.
What is your language of choice nowadays?
This was very helpful and educational for me.
14:20 I worked on more than one C/C++ compiler, including very early DOS-hosted ones (before MS bought Lattice's), and the first 808x-native C++ compiler (predecessors all used Cfront). This meant I maintained my own run-time library, and toward the end, this included code for [global] operator new & delete.
When I added the array versions, I just followed the paradigm used in the heap design, which was to record the length of the block, and then just return the address of the next word to the user (after invoking the default ctor for each element). This meant we were *absolutely* dependent of the user releasing using the corresponding operator delete[], so that we could back up over the count before commencing with the element-destroying loop.
Great video! I think it's awesome you ported explorer to Windows NT on MIPS. :) Thanks for posting!
Your shirt today is such a winner!
Really nice video 👏👏👏, long live C/C++
Where were you 20 years ago when I was learning C and C++. I could have not wasted so much time figuring out things the hard way! You know, trial and error and stoping with the first thing that works. Even if it wasn't the best or most effective/efficient way to solve a problem. Only to have to come back later to address some error or efficiency issue.
I suppose having to learn to solve and fix problems made the lessons I learned all the more effective.
Being a citizen of embedded C who dwells in C++ here and there. All very fascinating. I didn't even know about alloca.
"Bob handed me his dollar...." That's where I started chuckling.
I remember wondering how Microsoft was going to cope with the introduction of the Euro into Europe, with the € symbol not being an ASCII character. Some of the first attempts were not very successful until everything went 16-bit.
Coming from the .NET ecosystem and on my journey towards code optimization, I found this information quite intriguing and important! Thanks for the video!
Code optimization you say. If you need speed specialized assembly (ASM) functions is the way to go. Compilers still do stupid stuff that cause slow downs. Especially Microsoft's visual studio compiler. It's infamously bad. Many compiler bugs Microsoft refuse to fix. As usual.
There are probably two things in life most people will never master...The Bible, and C++...
I alway learn something new watching someone talk about C++...
This is also a moving goal post, compilers are currently implementing C++23 proposed features.
Only God knows C/C++, He helped me sort 100,000,000 random numbers in .28 of a sec.
1:07 - my video buffered for a few seconds and went to a black screen after you said "C++ quick tip, like this". I thought it was a joke for a sec lol
This is why you fix warnings, not just errors.