C is such a simple language that you could learn the syntax in a day, learn the important parts of the standard library in less than a week, and learn how to use pointers.... eventually.
@@saeedbaig4249I loved c and hated all other languages when i studying After my school, i had learned so many other langs, and I never even used c in my job or higher studies, but nothing same as c
@@loganhello not necessarily, it's read or write in memory that you are not allowed to read from/write to. You can read garbage from memory that you haven't allocated but that is undefined behaviour.
Becoming a programmer these days is like coming in to a movie theater halfway through the movie. Computer Science history weighs heavy over everything. After decades of messing around with computers and programming I understand the approach of learning things in the order they were invented- everything builds on what came before. Everyone should learn C first if they want everything that follows to be many time easier- because you will have context and won't need to constantly ask: "Why TF did they do it that way?" you'll already know.
I went from BASIC to assembly to C back in the late 80's and early 90's. By the time I got to C, I already knew what "i++" and "++i" meant, how they were different, and why that difference was significant. I had already programmed position independent and re-entrant systems, and understood the benefits and costs involved. I understood why Motorola CPU's were/are far better at it than Intel. I agree with you that learning programming history is a huge, highly underrated benefit.
It would be insane to teach 15 year olds C, which is when we start learning some programming where I'm from. We start with Python. As someone whose first language was Python, I haven't struggled that much with C honestly.
Seems similar to the way one learns studio or fine art at the professional level. Nothing exists in a vacuum and the best artists reference what came before. Ongoing dialogue
EXACTLY! I keep asking myself the same question: "WHY DID THEY DO IT THIS WAY". I can't focus due to this. Can somebody tell me how I can overcome this?
Being a CS student, transitioning from classes where C is the only language to now building modern apps with Kotlin/Swift and other various API's and languages. I now really appreciate the simplicity of C.
I work as a software engineer writing C code on embedded linux, and while the C language may be simple, the bugs can be some of the worst ever. I'm finding I much prefer higher level languages. Also, the lack of a string type or decent string library (string.h is trash IMO) really hurts its usability. Also, legacy C code written by EEs who didnt know good programming practices is some of the nastiest stuff ever created lol
If you ever programmed anything more advanced than an introduction to programming "Hello World" program in C, you would appreciate the simplicity of higher level languages.
@@unheilbar AHAHAHAHA! Oh that's good. About one college course that praised AGILE and OOP like they were made by the second-coming of Jesus himself dude.
Quick remark, gcc actually means GNU Compiler Collection and not GNU C Compiler, the name GNU C Compiler was dropped around 1999 (in April to be precise) because C was only one of gcc's frontend.
C is challenging to master because to truly use it well, one must understand fundamentally understand how computers work. This means having some ideas about registers and ASM syntax, while not required, greatly enhances interpreting why certain design choices were made in C. C allows interfacing to hardware in straight-forward ways (custom registers, memory mapped peripherals) which don't really make sense in higher level languages. So C excels at low and mid level tasks where performance and resource allocation are the most important criteria. But it is difficult to be time efficient if you have to mix complex dynamic data, such JSON or web content.
Well, it's true about almost(?) every language. If you want to master it - you MUST understand, what happens under the hood. Without understanding you can get relatively far, but other people will point at you and laugh.
Yep. C is just a high level form of Assembly. The output from many C compilers is an Assembly program which is then fed to the Assembler to create object code and then to the Linker to create executable code.
@@bf945 That the case with many (all?) compiled languages, but the thing specific to C is that since C is simple (not necessarily easy), its easy to reason about generated Assembly, specially if you have habit of reading Assembly listing file of some compiler, you get a general idea what assembly instructions your compiler might generate from some C code. This is what makes C so joyful to work with in Systems programming.
@@alexeynezhdanov2362 As a C# dev...no you don't. I routinely have trouble where I can't find anyone on my team who knows whether the performance of a LINQ statement will be better written one way over another, for example. In that environment, performance almost never comes up in consideration, it's all about how clear and maintainable the code is, and if it follows a set of arbitrary style guidelines that are defined vaguely enough no one can agree on (or even really cares) what they mean. Java's the same way...and when you add in front-end web development and consider a lot of backend stuff is written on the assumption that microservices or other tech magic will do all the optimization you need (I mean, to be fair most of the time is spent executing web calls anyway), that's a sizeable chunk of the industry that doesn't require any low-level knowledge for better or for worse.
"Platform-dependent" in the context of compilers doesn't really imply dependence on a particular operating system; it means the native code is generated for a specified processor architecture, such as ARM or x86.
I'm 99% sure this is wrong, since compiled code will use subroutines defined in ram by the OS, so an executable for linux won't work on Windows even if they are run by machines with the exact same specs
Well it depends, if what you want to do is emcompassed by the architecture, then the resulting assembly will be the same: (adding two numbers from the stack together) mov edx, DWORD PTR [rbp-4] mov eax, DWORD PTR [rbp-8] add eax, edx But if you need anything provided by the OS (so most things done by the standard library), the compiler output will probably be different: (exiting program, Linux vs Windows) mov eax, 60 ; sys_exit system call mov edi, 0 ; return code syscall ; equivalent to int 0x80 mov ecx, 0 ; return code call ExitProcess ; WinApi function add rsp, XXX ; clean up stack
@Z3U5 Different compilers have different behaviour, but GCC allows the implicit conversion, so (char*) isn't necessary. Plus, char on most systems is 1 byte, so though a good practice, not multiplying by sizeof(char) wasn't the cause of error either. The only thing I can think of is that his malloc failed to allocate memory, and hence he tried to access null + 1 address, which failed.
@ZRU There is no problem with the original code he wrote unless he change it somewhere and didn't included it in the video. He never access any illegal memory address. void* can be safely cast to any type except a function pointer.
@@hi_arav C is one of the few languages that require you to develop your own standard library to even use the language efficiently. Libc is just way too basic to be reasonable.
I think the *type* syntax of C is quite complicated and I've seen plenty of people get confused by it. I mostly mean declaration and use of anything that has to do with pointers. Pointers aren't complicated but the C syntax without proper explanation can make people think that they are.
Tip with malloc: if you cant remember how much bytes u need to store u can do malloc(sizeof([datatype here]) * [how much of those datatypes you want]) e.g: char* word = malloc(sizeof(char)*4) Now with chars its easy (1 char = 1 byte) however this technique works well when ur making struc arrays
C was my second language I ever learned. I find it kind of funny that your explanation of C is mostly what happens in the machine for _every language_ but that most people take for granted since so many popular languages have an interpreter and/or sandboxed environment holding their hand.
12/10/2011: 10 years ago, Dennis Ritchie past away, leaving the world the legacy of the operational systems and the internet that we use today. He is the true legend, not Steve Jobs.
and alot of insecure, messy legacy codebases. C is certainly a huge accomplishement, but couldn't we just have bounded buffers? real strings? were those couple of bytes really too much? even as an optional addon?
@@cobaltno51 Yes, those couple of bytes were too much in many cases back then. Today we are spoiled by super fast and high memory computers, but people literally counted the bits back in the day. Even C was considered a "heavy" language at some point.
If we look at the algorithm, next would be C++ in 100 seconds, C# in 100 seconds, Assembly in 100 seconds and finally Binary in 100. Last time in the react native, flutter and kotlin series, I predicted correct
I was brought up on C - I bloody love the language (doing realtime embedded software and device drivers with it is a rollercoaster until you know what you're doing).
The coolest thing I ever had to do in C was implementing sensory input on an Arduino microcontroller. It was a bizarre combination of bit manipulation, machine language, and regular C that you could only do in something as low-level as C itself.
I remember writing an employees management system using C during my first year of university. Learnt a lot, including the fact that I don't wanna write C code for work.
@@mattmurphy7030 I admire your kind. The guys who write C code and do embedded stuff. I love it, but I just haven't had a chance to do it extensively. I feel like that's where the real stuff is at.
@@justapugontheinternet It is. That's pretty much what I do for a living. It takes time to get used to. That being said, with embedded processors being a lot more powerful than they used to be (the Cortex A53 is an industry favorite right now) we're able to splurge a lot more and go C++ rather then straight C, since we aren't as performance constrained as the days when we used 286's and 68000k's as our embedded processors of choice.
@@gamerk316 Not sure about you but I loved the 68000 architecture. It was so elegant compared to the x86 stuff. I've been fortunate to have been working with C and C++ for 25 years but now being pushed into the Java world, I'm hating it.
Indeed Bell Labs and Xerox Parc technologies and invented paradigms are still either directly or indirectly used today - these people were the knee's bees. Amazing visionaries and engineers all of them.
@@PhilipAlexanderHassialis What I saw at Xerox Parc's graphical user interface demonstration was a regression. Its no longer "normal" to use a command line to get exactly what you want done, fast. It's all point and grunt like cave men.
2:04 if needed, you can mimic an object-oriented programming language approach in C by having function pointers inside a structure, which then would be accessible through the arrow operator. An instance of this structure could be compared to an object with its getters, setters and methods. Very nice video, I haven't used C lang for quite a few years and that's a great throwback with all those countless segfaults
This is what SDL does, iirc. struct some_sdl_interface { func_ptr method; }; struct some_sdl_interface* create() { ... st->method = method_ptr; ... } create()->method(); I think SDL does that for a lot of its abstractions.
One of my all-time favourite things is Exercise 1-23 at the end of chapter 1 in “The C Programming Language” It says “Write a program to check a C program for rudimentary syntax errors like unbalanced parentheses brackets and braces.” This at the end of the first chapter!! We only learned the very basics, and now Kernighan/Ritchie expect us to do that! I think I actually went back and did this exercise after reading chapter 5. But it still remains with me 40 years later as something that just seemed so “Wow!” …
It should be said that malloc/calloc allocates memory on the heap and you should only use free on heap allocated memory. Stack allocated memory will be deleted automatically after it goes out of scope.
One could nitpick about your use of terminology. Objects on the stack aren't "deleted" per se. The language manages the stack automatically, without the programmer having to write code to do so explicitly, and this management is essentially just updating the stack pointer register. In some kind of vague sense when a function is exited and the stack pointer is changed to what it was before the function was called, that stack space is kind of "deleted", but that's a bit of a misnomer. (In C++ the use of that term would be more apt because in that case the language actually literally calls destructors for objects on the stack, and in this case "deleting" them would be an appropriate term to use, in a sense.)
But did you ever create an array of characters without remembering to terminate it with a '/0'??? A noob move that everyone learning C has to go through 🤣
You do finish learning the syntax. I learned most of c++ when I was in middle school in a couple of months. You just need a good tutorial and a lot of determination.
@@sxs512 I'm glad to hear it's possible. I've only been learning since September, so I think I've still got a bit of time to go. Most online tutorials are anemic, though, so I think I'll need to memorize the class textbook or something.
Honestly C is one of the most amazing language I've ever used. Once you get through the concept of pointers, you'll find out how powerful they actually are. It's one of the most easy-to-use language to do simple parallel tasks without blowing up your memory limit.
One of the most easy-to-use languages to do simple parallel tasks? Are you serious? Pretty much _any_ other programming language that has support for threads is simpler to use for parallel tasks than C.
@@DjVortex-w The nice thing about C is you can access the same memory chunks in multiple threads more easily. You don't have to call the mutex APIs if the threads are not going to work at the very same position in memory. I'm not sure about Golang though. The synchronization in Java is a bit convoluted but most of the APIs are nicer indeed. Running multi-thread in Python is nonetheless a nightmare. In many cases the GIL gets into the way and makes multi-threading meaningless (e.g. numpy arrays are managed by GIL in the version I used). In my project I had to get both temporal and spatial efficiency and C was the optimal solution to it.
@@mu11668B That doesn't make C the easiest language to run multiple threads. It just makes it easier to access shared memory without mutexes. If that's the measure of easiness, then C++ is better on both fronts: It's very easy to create multiple threads (with 100% standard code, no need for non-standard libraries) and you can likewise bypass mutexes if you want. (And, in fact, the C++ standard library also offers support for efficient atomic variables. I don't know how many other programming languages do. And yes, C does too, I know.)
I was in college studying computer science circa 1980. For programming we were studying FORTRAN and IBM OS360 assembly language. A friend of mine was interning at Bell Labs and he said he was learning a language that was so much better than FORTRAN called "C" and an operating system so much better than IBM OS 360 called "UNIX". After being a professional FORTRAN programmer for a number of years I started to use C and it was so much better than FORTRAN and the whole developer environment was miles ahead of anything else at the time.
With C you have to check your math (especially floating point division) on each processor type you're planning on working on. With FORTRAN (and COBOL) you don't. No matter the platform you get the same results. C uses 2's complement to store numbers. FORTRAN and COBOL - BCD. 2's complement is faster, but sometimes wrong. BCD is slower, but always correct. There's a reason R uses the ancient (and well debugged) FORTRAN statistical libraries and not the schlocky C equivalents. That isn't to say that C is inherently bad. But using C as a replacement for FORTRAN in many cases is. Now, C for integer based math? - like a rocket powered toboggin. To the point that I convert my floats to integers and only do integer math.
This makes me want to revisit C, didn’t like it when I learnt it at uni because struggled with pointers, but that all seems more interesting to me now!
th-cam.com/video/zuegQmMdy8M/w-d-xo.html I highly encourage you to watch it, I started learning C with this video. It isn't as hard as you think, you could maybe watch 30 minutes of this video everyday and then watch the Data Structures video from the selfsame youtuber. If you want to learn even more about it, then Jacob Sorber's videos are great too.
C is actually really cool, just absurdly verbose, by design. And you can target just about anything. There is even SDCC, a C compiler that can, for instance, compile for the GameBoy!
YES. I was looking for this. Essentially everything in C that is seen as "FALSE" (not always but often defined by pre-compiler) equates to 0 (0x0, etc, not '0' though). Anything NOT "FALSE" (or !0) is seen as "TRUE" or essentially equal to 1. However, anything "TRUE" == 1 is not always true. It's tricky but the only real definition of "FALSE" should always equal 0 and anything non-zero is true. In extension to that, the only error code that means "no error" is 0 itself. Meaning your program ran perfectly with no issue/bug. Anything non-zero isn't necessarily an error, but any "non-flawless" execution. The nuance is necessary and this hasn't been stated enough in the comments (to my knowledge).
Been programming in C for 3 decades, so, at this point, I can't even use a language that doesn't allow me to work with pointers and manage my own memory. And I'm better off for it.
I learned python then C and now C++. It genuinely feels so much better to deal with the memory directly and you can be really creative with it. You have to know what you're doing though.
It depends completely on what you are doing. If I'm writing a shell script, for example, I couldn't care one iota about memory management, and I'm completely fine with the shell script interpreter doing whatever it does, as long as it does it correctly. Manual memory management in this context wouldn't make an iota of sense.
I love that I can actually understand every bit of this video now! I watched this when I first started learning c++ about a year ago. I’ve now taken two computer science classes and a web authoring class for html, css, JavaScript. As well as a course in Microsoft power shell. Nice to see I’ve progressed. Lol
I had a great lecturer so he helped me (and a bunch of others) through the week or two it took for me to grasp pointers and arrow notation. After this, C became my favourite language to program in as I love how strict it is. And as it’s statically typed you know exactly how the data flows about in your program. And as long as you use them correctly, pointers make for very easy access to variables from anywhere in your program.
Erm.... I would not say C is a strict language at all - you can REALLY screw stuff up with it. I saw it almost has a higher-level assembly language in some respects. Try looking at Ada for a strict language.
@@JoolsParker By strict I meant mostly that it was statically typed and it was my first statically typed language (I started programming only in uni therefore I was very late to programming). Being able to shoot yourselves in the foot is true though, as we all definitely have. However C is far more rigid, compared to vanilla JS, which is what a lot of viewers of this channel (including myself) program in.
@@thedrunknmunky6571 If you enjoy the strictness of C, C++ is significantly more strict with types and casting. C++ has exceptions as well, so the shooting yourself in the foot stuff can be caught, like accessing outside of array bounds.
I am a C# programmer and I recently started learning C to program with arduino. My prior knowledge of C# really helped me understand basic programming fundamentals and syntax. Both are definitely great and useful languages!
That video, like always, was fantastic. I think it'd be quite interesting to see you cover one of the Lisp languages so Common Lisp, Clojure/ClojureScript, or even Scheme (in that case it'd most likely be Racket as many of the other Scheme implementations are too minimalist by themselves or are extension languages instead of general purpose programming languages).
Clojure! Clojure! Clojure is one of the BEST languages I've ever seen. It takes the good parts of functional programming & good parts of lisp while getting rid of worse parts. It works on JVM so you can use java libraries. Clojure repl and the tooling around it is amazing, the language really focused on the repl. You can even use the repl with clojurescript to run clojure code that then evaluates ON THE BROWSER IN REAL TIME LIKE BLACK MAGIC.
Such an efficient overview, as someone with familiarity with languages built on top of C but no experience in C this sheds a lot of light for 100 seconds. Thank you Fireship!
@pntg0n!kyuu Because they represent two completely different things. (.) is the "access" operator, i.e. access a field in this struct. the "->" operator isn't really unique, its actually a combination of two operators. "->" is equivalent to "(*).", as in "deference this pointer to get a concrete object and then access this element in this object". This is why they're two separate operators. "->" is used exclusively on POINTERS, i.e. references/addresses, while "." is used on CONCRETE objects. It allows you to look at code and quickly figure out if a variable is a reference (pointer) or concrete object. It also lets the compiler easily statically check the program, as it will detect if "->" is used with the wrong type and if "." is used with the wrong type. Languages that only use "." can't do that. Also, languages that only use "." generally don't have pointers. See: C# and Java.
@pntg0n!kyuu I didn't know C# has pointers, but I would say that "." and "->" are both used very often and they're definitely not interchangeable because pointers aren't like references in C++, you can arbitrarily access a method without first dereferencing a pointer. I think it makes sense from a technical view, i.e. pointers must be dereferenced before using a method/member variable, and from a user point of view, i.e. "->" clearly indicates a pointer, "." does not.
@pntg0n!kyuu The majority of people haven't actually done proper programming to know how problematic implicit type declaration is. It's very ambiguous and shouldn't be used when you have lots of incompatible variable types, especially when you want to have multiple *uninitialised* variables of different types.
I enjoyed C so much. I recommend everyone to try C, you will just learn a whole lot about how computers work. Edit: You can C how the computer works, ta dum tiss.
c is such a wonderful simple language, if you're not new to programming, this video has everything you need to get you started in c. awesome job man! p.s. i watch these videos even if i don't care or already know the programming language/tool because they are quite fun to watch. awesome job man!
@@cobaltno51 neither C is really as simple, numerous UB's and quirks are there in the standard, they are there for a reason though, and you don't usually encounter them
C will forever remain my favorite programming language. Of course other languages suit certain demands better. But for bare metal power in a high level programming language, c is simply unbeatable imvho.
I'm a greybeard architect now - over 30 years building code. I mentor, because everyone should. I try to get the younger devs to learn a little C (not C++). My suggestions to them are: learn a little C; understand how IO works in your OS (buffer sizes, how data is laid to storage, and IP packet construction); and how memory works in conjunction with the kernel. They are amazed how much performance can improve if you just align your data to the device, and minimize runs through the IO subsystem. Those that do eventually become much better designers and developers overall. Most 4th-generation languages - and nearly all OO types - hide a lot of details, which is good (Java is my goto). But knowing how it works under the covers saves a lot of problems.
"C does not support OO" if you know how to use C you can make OO it is hacky but you can. There is a book, "Compiler design in C" that all their examples are written in C, and explains how the code is OO.
It certainly is possible. When I was in my data structures course, I was the only crazy person who did the assignments in C (linked lists, binary trees, dynamic queues and stacks, etc.) Objects are just structs with function pointers.
that's mimic. not support. It is possible to write C code in style of OOP. but that's not OOP. OOP is how you organize data and OOP languages provide syntax for that concept. but C don't provide that syntax. you have to make up oop structure from the scratch and then you have to stick to that style. you can't expect other programmer to stick to that style. if you need to work with other people then you need a lot of document. that's why we don't call C OOP language.
I agree for most cases, but I will say containers especially lend to object oriented programming very well. Having lists, vectors, trees, etc. with methods to act on their own data is intuitive and very concise. list.size() and vector.size() have completely different code, but they have the same name and represent the same thing - the size of a container. It really makes it simple to abstract and encapsulate complex data structures with complex memory allocation and traversal into objects as simple and easy to use as arrays. OOP also lends well to GUI parts of applications, as polymorphism and inheritance represent GUI components and widget hierarchies very well.
@@lucass8119 I can agree about containers. I don't know about GUI programming since this is not my field. I guess the best approach is to use classes where they really fit and not everywhere. Having everything as class never works from my experience. Instead of focusing on the task that your program needs to do you focus on creating object hierarchies. So for example whether message should send itself or there should be a sender object that sends messages.
@@MrChester114 I agree 100%. The "everything is a class" sort of idea that we see in Java and others is dumb and needlessly complicated. Most things don't need classes and objects, and you should only use OOP if it makes sense for that specific problem.
And I think you've never tried writing anything longer than 100 lines of code. If you did you'd know that using objects and oop concepts like polymorphism and encapsulation is the way to go.
The video implies that the stack has to be managed manually, but this is not true. If you declare a variable int phone_number, the memory for it will be automatically deallocated when the function containing it returns. This does not require garbage collection.
It isn't deallocated. Such variables are created on the stack, when the function returns that variable is out of scope, but can still be accessed by evil people until something else overwrites it, assuming it wasn't optimized out of existence by the compiler. Accessing something deallocated would cause a segmentation fault on a modern OS, where as accessing an out of scope variable is just evil.
@@bur1t0 Fair point, and this can be the source of very subtle bugs if you're careless! It isn't deallocated in the sense that it's no longer mapped to the program's address space, but as you point out, it is no longer reserved for the data it contains and can be overwritten by the runtime, meaning it should probably for all intents and purposes be treated as strictly deallocated regardless.
People keep on mentioning OOP. Many languages 'support' Object Orientation. Emphasis on 'support'. If you have enough internal discipline, you can use the object abstraction in Assembler, C, Basic.... Adding the abstraction layers to a language may make it 'easier', but it does make for a very thick language spec, and book. The number and scope of interesting traps and errors also grow at the same time. A feature rich language may be an advantage. If you have an infinite time to learn it. Ada is a very rich language. To write mission critical 'real time' code you use a basic subset, same sort of feature set as a 'Small C'. The code generated can then be simple enough to hand inspect to establish function.
The hassle of dealing with "object oriented" magic makes me hate that stuff. Always some weird error with run time/compile time nonsense or casting or types of whatever. I have little experience but to me it often feels like using an airplane to go down the store for groceries, it's not worth it, I'd rather just walk.
Nice video! One small nitpick though: one should never write functions with completely empty argument lists in „C“, as the compiler takes this to mean that the functions in question can be called with a variable number of arguments (of nearly arbitrary type). Taking your example of a completely innocently looking “int main ()” function, this means that it would perfectly valid to recursively call this function-for example-like so: main(1, 'A', 3.7, "unintended consequences"); Thus, if one is not interested in doing any command line argument processing (at all), a programs entry point should always be written like so: int main (void) { ... } Admittedly, this may change with the next version of “C”… however, until the final standard for C2x is released, the above rule stands as described.
Languages with garbage colection: Don't worry about memory management, I got you covered and I will protect you. C (and c++) :You're on your own kid... Have fun.
OO is just one way of creating layer of abstraction in your code. OO languages not only enforce one way of abstraction layering, but also the OO in most of the languages is 'underpowered' compared to the true OO concept as it was executed in SmallTalk. Not having obligatory OO in your code and having access to memory granted by language means you can create your own abstraction solutions, tailor fit to your particular needs. It also means that if you write C code, it can always do something useful for your program, there's no noise of boilerplate and excessive naming. When you're reading a program, would you rather have to read one file with 5000 lines, or need to understand relations between 100 files 50 lines each? When you're programming, do you just want results, or do you need control?
This. People not familiar with C tend to misunderstand/overestimate garbage collection. Not EVERYTHING has to be malloced & managed; like 90% of stuff can use just normal stack variables (or for things like hardcoded char*, live in the data section). E.g. char* str = "hey"; printf("str is '%s' ", str); The only times you actually need to malloc the variable is if you want the variable to survive beyond the scope of the function, or (like you said) you don't know at compile-time how many instances you're going to use (e.g. if you were writing a browser in C, you'd malloc for every tab opened, cause you don't know at compile-time how many tabs the user is going to use; that's an indefinite/variable number).
I used to hate it in college, but now later I realise that it provides the perfect balance b/w high level and low level stuff. The pointers stuff and similar low level details provide a great gateway to get into OS, linux, Networking and Compiler stuff.
we're in 2021, there are plenty of languages that provide full control on low level stuff while still being considered high level level languages, and which are also infinitely less painful than C
@@mccarthymalonza6500 I'd say it depends on which part of the low level stuff you need to have access to, but I suppose high-level languages like Rust or even C# could satisfy most of the these needs. On the other hand, you could use C when you explicitly need a low-level language that's as close as possible to machine instructions (e.g. when programming for embedded system with very limited resources), but then there's nothing "high-level" about it in my opinion.
@@borgir6368 People do use Rust for low-level things. For example, Redox OS is an example of an operating system written in Rust, and the Linux kernel is adopting Rust for writing some drivers in. The point they were making is that most people don't actually need to use C; C# or Java would still give good performance while being much nicer to program with, because they offer garbage collection and built-in conveniences like inheritance and lambda expressions, and they hide pointers from you so you don't shoot yourself in the foot.
c is really really simple, and easier to get a hang of than you think. you dont have to be smart to write good c code, like you have to be to write good c++ or javascript or anything. c also compiles in the blink of an eye, which is my favorite part. if something turns out wrong, i can make and test small changes really quickly. c++ leaves me waiting at least a minute between changes. managing your own memory is really a blessing. c only ever does what you tell it to do, making it really easy to understand whats happening in c. word of advice: dont leave pointers dangling. if you havent allocated it yet, NULL it. if you just freed it, NULL it. this way, if you try to use it, you will always have a segfault instead of the application carrying on, on hallowed ground. also free does nothing if it is passed a null pointer, so you can leave a goto label at the end of a function to handle cleanup and return, and just pass everything to frees or whatever destroy function you may have. in fact, just get comfortable with the memory allocation functions. theres malloc and free, but also check out calloc, realloc, and reallocarray.
I started coding using python, javascript, etc. But I never fell in love with programming until I picked up C. Easily my favorite language together with Lisp. Sadly I never get to use them.
I love C. Sometimes I think it just needs a simple push in the right direction to make it better as a language, but its fundamental essence is that of a language that is just so very simple, fast, and beautifully expressive. It's also incredibly dangerous. I still love it.
@scum bag worse: as far as gcc goes, on ARM it's unsigned, on x86-family it's signed. yet, char remains different from signed char and unsigned char to the type system :) nothing in the standard actually prevents `char` from being 16 bits... some obscure platforms (e.g. DSPs) may do that, but it can be assumed to be 8-bit for all typical platforms
No, it's whatever is faster on the system you're targeting. If you're doing arithmetic on chars instead of just reading, writing, or testing for equality, you should explicitly use "signed char" or "unsigned char".
C will never die. Tons of "low-level" languages have come and gone over the years with the goal of being a C alternative, and most have faded into obscurity. The base libraries that pretty much everything is built atop relies on C in one way or another, and it is just too much to "rewrite it in Rust" or whatever other "hot" language is stylish to use that year.
At college (I was at electrical enigneering at the time) the first programming language they taught us was C. About 80% people after that hated programming profoundly. I really enjoyed it although it was a very difficult course to pass. I mean you give people with 0 programming experience to use pointers. We also worked with structures and even had to work with matrices for the final exam.
When I finally moved from C to Java, the fact that Java has a garbage collector and has memory safety has confused me, annoyed me, and is causing me severe discomfort over the fact that I am not managing my own memory, I feel like a vital limb has been torn away from me when coding in Java
Id recommend rust if you want a step higher without losing control. In it you dont have to manage memory manually, but it doesnt have a garbage collector either. It just deallocates everything as they go out of scope (which does come with a few minor limitations). Basically rust is an amazing language (in my opinion) that has memory safety guarantees without performance loss.
Id recommend C++ if you want a step higher without losing control. In it you dont have to manage memory manually, but it doesnt have a garbage collector either. It just deallocates everything as they go out of scope, (look up RAII and C++11 pointers). Basically C++ is an amazing language (in my opinion) that has memory safety guarantees without performance loss.
@@ilyastuurfedorov4057 uhhh youre talking about rust? C++ has manual memory management and 0 safety guarantees, while rust has every benefit you mentioned
@@theroboman727 That is incorrect, please, re-read my comment, try looking up stuff that I mentioned. There is literally no reason to use different language like Rust for feature-set like this, since C++ supports it out of the box since 11 years ago.
@@ilyastuurfedorov4057 smart pointers? Dont you have to wrap stuff in "unique_ptr" or "shared_ptr" every time you have type annotations? Also arent there many cases where smart pointers make no sense and you have to use regular pointers/references, and you lose the memory safety? Rust has all and much more of these safety guarantees at compile time whatever kind of pointers you use, and has better support for smart pointers because the language was designed for memory safety and these kinds of features. If you think rust and c++ are anywhere close in terms of safety/security, you havent done your research.
Fun Fact: The reference implementation for C is technically the Portable C Compiler developed by Bell Labs, more specifically by Mr. Ritchie and his colleague Mr. Thompson. It was extremely portable and could do all sorts of link time magic, shame that it never caught up to gcc and eventually stalled in 2014
The simplicity of 'C' makes it practical to port compilers easily from one computer/cpu to another. It is interesting to investigate 'Small C', covered in Byte at one time and in 'A Tool-book of C'. Once the compiler is ported, libraries and applications can then be recompiled for the new machine.
I just started cs50 earlier this week and hearing that c is one of the more simple languages is definitely not what i wanna hear but are well wish me luck.I feel like once i wrap my head around how to solve problems then everything will kind of be about practice and ill have a much easier time learning
My professor had a saying: "You will die but C will live."
that's a good saying
Yea C wont die out because of embedded systems.
so long as there is electricity, C will be forever
@@Afterburn7 c is used in kernels and stuff.
@@Afterburn7 C++ is C with AIDS
C is such a simple language that you could learn the syntax in a day, learn the important parts of the standard library in less than a week, and learn how to use pointers.... eventually.
You can learn pointers... you never master pointers
@@saeedbaig4249I loved c and hated all other languages when i studying
After my school, i had learned so many other langs, and I never even used c in my job or higher studies, but nothing same as c
@@vaisakh_km depends where u work
@@betterthanb4r yes, i ment that, eventhough i never used it.... i still love it....
...once you see the pointer, which points to the pointer of the pointers' pointer
Love how it ends with a seg fault, describes the C experience perfectly
What does it mean?
Seg fault almost cost me my last semester
@@diskyariajetmiko A seg fault is when you try to read or write to somewhere that you haven't allocated memory for.
@@loganhello not necessarily, it's read or write in memory that you are not allowed to read from/write to. You can read garbage from memory that you haven't allocated but that is undefined behaviour.
Ah, the memory
Becoming a programmer these days is like coming in to a movie theater halfway through the movie. Computer Science history weighs heavy over everything. After decades of messing around with computers and programming I understand the approach of learning things in the order they were invented- everything builds on what came before. Everyone should learn C first if they want everything that follows to be many time easier- because you will have context and won't need to constantly ask: "Why TF did they do it that way?" you'll already know.
I went from BASIC to assembly to C back in the late 80's and early 90's. By the time I got to C, I already knew what "i++" and "++i" meant, how they were different, and why that difference was significant. I had already programmed position independent and re-entrant systems, and understood the benefits and costs involved. I understood why Motorola CPU's were/are far better at it than Intel.
I agree with you that learning programming history is a huge, highly underrated benefit.
I studied the history of computing for 2 weeks straight before actually learning how to program.
It would be insane to teach 15 year olds C, which is when we start learning some programming where I'm from. We start with Python. As someone whose first language was Python, I haven't struggled that much with C honestly.
Seems similar to the way one learns studio or fine art at the professional level. Nothing exists in a vacuum and the best artists reference what came before. Ongoing dialogue
EXACTLY! I keep asking myself the same question: "WHY DID THEY DO IT THIS WAY". I can't focus due to this. Can somebody tell me how I can overcome this?
Being a CS student, transitioning from classes where C is the only language to now building modern apps with Kotlin/Swift and other various API's and languages. I now really appreciate the simplicity of C.
Agree 100%. But I'm also grateful we have languages like Java that have garbage collection. Haha.
I work as a software engineer writing C code on embedded linux, and while the C language may be simple, the bugs can be some of the worst ever. I'm finding I much prefer higher level languages. Also, the lack of a string type or decent string library (string.h is trash IMO) really hurts its usability. Also, legacy C code written by EEs who didnt know good programming practices is some of the nastiest stuff ever created lol
If you ever programmed anything more advanced than an introduction to programming "Hello World" program in C, you would appreciate the simplicity of higher level languages.
How long did it take you to turn from being heterosexual to kotlin developer?
@@unheilbar AHAHAHAHA! Oh that's good. About one college course that praised AGILE and OOP like they were made by the second-coming of Jesus himself dude.
"Use prinffffff" 🤣🤣🤣🤣
And those "C" puns at the end 👏🏾
Exactly what I wanted to comment 😁
You committed on behalf me.😁😁
;)
That's what "C" said...
damn you read my mind. Amazing work I love it @Fireship
Everybody: use print-F to print the value
Jeff: *Use printfffff*
It's not Jeff it's Jefffffff
@@evertonalmeida1165 Je-F
😂😂😂 lol
Ffffffff
@@_prothegee Ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff
Quick remark, gcc actually means GNU Compiler Collection and not GNU C Compiler, the name GNU C Compiler was dropped around 1999 (in April to be precise) because C was only one of gcc's frontend.
Good call, thanks for the correction
@@Fireship No problem! Keep up the good content!
@@Fireship You could pin him or something but everyone who searches for it will just type gcc anyways I guess
Yeah details like that aren't included in the first paragraph of a wiki page, so the youtuber missed it.
@@pearz420 😂
C is challenging to master because to truly use it well, one must understand fundamentally understand how computers work. This means having some ideas about registers and ASM syntax, while not required, greatly enhances interpreting why certain design choices were made in C.
C allows interfacing to hardware in straight-forward ways (custom registers, memory mapped peripherals) which don't really make sense in higher level languages. So C excels at low and mid level tasks where performance and resource allocation are the most important criteria. But it is difficult to be time efficient if you have to mix complex dynamic data, such JSON or web content.
Well, it's true about almost(?) every language. If you want to master it - you MUST understand, what happens under the hood. Without understanding you can get relatively far, but other people will point at you and laugh.
Yep. C is just a high level form of Assembly. The output from many C compilers is an Assembly program which is then fed to the Assembler to create object code and then to the Linker to create executable code.
@@bf945
That the case with many (all?) compiled languages, but the thing specific to C is that since C is simple (not necessarily easy), its easy to reason about generated Assembly, specially if you have habit of reading Assembly listing file of some compiler, you get a general idea what assembly instructions your compiler might generate from some C code.
This is what makes C so joyful to work with in Systems programming.
@@alexeynezhdanov2362 As a C# dev...no you don't. I routinely have trouble where I can't find anyone on my team who knows whether the performance of a LINQ statement will be better written one way over another, for example. In that environment, performance almost never comes up in consideration, it's all about how clear and maintainable the code is, and if it follows a set of arbitrary style guidelines that are defined vaguely enough no one can agree on (or even really cares) what they mean. Java's the same way...and when you add in front-end web development and consider a lot of backend stuff is written on the assumption that microservices or other tech magic will do all the optimization you need (I mean, to be fair most of the time is spent executing web calls anyway), that's a sizeable chunk of the industry that doesn't require any low-level knowledge for better or for worse.
@@traveller23e the discussions is about C not C# which is abstracted!
"Platform-dependent" in the context of compilers doesn't really imply dependence on a particular operating system; it means the native code is generated for a specified processor architecture, such as ARM or x86.
I'm theory you're right, in practice if you're going to use the standard library you'll likely be bound to the architecture AND the OS
Platform is OS + processor architecture. No?
I'm 99% sure this is wrong, since compiled code will use subroutines defined in ram by the OS, so an executable for linux won't work on Windows even if they are run by machines with the exact same specs
@@Unit_00 Well that's a different matter of binary formats and position dependency.
Well it depends, if what you want to do is emcompassed by the architecture, then the resulting assembly will be the same:
(adding two numbers from the stack together)
mov edx, DWORD PTR [rbp-4]
mov eax, DWORD PTR [rbp-8]
add eax, edx
But if you need anything provided by the OS (so most things done by the standard library), the compiler output will probably be different:
(exiting program, Linux vs Windows)
mov eax, 60 ; sys_exit system call
mov edi, 0 ; return code
syscall ; equivalent to int 0x80
mov ecx, 0 ; return code
call ExitProcess ; WinApi function
add rsp, XXX ; clean up stack
2:14 "Segmentation fault" - classic
@Z3U5 Different compilers have different behaviour, but GCC allows the implicit conversion, so (char*) isn't necessary. Plus, char on most systems is 1 byte, so though a good practice, not multiplying by sizeof(char) wasn't the cause of error either.
The only thing I can think of is that his malloc failed to allocate memory, and hence he tried to access null + 1 address, which failed.
@@comradepeter87 char is 1 byte on ALL systems, cause it's the "single byte"
it's in the standard :)
@Z3U5 casting malloc result is not canonic C
@ZRU There is no problem with the original code he wrote unless he change it somewhere and didn't included it in the video.
He never access any illegal memory address. void* can be safely cast to any type except a function pointer.
I left out the code that caused the seg fault, it was an intentional joke
Ah yes the classic functions, print-fff and scan-fff
And fprintf, and sprintf, and fnprintf, and snprintf, and vprintf, and vfprintf, and vsprintf, and....
@@mananasi_ananas i love these now im going to call it ffffprintfff
@@whythosenames sssnnprintfff
Useless functions. All my homies use write() and read()
Actually, as opposed to C's neighbours, one can really learn all C's syntax in 100 seconds... 😂
@@hi_arav 💯 percent agree.. I had my suffering from C, then I moved to JS. Thanks GOD 🎶🎶
@@hi_arav C is one of the few languages that require you to develop your own standard library to even use the language efficiently. Libc is just way too basic to be reasonable.
I think the *type* syntax of C is quite complicated and I've seen plenty of people get confused by it. I mostly mean declaration and use of anything that has to do with pointers.
Pointers aren't complicated but the C syntax without proper explanation can make people think that they are.
@@schniemand That is why I wrote "C's syntax"..
@@orco3847 js is worse when it comes to complex stuff :^)
Tip with malloc: if you cant remember how much bytes u need to store u can do malloc(sizeof([datatype here]) * [how much of those datatypes you want]) e.g:
char* word = malloc(sizeof(char)*4)
Now with chars its easy (1 char = 1 byte) however this technique works well when ur making struc arrays
If you need to initialize a string and know its length at compile-time just avoid the heap-allocation and do
char word[4];
i think in this case it's better to use calloc:
str* word = calloc(4, sizeof(char))
@@TheLoveMario that's cool until you have to access that variable outside of the function its initialised in.
@@yeahthebois3617 In a header file: char* word;
Inside the function: static char a[4]; word = &a;
@@TheLoveMario completely forgot abt that one there 😂😂😂
C was my second language I ever learned. I find it kind of funny that your explanation of C is mostly what happens in the machine for _every language_ but that most people take for granted since so many popular languages have an interpreter and/or sandboxed environment holding their hand.
12/10/2011: 10 years ago, Dennis Ritchie past away, leaving the world the legacy of the operational systems and the internet that we use today.
He is the true legend, not Steve Jobs.
and alot of insecure, messy legacy codebases. C is certainly a huge accomplishement, but couldn't we just have bounded buffers? real strings? were those couple of bytes really too much? even as an optional addon?
He passed on with steve jobs yet few acknowledged him as compared to jobs
@@cobaltno51 Yes, those couple of bytes were too much in many cases back then. Today we are spoiled by super fast and high memory computers, but people literally counted the bits back in the day. Even C was considered a "heavy" language at some point.
Steve jobs is even lesser know than iron man.
Point is, Having more than one legend is not a necessary for the world's existence.
@@cobaltno51 bro it's a 50 year old language
If we look at the algorithm, next would be C++ in 100 seconds, C# in 100 seconds, Assembly in 100 seconds and finally Binary in 100. Last time in the react native, flutter and kotlin series, I predicted correct
that would be "Binary in &1100100 seconds"
If C was 100 seconds, clearly C++ must be 101 seconds, or 5 in base 10.
What about Eiffel and Ada? Maybe, Pascal and Lisp even. Not to mention Algol and Smalltalk.
Quantum Mechanic inside Processor & Silicon Transistor in 100 Second
@@patrickmullot73 epicccc
I was brought up on C - I bloody love the language (doing realtime embedded software and device drivers with it is a rollercoaster until you know what you're doing).
The coolest thing I ever had to do in C was implementing sensory input on an Arduino microcontroller. It was a bizarre combination of bit manipulation, machine language, and regular C that you could only do in something as low-level as C itself.
2:19 I C what you did there.
Most programming languages: Detailed Exceptions / Errors with line numbers and error specifications
C: *SEGMENTATION FAULT*
I remember writing an employees management system using C during my first year of university. Learnt a lot, including the fact that I don't wanna write C code for work.
@@mattmurphy7030 I admire your kind. The guys who write C code and do embedded stuff. I love it, but I just haven't had a chance to do it extensively. I feel like that's where the real stuff is at.
@@justapugontheinternet It is. That's pretty much what I do for a living. It takes time to get used to. That being said, with embedded processors being a lot more powerful than they used to be (the Cortex A53 is an industry favorite right now) we're able to splurge a lot more and go C++ rather then straight C, since we aren't as performance constrained as the days when we used 286's and 68000k's as our embedded processors of choice.
@@gamerk316 Not sure about you but I loved the 68000 architecture. It was so elegant compared to the x86 stuff. I've been fortunate to have been working with C and C++ for 25 years but now being pushed into the Java world, I'm hating it.
@@toby9999 Go C# my friend.
@@toby9999 what's your job
It blows my mind how much of the technology we depend on everyday came from Bell Labs
Truly a place with so many of the smartest people in the world.
Indeed Bell Labs and Xerox Parc technologies and invented paradigms are still either directly or indirectly used today - these people were the knee's bees. Amazing visionaries and engineers all of them.
@@PhilipAlexanderHassialis What I saw at Xerox Parc's graphical user interface demonstration was a regression. Its no longer "normal" to use a command line to get exactly what you want done, fast. It's all point and grunt like cave men.
@@KookoCraft If you dont like just dont use it.
@@KookoCraft GUI is better though.
2:04 if needed, you can mimic an object-oriented programming language approach in C by having function pointers inside a structure, which then would be accessible through the arrow operator. An instance of this structure could be compared to an object with its getters, setters and methods.
Very nice video, I haven't used C lang for quite a few years and that's a great throwback with all those countless segfaults
iirc that's what they do in the Linux kernel
How would something like 'private' and 'public' work tho?
This is what SDL does, iirc.
struct some_sdl_interface { func_ptr method; };
struct some_sdl_interface* create() { ... st->method = method_ptr; ... }
create()->method();
I think SDL does that for a lot of its abstractions.
@@5cover opaque pointers can be used to hide details
Yep, do know that a lot of oop features are made using structures and function pointers
One of my all-time favourite things is Exercise 1-23 at the end of chapter 1 in “The C Programming Language” It says “Write a program to check a C program for rudimentary syntax errors like unbalanced parentheses brackets and braces.” This at the end of the first chapter!! We only learned the very basics, and now Kernighan/Ritchie expect us to do that! I think I actually went back and did this exercise after reading chapter 5. But it still remains with me 40 years later as something that just seemed so “Wow!” …
Same experience. Actually really like the book, by the time you read it you feel like you can actually write useful progtammes
If you have K&R at your fingertips, you dont need anything else. The man pages online will tell you everything else.
whats the book name@@neutral_positron
@@WalterDmllo just search k&r C book
@@WalterDmllo "the c programming language" by kernighan and ritchie
It should be said that malloc/calloc allocates memory on the heap and you should only use free on heap allocated memory. Stack allocated memory will be deleted automatically after it goes out of scope.
Immanuel's talking about 0:52 I assume
One could nitpick about your use of terminology.
Objects on the stack aren't "deleted" per se. The language manages the stack automatically, without the programmer having to write code to do so explicitly, and this management is essentially just updating the stack pointer register. In some kind of vague sense when a function is exited and the stack pointer is changed to what it was before the function was called, that stack space is kind of "deleted", but that's a bit of a misnomer.
(In C++ the use of that term would be more apt because in that case the language actually literally calls destructors for objects on the stack, and in this case "deleting" them would be an appropriate term to use, in a sense.)
That part about a string being an array of chars brought me wayyy back to early days in my university class learning C
But did you ever create an array of characters without remembering to terminate it with a '/0'??? A noob move that everyone learning C has to go through 🤣
@@JoolsParker '\0' not '/0' :D
@@raunak1147 🤣 That’s the noob move that comes after realising there’s no terminating character!
@@JoolsParker :o
I think C’s easier than C++. With C++, you feel like you never actually finish learning the syntax. Ever.
yeah, exactly
I really hate it when I get weird errors.
You do finish learning the syntax. I learned most of c++ when I was in middle school in a couple of months. You just need a good tutorial and a lot of determination.
@@sxs512 I'm glad to hear it's possible. I've only been learning since September, so I think I've still got a bit of time to go. Most online tutorials are anemic, though, so I think I'll need to memorize the class textbook or something.
@@mattmurphy7030
C++ gets freaking confusing when you learn about pointers. Segfault runtime errors are a pain on my arse
Everyone: print ef
Jeff: prin(t)ph
Honestly C is one of the most amazing language I've ever used. Once you get through the concept of pointers, you'll find out how powerful they actually are. It's one of the most easy-to-use language to do simple parallel tasks without blowing up your memory limit.
Don't forget how powerful memory leaks C does. I just did one yesterday.
One of the most easy-to-use languages to do simple parallel tasks?
Are you serious?
Pretty much _any_ other programming language that has support for threads is simpler to use for parallel tasks than C.
@@DjVortex-w
The nice thing about C is you can access the same memory chunks in multiple threads more easily. You don't have to call the mutex APIs if the threads are not going to work at the very same position in memory. I'm not sure about Golang though. The synchronization in Java is a bit convoluted but most of the APIs are nicer indeed. Running multi-thread in Python is nonetheless a nightmare. In many cases the GIL gets into the way and makes multi-threading meaningless (e.g. numpy arrays are managed by GIL in the version I used). In my project I had to get both temporal and spatial efficiency and C was the optimal solution to it.
@@mu11668B
That doesn't make C the easiest language to run multiple threads. It just makes it easier to access shared memory without mutexes.
If that's the measure of easiness, then C++ is better on both fronts: It's very easy to create multiple threads (with 100% standard code, no need for non-standard libraries) and you can likewise bypass mutexes if you want. (And, in fact, the C++ standard library also offers support for efficient atomic variables. I don't know how many other programming languages do. And yes, C does too, I know.)
I was in college studying computer science circa 1980. For programming we were studying FORTRAN and IBM OS360 assembly language. A friend of mine was interning at Bell Labs and he said he was learning a language that was so much better than FORTRAN called "C" and an operating system so much better than IBM OS 360 called "UNIX". After being a professional FORTRAN programmer for a number of years I started to use C and it was so much better than FORTRAN and the whole developer environment was miles ahead of anything else at the time.
With C you have to check your math (especially floating point division) on each processor type you're planning on working on. With FORTRAN (and COBOL) you don't. No matter the platform you get the same results. C uses 2's complement to store numbers. FORTRAN and COBOL - BCD. 2's complement is faster, but sometimes wrong. BCD is slower, but always correct. There's a reason R uses the ancient (and well debugged) FORTRAN statistical libraries and not the schlocky C equivalents. That isn't to say that C is inherently bad. But using C as a replacement for FORTRAN in many cases is. Now, C for integer based math? - like a rocket powered toboggin. To the point that I convert my floats to integers and only do integer math.
This makes me want to revisit C, didn’t like it when I learnt it at uni because struggled with pointers, but that all seems more interesting to me now!
th-cam.com/video/zuegQmMdy8M/w-d-xo.html I highly encourage you to watch it, I started learning C with this video. It isn't as hard as you think, you could maybe watch 30 minutes of this video everyday and then watch the Data Structures video from the selfsame youtuber. If you want to learn even more about it, then Jacob Sorber's videos are great too.
I started programming with C. It's not as hard as people say
what did you learn in this video that you haven't been taught at university? or is it just that you find it more interesting now?
C gives a lot of nostalgia hahahaha
C is actually really cool, just absurdly verbose, by design. And you can target just about anything.
There is even SDCC, a C compiler that can, for instance, compile for the GameBoy!
One note, return + any number means non-success exit, you can then check what code was returned (exited) to figure where you returned that number.
YES. I was looking for this. Essentially everything in C that is seen as "FALSE" (not always but often defined by pre-compiler) equates to 0 (0x0, etc, not '0' though). Anything NOT "FALSE" (or !0) is seen as "TRUE" or essentially equal to 1. However, anything "TRUE" == 1 is not always true. It's tricky but the only real definition of "FALSE" should always equal 0 and anything non-zero is true.
In extension to that, the only error code that means "no error" is 0 itself. Meaning your program ran perfectly with no issue/bug. Anything non-zero isn't necessarily an error, but any "non-flawless" execution. The nuance is necessary and this hasn't been stated enough in the comments (to my knowledge).
yeah for example 137 is SIGKILL
Being the mother, C deserves beyond 100 seconds
C is the big sister, assembly is the mother
It is beyond 100 seconds. It's 145 second to be exact.
@@salsamancer assembly is not even a high level language you can call it grandma
@@OriMoscovitz hehe
Been programming in C for 3 decades, so, at this point, I can't even use a language that doesn't allow me to work with pointers and manage my own memory. And I'm better off for it.
I learned python then C and now C++.
It genuinely feels so much better to deal with the memory directly and you can be really creative with it. You have to know what you're doing though.
@@ghosthunter0950 same, I really like C/C++, more than python, even though it is much easier
It depends completely on what you are doing.
If I'm writing a shell script, for example, I couldn't care one iota about memory management, and I'm completely fine with the shell script interpreter doing whatever it does, as long as it does it correctly. Manual memory management in this context wouldn't make an iota of sense.
I love that I can actually understand every bit of this video now! I watched this when I first started learning c++ about a year ago. I’ve now taken two computer science classes and a web authoring class for html, css, JavaScript. As well as a course in Microsoft power shell. Nice to see I’ve progressed. Lol
Can we get a "Lua in 100 seconds?🔥"
Of course! my favorite programming language persuaded me to learn programming properly.
Oh, the forbidden language technique of the Hidden Brazillian village. A man of culture, I see.
I was about to ask this
@@softwarelivre2389 lol
@@softwarelivre2389 While other languages were evolving by adding new keywords, the moon of tecgraf evolved by removing them
Literally began looking into C today for the first time. The timing of these videos has been oddly perfect.
Thank you so much for these!
Good fuckin luck fam
I had a great lecturer so he helped me (and a bunch of others) through the week or two it took for me to grasp pointers and arrow notation. After this, C became my favourite language to program in as I love how strict it is. And as it’s statically typed you know exactly how the data flows about in your program. And as long as you use them correctly, pointers make for very easy access to variables from anywhere in your program.
Erm.... I would not say C is a strict language at all - you can REALLY screw stuff up with it. I saw it almost has a higher-level assembly language in some respects. Try looking at Ada for a strict language.
@@JoolsParker true
@@JoolsParker By strict I meant mostly that it was statically typed and it was my first statically typed language (I started programming only in uni therefore I was very late to programming). Being able to shoot yourselves in the foot is true though, as we all definitely have. However C is far more rigid, compared to vanilla JS, which is what a lot of viewers of this channel (including myself) program in.
@@thedrunknmunky6571 If you enjoy the strictness of C, C++ is significantly more strict with types and casting. C++ has exceptions as well, so the shooting yourself in the foot stuff can be caught, like accessing outside of array bounds.
I *still* get confused by const and const ...
I am a C# programmer and I recently started learning C to program with arduino. My prior knowledge of C# really helped me understand basic programming fundamentals and syntax. Both are definitely great and useful languages!
C is by far one of the best language . Even after so many years and so many languages later it's still very useful .
That video, like always, was fantastic. I think it'd be quite interesting to see you cover one of the Lisp languages so Common Lisp, Clojure/ClojureScript, or even Scheme (in that case it'd most likely be Racket as many of the other Scheme implementations are too minimalist by themselves or are extension languages instead of general purpose programming languages).
Scheme is WONDERFUL! I'd watch a video if he made one
Clojure! Clojure!
Clojure is one of the BEST languages I've ever seen. It takes the good parts of functional programming & good parts of lisp while getting rid of worse parts. It works on JVM so you can use java libraries. Clojure repl and the tooling around it is amazing, the language really focused on the repl. You can even use the repl with clojurescript to run clojure code that then evaluates ON THE BROWSER IN REAL TIME LIKE BLACK MAGIC.
Gcc actually stands for gnu compiler collection, it can compile fortran, ada, obj C, go and more
That's crazy my dude but who asked
@@darkforst17 I did
Such an efficient overview, as someone with familiarity with languages built on top of C but no experience in C this sheds a lot of light for 100 seconds. Thank you Fireship!
C, c++ , and Go that is simple to learn and training your mindset and programming skills. Thanks for quick video but really useful!
I always say I love Go, but the elegance of C is something that I just can't get over. It's so beautiful
@pntg0n!kyuu Because they represent two completely different things. (.) is the "access" operator, i.e. access a field in this struct. the "->" operator isn't really unique, its actually a combination of two operators. "->" is equivalent to "(*).", as in "deference this pointer to get a concrete object and then access this element in this object". This is why they're two separate operators. "->" is used exclusively on POINTERS, i.e. references/addresses, while "." is used on CONCRETE objects. It allows you to look at code and quickly figure out if a variable is a reference (pointer) or concrete object. It also lets the compiler easily statically check the program, as it will detect if "->" is used with the wrong type and if "." is used with the wrong type. Languages that only use "." can't do that. Also, languages that only use "." generally don't have pointers. See: C# and Java.
@pntg0n!kyuu I didn't know C# has pointers, but I would say that "." and "->" are both used very often and they're definitely not interchangeable because pointers aren't like references in C++, you can arbitrarily access a method without first dereferencing a pointer. I think it makes sense from a technical view, i.e. pointers must be dereferenced before using a method/member variable, and from a user point of view, i.e. "->" clearly indicates a pointer, "." does not.
@pntg0n!kyuu The majority of people haven't actually done proper programming to know how problematic implicit type declaration is. It's very ambiguous and shouldn't be used when you have lots of incompatible variable types, especially when you want to have multiple *uninitialised* variables of different types.
"and i will c you in the next one"
Why did i knew you were going to say that as soon as i clicked on the video xDD
Fireship videos are like a knowledge pill for your brain. Simply awesome!
I enjoyed C so much. I recommend everyone to try C, you will just learn a whole lot about how computers work.
Edit: You can C how the computer works, ta dum tiss.
It's my favorite language, plain & simple but still very eloquent to write.
Anybody who wants to learn C should read the K&R book. It's brilliant and has everything you need.
I kinda want to, but it's so old. I got Effective C which covers c17 and possible C2x.
Please, don't recommnend it to beginners.
That printf was smooth
c is such a wonderful simple language, if you're not new to programming, this video has everything you need to get you started in c. awesome job man!
p.s.
i watch these videos even if i don't care or already know the programming language/tool because they are quite fun to watch. awesome job man!
but remember, simple != easy
@@cobaltno51 neither C is really as simple, numerous UB's and quirks are there in the standard, they are there for a reason though, and you don't usually encounter them
C will forever remain my favorite programming language. Of course other languages suit certain demands better. But for bare metal power in a high level programming language, c is simply unbeatable imvho.
I'm a greybeard architect now - over 30 years building code. I mentor, because everyone should. I try to get the younger devs to learn a little C (not C++). My suggestions to them are: learn a little C; understand how IO works in your OS (buffer sizes, how data is laid to storage, and IP packet construction); and how memory works in conjunction with the kernel. They are amazed how much performance can improve if you just align your data to the device, and minimize runs through the IO subsystem. Those that do eventually become much better designers and developers overall. Most 4th-generation languages - and nearly all OO types - hide a lot of details, which is good (Java is my goto). But knowing how it works under the covers saves a lot of problems.
"Segmentation fault" ah yes, the C language
Came here in seconds!
Same
Same
Gold! 🥇
@@Fireship your videos are always fire, this was just what I needed! You always exceed expectations!!
same bruhh
"C does not support OO" if you know how to use C you can make OO it is hacky but you can.
There is a book, "Compiler design in C" that all their examples are written in C, and explains how the code is OO.
It certainly is possible. When I was in my data structures course, I was the only crazy person who did the assignments in C (linked lists, binary trees, dynamic queues and stacks, etc.) Objects are just structs with function pointers.
that's mimic. not support. It is possible to write C code in style of OOP. but that's not OOP. OOP is how you organize data and OOP languages provide syntax for that concept. but C don't provide that syntax. you have to make up oop structure from the scratch and then you have to stick to that style. you can't expect other programmer to stick to that style. if you need to work with other people then you need a lot of document. that's why we don't call C OOP language.
@@MrjinZin0902 OOP is a philosophy, not a language. Back to noob class with you.
🤣🤣🤣🤣🤣🤣. That last part killed me! That caught me off guard. That was so good! Im subbing just for making me laugh so hard. Thank you.
C is the daddy of all. RIP To the legend Dennis Ritchie.
Would love a full course on C from you guys
I love that in C you have no classes. I think separating data from functions makes any program cleaner and easier to understand.
You can implement OOP in C but you should not
I agree for most cases, but I will say containers especially lend to object oriented programming very well. Having lists, vectors, trees, etc. with methods to act on their own data is intuitive and very concise. list.size() and vector.size() have completely different code, but they have the same name and represent the same thing - the size of a container. It really makes it simple to abstract and encapsulate complex data structures with complex memory allocation and traversal into objects as simple and easy to use as arrays. OOP also lends well to GUI parts of applications, as polymorphism and inheritance represent GUI components and widget hierarchies very well.
@@lucass8119 I can agree about containers. I don't know about GUI programming since this is not my field. I guess the best approach is to use classes where they really fit and not everywhere. Having everything as class never works from my experience. Instead of focusing on the task that your program needs to do you focus on creating object hierarchies. So for example whether message should send itself or there should be a sender object that sends messages.
@@MrChester114 I agree 100%. The "everything is a class" sort of idea that we see in Java and others is dumb and needlessly complicated. Most things don't need classes and objects, and you should only use OOP if it makes sense for that specific problem.
And I think you've never tried writing anything longer than 100 lines of code. If you did you'd know that using objects and oop concepts like polymorphism and encapsulation is the way to go.
The video implies that the stack has to be managed manually, but this is not true. If you declare a variable int phone_number, the memory for it will be automatically deallocated when the function containing it returns. This does not require garbage collection.
I was in doubt with that detail, thanks for explaining!
It isn't deallocated. Such variables are created on the stack, when the function returns that variable is out of scope, but can still be accessed by evil people until something else overwrites it, assuming it wasn't optimized out of existence by the compiler. Accessing something deallocated would cause a segmentation fault on a modern OS, where as accessing an out of scope variable is just evil.
@@bur1t0 Fair point, and this can be the source of very subtle bugs if you're careless! It isn't deallocated in the sense that it's no longer mapped to the program's address space, but as you point out, it is no longer reserved for the data it contains and can be overwritten by the runtime, meaning it should probably for all intents and purposes be treated as strictly deallocated regardless.
C language has its own type of warmth and love in it.
People keep on mentioning OOP. Many languages 'support' Object Orientation. Emphasis on 'support'. If you have enough internal discipline, you can use the object abstraction in Assembler, C, Basic.... Adding the abstraction layers to a language may make it 'easier', but it does make for a very thick language spec, and book. The number and scope of interesting traps and errors also grow at the same time. A feature rich language may be an advantage. If you have an infinite time to learn it. Ada is a very rich language. To write mission critical 'real time' code you use a basic subset, same sort of feature set as a 'Small C'. The code generated can then be simple enough to hand inspect to establish function.
The hassle of dealing with "object oriented" magic makes me hate that stuff. Always some weird error with run time/compile time nonsense or casting or types of whatever. I have little experience but to me it often feels like using an airplane to go down the store for groceries, it's not worth it, I'd rather just walk.
Nice video!
One small nitpick though: one should never write functions with completely empty argument lists in „C“, as the compiler takes this to mean that the functions in question can be called with a variable number of arguments (of nearly arbitrary type).
Taking your example of a completely innocently looking “int main ()” function, this means that it would perfectly valid to recursively call this function-for example-like so: main(1, 'A', 3.7, "unintended consequences");
Thus, if one is not interested in doing any command line argument processing (at all), a programs entry point should always be written like so:
int main (void) { ... }
Admittedly, this may change with the next version of “C”… however, until the final standard for C2x is released, the above rule stands as described.
Is It a problem to allow a variable Number of arguments to the main, if you won't use them after?
Languages with garbage colection: Don't worry about memory management, I got you covered and I will protect you.
C (and c++) :You're on your own kid... Have fun.
Honestly, managing memory yourself is kinda fun
Elm next please :-)
Such a great language deserves to be honoured by the best TH-camr ever :-)
Thx for you awesome work
And its successor Roc as well. Ty
"C in 100 Seconds"
_Laughs in 2x speed_
OO is just one way of creating layer of abstraction in your code. OO languages not only enforce one way of abstraction layering, but also the OO in most of the languages is 'underpowered' compared to the true OO concept as it was executed in SmallTalk.
Not having obligatory OO in your code and having access to memory granted by language means you can create your own abstraction solutions, tailor fit to your particular needs. It also means that if you write C code, it can always do something useful for your program, there's no noise of boilerplate and excessive naming.
When you're reading a program, would you rather have to read one file with 5000 lines, or need to understand relations between 100 files 50 lines each?
When you're programming, do you just want results, or do you need control?
You don't need to use malloc unless you're dealing with arrays of variable size at compile time. Seeing malloc(4) makes me feel awkward.
This. People not familiar with C tend to misunderstand/overestimate garbage collection. Not EVERYTHING has to be malloced & managed; like 90% of stuff can use just normal stack variables (or for things like hardcoded char*, live in the data section). E.g.
char* str = "hey";
printf("str is '%s'
", str);
The only times you actually need to malloc the variable is if you want the variable to survive beyond the scope of the function, or (like you said) you don't know at compile-time how many instances you're going to use (e.g. if you were writing a browser in C, you'd malloc for every tab opened, cause you don't know at compile-time how many tabs the user is going to use; that's an indefinite/variable number).
@pntg0n!kyuu Thanks for correction. Fixed.
Respect for C, the father of my favorite programming language.
"Do you have the slightest idea how little that narrows it down"
C++ it is
I just love how Fireship
instantly gets to the point and and stays true to the title.
0:50 I need that t-shirt
0:50 That is a bit misleading, because the manual freeing only applies to dynamically allocated variables
Yeah, primitive types exist on the stack and are effectively freed as the stack returns up. Fireship is not a C programmer
@@kevincahalan8118 Non-primitive types can too be allocated on the stack; on the other hand, primitive types can also be allocated in the heap
@@Supergeckos1000 Primitive variables*, in other words stack variables
@@kevincahalan8118 just call them auto variables
Oh god, the "printfff" killed me 🤣
I used to hate it in college, but now later I realise that it provides the perfect balance b/w high level and low level stuff. The pointers stuff and similar low level details provide a great gateway to get into OS, linux, Networking and Compiler stuff.
we're in 2021, there are plenty of languages that provide full control on low level stuff while still being considered high level level languages, and which are also infinitely less painful than C
@@loriswit which ones?
@@mccarthymalonza6500 I'd say it depends on which part of the low level stuff you need to have access to, but I suppose high-level languages like Rust or even C# could satisfy most of the these needs. On the other hand, you could use C when you explicitly need a low-level language that's as close as possible to machine instructions (e.g. when programming for embedded system with very limited resources), but then there's nothing "high-level" about it in my opinion.
@@loriswit literally no one use C# and rust for low level
@@borgir6368 People do use Rust for low-level things. For example, Redox OS is an example of an operating system written in Rust, and the Linux kernel is adopting Rust for writing some drivers in. The point they were making is that most people don't actually need to use C; C# or Java would still give good performance while being much nicer to program with, because they offer garbage collection and built-in conveniences like inheritance and lambda expressions, and they hide pointers from you so you don't shoot yourself in the foot.
Just spent all day trying to read a file into a text buffer. It was worth it.
I ❤ C
String in C:
char *str = “Hello”;
prtintf(“%s”,str);
1:28 use print *FFFFFFFFFFFFFFFFFF* to print the value...
c is really really simple, and easier to get a hang of than you think. you dont have to be smart to write good c code, like you have to be to write good c++ or javascript or anything. c also compiles in the blink of an eye, which is my favorite part. if something turns out wrong, i can make and test small changes really quickly. c++ leaves me waiting at least a minute between changes.
managing your own memory is really a blessing. c only ever does what you tell it to do, making it really easy to understand whats happening in c. word of advice: dont leave pointers dangling. if you havent allocated it yet, NULL it. if you just freed it, NULL it. this way, if you try to use it, you will always have a segfault instead of the application carrying on, on hallowed ground. also free does nothing if it is passed a null pointer, so you can leave a goto label at the end of a function to handle cleanup and return, and just pass everything to frees or whatever destroy function you may have.
in fact, just get comfortable with the memory allocation functions. theres malloc and free, but also check out calloc, realloc, and reallocarray.
You don't have to be smart to write in any language. Familiarize with it, and you're good.
C
I would love to see a video about CMS's (Headless CMS"s, Wordpress, Drupal, etc.)
Helpline📲📥⬆️
Questions can come in⬆️
I started coding using python, javascript, etc.
But I never fell in love with programming until I picked up C. Easily my favorite language together with Lisp. Sadly I never get to use them.
Go embedded systems, you will use C everyday.
SAME!!!
@@zawizarudo7295 recently coding some rust. Doesn't suck.
I think the 6 dislikes so far didn't take those "C" puns at the end too well.
Nah, those are just stray GCC errors :P
I love C. Sometimes I think it just needs a simple push in the right direction to make it better as a language, but its fundamental essence is that of a language that is just so very simple, fast, and beautifully expressive. It's also incredibly dangerous. I still love it.
1:28 use *printffff* exactly what I thought to myself when I was a firstyear at uni😂
0:34 The first 4 bytes in hex say cafe babe which are the magic bytes for the java class file. i hope i'm not the first one realizing this lol
Easter eggs in this videos:
Printfffffffff 💨
"If you want to C"
"I will C you in next"
`char` is always 0-255. If you want a -128-+127 type, you’ll need the rarely-used `signed char` or more common `int8_t` type.
@scum bag worse: as far as gcc goes, on ARM it's unsigned, on x86-family it's signed.
yet, char remains different from signed char and unsigned char to the type system :)
nothing in the standard actually prevents `char` from being 16 bits... some obscure platforms (e.g. DSPs) may do that, but it can be assumed to be 8-bit for all typical platforms
No, it's whatever is faster on the system you're targeting. If you're doing arithmetic on chars instead of just reading, writing, or testing for equality, you should explicitly use "signed char" or "unsigned char".
C will never die. Tons of "low-level" languages have come and gone over the years with the goal of being a C alternative, and most have faded into obscurity. The base libraries that pretty much everything is built atop relies on C in one way or another, and it is just too much to "rewrite it in Rust" or whatever other "hot" language is stylish to use that year.
At college (I was at electrical enigneering at the time) the first programming language they taught us was C. About 80% people after that hated programming profoundly. I really enjoyed it although it was a very difficult course to pass. I mean you give people with 0 programming experience to use pointers. We also worked with structures and even had to work with matrices for the final exam.
And on to Verilog....
Bro it passed only 3.5 months since I was not able to know what binary meen and now , we end with all what you have said i swear 😂
Thanks man, I knew nothing about C and I have my Semester Ending Examination for C in 5 minutes time. I hope this will help me pass the exam.
Thats it, I am now expert at C programming. Thank you for this presentation.
When I finally moved from C to Java, the fact that Java has a garbage collector and has memory safety has confused me, annoyed me, and is causing me severe discomfort over the fact that I am not managing my own memory, I feel like a vital limb has been torn away from me when coding in Java
Id recommend rust if you want a step higher without losing control. In it you dont have to manage memory manually, but it doesnt have a garbage collector either. It just deallocates everything as they go out of scope (which does come with a few minor limitations). Basically rust is an amazing language (in my opinion) that has memory safety guarantees without performance loss.
Id recommend C++ if you want a step higher without losing control. In it you dont have to manage memory manually, but it doesnt have a garbage collector either. It just deallocates everything as they go out of scope, (look up RAII and C++11 pointers). Basically C++ is an amazing language (in my opinion) that has memory safety guarantees without performance loss.
@@ilyastuurfedorov4057 uhhh youre talking about rust? C++ has manual memory management and 0 safety guarantees, while rust has every benefit you mentioned
@@theroboman727 That is incorrect, please, re-read my comment, try looking up stuff that I mentioned. There is literally no reason to use different language like Rust for feature-set like this, since C++ supports it out of the box since 11 years ago.
@@ilyastuurfedorov4057 smart pointers? Dont you have to wrap stuff in "unique_ptr" or "shared_ptr" every time you have type annotations? Also arent there many cases where smart pointers make no sense and you have to use regular pointers/references, and you lose the memory safety? Rust has all and much more of these safety guarantees at compile time whatever kind of pointers you use, and has better support for smart pointers because the language was designed for memory safety and these kinds of features. If you think rust and c++ are anywhere close in terms of safety/security, you havent done your research.
I just learned more about C in this video than I did a college class I took in it 20 years ago
that is a lie.
Fun Fact: The reference implementation for C is technically the Portable C Compiler developed by Bell Labs, more specifically by Mr. Ritchie and his colleague Mr. Thompson. It was extremely portable and could do all sorts of link time magic, shame that it never caught up to gcc and eventually stalled in 2014
The simplicity of 'C' makes it practical to port compilers easily from one computer/cpu to another. It is interesting to investigate 'Small C', covered in Byte at one time and in 'A Tool-book of C'. Once the compiler is ported, libraries and applications can then be recompiled for the new machine.
I just started cs50 earlier this week and hearing that c is one of the more simple languages is definitely not what i wanna hear but are well wish me luck.I feel like once i wrap my head around how to solve problems then everything will kind of be about practice and ill have a much easier time learning
pointers get me so confused, I'm glad you could explain then só easily and simple!
1:03 i recently learned that GCC stands for GNU compiler collection, and not GNU C compiler. Is that right?
It used to stand for “GNU C Compiler” but they added more compiler to the compiler and changed the name as a result
@@chri-k oh, ohkay. yeah, that makes sense. 👍👍