@@tattipodapatti yes even non programmers can understand JSON easily, government of india is using it in websites like income tax for the users to download as json
Great video. Thanks for introducing me to Deku. I think there is a lot of information left out here. I'm gonna outline my opinions below even if they don't matter: 1. JSON is not for computers it's for humans. Trouble shooting binary requests isn't easy. 2. GZIP compression can greatly reduce the bytes over the wire. JSON with lots of repeating keys can greatly benefit from gzip. 3. A lot of the time the database is your actual bottleneck and serialization doesn't matter for performance. In fact, 1,600 requests per second is more than most companies/services will ever see. 4. Having the document define the schema allows for flexible data. You can't just send arbitrary data with a binary protocol. This has pros/cons. For a public facing web service that only receives a couple thousand requests per second I would pick JSON over any other structured binary protocol almost every time. Yes there is a cost to simplicity but there is also a cost to complexity. Make it easy for internal (engineers) and external (users) to troubleshoot. Just use JSON unless you have identified that serialization is the bottleneck in your endpoint.... For most endpoints/services, I would wager serialization is not the bottleneck.
things like scientific applications involving high density of data collected in small amount of time(rocket controls, explotions tests..), and things that are inherintly binary(vnc), JSON is not a good choice anything involving man, it's good to use JSON
Hey, did you like the video? I really enjoy making this kind of content. That's fun. Do you think I should do something with promises? (MAKE A COMMENT, Do not respond, TH-cam has the worst notifications)
BLAZINGLY SLOW. Thanks for another great technical video! I hadn't actually thought about it until now, but this explains perfectly why we use protobuf so much in embedded work
Binary formats are difficult to maintain and scale at certain points, as ThePrimeagen notes, it's also not easy to version; you'll have to change your parser to read data correctly, or know where to take up unused space to give room for future improvements without breaking the protocol. But it's really good for real-time systems that need to deliver a lot of data in a small amount of time. For example game servers where feedback should be fast and responsive both on the client and across other clients with different network conditions at 60fps, while also dealing with other systems like rendering.
For sure. I am also curious why “deku” instead of using exactly Avro or Protobuf. This approach doesn’t seem to take into account forward/backward compatibility, unless I missed something?
I can't stop loving this content. JSON is something I use daily but never really think about. A deeper than usual deep dive, yet still accessible. Thank you my good man
It kinda landed right at the best moment, as I'm trying to improve the performance of our microservices where are transferring millions of JSON messages and can see the real cost of JSON serialize/deserialization growing super fast. I was studying GRPC and just learned about this protobuf here, thanks again for sharing this kind of content!!!
There is actually a successor to GRPC & Protobuf called Cap’n Proto. Afaik it's made by the same person but they are no longer working for Google. It's faster because Cap'n Protos RPC can "travel back in time" (send a rpc using the output of a another call before the call has even finished) and it does direct in-memory modification rather than adding extra encoding & decoding steps. You of course keep the benefit of having a separate protocol definition that can be used across multiple languages.
I think JSONs fall into the same pitfalls as document storage DBs. They are so easy to understand and implement, yet so goddamn hard to let go off even after you learn about better and BLAZINGLY FAST-er alternatives (like good ol' postgres or protobufs in this case). This requires a change in mentality more than anything. I suppose old habits die hard.
I interned at an insurance company where the way they received new insurance data was through a single string that was about 5000 characters long and parsed it by slicing that string at intervals and it almost always caused issues with encoding because it used ascii encoding and all the responses were in utf-8 so it had a billion safeguards that would always break when someone entered a non ascii character. I'd say it doesnt get more blazingly fast than this but its written in the most cursed and blazingly slow C# code ive ever seen in my limited experience. Best part is they moved to c# from java like 7 years ago but just kept this system instead of rewriting it to use xml or "jizzle"
Very cool video 😎 Would be interesting to see jsons position challenged. I wonder how gRPC would compare, as I have never used it in the browser. But, before we hate on JSON, let’s remember that it freed us from XML
It didn’t really. All these horrible front end “frameworks” go through linguistic and development hoops to preserve XML. UIs are defined in some bastard munged XML+JS format that can’t be merged. UIs can be described in JSON. But a bunch of HTML authors from 1994 still want to do
@@ABaumstumpf you dont need to send comments over API requests, lol. XML is fine for a lot of cases if you like it, but it makes no sense to use it for network traffic.
@@lunafoxfire "you dont need to send comments over API requests" Who said anything about API request?? "but it makes no sense to use it for network traffic." Then that applies exactly the same to Json.
As someone who tried writing this kind of stuff back before JSON and even before XML, the reason we went with this stuff was for debugging purposes. At the time it was so nice to just watch the stuff on the wire, and be able to have it be human readable. However, I would argue this was mostly because there was not a de-facto binary protocol well understood by common tooling. I think it's entirely possible for us to have a nice binary encoding that our tooling (things like wireshark, Charles proxy, Chrome dev tools) could read and understand without it being fat and bloated like JSON.
I guess it would be really beneficial for the adoption of protobufs as well, if you could just upload a .proto to devtools and see all the underlying data
I'm, just using "edn" (extensible data notation) by default since it's not only "blazingly" but also "extensibly" fast :D Amazing video btw, like always!
Prime, these videos make us all better programmers. Thank you so much, I have been so much healthier programming and having fun doing it just because of your content, and that makes me a much better programmer
Great video. Super interesting to see! The “XML is worse” part got me in kidneys. I have to work with shitty XML everyday as a messaging schema between our app and other companies integrations (LEGACYYY). Slow and terrible to work with. The worst of both worlds.
I think another benefit of JSON (though it may not be as relevant in this example as in, say, a web application) over any binary format is that it is immediately readable to any user. I can't count the number of scripts or alternative clients I've been able to write just by inspecting the JSON data a website sends. A universal, plain-text format like JSON frees the potential of how software is used from just the developer and places some choice in the hands of the user as well. Another thing, I've found that I really like the TSV (tab-separated values) format for storing any tabular data. It's so simple, effortless to parse, and much more compact compared to JSON or even CSV (the schema is usually just the first line, or can even be omitted). I'm not sure how it would work as a data transfer format, and though I don't see it commonly used, I doubt there would be any significant problems.
looking at the comments: seems most people consume messages without checking what they are ingesting (some even say that is the safe way), never heard of network endianness, didn't realize a few bits in the beginning of the frame can specify version, size and any other desirable property, don't know what rpc is, think computers should communicate in human, and the list goes on... so I think you are right.
I'm with you man, I have an embedded arm Linux mature project I recently spend some time optimizing code that the GCC profiler marked as high usage. There is some json in the project, this video makes me want to rethink it, so the device runs more cool, other people don't this beneficial, but I don't see the benefit of heating the atmosphere
@@andersoncunha7079 Who cares? 99% of devs are probably working stuff like web dev. Yeah we want the router and the actual webserver itself to be internally optimized to the femtosecond. But the website? Bro people are loading sites on their phones. If it takes a couple seconds, who cares? What matters is if it's maintainable. JSON is maintainable. And before you say "but you can break it with version updates" yeah and it just causes an image not to show on a website. It doesn't crash an airplane into the ground. Different applications bro.
The fact that there are a million different Javascript frameworks invented every day shows that the productivity boost and portability benefits that Javascript gives you are real.
Hey, I clicked that weird red button that says SUBSCRIBE on it and indeed, there were multiple AJAX calls that were using JSON for BOTH request parameters and response data. Crazy!
Its run by business, not engineers; jsôn allows for a more commodity/fungible dev team. It takes a certain amount of scale before the tradeoff is worth it. Same for things like reactive programming, etc. The quality of your videos, in both content and presentation, is strong enough that it's deterring me from taking on that kind of load for a S&Gs channel of my own! I'd love to see more stuff 100% (i'd do the twitch stuff but it's a bit of a time commitment and i completely suck at multi-tasking so i can't really do it in the background)
The deserialization from json from the requests and the database to serialization to the response are actually in the top 5 of our bottlenecks in the company i work in. 40ms of a 100ms request are just for de-/serialization from/to json.
Human readable, dynamic formats are not binary. I am shook. I agree that people often use JSON where they don't really need a human readable format, but they have their place. I would actually be interested if there are human readable formats that are much faster to parse/generate. Also 10x difference is surprisingly low. Like, we are comparing a format that needs to be parsed to something that basically can get flushed straight to memory. JSON seems pretty amazingly fast for what it does
It was the smallest message I could send. As a message grew the performance got closer to 40 to 100x. Remember I'm also fully parsing the message. This isn't flat buffers where it lazily parses message access At that point you're talking a thousand x
I think I could have done a better job and got a bit slower on binary representation. But, it's such a hard balance to not be long-winded, but be entertaining. So thank you for the note!
"This data transfer specification is slower than writing non-interchangeable data structures in my preferred language" yeah of course, and also of course it's less space-efficient (and slower to parse) than all the various binary formats (but see also gz). JSON is human-readable, available in every language, and good enough for most use-cases (except at FAANG scale); no one ever accused JSON parsers of being particularly speedy, but also most systems don't have JSON parsing as a bottleneck. I ran into this at a previous company, they wanted to switch to protobuf or something similar to save some bytes, and I had to call out that they had some terrible queries in common paths that took over 5 seconds to complete, and needed to focus on real problems. And about the last quip: if we got rid of JSON today, some idiot with a popular blog would say it's time for XML to make a comeback, and then we'd really be in deep shit.
Your production quality has gone up and the humor is on point. Today I decided to subscribe! I can officially say that I like this channel now :) great content my dude!
A few extra bytes to be human readable is totally worth it. The fact that you inspect the JSON sent in the dev tool sis testament to that. I would say the overhead of the encoding/decoding is perceptabible for most use cases.
The particular example you gave is just wrong. Dev tools could've just as easily transparently converted to/from an efficient binary format. Surely a human-readable format is useful. But nowhere as useful as people think it is. Images are not human-readable, neither are docx, pdfs, or mobi. Plus, if readability was a concern a subset of YAML (which is less complicated to parse) is much more readable than JSON. How is an unparsed, unindented JSON readable anyway? And if you have to do parse JSON to display in a "human readable" format anyway, why not use a binary format, or a subset YAML?
@@FunOrange42 Yes, and your answer is the correct one. JSON is not used because it is "human friendly". It is used because the tooling required for it was already there. Javascript always understood JSON. And that is all it took for it become popular
I was gonna say, you're gonna get people arguing with you instantly in this thread. I agree with you Michael but what can you do. A day is coming when quantum computers will make it so that the difference is actually negligible.
@@gnikdroy by “subset of YAML”, I assume you mean basically just the keys + values? Because ‘less complicated to parse’ is absolutely not true for full YAML vs JSON… if you look at the spec alone, this is quite clear (YAML spec is even longer than XML…, and ~10x longer than JSON). Because of this complexity, also different YAML parsers may sometimes give entirely different interpretations & parsings for the same YAML string. And I would argue that even the subset of the most basic functionality of YAML is still more complex to parse than JSON, since tracking indentation to denote “Start object”/‘End object” is already more complex than simple delimiter characters… and inevitably someone will use a tab when someone else used spaces. And someone may use 3 spaces to indent while another uses 4. And so on. It’s very easy to create broken YAML while also hard to verify because it’s very easy to create YAML which is semantically broken but syntactically correct. YAML can’t detect that you meant for a line to have different indentation, while JSON can detect very easily that you didn’t add a “}" Unindented JSON may be unreadable, but unindented YAML is semantically broken. Unindented + single-line JSON may be EXTREMELY unreadable, but unindented + single-line YAML is syntactically broken
I don't know how I got here guy, but I'm absolutely glad to be here - some slick vim keystrokes, nice editing and explanations. Keep it coming good sir.
been watching a lot of your videos, I feel like your newer 'hot take' videos on blogs are fine and all, but you're a good teacher and I'd like to see more of these more informative/objective vids from you because you're a pretty good teacher when you do this sort of content!
I would like to take a moment and say the videos you put out are excellent, both in educational and entertainment value. Very much appreciated! ✨BLAZINGLY AWESOME ✨
There's a Handmade Seattle talk from 2021 called "context is everything" that shows you can get huge speedups if you can tighten your schema, and if you have full control over your tools that do the parsing (down to using native code). JSON imposes a tax in comparison to raw bytes, but you can still get big gains if you can tighten down your schema. Thinking JSON/not JSON in black and white will cost you orders of magnitude. You should consider if you actually need the flexibility of wide-open generic constrains on either the parsing or the generation side.
W video, more more more. Too many videos out there aimed at beginners, very few aimed at intermediates (other than theo and fireship). Love the content. Love the streams. Good job dad.
I loved this idea, And this is the first time I see how slow is JSON, I never thought about it before Thanks for the great ideas and videos, Please keep up the good work I love watching your videos, it is useful, funny and interesting to me
yah yahh, prime is on a roll! This is the best tech content on the internet if you ask me. Would be delighted if you pumped up even more of these. Plus the streams were you write the code for these experiments are so much fun :)
It's not apples to apples though, JSON is (generally) schema-less, while the simple binary format showed has to know the structure of the data before it can even parse it. If say fields are added later an outdated client might just parse garbage values not knowing how the new data is layed out. Yes you can version data correctly yourself or use good tools like protobuf. And for the size complaints, JSON is usually served gzipped so a lot of repeated " , {} [] won't really matter. I'd like a comparison with gzipped data, text compresses very well so it will get closer to binary.
This was great - we need more people to think about how efficient programs that run at scale are, as well as how efficient the data we send through the net. There is this horrible trend of as chips and network transmission gets faster to just send things using the least efficient method. I totally agree with your point though on being able to inspect the payload is helpful for debugging what's going on though.
algorithmic signal that I love the content -> SENT ( in JSON format :D ) { "like": "true", "subscribe": "true", "rateContent": "5/5", "remark": "god bless you" }
JSON is like the python of language, slow but convenient to use
I think this is the perfect description
We want to know more about JSON alternatives
@@worstfellow Yes, but definitely not XML
@@tattipodapatti yes even non programmers can understand JSON easily, government of india is using it in websites like income tax for the users to download as json
@@codyking9491 yes already we have too many javascript frameworks don’t want too many data formats 😅
Great video. Thanks for introducing me to Deku.
I think there is a lot of information left out here. I'm gonna outline my opinions below even if they don't matter:
1. JSON is not for computers it's for humans. Trouble shooting binary requests isn't easy.
2. GZIP compression can greatly reduce the bytes over the wire. JSON with lots of repeating keys can greatly benefit from gzip.
3. A lot of the time the database is your actual bottleneck and serialization doesn't matter for performance. In fact, 1,600 requests per second is more than most companies/services will ever see.
4. Having the document define the schema allows for flexible data. You can't just send arbitrary data with a binary protocol. This has pros/cons.
For a public facing web service that only receives a couple thousand requests per second I would pick JSON over any other structured binary protocol almost every time. Yes there is a cost to simplicity but there is also a cost to complexity. Make it easy for internal (engineers) and external (users) to troubleshoot. Just use JSON unless you have identified that serialization is the bottleneck in your endpoint.... For most endpoints/services, I would wager serialization is not the bottleneck.
things like scientific applications involving high density of data collected in small amount of time(rocket controls, explotions tests..), and things that are inherintly binary(vnc), JSON is not a good choice
anything involving man, it's good to use JSON
to your 3rd point. It was even greater 1,600 requests per milliseconds ~ which would be 1,600,000 requests per second
on point
Comments like this keep my hope in the humanity
Why gzip when we can Brotli?
Hey, did you like the video? I really enjoy making this kind of content. That's fun. Do you think I should do something with promises? (MAKE A COMMENT, Do not respond, TH-cam has the worst notifications)
I think you should learn french
These videos are getting better and better both in quality and entertainment factor :)
That's what I like to hear
@@ThePrimeagen to bad you cannot 'hear' that...
unless you made a speech to text app in RuSt... :)
edit: text to speech
@@vaisakh_km if he had then it would have been blazingly fast
BLAZINGLY SLOW. Thanks for another great technical video! I hadn't actually thought about it until now, but this explains perfectly why we use protobuf so much in embedded work
Absolutely. By the way, I haven't responded to your discord message. I'm on a flight to meet my boss right now, and then I'll be able to update you
@@ThePrimeagen Thanks a ton. Have a safe flight and hope the meeting goes well
@@ThePrimeagen you have a boss? thought ur the boss
blazingly slow is the new blazingly fast
@@dickheadrecs I just heard that sentence with Fireships voice thanks for making my day
Binary formats are difficult to maintain and scale at certain points, as ThePrimeagen notes, it's also not easy to version; you'll have to change your parser to read data correctly, or know where to take up unused space to give room for future improvements without breaking the protocol.
But it's really good for real-time systems that need to deliver a lot of data in a small amount of time. For example game servers where feedback should be fast and responsive both on the client and across other clients with different network conditions at 60fps, while also dealing with other systems like rendering.
Protobuf then?
It would be nice to see a video comparing sending JSON/XML/Apache Avro/Protobuf over the wire with Rust/Go/JS. Great video, as always;
Absolutely must happen
For sure. I am also curious why “deku” instead of using exactly Avro or Protobuf. This approach doesn’t seem to take into account forward/backward compatibility, unless I missed something?
I can't stop loving this content. JSON is something I use daily but never really think about. A deeper than usual deep dive, yet still accessible. Thank you my good man
ty ty ty
It kinda landed right at the best moment, as I'm trying to improve the performance of our microservices where are transferring millions of JSON messages and can see the real cost of JSON serialize/deserialization growing super fast. I was studying GRPC and just learned about this protobuf here, thanks again for sharing this kind of content!!!
GRPC is wonderful. And reading the proto files is so much better than working with swagger
There is actually a successor to GRPC & Protobuf called Cap’n Proto.
Afaik it's made by the same person but they are no longer working for Google. It's faster because Cap'n Protos RPC can "travel back in time" (send a rpc using the output of a another call before the call has even finished) and it does direct in-memory modification rather than adding extra encoding & decoding steps.
You of course keep the benefit of having a separate protocol definition that can be used across multiple languages.
@@Ether_Void “before call is even finished” sounds cool but also scary this rings so many alarms in my head as a security analyst
I think JSONs fall into the same pitfalls as document storage DBs. They are so easy to understand and implement, yet so goddamn hard to let go off even after you learn about better and BLAZINGLY FAST-er alternatives (like good ol' postgres or protobufs in this case). This requires a change in mentality more than anything. I suppose old habits die hard.
Exactly. Simplicity is amazing. I cannot stress that enough. But it also costs a lot.
Recently I was comparing Deno and Rust. What surprised me was how much simple JSON response weigh...
People just don't realize how expensive simplicity is
@@ThePrimeagen I'm a simple man; consequently, an expensive one as well.
@@enclave2k1 xd
I interned at an insurance company where the way they received new insurance data was through a single string that was about 5000 characters long and parsed it by slicing that string at intervals and it almost always caused issues with encoding because it used ascii encoding and all the responses were in utf-8 so it had a billion safeguards that would always break when someone entered a non ascii character.
I'd say it doesnt get more blazingly fast than this but its written in the most cursed and blazingly slow C# code ive ever seen in my limited experience. Best part is they moved to c# from java like 7 years ago but just kept this system instead of rewriting it to use xml or "jizzle"
Really enjoyed this. It just seems like, for better or for worse, the ease and convenience of JSON trumps everything else.
Very cool video 😎
Would be interesting to see jsons position challenged. I wonder how gRPC would compare, as I have never used it in the browser. But, before we hate on JSON, let’s remember that it freed us from XML
It didn’t really.
All these horrible front end “frameworks” go through linguistic and development hoops to preserve XML.
UIs are defined in some bastard munged XML+JS format that can’t be merged.
UIs can be described in JSON.
But a bunch of HTML authors from 1994 still want to do
Yeah, it "freed" us from a more structured approach that also supports comments and validation....
@@ABaumstumpf you dont need to send comments over API requests, lol. XML is fine for a lot of cases if you like it, but it makes no sense to use it for network traffic.
@@lunafoxfire "you dont need to send comments over API requests"
Who said anything about API request??
"but it makes no sense to use it for network traffic."
Then that applies exactly the same to Json.
@@ABaumstumpf ...the whole video was about network traffic... ...thats what the original comment was about...
As someone who tried writing this kind of stuff back before JSON and even before XML, the reason we went with this stuff was for debugging purposes. At the time it was so nice to just watch the stuff on the wire, and be able to have it be human readable. However, I would argue this was mostly because there was not a de-facto binary protocol well understood by common tooling. I think it's entirely possible for us to have a nice binary encoding that our tooling (things like wireshark, Charles proxy, Chrome dev tools) could read and understand without it being fat and bloated like JSON.
This
Insert obligatory xkcd standards cartoon here ;-)
I guess it would be really beneficial for the adoption of protobufs as well, if you could just upload a .proto to devtools and see all the underlying data
I'm, just using "edn" (extensible data notation) by default since it's not only "blazingly" but also "extensibly" fast :D
Amazing video btw, like always!
Tytyty
Prime, these videos make us all better programmers. Thank you so much, I have been so much healthier programming and having fun doing it just because of your content, and that makes me a much better programmer
Love to hear that giga Chad
holy gigachad comment
i love that akame wallpaper behind your transparent terminal with nvim open. nice touch.
I'd just like to interject for a moment. What you're referring to as Json,
is in fact, JSML, or as I've recently taken to calling it, Tom is a genius.
Crazy timing. My frontend team at work is frequently wondering why we're not doing JSON payload, so i can send this! Super informative, thanks!
hah! well if your backend is dictating it, they are smert
As a french dev, hearing you say "jason" makes me giggle in the weirdeist way 😅
Your more technical videos are so great, kudos !
Will do, and I will never let JSON be said any other way
I'm loving the new editing style. not too much but just the right amount of sass
For the algo
For the thankfulness
Great video. Super interesting to see! The “XML is worse” part got me in kidneys. I have to work with shitty XML everyday as a messaging schema between our app and other companies integrations (LEGACYYY). Slow and terrible to work with. The worst of both worlds.
I want an update of this video, this time featuring JDSL and Tom the genuous
I think another benefit of JSON (though it may not be as relevant in this example as in, say, a web application) over any binary format is that it is immediately readable to any user. I can't count the number of scripts or alternative clients I've been able to write just by inspecting the JSON data a website sends. A universal, plain-text format like JSON frees the potential of how software is used from just the developer and places some choice in the hands of the user as well.
Another thing, I've found that I really like the TSV (tab-separated values) format for storing any tabular data. It's so simple, effortless to parse, and much more compact compared to JSON or even CSV (the schema is usually just the first line, or can even be omitted). I'm not sure how it would work as a data transfer format, and though I don't see it commonly used, I doubt there would be any significant problems.
This is an algorithmic message to let you know that this type of content is really really appreciated !
These types of videos are super valuable and I really appreciate the way you go into the technicalities and explain these topics. Loving the format!
As an embedded C engineer i love efficient code. I think JavaScript is the evidence of the decline of our civilization.
Facts
looking at the comments: seems most people consume messages without checking what they are ingesting (some even say that is the safe way), never heard of network endianness, didn't realize a few bits in the beginning of the frame can specify version, size and any other desirable property, don't know what rpc is, think computers should communicate in human, and the list goes on... so I think you are right.
I'm with you man, I have an embedded arm Linux mature project I recently spend some time optimizing code that the GCC profiler marked as high usage.
There is some json in the project, this video makes me want to rethink it, so the device runs more cool, other people don't this beneficial, but I don't see the benefit of heating the atmosphere
@@andersoncunha7079 Who cares? 99% of devs are probably working stuff like web dev. Yeah we want the router and the actual webserver itself to be internally optimized to the femtosecond. But the website? Bro people are loading sites on their phones. If it takes a couple seconds, who cares? What matters is if it's maintainable. JSON is maintainable. And before you say "but you can break it with version updates" yeah and it just causes an image not to show on a website. It doesn't crash an airplane into the ground. Different applications bro.
The fact that there are a million different Javascript frameworks invented every day shows that the productivity boost and portability benefits that Javascript gives you are real.
Hey, I clicked that weird red button that says SUBSCRIBE on it and indeed, there were multiple AJAX calls that were using JSON for BOTH request parameters and response data. Crazy!
Weird. This Json thing might just catch on after all
but my eyes can read json blazingly fast
Absolutely loving these performance deep dives! Keep ‘em comin!
damn! that sub request was really well placed and timed! nice one!
having just seen Mark Robber's last Olympiad for squirrels. I can appreciate the squirrel for science...
Science squirrels are incredibly important
Its run by business, not engineers; jsôn allows for a more commodity/fungible dev team. It takes a certain amount of scale before the tradeoff is worth it. Same for things like reactive programming, etc.
The quality of your videos, in both content and presentation, is strong enough that it's deterring me from taking on that kind of load for a S&Gs channel of my own! I'd love to see more stuff 100% (i'd do the twitch stuff but it's a bit of a time commitment and i completely suck at multi-tasking so i can't really do it in the background)
I like this format! Could be even a bit deeper / more technical and longer :). Anyway, super cool format!
I absolutely vote for more videos like this, but maybe some more involvement in the creation of the code. Loving it! SO much JSML o_O
As a front-end dev - I feel personally attacked by Prime's videos but I do appreciate them all the same!
Loving this series! This is great content covering topics that aren't too easy to find, most of YT is for junior devs.
The lab coat and theme when you went all sciency gave me serious Garand Thumb vibes. I love it. You guys even kind of look like each other.
Love your videos dude!
What's with the dancing squirrel 💃🐿️?
🤣🤣🤣
Good one!
Love this form of content Prime! 👍 You make harder concepts (for me anyway) as digestible as possible! So invaluable.
The deserialization from json from the requests and the database to serialization to the response are actually in the top 5 of our bottlenecks in the company i work in. 40ms of a 100ms request are just for de-/serialization from/to json.
That is incredible. And if you're using a garbage collected language, don't forget the effects of garbage collection.
@@ThePrimeagen Well, we use Java so... yeah.. xD
Human readable, dynamic formats are not binary. I am shook. I agree that people often use JSON where they don't really need a human readable format, but they have their place. I would actually be interested if there are human readable formats that are much faster to parse/generate.
Also 10x difference is surprisingly low. Like, we are comparing a format that needs to be parsed to something that basically can get flushed straight to memory. JSON seems pretty amazingly fast for what it does
It was the smallest message I could send. As a message grew the performance got closer to 40 to 100x. Remember I'm also fully parsing the message. This isn't flat buffers where it lazily parses message access
At that point you're talking a thousand x
@@ThePrimeagen Fair point
For more information on that topic I recommend chapter 4 of Designing Data-Intensive Applications. Really good video :D
This was a really cool deep-dive, but also very entertaining and straight to the point. Looking forward to more stuff like this. :)
I think I could have done a better job and got a bit slower on binary representation. But, it's such a hard balance to not be long-winded, but be entertaining.
So thank you for the note!
This is one of the best videos on programming I've ever seen. I feel enlightened and empowered
Your jokes in this type of developer/programming videos is funny and unique and I would love to see more of em keep up the good work.
Thank you for this one, was really informative to me.
Can't believe I wasn't taught this in University 😂
"This data transfer specification is slower than writing non-interchangeable data structures in my preferred language" yeah of course, and also of course it's less space-efficient (and slower to parse) than all the various binary formats (but see also gz). JSON is human-readable, available in every language, and good enough for most use-cases (except at FAANG scale); no one ever accused JSON parsers of being particularly speedy, but also most systems don't have JSON parsing as a bottleneck. I ran into this at a previous company, they wanted to switch to protobuf or something similar to save some bytes, and I had to call out that they had some terrible queries in common paths that took over 5 seconds to complete, and needed to focus on real problems. And about the last quip: if we got rid of JSON today, some idiot with a popular blog would say it's time for XML to make a comeback, and then we'd really be in deep shit.
Your production quality has gone up and the humor is on point. Today I decided to subscribe! I can officially say that I like this channel now :) great content my dude!
0:08 made me laugh way too hard, thank you Mr. Prime
A few extra bytes to be human readable is totally worth it. The fact that you inspect the JSON sent in the dev tool sis testament to that. I would say the overhead of the encoding/decoding is perceptabible for most use cases.
The particular example you gave is just wrong. Dev tools could've just as easily transparently converted to/from an efficient binary format.
Surely a human-readable format is useful. But nowhere as useful as people think it is. Images are not human-readable, neither are docx, pdfs, or mobi. Plus, if readability was a concern a subset of YAML (which is less complicated to parse) is much more readable than JSON. How is an unparsed, unindented JSON readable anyway? And if you have to do parse JSON to display in a "human readable" format anyway, why not use a binary format, or a subset YAML?
@@gnikdroy because the tooling already exists for JSON
@@FunOrange42 Yes, and your answer is the correct one. JSON is not used because it is "human friendly". It is used because the tooling required for it was already there. Javascript always understood JSON. And that is all it took for it become popular
I was gonna say, you're gonna get people arguing with you instantly in this thread.
I agree with you Michael but what can you do.
A day is coming when quantum computers will make it so that the difference is actually negligible.
@@gnikdroy by “subset of YAML”, I assume you mean basically just the keys + values?
Because ‘less complicated to parse’ is absolutely not true for full YAML vs JSON… if you look at the spec alone, this is quite clear (YAML spec is even longer than XML…, and ~10x longer than JSON). Because of this complexity, also different YAML parsers may sometimes give entirely different interpretations & parsings for the same YAML string.
And I would argue that even the subset of the most basic functionality of YAML is still more complex to parse than JSON, since tracking indentation to denote “Start object”/‘End object” is already more complex than simple delimiter characters… and inevitably someone will use a tab when someone else used spaces. And someone may use 3 spaces to indent while another uses 4. And so on. It’s very easy to create broken YAML while also hard to verify because it’s very easy to create YAML which is semantically broken but syntactically correct. YAML can’t detect that you meant for a line to have different indentation, while JSON can detect very easily that you didn’t add a “}"
Unindented JSON may be unreadable, but unindented YAML is semantically broken. Unindented + single-line JSON may be EXTREMELY unreadable, but unindented + single-line YAML is syntactically broken
Very informative and comedic video, thanks :) Keep up the good work TheScienceagen
I don't know how I got here guy, but I'm absolutely glad to be here - some slick vim keystrokes, nice editing and explanations. Keep it coming good sir.
been watching a lot of your videos, I feel like your newer 'hot take' videos on blogs are fine and all, but you're a good teacher and I'd like to see more of these more informative/objective vids from you because you're a pretty good teacher when you do this sort of content!
Anyone else currently looking at a JSON parser and your head constantly screaming "Jéson!"?
I really enjoy these comparison videos, especially with your humor.
Yes!!! Gold content!!! I have never heard any other dev talking about serialization encodings!!!!
Amazing video as always, please keep them up!
had to pause at the intro just to mention how smooth that subscriber plug was. ggwp
edit: what a great video
I would like to take a moment and say the videos you put out are excellent, both in educational and entertainment value. Very much appreciated!
✨BLAZINGLY AWESOME ✨
This was very interesting and something I hadn't really given second thought to when I'd use JSON
0:24 its used by Tom The Genius Jay-Diesel
There's a Handmade Seattle talk from 2021 called "context is everything" that shows you can get huge speedups if you can tighten your schema, and if you have full control over your tools that do the parsing (down to using native code).
JSON imposes a tax in comparison to raw bytes, but you can still get big gains if you can tighten down your schema.
Thinking JSON/not JSON in black and white will cost you orders of magnitude. You should consider if you actually need the flexibility of wide-open generic constrains on either the parsing or the generation side.
insanely high quality video. thank you for all the information!
One of the easiest subscribing of my life, top notch quality
ty ty ty ty
I love this style of video! The live stuff is neat as well but I find these are much easier to focus on
W video, more more more. Too many videos out there aimed at beginners, very few aimed at intermediates (other than theo and fireship). Love the content. Love the streams. Good job dad.
I loved this idea, And this is the first time I see how slow is JSON, I never thought about it before
Thanks for the great ideas and videos, Please keep up the good work
I love watching your videos, it is useful, funny and interesting to me
yah yahh, prime is on a roll!
This is the best tech content on the internet if you ask me. Would be delighted if you pumped up even more of these.
Plus the streams were you write the code for these experiments are so much fun :)
What about gzip, brotli, etc? I don't worry too much about json size overhead as long as the client supports compression (especially brotli).
i think you fall a sleep during the video
parsing json is slow
Would be interesting to see comparison between deku and MsgPack since it sits somewhere in between binary and json
Dude this guy is on another level. Love your videos.
love it when the scienceagen comes in to do some blazingly fast research for us all
Great video, some interesting points mentioned here. Looking forward to seeing more
I just love your content, it's smart, useful, in dept and quite entertaining!
This kind of videos are just pure gold and a lot of educational content is in. Thanks for the education and entertainment at the same time. :)
MY GOD, this was so satisfying to watch and listen to
I'm saying it the french way from now on. Thank you for making my life richer!
Thank you primeagen now I occasionally giggle like an idiot while working with JSON because now I call it nothing but Gizmo
This is the best type of video. Make more plz.
JSML
I like this. Have my algorithmic signal. Thanks prime.
Exactly what I was looking for. Thank you!
I don't even work with Rust and NodeJS, but it's pretty fun your videos. Keep going!
I loved this video! Thank you Primeagen
JSON maximizes readability, whereas deku maximizes speed. Same tradeoff between Rust and Python
It's not apples to apples though, JSON is (generally) schema-less, while the simple binary format showed has to know the structure of the data before it can even parse it. If say fields are added later an outdated client might just parse garbage values not knowing how the new data is layed out. Yes you can version data correctly yourself or use good tools like protobuf. And for the size complaints, JSON is usually served gzipped so a lot of repeated " , {} [] won't really matter. I'd like a comparison with gzipped data, text compresses very well so it will get closer to binary.
It really works! I clicked on the subscribe button, I saw the Json!
Yoda voice: Jason. You seek Jason… for some reason that’s what plays in my head.
This was great - we need more people to think about how efficient programs that run at scale are, as well as how efficient the data we send through the net. There is this horrible trend of as chips and network transmission gets faster to just send things using the least efficient method. I totally agree with your point though on being able to inspect the payload is helpful for debugging what's going on though.
this was really beneficial and interesting. thanks for posting it. i learned a lot
Love these style videos. Keep it up!
Thank you. I feel like it's a fun break from the VIM content
It's like JSON is Recoome and binary is Goku.
Great video, fun to watch and good knowledge. 1Gs in the chat for that
Damn, when that wild Thomas Dolby appeared I couldn't help myself from laughing
algorithmic signal that I love the content -> SENT ( in JSON format :D )
{
"like": "true",
"subscribe": "true",
"rateContent": "5/5",
"remark": "god bless you"
}
Thank you for this video MR. Primeagen. I feel smarter now.
Great video, please do more subjects in depth. These nuggets of knowledge help us all! Thank you.
This one was a BANGER. Nice job prime