Jonathan Blow on Modern Graphics Programming

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 พ.ย. 2024

ความคิดเห็น • 234

  • @JonathanBlowClips
    @JonathanBlowClips  11 หลายเดือนก่อน +33

    Sorry with the audio on this video. Will be fixed in the next upload.

    • @HairyPixels
      @HairyPixels 11 หลายเดือนก่อน +1

      Can you post the dates of the videos? they're months or years old?

    • @itsdanott
      @itsdanott 11 หลายเดือนก่อน +1

      That means you'll reupload the video? Because in this state it's hard to follow along at all :D

    • @Dex_1M
      @Dex_1M 11 หลายเดือนก่อน

      Great man, you cleanse my mind and my view to programming... I'm only a script guy andy... I'm new to programming and I'm lucky to have encountered your channel before going in into this journey... I love scripts because they are straight forward.

    • @HerrDoktorWeberMD
      @HerrDoktorWeberMD 11 หลายเดือนก่อน +2

      @@itsdanott I'm glad someone else pointed it out, I was gonna say, this feels like the most schizophrenic clip I've listened to in ages, the audio issues were actually making me frustrated with the cake and tea tangents

  • @ssvenno
    @ssvenno 11 หลายเดือนก่อน +187

    As a graphics programmer I never considered any of this, but he's absolutely right. To my knowledge, Vulkan is the most frankenstein of them all because it attempts to find a common abstraction for every vendor. In a perfect world where every vendor opened up their hardware, abstractions like Vulkan's notoriously awful descriptor sets would either not exist or be a lot more simpler to implement.

    • @-Engineering01-
      @-Engineering01- 11 หลายเดือนก่อน +32

      You guys still write 1000 lines of code to draw a basic damn triangle ? Man you gotta be paid more than react devs at Netflix. That shit is the hardest thing I have ever seen in my life.
      When I saw this, I totally gave up and went to network programming, I'm so chill on the Linux terminal screen.

    • @Muskar2
      @Muskar2 11 หลายเดือนก่อน +7

      @@-Engineering01- Huh?

    • @BrandonFurtwangler
      @BrandonFurtwangler 11 หลายเดือนก่อน +25

      You act like those abstractions aren't there for a reason. In this hypothetical world where every vender opened up their hardware, do you think graphics programmers are going to learn dozens of different ISAs and implement engine backends for dozens of GPU specific apis? Seems like you're still just end up with some GPU independent abstraction...

    • @ssvenno
      @ssvenno 11 หลายเดือนก่อน +12

      @@-Engineering01- You're lucky if it's just 1000 lines. If you learn it long enough, you'll develop Stockholm Syndrome for the API.

    • @liquidotacita7027
      @liquidotacita7027 11 หลายเดือนก่อน +7

      Maybe it is just me but concepts in Vulkan were very straightforward for me. Yes, it is a lot of boilerplate at the beginning but the way to abstract them away for a game felt very natural.

  • @darkwoodmovies
    @darkwoodmovies 11 หลายเดือนก่อน +128

    I started programming about 15 years ago, and I feel like I was right on the edge of "old school" and "modern" development. I kinda wish I had the opportunity to be a software engineer during the days of actually needing to implement stuff and not just being a glorified infra engineer stitching together a bunch of NPM libraries.

    • @phee3D
      @phee3D 11 หลายเดือนก่อน +14

      You're comparing old school programming to JavaScript or web based programming? That's not a great comparison for understanding how modern programming differs from the old days. A more reasonable comparison would be doing the same thing, for example making a game in 2005 vs 2020. Many people in 2020 will probably use hardware rendering, which comes with its own complexities and setup, and also have to create much more elaborate, generalized and complicated systems. Many people still do that even though we have things like unity, so if you're really interested and have the skills, you can still make things from scratch which comes with many benefits.

    • @gabereiser
      @gabereiser 10 หลายเดือนก่อน +5

      Jokes on you, we've always been digital plumbers.

    • @OpenGL4ever
      @OpenGL4ever 8 หลายเดือนก่อน

      @@gabereiser Not true.

  • @botbeamer
    @botbeamer 11 หลายเดือนก่อน +32

    As a C graphics programmer, 100% agree on this

  • @adammontgomery7980
    @adammontgomery7980 11 หลายเดือนก่อน +61

    It took me weeks to learn that opengl is basically a state machine, and you're just calling functions that turn on/off features. I wouldn't say it's simple but I feel like that should have been the first line of every beginner tutorial I did.

    • @twistedshake5500
      @twistedshake5500 11 หลายเดือนก่อน

      Can you explain further?

    • @turolretar
      @turolretar 11 หลายเดือนก่อน +1

      The best resource I found for OpenGL is the khronos group wiki, I think they mention early on about opengl being a state machine. As a side note, every concept there is explained really well. Would recommend over any tutorial or book

    • @TheFlynCow
      @TheFlynCow 11 หลายเดือนก่อน +27

      It IS explained in the first chapter in most if not all popular opengl learning resources. Khronos' "What is OpenGL" page also explains this.

    • @SirusStarTV
      @SirusStarTV 11 หลายเดือนก่อน +1

      You really need to be good at low level stuff like (pointers, memory layout, structs, image formats, file i/o) to use opengl

    • @bobbrysonn
      @bobbrysonn 5 หลายเดือนก่อน +2

      In OpenGL's defense, they do state this fact at the very first chapter of their tutorial on learnopengl. That it's all basically a state machine.

  • @JrIcify
    @JrIcify 11 หลายเดือนก่อน +215

    This is why everybody ran for the hills and started using web browsers for everything. On a short term basis it's easier to deal with the insanity of web browsers than the rocket science of native applications.

    • @CianMcsweeney
      @CianMcsweeney 11 หลายเดือนก่อน +27

      100%, we give out about web devs, but there are valid reason why things devolved into the current state of affairs, drivers and graphics api's ruined native development. It's still doable if you have the know how to avoid most of the complexity, but it's far from ideal

    • @ex1tium
      @ex1tium 11 หลายเดือนก่อน +19

      Then somebody came up with React and thought it was the best thing invented in the history of the universe.

    • @JrIcify
      @JrIcify 11 หลายเดือนก่อน +4

      @@CianMcsweeney Yeah you can still do things if you understand the full "ecosystem" of garbage but even then it's not going to be very portable without abstracting the entire machine to death anyway.

    • @-Engineering01-
      @-Engineering01- 11 หลายเดือนก่อน +3

      And nearly 95% of applications, aside of games, can be done on the web without noticing serious performance problems. Even a lot of 3D applications can be written thanks to web-GPU.
      I mean I don't care games at all (don't play them much), so the web is ok for me I even play basic games on my browser only.

    • @CianMcsweeney
      @CianMcsweeney 11 หลายเดือนก่อน +7

      @@-Engineering01- that would be true if web apps were actually written well, most of them are not

  • @JonathanTheZombie
    @JonathanTheZombie 11 หลายเดือนก่อน +36

    I’m a software engineer who uses OpenGL for graphics in our company’s software. In my opinion, it is completely impossible to gather the graphics rings of power and bind them in darkness. Instead, I have resigned myself to the fact that I will need to just know each shading language and each use case, and just do the rework a bunch of times. There’s no better solution unless there’s only one GPU manufacturer.

    • @defeqel6537
      @defeqel6537 10 หลายเดือนก่อน +1

      you already use a bunch of shading languages, which compile to Spir-V, with Vulkan

  • @martin128
    @martin128 11 หลายเดือนก่อน +18

    Jonathan Blow on cake as pallet cleanser

    • @nolamo1496
      @nolamo1496 6 หลายเดือนก่อน

      Lol

  • @ExpensivePizza
    @ExpensivePizza 11 หลายเดือนก่อน +97

    The worst part is these hardware manufactures don't even get the competitive advantage they seek. What actually happens is they just create significantly work for engine developers to support the multiple shader APIs. Basically nobody wins and everyone else loses.

    • @CianMcsweeney
      @CianMcsweeney 11 หลายเดือนก่อน +20

      agree, a large reason why many games never released for PC was due to how difficult it was (and still is) to release bug-free pc releases due to the driver and graphics api mess

    • @doltBmB
      @doltBmB 11 หลายเดือนก่อน +6

      @@CianMcsweeney PC has the biggest game library out of any platform, you are talking shit.

    • @Salantor
      @Salantor 11 หลายเดือนก่อน +23

      @@doltBmB And yet there are still hundreds of console games that were - and probably will be - never ported to PC.

    • @CianMcsweeney
      @CianMcsweeney 11 หลายเดือนก่อน +2

      @@doltBmB I'm talking about the releases most average people want to play. Of course there's loads of great indie games on pc. My point was, that graphics vendors have made it unnecessarily difficult to develop for PC

    • @doltBmB
      @doltBmB 11 หลายเดือนก่อน +2

      @@CianMcsweeney you live in a consolified bubble if you think people only want to play AAA third person shooters

  • @zacharychristy8928
    @zacharychristy8928 7 หลายเดือนก่อน +7

    I totally understand what Jon is saying and largely agree, but I feel like his analysis tends to stop prematurely at the technical level.
    Are AAA games really hampered today primarily by old technical debt? Im sure that's a major factor, but I would argue it's a similar problem with movies. When 100 million dollars are being thrown at a project, no investor is ever going to take the liability of giving artists the freedom they need to make anything but the most mass-appeal trough slop you've ever seen. All while trying to save money by using new developers desperate for work, or offshoring pieces of their development entirely.
    In this kind of environment, is it really a surprise nobody takes the time to learn the intricacies of their machines? It's not oit of fear or ignorance that people don't know their stuff. I would argue it's almost entirely a form of market failure. We racked up too much technical debt and traded quality for development speed, and now in many ways its become a defining quality trait to do things well instead. The pendulum eventually HAS to swing back that way.
    And no, I don't agree with Jon's "bronze age collapse" model. I think we'll be able to dust off the C manuals, establish better hardware standards or all switch to Rust before we just pack it in and decide nobody can use computers any more, lol.

  • @mikeyjohnson5888
    @mikeyjohnson5888 10 หลายเดือนก่อน +5

    My first introduction to "programming" was html and visual basic. I was on the cusp of the design philosophy change over you mention. When I first started, it seemed the only real way to get anything done was to target older machines and when you had direct access to hardware. Things had started to be abstracted so far, you needed to learn dozens of things just to render something on the screen. It was really disheartening. I ended up pushing towards ruby and python for a little bit. Everyone at the time was heralding them as the future of programming. Then I got stuck on java for a little bit. Over the years I ended up more so building skills for dev ops rather than explicitly programming so I pushed for a career in IT. I do wish I had been exposed to more bare metal programming first. I would have fallen more into software development had that been the case.

  • @souper775
    @souper775 10 หลายเดือนก่อน +22

    Vulkan and DX12 are considered modern graphics APIs. DX11 (and previous), OpenGL, and Metal are considered more as Legacy APIs. The main difference between modern and legacy is that the burden of syncing the CPU and GPU is now put in the hands of the developer with modern APIs. Although this leads to more complex code bases and a steeper learning curve, the good thing about this is we can have (almost) complete control over resources shared between the CPU/GPU. This is huge for performance possibilities! Keep in mind, having control like this is also a double edged sword. Not everyone will know how to write code that manages resources correctly.
    Additionally, these changes help the embedded world once it catches on (embedded seems to always lag so far behind, so who knows how many years until OpenGL ES is phased out)
    If you are new to learning graphics APIs in general, start with the legacy ones first because they handle the hard part of memory management and GPU sync. Starting with DX12/Vulkan will be much more challenging, and you will end up just learning more memory management than the graphics related aspects. But if that's what you want, go for it!

    • @nolram
      @nolram 10 หลายเดือนก่อน +8

      Metal isn't a legacy API. It also is an explicit API, its just not as verbose as Vulkan.

    • @pearljamrules1
      @pearljamrules1 3 หลายเดือนก่อน

      Metal isn’t a legacy API. It’s everything Vulkan should have been. It’s only major issue is that’s only available for Apple platforms. There is a reason why MacOS doesn’t have this desktop app issue that Windows/Linux have. It’s extremely easy to make desktop apps on MacOS.

  • @AndarManik
    @AndarManik 28 วันที่ผ่านมา +2

    The original purpose of computation was to reduce friction. You program a system so that you can solve a problem which was previous with high friction. Bank transfers or company document storage were already solved with physical technology but instead now we are able to do so frictionless. People saw how they were able to make these physical solution frictionless by creating software that they abstracted the idea, what if the purpose of software isn't to make originally physical systems faster, but instead how can we make the process of creating software frictionless.
    Many of tools we have developed in the past were motivated by solving these physical problems, and thus needed to provide the tools needed to make something frictionless. What has happened in recently years, say last 1.5 decades, is that many of the tools we had in the past are replaced by one which are "frictionless". The problem with that is that these new tools lack the ability to now solve the real physical problem, which was to make these orginialy physical systems frictionless.
    Basically we used the tools which enabled us to make frictionless systems and used to only the tools themselves and resulted in the loss of actually solving the problem.

  • @MarkMark
    @MarkMark 11 หลายเดือนก่อน +22

    “Everything is too big and too horrible.” lol 2020s in a nutshell.

    • @JohnnyThund3r
      @JohnnyThund3r 2 หลายเดือนก่อน

      Yet Open Source keeps winning!
      Blender isn't bloated and horrible, Godot isn't bloated and horrible. The world is waking up to the fact that Open Source software is just better due to it's sole desire to give the end user the best experience over monetary considerations.

  • @interdimensionalsailboat
    @interdimensionalsailboat 11 หลายเดือนก่อน +11

    The cuts in the video are driving me nuts.

  • @gruntaxeman3740
    @gruntaxeman3740 5 หลายเดือนก่อน +3

    Biggest obstacle of fixing graphics APIs is that Apple, Sony, Microsoft, nVidia etc. don't want to fix them. Everyone want to push their own library in platform what they control.
    So options are:
    1. Write graphics on browser
    2. Use some graphics API abstraction layer.

  • @numoru
    @numoru 11 หลายเดือนก่อน +18

    Man my nigga I dig this dog. Mf thought deep and shared some real shit ,bigsub

  • @delibellus
    @delibellus 11 หลายเดือนก่อน +5

    i was thinking of learning dos because, as a hobby, i would like to develop games in c for a simple system that isn't subject to terrible change. i'm happy to go down that path but the fact that i couldn't find a sweet spot like that in a relevant modern platform has made me feel very disappointed

  • @Synky
    @Synky 10 หลายเดือนก่อน

    Omfg this is so true and even tho i dont make video games it applies to webdev and just software development in general

  • @CharlesVanNoland
    @CharlesVanNoland 11 หลายเดือนก่อน +6

    Gamedevs don't put up with all of the APIs and issues - they resort to using an existing game engine instead, which is a bummer for the reasons JB outlines here, all the complexity of the undertaking entailed.

  • @Olodus
    @Olodus 11 หลายเดือนก่อน +5

    I come here purely for the Jon Blow cake reviews, I don't know what all y'all are talking about here. Like, is Jon known for anything other than his amazing cake reviews?

  • @MikePaixao
    @MikePaixao 11 หลายเดือนก่อน +12

    I started writting a new kind of forward predictive renderer, I don't see a need more than simple compute shaders to crunch numbers to utilize the full gpu.. I decided to instead leverages how Ai's like NERFs and Gaussian splat neural networks calculate graphics, I honestly think we are heading in this direction. It will be more about how many numbers you can crunch at the end of the day :)

    • @lithium
      @lithium 11 หลายเดือนก่อน +14

      Cram more buzzwords in next time.

    • @MikePaixao
      @MikePaixao 11 หลายเดือนก่อน +2

      @@lithium forgot my trusty circular quadratic algebra! 🤣

    • @lostsauce0
      @lostsauce0 11 หลายเดือนก่อน

      Cool idea! I'm curious how it develops

    • @CharlesVanNoland
      @CharlesVanNoland 11 หลายเดือนก่อน +3

      But GPUs have hardware dedicated to rasterizing triangles and specialized caching organization and mechanisms for texture accesses, among other things, and it's always going to be faster than a shader core executing algorithms to do the same work.

    • @oraz.
      @oraz. 10 หลายเดือนก่อน +1

      What did you write? It sounds cool.

  • @HartleySan
    @HartleySan 11 หลายเดือนก่อน +4

    I miss the "old days" of programming.

    • @youtubesuresuckscock
      @youtubesuresuckscock 10 หลายเดือนก่อน

      Are you a psychopath? It was worse than it is today.

  • @edhalferty
    @edhalferty 10 หลายเดือนก่อน +2

    GPUs are actually very simple. Writing a direct driver for a particular GPU wouldn't be much harder than using Vulkan. Vulkan is basically the lack of a graphics library. It's just a thin abstraction layer providing things that most GPUs have in common.

  • @robinpage2730
    @robinpage2730 11 หลายเดือนก่อน +6

    RISC-V is developing an open-source GPU. So the problem of shading language access may be alleviated in the near future.

    • @Nightwulf1269
      @Nightwulf1269 11 หลายเดือนก่อน +4

      Even RISC-V cores are used in NVIDIA-GPUs nowadays. But that doesn't mean necessarily, that any of that is exposed to the user aka programmer. That's why I like embedded devices from small vendors....they include hardware documentation with every single bit in every single register described. The backside is: if I own such a device and should take full advantage of that and write everything besides the linux kernel and GNU tools myself, that would be a task for the next ten years. As it is mentioned in the video: things got WAY too complex.

    • @Roxor128
      @Roxor128 11 หลายเดือนก่อน +2

      There are already a couple out there. GPLGPU is an implementation of a 1990s-era GPU (so it's all fixed-function stuff), and Nyuzi and MIAOW are manycore compute cores (the latter using AMD's Southern Islands ISA).
      Oh, and of course, there are HDL re-implementations of the GPUs from classic games consoles for use in FPGA projects like MiSTer.

  • @salmaofinlandes6793
    @salmaofinlandes6793 11 หลายเดือนก่อน +41

    Modern Graphics Programming and its consequences have been a disaster for the human race

  • @smallbluemachine
    @smallbluemachine 11 หลายเดือนก่อน +15

    The only game in town is OpenGL and it’s also the one everyone is waiting for to die. The API scene is so bad that OpenGL cannot die.

    • @4.0.4
      @4.0.4 10 หลายเดือนก่อน

      Apple stopped supporting it, right? That was what killed Adobe Flash.

    • @megamozgs9959
      @megamozgs9959 10 หลายเดือนก่อน +2

      it is already marked as legacy api by almost everyone. nobody use opengl anymore.

    • @ZombieLincoln666
      @ZombieLincoln666 9 หลายเดือนก่อน

      Did you time travel here from 2004?

    • @smallbluemachine
      @smallbluemachine 9 หลายเดือนก่อน +4

      Vulkan has more problems than OpenGL does. I was sent from 2004 to save you all.

    • @ZombieLincoln666
      @ZombieLincoln666 9 หลายเดือนก่อน +1

      @@smallbluemachine what about directX?

  • @lucy-pero
    @lucy-pero 11 หลายเดือนก่อน +11

    It's true. It's sad but it's the reality of the situation. Every GPU/System company wants to have control over the process so all of them have their own GPU language. There is no incentive for them to support or care about a universal language. That's the only reason we have Direct3D, Vulkan, OpenGL, Metal, and all the other proprietary console graphics API's, each with their own shading language.

    • @dealloc
      @dealloc 11 หลายเดือนก่อน +4

      I don't see how this is different from CPUs. There are already shader languages that compile to different GPUs respectively, depending on your toolchain; GLSL, HLSL, WGSL and Furthark just to name a few.

    • @CianMcsweeney
      @CianMcsweeney 11 หลายเดือนก่อน +3

      @@dealloc it is different from cpu's, amd and intel share the same instruction set, x64/x86, on desktop only Apple use a different architecture/instruction set. Sure microcode is different but the programmer doesn't need to care about that. If there was some common hardware interface or instruction set used between gpu vendors we wouldn't be in this situation. The ideal futuristic option is to not need a gpu at all, and to rely on expanding the width of cpu vector registers to make gpu ops doable on CPU's with good speed

    • @quaker5712
      @quaker5712 11 หลายเดือนก่อน

      ​@@CianMcsweeney That would be awesome and make draw calls far less expensive surely.

    • @CianMcsweeney
      @CianMcsweeney 11 หลายเดือนก่อน

      @@quaker5712 potentially yes, would be a massive change however so not sure how likely it is to happen

    • @dealloc
      @dealloc 11 หลายเดือนก่อน

      @@CianMcsweeney > on desktop only Apple use a different architecture/instruction set
      Apple's Silicon CPUs are RISC-based SOC, like ARM which are deployed on more than just Apple's devices.

  • @SuSh0xka
    @SuSh0xka 6 หลายเดือนก่อน

    Jonathan, I work for Imagination Technologies (PowerVR GPUs). We made the ISA public for our latest two generations (some people even managed to get SGX reversed engineered for PSVita).
    Unfortunately even if we manage to expose ASM directly to the programmers to they can run shaders they way they want very few brave graphics developers will attempt to do it. Also keep in mind that not only between different families of GPUs, but even on the same family, you can write code on one PowerVR BXM core that will basically hang the GPU on a slightly different permutation (just because that instruction is trying to access or do something that is not supported). This will only work on closed systems where you know that the hardware is never going to change.

    • @boptillyouflop
      @boptillyouflop 2 หลายเดือนก่อน +1

      I'm reminded of the Nintendo 64's RSP (8 x 16bit vector unit), mostly used for vertex processing and triangle setup and sound mixing, where games would normally not use custom RSP code but rather use pre-written "microcodes" supplied in the development kit... except for Factor 5 for their Indiana Jones and the Infernal Machine game, which showed Factor 5's demoscene roots and used custom RSP code and rendered in high resolution! ... and came out very late in the life cycle of the N64, and was not one of the top 50 selling N64 games... It seems that for these kind of game system, it's more important to get good performance from middling code, than great performance from exceptional code...

    • @SuSh0xka
      @SuSh0xka 2 หลายเดือนก่อน

      @@boptillyouflop I thought N64 RSP microcode was not available (at least at the beginning).

    • @boptillyouflop
      @boptillyouflop 2 หลายเดือนก่อน +1

      ​@@SuSh0xka Oh... Well, I don't know I guess. You have a point... I guess that they must have only been supplied the pre-compiled RSP .o object files (which the SDK code would copy into the RSP instruction RAM to do vertex and audio processing).

  • @kajanswat6206
    @kajanswat6206 11 หลายเดือนก่อน +20

    I'm medical laboratory technician. But I don't know why I like and really everyone in the programming world and appreciate Their contribution by any means they give to the world .

    • @StarContract
      @StarContract 11 หลายเดือนก่อน +10

      Along with our contributions, you should also count our negative impact. For example, Twitter is a massive shit stain on humanity, we made disinformation commonplace, we are ridiculously overpaid and bring prices up for everybody, we facilitated mass distribution of digital pornograpgy which is an epidemic, we made scamming a viable career choice and the list goes on.

    • @StarContract
      @StarContract 11 หลายเดือนก่อน +5

      While medical laboratory technician's work might be more straightforward than solving engineering problems, a functioning society is fucked without your team while it do ok without my team.

    • @xking21
      @xking21 11 หลายเดือนก่อน +2

      @@StarContractstill more positives than negatives

    • @-Engineering01-
      @-Engineering01- 11 หลายเดือนก่อน

      We created TikTok to lower the intelligence of human beings.
      We created Whatsapp, Google and Facebook to gather and sell all of your data, messages, pictures and videos to the governments and companies.
      We created Microsoft Windows, Android, iOS to sneaking on you, collection all of your private data and collecting them to CIA and FBI.
      Do you still respect us ?

    • @Muskar2
      @Muskar2 11 หลายเดือนก่อน +1

      @@StarContract If the net benefit was so terrible, there would be no hesitation to combat the industry. Let's not pretend like our industry doesn't bring any value

  • @angelopesce2520
    @angelopesce2520 11 หลายเดือนก่อน +6

    Your analogy is flawed.
    It's not that when C was made, someone made a compiler for ALL architectures - far from it. C was made for a specific reason, a specific need on a specific platform. Then, over the years it prevailed because it filled a gap, and it prevailed with lots of vendors implementing it in their own way - gradually then becoming a true standard one could rely upon and so on and so forth.
    Making the same for a GPU would be trivial, you can make your own toy language for the GCN for example, call it G. But probably it would not really serve a big need, thus won't spread and become a standard etc etc.

  • @Dom-zy1qy
    @Dom-zy1qy 11 หลายเดือนก่อน +12

    Gaining a fundamental understanding of everything in your stack/all the tools/concepts you use to build things is very time consuming. Like for example, if i wanted a fundmental understanding of what projection in 3d graphics actually does / what project matricies are, i would need to study linear algebra.
    Studying linear algebra doesn't necessarily require trigonometric knowledge, but there's a fundamental relationship between trig & projection & inner products & magnitude so you really should.
    Eventually, if youre like me you forget what you were actually studying for in the first place.

    • @JohnnyThund3r
      @JohnnyThund3r 2 หลายเดือนก่อน

      This is a real issue... I mean I'm literally here because I was looking for Godot Tutorials, now I'm watching videos on graphics APIs... like what am I even doing here?! 😅

  • @Johan-rm6ec
    @Johan-rm6ec 10 หลายเดือนก่อน

    Gone are the days of bare metal coding like on the C64 and Amiga ooooh how glorious where those days ;)

  • @oraz.
    @oraz. 10 หลายเดือนก่อน

    There are compatibility libraries like bhfx, maybe that's the best way currently.

  • @dahahaka
    @dahahaka 11 หลายเดือนก่อน +12

    Isn't spir-v basically a universal shading language?

    • @nolram
      @nolram 11 หลายเดือนก่อน +1

      It‘s more of an assembly format.

    • @pokefreak2112
      @pokefreak2112 11 หลายเดือนก่อน

      It's not officially supported by macOS or web (wgpu) afaik, so you still need to transpile to msl and wgsl respectively

    • @dahahaka
      @dahahaka 11 หลายเดือนก่อน +1

      @@nolram sure but since we have spir-v what hinders us from coming up with a new language that transpiles to spir-v and uses existing toolchains to again transpile to specific implementations?
      I know it sounds a bit ridiculous, but we do much crazier stuff nowadays

    • @nolram
      @nolram 11 หลายเดือนก่อน +3

      @@dahahaka Good news: That is, partially, what we are already doing! Although SPIR-V is mainly (and pretty much only) consumed by Khronos-standards (OpenGL, Vulkan, OpenCL), it can be compiled to very easily by shader compilers. Meaning that even the Microsoft DirectX shader compiler, that can also compile for DirectX, can also output SPIR-V, which is why even most Vulkan applications use HLSL, as that is the input language to DXC.

    • @Roxor128
      @Roxor128 11 หลายเดือนก่อน +4

      I'd say it's more like bytecode for the Java Virtual Machine. Compile your Java (or other language) program to bytecode, then load that up in the JVM to actually run it, and it need not know about the specifics of what architecture it's actually running on. Same idea for SPIR-V, except with compilation, rather than interpretation. You compile your GLSL shader to SPIR-V and save it to disk, then at load-time the SPIR-V version gets handed to the graphics driver, which compiles it to whatever the GPU wants.

  • @Hoptronics
    @Hoptronics 5 หลายเดือนก่อน

    I kinda phased myself out. Back in the fat i was in tight with opengl, directx and all that. Then i kinda had some life changes and was out of anything really programming or even computer related for about 15yrs. Now, the tools are the same but changed but relearning all the tiols and modern best practices is pretty daunting..

  • @zandernoriega
    @zandernoriega 10 หลายเดือนก่อน

    100% on point

  • @gabrielshansen
    @gabrielshansen 11 หลายเดือนก่อน +4

    @JonathanBlow So, basically, the reason software development is so abstracted and 'crappy', is because of the need to constantly - marketing-driven, profit-driven - evolve the underlying hardware and systems. I use to use the Amiga ecosystem as an example of what happens, when you have hardware/base, that stays more or less the same over a long timeperiod. The creativity and amount of problemsolving that went into making extraordinairy experiences with very constricted technology, was astounding. And I question whether this could be the same, IF hardware wasn't so rapidly evolving. Software development that actually caters to the hardware 100% is soo much longer to develop than the hardware itself. Very interesting take there!

    • @lostsauce0
      @lostsauce0 11 หลายเดือนก่อน +1

      Constantly moving to the next shiny thing while never taking time to refine.
      This is why I LOVE stuff like Pico-8. I think the concept of virtual-only consoles is really clever. These types of self-imposed restrictions inspire creativity. I also love seeing new physical hardware like the Playdate (and its included dev environment).

  • @HairyPixels
    @HairyPixels 11 หลายเดือนก่อน +3

    Why isn't there a shader language transpiler if that's such a big problem to have so many shader languages?

    • @stephenlim7564
      @stephenlim7564 11 หลายเดือนก่อน +4

      Transpiling doesn't solve anything if the semantics of the shader languages are drastically different. Transpiling is only possible if there are semantic equivalents you can rearrange syntax to make.

    • @nolram
      @nolram 11 หลายเดือนก่อน +9

      There is. Loads. Even the DirectX Shader Compiler, which is open source, supports SPIR-V output.

    • @HairyPixels
      @HairyPixels 11 หลายเดือนก่อน

      @@nolram then what is blow's complaint about? seems like this problem is solved.

    • @nolram
      @nolram 11 หลายเดือนก่อน +4

      @@HairyPixels I DON'T KNOW! No idea what he is complaining about. I get his point about modern rendering being overly complex, that makes sense, but most of the rest of this rant seems pretty nonsensical to me... Vulkan can be used to counter most of his points about vendor lock-in.

    • @baki9191
      @baki9191 11 หลายเดือนก่อน +5

      @@HairyPixels what's the complaint? Thats like saying the problem for game development is solved because engines exist. Games development requires low level control, he doesn't want a transpiler, he wants to communicate with the GPU directly like he does the CPU.

  • @anonymouscommentator
    @anonymouscommentator 11 หลายเดือนก่อน +3

    can anybody here explain to me why we actually need graphics drivers? my cpu doesnt need any drivers either. if i have a program which says to calculate 1+1 on the cpu then the cpu understands it directly. why is it that when i tell my gpu to calculate 1+1 it suddenly needs a driver for it.

    • @TheOnlyJura
      @TheOnlyJura 11 หลายเดือนก่อน

      because the GPU has its own instruction set, and you are sending the instructions via your CPU, so the bare minimum your CPU needs to know how to do, is to send "some arbitrary" data to the GPU, which is what the driver does under the hood

    • @metalstarver642
      @metalstarver642 11 หลายเดือนก่อน +3

      @@TheOnlyJura But CPU also has own instruction set. We just compile to different target. We could very well do the same for GPUs and have even different languages. We just need open specification for hardware, then anybody could make compiler for it.

    • @anonymouscommentator
      @anonymouscommentator 11 หลายเดือนก่อน +1

      @@TheOnlyJura my cpu has it's own isa and my gpu has it's own isa. c programs are compiled to machinecode and shaders are compiled so it runs on the gpu isa. PCIe is a clearly defined standard where no driver help should be needed.

    • @turolretar
      @turolretar 11 หลายเดือนก่อน +2

      Because gpu instructions are purposely hidden away in the driver. But I’m not an expert, I’m sure there are other uses for it too

    • @TheOnlyJura
      @TheOnlyJura 11 หลายเดือนก่อน +1

      @@anonymouscommentator your network card needs a driver too

  • @gsestream
    @gsestream 2 หลายเดือนก่อน

    you dont need a shader language, you just need a generic pbr pre-programmed shading system. yes like it was a generic texture raster system. but T&L.

  • @channel11121
    @channel11121 11 หลายเดือนก่อน +6

    Why does it skip every 10-or-so seconds?

    • @Saganist420
      @Saganist420 11 หลายเดือนก่อน +29

      because proprietary software is trash and things are too complicated nowadays

    • @JonathanBlowClips
      @JonathanBlowClips  11 หลายเดือนก่อน +8

      Bug when exporting

    • @Gabemeister1201
      @Gabemeister1201 11 หลายเดือนก่อน +3

      Because he's eating cake.

  • @pikuma
    @pikuma 11 หลายเดือนก่อน

    Yes.

  • @mintx1720
    @mintx1720 11 หลายเดือนก่อน +3

    I don't think this is true, the real problem is your shader fix would suck about as much as glsl. Just look at how many tries people made to get programming right. When we finally got Crablang some people thought that was so good and made wgsl with 'Vec4' that nobody uses.

    • @lostsauce0
      @lostsauce0 11 หลายเดือนก่อน

      Vec4 is the ultimate in optimizing for genericisim over the most common case.
      Like, I get it. It makes perfect sense. But it doesn't feel good to use so it doesn't fuckin matter

  • @doltBmB
    @doltBmB 11 หลายเดือนก่อน +6

    Low level API's have been an absolute disaster.

    • @botbeamer
      @botbeamer 11 หลายเดือนก่อน +2

      High level you mean

    • @doltBmB
      @doltBmB 11 หลายเดือนก่อน

      @@botbeamer No.

    • @Muskar2
      @Muskar2 11 หลายเดือนก่อน +4

      Most APIs in general are bad. Typically full of unnecessary abstractions, ties into an intrusive mental model and doesn't support granular steps from quick prototyping and down toward controlling as much yourself that you need to get the production release done. With that said, it takes many iterations to make a great API. Every time it takes more than a few minutes to figure out how to use the API (as a first-time user) to achieve what you need is a symptom of bad API design in my book. Don't settle just because the norm is bad.

  • @brhvitor4
    @brhvitor4 11 หลายเดือนก่อน +8

    I honestly think its not comparable C that had to deal with a few different data types/formats vs. the massive complexity of modern graphics shaders.

    • @Ipanienko
      @Ipanienko 11 หลายเดือนก่อน +1

      C runs on FPGAs. It's not just a few different data types.

    • @brhvitor4
      @brhvitor4 11 หลายเดือนก่อน +2

      ​@@IpanienkoYour point underscores mine. C alone doesn't cover all FPGA data types; it relies on HLS for translation. Yet, HLS is only somewhat encompassing, leaving us (developers) to handle specific FPGA requirements manually. At the end modern graphics shaders is just more extensive.

    • @Ipanienko
      @Ipanienko 11 หลายเดือนก่อน +1

      @@brhvitor4 That's not my point. You are still vastly underselling the flexibility of C

    • @megamozgs9959
      @megamozgs9959 10 หลายเดือนก่อน +1

      @@Ipanienko well there is already an intermediate format for shading languages, called spir-v. dont know if macos supports it, but you can just convert this format to msl and you are good to go.

  • @puncherinokripperino2500
    @puncherinokripperino2500 11 หลายเดือนก่อน +1

    what's with the sound?

  • @rusmaakatupal4723
    @rusmaakatupal4723 8 หลายเดือนก่อน

    The electronic engineers, the embedded software dev community needs to do it's comming out and put in place an open source collaboration with indepedant graphic hardware makers, graphic driver programmers, and graphic programmers on the "front end". It Really sounds utopian and I am so sorry for everyone who is impacted by this shit situation.

  • @kingemhyr
    @kingemhyr 11 หลายเดือนก่อน

    I wonder how I can apply this to LSPs

  • @0Camus0
    @0Camus0 11 หลายเดือนก่อน +21

    Honestly this is a poor take IMO.
    What do you mean "locking up shading languages" ? You know that HLSL can be compiled in Vulkan, right? Also GLSL runs on any vendor right now, except maybe Apple. So, I don't understand this complain.
    Secondly, do you know that OpenCL exists? Is the closest thing to C and it's more of the general compute platform that you are talking about. This is also runs on any vendor (again, except Apple), so, again, what's the problem here?
    Do you want a simple API for graphics, well, you can always use OpenGL or D3D11, they still work you know. Now, as developer, I am grateful that Dx12 and Vulkan exist, because now you can optimize your app or game to take full potential of the hardware. In the past, we had to depend on the vendor to optimize the driver for specific games, it was worse than now, so, again I don't understand your rant.
    Yes, Dx12 and Vulkan ARE complicated, and there is a reason, there is no free lunch, GPUs are complicated, and they are not complicated for some conspiracy theory of Nvidia to make them complicated, is just progress. They are required to do thousand of things now that were not a thing in the past, video encoding/decoding, general compute, AI inference, raytracing, etc. You can write a Basic interpreter for GLSL if you want, you can also go an write your open source driver for Intel if you want (like the i915).
    Finally, are you advocating for Open Source drivers? or Open Source RTL? Like the design of the GPU? Who will fund this? Who will validate this? Is not a raspberry PI you know. Seems to me you are trying to reduce everything to: complicated bad, simple good.

    • @youtubesuresuckscock
      @youtubesuresuckscock 10 หลายเดือนก่อน

      Like almost everything this clown says, it's a non issue that no one else is hung up on. This doesn't matter at all, and it's especially funny coming from someone working on pissant indie games that don't even NEED a modern 3D API to begin with.

  • @theaugur1373
    @theaugur1373 11 หลายเดือนก่อน

    Will AMD ROCm fit in here somewhere? It’s open source, but I’m unaware of its use outside of deep learning applications.

    • @4.0.4
      @4.0.4 10 หลายเดือนก่อน

      I'm unaware of its use even in the field of deep learning lol everything is CUDA.

  • @magnuswootton6181
    @magnuswootton6181 11 หลายเดือนก่อน +2

    i want to see more 3d menu systems, and title screens, make the interface fully extruded 3d and rotating around, i would like to see that, not done that often!!!

    • @4.0.4
      @4.0.4 10 หลายเดือนก่อน

      You want that Navi computer from Lain, don't you?

    • @magnuswootton6181
      @magnuswootton6181 10 หลายเดือนก่อน

      @@4.0.4 just saying that titlescreens and menus are always 2d, it would be cool to see a 3d one, or lots of them. :)

    • @defeqel6537
      @defeqel6537 10 หลายเดือนก่อน

      might work in VR, but poorly on a flat screen

  • @ShadowKestrel
    @ShadowKestrel 11 หลายเดือนก่อน +2

    This is something that I am quite excited about the future of wgpu for - it compiles down (and quite effectively too, an actually good use of rust's absolute overkill of 'zero-cost' abstractions) into basically whatever backend API you want. Vulkan, DX, Metal, if you want to support antique toasters it can do OpenGL no problem, and even WebGL and WebGPU (confusingly, wgpu is also the core of firefox's WebGPU implementation. Mozilla moment I guess). And it's a heck of a lot nicer to write than Vulkan, speaking from experience. While I've seen bindings for other languages out in the wild, it is still rust native - and does depend on its features kinda deeply. So if you don't want to use rust, you've still got a while to wait most likely.
    also, grrr CUDA >:{ One of the big names at intel, don't remember exactly who, said the world should push towards open standards (specifically calling out against CUDA) for how to interact with a GPU. Rare hardware manufacturer W? Be interesting to see how much intel follow up on that in their own product lines

  • @LordOfCake
    @LordOfCake 11 หลายเดือนก่อน +4

    What's his take on WebGPU?

  • @derpysean1072
    @derpysean1072 11 หลายเดือนก่อน +1

    The thought of manufacturers designing their products for control. My naive mind can only think of one solution for this: anti-consumerism.
    But then after buying second hand laptops, I still contributed to the problem more or less.
    I don't fucking know what the solution anymore apart from DIY microchips and open source projects.
    But then, trying to source decent equipments and Android (Google) is a shitshow already.

  • @Eek_The_Cat
    @Eek_The_Cat 11 หลายเดือนก่อน +10

    I can understand and accept the sentiment, but hardware nowadays really is too complicated to be accesseed directly.
    Still, opensourcing specs, drivers and frameworks would make everyone's life so much easier.

    • @ArneChristianRosenfeldt
      @ArneChristianRosenfeldt 10 หลายเดือนก่อน

      Win 3.11 came with drivers. Only a few chosen ones could improve the driver in the N64 (microcode).

  • @Saganist420
    @Saganist420 11 หลายเดือนก่อน

    i agree

  • @nolram
    @nolram 11 หลายเดือนก่อน +9

    But… the hardware manufacturers don‘t make the standards. Different consortiums make the standards, like Khronos or the DirectX Group at Microsoft.
    So what is John‘s point, since the manufacturers don‘t make the evil he complains about?

    • @monkev1199
      @monkev1199 10 หลายเดือนก่อน +1

      Take a quick look at the khronos members list and come back to me about the hardware manufacturers not being involved

    • @nolram
      @nolram 10 หลายเดือนก่อน +2

      @@monkev1199 They are involved, certainly, but there is no single one of them that makes the standards, which is what John is complaining about. It's not some exclusive thing per vendor.

  • @syntaxed2
    @syntaxed2 5 หลายเดือนก่อน +1

    The poor state of modern API's are kind of an indication that capitalism isnt quite giving us better products - Competition seems to affect only the hardware.
    Same thing with modern entertainment - Competition hasnt worked to give us better Alien, Terminator or Predator movies. Its all garbage. Why isnt competition working?

  • @renanmonteirobarbosa8129
    @renanmonteirobarbosa8129 11 หลายเดือนก่อน

    Glados give you cake ? 😂

  • @jak3legacy
    @jak3legacy 2 หลายเดือนก่อน

    Isn't Vulkan open source?

  • @doofmoney3954
    @doofmoney3954 หลายเดือนก่อน

    Intellectual Property ruins everything per usual

  • @PixelOutlaw
    @PixelOutlaw 11 หลายเดือนก่อน +3

    I still think the most universal approach model wise to graphics is writing directly to the acreen pixels. Unfortunately it seems you just can't get any speed out of that. APIs have not favored the programmer instead we're having to bend ourselves and bang our heads on the wall as we build cabins out of sawdust and glue. I don't think there's been a more hostile programming era to the new programmer. We have tools that attempt to do everything for us yet we have horrid abstractions around graphics we need entire engines just to manage everything and nobody understands what's going on at the end of the day truly because everybody's worried about learning the 10% of the glue they need to make everything work. A big part of it too is the expectation of the people who receive the games. If the graphics API was simply coloring screen pixels you're development kit consisted of a core language and some multimedia libraries and the customer would settle for much more simple games with less content we'd all be better off. You look at game programming books in the '90s and early 2000s and you essentially see a core language some supporting multimedia functionality and that's it you get a game in around 200 pages. Go back to the '80s and you basically get 4 pages of BASIC to make a game. So we've gone from a program listing of four pages to a book of 200 pages to five books covering various topics or just blindly learning a game engine.

    • @lostsauce0
      @lostsauce0 11 หลายเดือนก่อน +1

      Any good book recommendations?

    • @ArneChristianRosenfeldt
      @ArneChristianRosenfeldt 10 หลายเดือนก่อน

      OpenGL has fragment shaders. With unified shaders you can pull in the rasteriser or just don’t? Nanite and cycles use the GPU, but not its rasteriser. Would be cool if we could just use pure functions, forEach, and hierarchical z-buffer. Cube maps as code.

    • @VivienBihl
      @VivienBihl 10 หลายเดือนก่อน +2

      You understand that in the 80s it was all software renderer right?
      And that with modern GPU APIs, the responsibility is moved from the driver/API implementation to the application?
      Do you really think moving from directx11 to 12 suddenly added tons of code? It just shifted who has to write the code needed to drive the hardware.

    • @0x1EGEN
      @0x1EGEN 21 วันที่ผ่านมา +1

      We kinda are going back to software rendering. Maybe in the future we will have a unified CPU/GPU architecture with massively parallel general purpose compute units that just write data to the framebuffer. I think Sony originally planned something like this with their CELL processor on the PS3. Apparently the PS3 wasn't going to have a GPU originally and all of the graphics would be rendered using SPE cores.

    • @PixelOutlaw
      @PixelOutlaw 21 วันที่ผ่านมา

      @@VivienBihl I'm saying you didn't really even need an API for software rendering. You poked directly the color into the pixels.

  • @sagielevy
    @sagielevy 10 หลายเดือนก่อน

    Metal is an amazing shading language, shame that Apple locked it to their proprietary hardware. But on the other hand part of the benefit of that is that Metal can be strongly coupled with their GPU hardware which means they can make it less general and more efficient. This is pretty important when considering GPUs.

    • @youtubesuresuckscock
      @youtubesuresuckscock 10 หลายเดือนก่อน

      It has 100% efficiency because no one on earth uses it.

    • @nolram
      @nolram 10 หลายเดือนก่อน +2

      Metal is a graphics API, not a shading language.

    • @sagielevy
      @sagielevy 10 หลายเดือนก่อน

      it's both, you write Metal shaders on iOS, not GLSL..@@nolram

  • @Nick_fb
    @Nick_fb 11 หลายเดือนก่อน +2

    Ughh this is the exact problem I'm in the process of solving, but the likelyhood of piracy and/or no payment, is kicking me in the gut so hard. /whinge

  • @perfectionbox
    @perfectionbox 11 หลายเดือนก่อน

    Good thing Unreal Engine has a material editor

  • @Maurdekye
    @Maurdekye 11 หลายเดือนก่อน

    we need to reimagine the gpu from the ground up using first principles. we cant just leave nvidia, amd, and intel to keep using their bloated proprietary architecture

    • @turolretar
      @turolretar 11 หลายเดือนก่อน +8

      What? But they make the chips, what are you going to do?

    • @lostsauce0
      @lostsauce0 11 หลายเดือนก่อน +1

      ​@@turolretar No he said we need to reimagine, not actually *make* anything

  • @wheatandtares-xk4lp
    @wheatandtares-xk4lp 11 หลายเดือนก่อน +8

    Did this dude just complain about shader languages? Of all problems that face video game developers, is the inability to rewrite the shader language really something to focus on?
    There is literally no game that would become possible to make in reasonable time if you had a new shader language.
    I'm reminded of the Steve Jobs Takes Criticism video whre the dude complains about java and opendoc. There aren't any customers that care dude.

    • @wheatandtares-xk4lp
      @wheatandtares-xk4lp 11 หลายเดือนก่อน +2

      @@Morimea You misunderstand, friend. I'm not saying there isn't any _technology_ you can't implement. I said there aren't any _games_ that would be unlocked by shader language innovation. C++ is far more "outdated" than GL, but there aren't any games that will become possible with a new programming language.

    • @lunabob-ie5qx
      @lunabob-ie5qx 10 หลายเดือนก่อน

      @@wheatandtares-xk4lp so there's no benefit to using C++, everything can be written in brainfuck

  • @roc7880
    @roc7880 9 หลายเดือนก่อน

    too big to fail in video games?