At this moment, dozens of people are planning that very application. This software needs to become real! ... and why is there not a proper butt emoji? We have a poop emoji, but no proper booty? Closest I could find is a peach. 🍑
Thats just the corporate machine demanding a prodouct gets shoved out the door. The "well patch it later" mantra. EA is plenty big enough to have ZERO excuse for poor performance on PC. Hell, I'm willing to bet their dev rigs are not all that different from a high end gaming rig. You don't need A6000's for Game dev. A 3090 or 4090 can even overkill.
Ive never had luck with this. Doesnt seem to do much on most laptops. And on my desktop it will reset but then set my monitor refresh rate down to 60hz and not allow me to go over 120hz until i restart
It be the same as shooting water actually gas isn’t as flammable as moves make it seem and a spark isn’t likely ignite it now a incendiary around or a tracer might be fun?
@@jeremybeaverson7167 a spark will definitely light off gasoline if it comes in contact with the fumes. I've lit gasoline just by hitting a striker a couple inches in the air plenty of times
@@jeremybeaverson7167 you are talking about diesel. Gasoline will definitely burn I use that shit to light charcoal all the time it's incredibly flammable diesel isn't
I love all the Linus and Alex videos where the whole idea is "we shouldn't do this. It's insane and kinda dumb. But let's do it anyway and take it to 1000%."
@@RobbieEl JFC it‘s not about raw data but doing a proper experiment with meaningful results. And btw how many raw data sources are there for 1000W GPUs?
@@alfiegordon9013 yeah, funs and giggles and so much wasted resources and potential. So if it‘s only about entertainment, why exactly are they sinking millions into LTT labs?
I mean if I were to do it I’d use a glue gun instead of soldering wires to the GPU. You could also use glue gun to shunt mod any GPU . Not that I condone warranty fraud of course
Everything from production quality to the ingenuity behind creating solutions as you go is why I'm subbed to LTT. I couldn't care much for a "new" laptop with x and y now available, but waking up the techie at heart in me makes me think my 1080Ti might still have more to offer with ideas like these
It's good to know the stock cooler on the Strix is really really good and water cooling is not necessary to get really good performance out of this. Plug it in and forget. Just don't forget to plug it in properly.
I agree the cooling on my strix 4090 has been way better than what I first had on 3090. I burned my hand on 3090 backside 105c BS lol. Had to add it to my custom loop.
Super mega like you listened to my comment(and probably others) and went JBC with the soldering station! Regarding the discrepancy in voltage readings between what the card reports and the multimeter reading (at specific TestPoints, measurement in these conditions is location dependent, best location might be on the small center SMD ceramic caps behind the GPU, closest to the load), is because of the high current draw of the GPU and voltage drop in distribution to the load (GPU Core). @ 600[W] with 1.1[Vcore], current draw would be a whopping 545[A]+ (0.75[V]->800[A] if that was Vcore)... Congrats! Always wanted to do that, amazing it held up so well! Keep 'em rollin' and much success with the LAB!
@@jcspotter7322 a kWh is 3.6 million joules of energy, it’s a constant. Something doesn’t draw energy measured in kWH, you measure in watts over a timespan. A kW per hour would be 1000 joules in the timespan of an hour (0.0027kWH) so to say this difference doesn’t matter is kinda stupid
The 4th thing you can be limited by is critical path in the silicon itself. There has to be enough time between clock pulses (the thing you decreasing by increasing clock speed) for signal to propagate from one register to the next.
@@enacku If i’m understanding the top comment right than that would be “locked” in a sense. There is a minimum amount of time it takes electricity to travel along the traces in the board. You cant adjust the length of that
@@serina762 Yeah, that would come down to manufacturing nodes and actual chip design to make those routes shorter and more efficient; You're gonna run into other troubles with that at some point though, if you go too close with transistor gates you might get electrons going where you don't want them to. Once you get closer than 3 nm on your gate size you start getting tunneling effects, so there's a fundamental limit to how small you can make one.
Every time I see Alex in the thumbnail I know I'm in for some nervous giggling and questionable activities. I love it. I hope that the future of these projects include modifying parts even further. Basically that chiller didn't just unlock the full potential of the CPU and GPU, I feel it's also unlocked the full power of Alex.
It’s a no lose situation, either it works and it’s great content-or it blows up and it’s great content. Linus always gets his money’s worth, the only thing that changes is the title of the video and if there needs to be a follow-up
When alex says "It might be fine, it might not be fine" at 14:00, it dawned on me that they are doing this on a GPU that costs almost four times as much as my computer
Wouldn't it have been "easier" to use the Galax HoF 4090 for this? That card comes with an unlocked vBios as well as two of the 12vHPWR connectors on the card as well as the VRMs to back that up since it's meant for hardcore OC and record breaking.
Yeah considering that thing got 3.6GHz it would’ve been the only way to truly make this video. Then again is that really a 4090 if it’s been modified so much? I think this video was a demonstration of rigging a typical consumer 4090.
Even 450W is way more than the card actually needs. I occasionally set mine to about 60% PT at which point it only loses about 5-8% performance but it draws about 250W. This thing is crazy efficient. As a reference, my previous 3070Ti would deliver way less than half of the performance while drawing about 330-350W
@@deheavon6670 Right, but it would obviously not perform like a 4090 -10%... While it makes me wonder about stability, it's what you get for it that's impressive in comparison. While the 40 serie is very underwhelming on so many level, so far consumption isn't really something we can complain about, it pretty much makes RDNA3's efficiency marketing look stupid.
@@vuri3798 very stable for me so far in any case I have tested it. To offer another perspective, I am playing Apex Legends at 4K 144fps locked (medium-low settings) and the 4090 (60% PT) pulls around 150W. To get the same performance with the 3070ti I had to overclock it, at which point it almost mostly stayed at 144 fps but consuming around 350W. Oh, and the 4090 is also at around 40% utilization in this scenario.
I did the same thing on my 3080Ti, when undervolted and power limited it draws around 180 to 250W in game and most of the time for a very small to insignificant loss in performance. (not to mention this nearly removes any coil whine in games that usually produce a lot of it in stock configuration)
70A power stages can not do 70A continiously (standart conditions). Those are rated at 20-30A continious with 70A peak current capability (for at least couple dozen of switching cycles). 40-50A with waterblock cooling might or might not be possible.
@@Kr0noZ Biuldzoid actually said practically the very same thing. He also mentioned efficiency curve beeing at the highest point at approximatly 20A for 60-70A power stages
@@volodumurkalunyak4651 Yes they can, they're literally specced that way by the manufacturer. Look at a SiC832 (which is a 70A SPS) datasheet for example, they list 70A for "continuous current rating". Now should you run them at 70A? No, obviously not, because they'd run really inefficient = create a lot of heat. You're completely missing the point tho, if you would've done your math correctly you would've noticed that they would never even come close to running at 70A with the 1000W BIOS.
@@-eMpTy- manufacturer is so confident in above mentioned power stage working at 70A continiously that datasheet efficiency curve don't even come close. I am even sure it will handle 70A continious... under LN2 (approximate 13W power dissipation per power stage is no joke). I haven't wrote "that will not run 70A at all", but "it will not run 70A at standart conditions".
My 980 TI reaches about 320 Watt in a stable overclock with 40 Degrees Celsius. I find it still impressive that this card is in my system now for years and is still running really well.
when i used to overclock my PC i bought a small chest freezer and fill it with water, make a closed loop with it and it works like a charm with my i7 4790k and gtx 1080Ti
Maybe you could try this with the 7900 XTX? Seems like it has much more potential performance to squeeze out with modding and that insane cooling system you have
@@slipknot1943 so? he did it for 4090 too, LTT has a hundred times the views and only fair if he does it for team green, to do it for team red and lol team blue.
OMG Linus your exact blank face stare into the object, and the way you had your hands behind your back leaning over the workbench, and reaction to Alex when you said "Uh-huh." @7:46, that's exactly what my late techie uncle Paul would do, when observing my cousin and I working on PC's. You made me have flashbacks dude! :D
Should I be worried about Alex’s well-being with how many jank set-ups keep happening under his watch? That’s a lot of bare wire coming off of the GPU.
Biggest takeaway for the regular user from this video is that Win + Ctrl + Shift + B resets the graphics driver. Never even knew this was a thing, much less that it was in stock Windows. Thanks!
The performance went down after you increased the memory clock because these chips have ECC memory. At some point it'll start putting out more and more errors which have to be corrected. You don't see artifacts like we've seen with older cards, but what you see instead is a performance decrease.
That's a really smart observation. Can we/you measure this somehow? Alternatively, if you understand virtual memory, page faults, and paging/thrashing where our OS's can no longer advance the real task and spend almost all time swapping pages between physical and virtual, a comparable thing can happen between a processor's cache and its main memory source of data. We then can argue that the problem is an I/O bandwidth problem (which is why we have the cache in the first place). I would not know how to measure this all, though.
Did they finally get that chilled water cooler working? My recommend solution is to use a water tower feed with an overflow bypass to stabilize the output. Simply put, you use a tall pressurized pipe as a reservoir. You have flow input in the middle, to fill it with cold water, the cooling loops take input through a manifold at the bottom, and the top has an adjustable overflow pressure valve that feeds to an unpressurized return reservoir that the loop return also feeds to. Still a lot of wasted pressure, but at least now what is going through the loop is stable and regulated.
I just want to say I have the Corsair ULP Air 100 that they are advertising in the video. It's got the speed of ULP and the tactile feedback of an old school typewriter. I absolutely love the keyboard!!! I returned an ASUS Falchion and got this keyboard and will never ever go back. It is worth every penny of the price with its 4 connection buttons, customizable RGB, and the volume roller is the most satisfying thing of all. It's sooooo smooth! Typing is effortless and dare I say a pleasure as this keyboard will give you an extra 20 wpm if you already are an expert typist. Great for gaming as the keys guide your fingers towards the center with the concave design. Definitely get it if you want premium feel.
7:10 "What are you going to measure with? Wheres the oscilloscope?" Ive walked that road when consulting experts of an expensive hobbie im trying with my jank tools and abilities. Alex is like my spirit animal.
A volt meter is actually a better tool here. They're much more accurate. (say .02%) and an average oscilloscope is like 5%. A scope is great for watching trends, and seeing data--it "can" measure raw voltages but it's not adapted for it.
This is an interesting difference compared to Ampere, where I flashed a 1000w EVGA Kingpin bios to a reference 3090 and it will just suck up as much power as you give it, up to 850w I've tried. The returns are very diminishing but it does gain performance every step of the way from 350w stock to 850w. And that's without any voltage mods and regular water cooling. Presumably a card with better power design than reference could use even more power but the one time I tried letting it eat the full 1000w it tripped some power protection in my power supply, as the card has only 2 8 pin connectors. Even though it's a 1600w platinum super flower.
Where can I find these custom bios? I knew they existed but my 3080ti FTW3 Ultra draws over 400w regularly and sits around 2000 on the clock with no tweaking just a (very aggressive) custom fan curve. I have two ftw3 hybrid kits (long story on why 2) that I'm finally thinking of installing it as it sits around 78c-79c under full load and the summer heat is coming. EVGA isn't clear on the instructions regarding the bios change to register it as the hybrid model though. I'm not going to 1000 but I'd love to tinker with a higher draw custom bios. TLDR to you have the bios url? :) thank you!
@@nainmain nope all you have to do is Google "reset graphics drivers windows shortcut" and you'll see plenty articles. Not only do you not know what you're talking about, you're acting like you have any idea too.
Yo, guys, 17:30 - the problem is GPU. Check your GPU cuda cores load as GPU load/usage and you will see that you are limited by CPU due that GPU is not utilizing its all Cuda cores and improving frequencies improved frequencies until you reach the wall of CPU with the same CUDA cores but showing higher impulsing on Cuda cores as frequencies. i9 9900K @ 5.3 GHz can utilize 2x Titan RTX SLI both above 95% while 2x RTX 2080 TI SLI were utilized at 97%. You got too many Nvidia CUDA cores on ADA and this is why Nvidia decided no Titan due that there is no CPU that is even capable to push fully RTX 4090 at 100% utilization. That is why Nvidia decided no ADA Quadro in the form of Titan due the CPU cant handle more CUDA cores when utilizing DXs APIs.
Nvidia definitely tweaked something with the 4090 in the last update or two because my power usage is waaayyy down. I barely even hit 340w unless I'm playing some unoptimized garbage(virtually every other game). Wicked episode. Always enjoy Alex and Linus shenanigans, together they always bring us the crazy frankensteined tech we want! They need to start a miniseries on LTT "It'll be fine" or maybe "What's the worst that could happen?"
The Strix 4090 VRM is definitely cabable of pushing 1000W+. It's GPU VRM setup is equiped with 24 70A phases. So theorettical at about 80% efficiency you could push 1300w easily.(cooling needs to be sufficient though)
Did I miss the part where they hit 1000W on the GPU? Also, I'd imagine the lack of improvement at higher power is intentional. It's just the reserve they'll use when they release the 4090ti. I'd imagine the 4090 is already a 4090ti and just has some built in limiter.
They decided not to chase it, but the BIOS they flashed did support up to 1000W. They weren't getting any performance improvement past 600W and they didn't want to blow up the card.
And as they mentioned in the start of the video, this card was originally planned to have a mich higher power limit but Nvidia set it lower because it didn't go any faster with that power level
This 4090 its a damn beast of a card! I hope that sooner rather than later i (we) can have something similar in our tower. Im still stuck with a 1080ti.
@@jeremybeaverson7167 I just want to jump to 4k... I have a LG E9 just there not being used for gaming at its fullest but goddam, I'm not going to pay 2500€ for this card (paid 630€ for my evga 1080ti ftw3 back then and I thought I was insane) and thats why I hope that we soon can have this kind of power in for a much cheaper price. They created something beautiful with this card for sure.
@@APunishedManNamed2 its seems to be the case. well, quit from gaming and use the spare time for smething else entirely. new games arent even worth it anymore anyway
3:27 Fun Fact that connector was design for peaks of up to 1250W, so maybe not theoretically sustained, but can manage a 1KW, I would assume it can be done as long as you keep the temperature of the connector/cable under control
@@billy101456 I know you're joking, but liquid cooled cables are fairly common for high current applications. Eg DC fast charging for electric cars. For GPUs, it'd make more sense to increase the voltage to 48V or 60V like what you see on data center hardware.
@@NavinF sadly they had their chance. And they picked more thin wires and demand more power. As well as letting the gpu know exactly how much it CAN have. Instead of maybe up the voltage or fewer thicker cables with larger pins so the heat won’t build up with a bad connection.
@@billy101456 Heat isn't the problem, you can easily push 1kW through a 12VHPWR connector as the OP noted. The problem is voltage drop and wasted copper in today's connectors. This can (and likely will) be fixed in a backwards-compatible manner when GPUs support both 12V and 60V, with a higher power limit when paired with a 60V PSU.
@@NavinF Computer connectors don’t use copper, it’s stamped steel in a plastic housing. The high power connectors that were melting last year were from heat due to a tweaked connector making a less than perfect connection, causing them to overheat. Or they were only plugged in 90 some percent and weren’t making full contact, leading to overheating. Moreover, just throwing more power at a consumer product isn’t a good solution. Seeing the chip itself only runs at 1.3 to 1.7 volts, adding more to the supply voltage will only get you so far.
I love these types of videos cause It's like two best friends doing some incredible janky shizzle and both know It's probably super stupid but still just goes for it
I used to push 1.425v out of 700 series Titans and GTX 980 Classified’s, you’d be surprised how much voltage they’ll take 😂, that was literally my gaming over clock voltage. This was in liquid cooling. The Titans, one did eventually die after 2 years of that daily I think, the 980s weren’t phased by it at all, but those were Classified cards built for abuse I understand the architecture is not the same and these gigantic dies can’t take massive voltage without extreme cooling like the older cards could.
What's the theory behind increasing the chip voltage? I can't of a way that increasing the voltage will increase the clock speed. The chip will draw more current with increased voltage, and dissipate more heat, but it's not doing more calculations. Sounds like people who equate petrol octane to more power, or have tuned exhaust pipes thinking the louder the exhaust the more power the engine is producing.
@@axelBr1 Transistor gates require a voltage difference to work properly and the faster the chip runs, the more noise there is that prevents the chip from recognizing where is one and where is zero, so the voltage has to be increased to keep everything stable
Remember that episode where you built the cooling system that used two to those blower motors for filling bouncy castles? I thing that would have helped in this episode.
Hope that K100 air doesn't have the same issues hundreds of us have with the K100 optical switches.. You know, the one where the keyboard records what you type and randomly decides to type is back at random moment, sometimes a week later, crashing in the process and forcing you to disconnect and reconnect :) Hope you're fixing this crap soon Corsair.
i love how Alex and Kyle is like the same coin, but two different sides. Alex on one side is completely unhinged while Kyle has multiple sanity checks over and over
16:13 Win + Ctrl + Shift + B has saved me mid Rainbow Six game multiple times where for some reason my GPU would "Disconnect" (is what Windows Event Viewer calls it) and it would switch to my CPU graphics.
I love how I barely understand whats going on, but linus's enthusiasm gets me excited for whatever is going on. One day I'll have a computer worth talking about 😅
My favorite thing about LTT is the complete and total disregard for caution and paying attention. Like how in the world is Lunis able to drop so many things and still have sponsors?! It's insane! But, then again, the dude is THAT guy. Right place, right time, right everything. Super lucked out!
If you guys wipe some soldering flux on the pcb where trying to solder those wires on, it'll make it a LOT easier and pretty much suck the solder to to the joint for you, no more of those dry joints you're getting.
I got two MSI Gaming Trio X, one wasn't going above 2850, the other is accepting +230 +1500 and is stable in portal RTX (which in my tests is the most agressive on the card) to 2985/3035Mhz range). There is some silicon lottery for sure
“It will probably be fine” is the unifying theme of every Alex and Linus video.
what's up with this crazy upload time
Famous last words...
@@Charles_Bro-son 47 seconds ago!!
All the best ones
Alex and Linus vids are my 2nd most fav type of content, right after the "corporate update vlogs"
GPU Twerk is the only graphics card tuning software I want on my system.
Oh my innocence, I thought he said GPU Torque lol.
Oh yeah, now I can't unsee it 😀
@@TH3C001 Some twerk with a lot of torque, so it's better to stay back a safe distance!😅
At this moment, dozens of people are planning that very application.
This software needs to become real!
... and why is there not a proper butt emoji? We have a poop emoji, but no proper booty? Closest I could find is a peach. 🍑
@@filipsikora1973 Why does Linus do this to us?
Maybe this is the kind of performance that EA was expecting for the PC port of Jedi Survivor
Still wouldn’t get 60FPS with no stutter lmao
But.. higher-end hardware results in lower performance, for Jedi Survivor..... apparently.
yeah lmao
nah .. EA rather expects 20 fps and paid dlc for 1 fps extra
Thats just the corporate machine demanding a prodouct gets shoved out the door. The "well patch it later" mantra. EA is plenty big enough to have ZERO excuse for poor performance on PC. Hell, I'm willing to bet their dev rigs are not all that different from a high end gaming rig. You don't need A6000's for Game dev. A 3090 or 4090 can even overkill.
14:11 "Lets just give this a tiny bit more" *increases voltage by almost 40%*
offset not voltage
@@A-BYTE64 Phhahah high school graduate got baited. LOL
16:14 Win+Ctrl+Shift+B Thank you!
Just wonder how you find more of these shortcuts.
Should be somewhere in documentation
Ive never had luck with this. Doesnt seem to do much on most laptops. And on my desktop it will reset but then set my monitor refresh rate down to 60hz and not allow me to go over 120hz until i restart
I tried pressing it before I saw the part.... wondering, what is it doing (nothing happened, I am running Linux)
key mashing works :P (jk don't try) ..
Win xtrl shift W for word, X for excel, Google it. But you might want a keyboard with macro keys so you can program stuff yourself.
Alex has the best job in LTT. He makes crazy projects, enjoys in them and plays with interesting tech
They built him a fuckin shop. He's a car dude too. Man's livin the dream.
He also makes the best videos
He cannot survive without a few modifications to perfectly working products.
You know it's gonna be a good video when he says "it's probably fine."
And reviews cars!
This entire experiment was like firing gunshots at a puddle of gasoline.
It be the same as shooting water actually gas isn’t as flammable as moves make it seem and a spark isn’t likely ignite it now a incendiary around or a tracer might be fun?
At least there are less flames involved. That's saying something with the 4090.
@@jeremybeaverson7167 a spark will definitely light off gasoline if it comes in contact with the fumes. I've lit gasoline just by hitting a striker a couple inches in the air plenty of times
Except nothing would happen in that situation unless you are also in the puddle. Real life isn't like video game physics bud.
@@jeremybeaverson7167 you are talking about diesel. Gasoline will definitely burn I use that shit to light charcoal all the time it's incredibly flammable diesel isn't
I love the videos where its just Alex and Linus.
Absolute chaos.
if you listen very carefull, you can hear Kyle crying in the background...
It will be fine.
a kid and his unhinged dad
1000th like
imagine you could use nitrogen liquid cooler for 2000 fps
11:20 omg I love this directing, the camera movement is comedy perfection and feels so good!
I just love the composition
18:18 I have never seen a sponsor segway interrupted. Absolutely floored.
I love all the Linus and Alex videos where the whole idea is "we shouldn't do this. It's insane and kinda dumb. But let's do it anyway and take it to 1000%."
oor, or, we could for once do projects the proper way and get usable results?
@@RandomUser2401 that's not really what this channel is for, LTT is much more for entertainment value than good and proper data
@@RandomUser2401 There's a million sources on the internet for raw data, there's nothing else out there like LTT.
@@RobbieEl JFC it‘s not about raw data but doing a proper experiment with meaningful results. And btw how many raw data sources are there for 1000W GPUs?
@@alfiegordon9013 yeah, funs and giggles and so much wasted resources and potential. So if it‘s only about entertainment, why exactly are they sinking millions into LTT labs?
Fun fact: it is said that if you try this mod at home, the warranty card for your GPU spontaneously combusts.
Your PC just transforms into a little robot that electrocutes you and then runs back to Nvidia
I mean if I were to do it I’d use a glue gun instead of soldering wires to the GPU. You could also use glue gun to shunt mod any GPU . Not that I condone warranty fraud of course
Please keep more projects with Alex coming. Love these videos.
I'm revisiting this video 3 months later, and I have to say it is one of my all time favorite. Do more crazy stuff like this.
Everything from production quality to the ingenuity behind creating solutions as you go is why I'm subbed to LTT. I couldn't care much for a "new" laptop with x and y now available, but waking up the techie at heart in me makes me think my 1080Ti might still have more to offer with ideas like these
I love how they are building a high tech lab, yet they do sketch experiments still.
That's what makes him..linus
@@navdeepsingh3508 Well, that and the dropping of things.
Sketch experiments is exactly the kind of stuff you want to do with a high tech lab.
I mean the only difference between science and dicking about is writing it down
Beyond human made comforts It's sketchines all the way down, labs are maximum expose to sketchines.
It's good to know the stock cooler on the Strix is really really good and water cooling is not necessary to get really good performance out of this. Plug it in and forget. Just don't forget to plug it in properly.
I agree the cooling on my strix 4090 has been way better than what I first had on 3090. I burned my hand on 3090 backside 105c BS lol. Had to add it to my custom loop.
"Omg that's your notes for this video?" Is such a strong start for an LTT video 🤣
It shure have to involve alex in some way
You know What else is Stronger then a start to an LTT Video? The Segway to Our Sponsor 😂
Super mega like you listened to my comment(and probably others) and went JBC with the soldering station! Regarding the discrepancy in voltage readings between what the card reports and the multimeter reading (at specific TestPoints, measurement in these conditions is location dependent, best location might be on the small center SMD ceramic caps behind the GPU, closest to the load), is because of the high current draw of the GPU and voltage drop in distribution to the load (GPU Core). @ 600[W] with 1.1[Vcore], current draw would be a whopping 545[A]+ (0.75[V]->800[A] if that was Vcore)... Congrats! Always wanted to do that, amazing it held up so well! Keep 'em rollin' and much success with the LAB!
7:37 Maybe I am weird but Linus' smile was so blissful at this point.
It made me smile.
More of Kyle please, watching him suffer trying to sanity check Alex is hysterical.
Agreed. More Zaffer representation! :D
@@mottamort I thought he had a Cape Townish accent...
He definitely sounds like Kobus from the pub, yep
Linus: KW/h cost is getting too expensive
Also Linus:
If LMG was based in Europe he'd need to sell the company to pay the energy bill from this video
It's kWh, not KW/h (kilowatt hour, not kilowatt per hour).
@Morgan Mitchell this matters less than what kim kardashian ate for lunch 20 years ago
@@jcspotter7322 a kWh is 3.6 million joules of energy, it’s a constant. Something doesn’t draw energy measured in kWH, you measure in watts over a timespan. A kW per hour would be 1000 joules in the timespan of an hour (0.0027kWH) so to say this difference doesn’t matter is kinda stupid
@@jcspotter7322 woww, such ignorance..
How can a simple correction bother you so much
Editor's must deserve props on this timely video.
but the black bars on a non 3840x1920 monitor....
@@lancerdeltarune Amazing res to watch from a phone though :)
Your English teacher from 2nd grade doesn't deserve props, though. Apostrophes don't make words plural.
@@PushyPawn Terribly boring and horribly unfunny.
Another one of these comments............
I love the Alex and Linus videos like this because they take all those weird late night/shower ideas and actually do them.
Alex is proof that the more highly trained you are, the more chaotic your work is.
The 4th thing you can be limited by is critical path in the silicon itself. There has to be enough time between clock pulses (the thing you decreasing by increasing clock speed) for signal to propagate from one register to the next.
But that’s not what’s locked tho right?
@@enacku If i’m understanding the top comment right than that would be “locked” in a sense. There is a minimum amount of time it takes electricity to travel along the traces in the board. You cant adjust the length of that
In this case however, it’s a natural limit and not one artificially imposed by nVidia.
@@serina762 Yeah, that would come down to manufacturing nodes and actual chip design to make those routes shorter and more efficient;
You're gonna run into other troubles with that at some point though, if you go too close with transistor gates you might get electrons going where you don't want them to.
Once you get closer than 3 nm on your gate size you start getting tunneling effects, so there's a fundamental limit to how small you can make one.
@@serina762 in short: yes
in long: yes until they go fully insane
Every time I see Alex in the thumbnail I know I'm in for some nervous giggling and questionable activities. I love it. I hope that the future of these projects include modifying parts even further. Basically that chiller didn't just unlock the full potential of the CPU and GPU, I feel it's also unlocked the full power of Alex.
Alex is not in the thumbnail
@@Vismajor01 they may have changed the thumbnail
1st time I've seen Linus not having a smooth Segway to his sponsor 19:18
12:45 "Man I remember when they were cool" See that's the thing Linus, they want to stay cool
Alex vids where he's just casually willing to risk destroying expensive things are my favorite videos.
Silly silly, you don't think they actually paid for it?
It’s a no lose situation, either it works and it’s great content-or it blows up and it’s great content. Linus always gets his money’s worth, the only thing that changes is the title of the video and if there needs to be a follow-up
I love how chaotic neutral Alex is.
Hah! Xkcd
well said
When alex says "It might be fine, it might not be fine" at 14:00, it dawned on me that they are doing this on a GPU that costs almost four times as much as my computer
If you add up the stuff they have in storage it will probably be more than your house (or the house you would like to have) is worth 🤷🏼♀️
You know it's a good video when Alex is involved :)
Wouldn't it have been "easier" to use the Galax HoF 4090 for this? That card comes with an unlocked vBios as well as two of the 12vHPWR connectors on the card as well as the VRMs to back that up since it's meant for hardcore OC and record breaking.
Yeah considering that thing got 3.6GHz it would’ve been the only way to truly make this video. Then again is that really a 4090 if it’s been modified so much? I think this video was a demonstration of rigging a typical consumer 4090.
A PNY Vertu 4090 would be an typical consumer card but aa Asus 4090 OC is not that!
Even 450W is way more than the card actually needs. I occasionally set mine to about 60% PT at which point it only loses about 5-8% performance but it draws about 250W. This thing is crazy efficient. As a reference, my previous 3070Ti would deliver way less than half of the performance while drawing about 330-350W
You can do the same thing to a 3070 Ti, though, undervolt it to consume 200W while performing just slightly worse.
@@deheavon6670 Right, but it would obviously not perform like a 4090 -10%... While it makes me wonder about stability, it's what you get for it that's impressive in comparison.
While the 40 serie is very underwhelming on so many level, so far consumption isn't really something we can complain about, it pretty much makes RDNA3's efficiency marketing look stupid.
@@vuri3798 very stable for me so far in any case I have tested it. To offer another perspective, I am playing Apex Legends at 4K 144fps locked (medium-low settings) and the 4090 (60% PT) pulls around 150W. To get the same performance with the 3070ti I had to overclock it, at which point it almost mostly stayed at 144 fps but consuming around 350W. Oh, and the 4090 is also at around 40% utilization in this scenario.
I did the same thing on my 3080Ti, when undervolted and power limited it draws around 180 to 250W in game and most of the time for a very small to insignificant loss in performance. (not to mention this nearly removes any coil whine in games that usually produce a lot of it in stock configuration)
4090 costs 6 months of wages in my country, who cares if it draws 600 watts.
Alex is the embodiment of "It's not stupid, if it works".
You should look up maxim 43: "If it's stupid and it works, it's still stupid and you're lucky" :D
And I love it
@@MrGrimsmith Alex would make a great Schlock Mercenary character
@@pirateFinn He's already the living example of Maxim 14: "Mad Science" means never stopping to ask "what's the worst thing that could happen?" :D
@@MrGrimsmith and 25
3:33
Yes, yes they are. They're using 24* 70A power stages.
70A power stages can not do 70A continiously (standart conditions). Those are rated at 20-30A continious with 70A peak current capability (for at least couple dozen of switching cycles).
40-50A with waterblock cooling might or might not be possible.
@@volodumurkalunyak4651 sounds like a job for Buildzoid xD
@@Kr0noZ Biuldzoid actually said practically the very same thing.
He also mentioned efficiency curve beeing at the highest point at approximatly 20A for 60-70A power stages
@@volodumurkalunyak4651
Yes they can, they're literally specced that way by the manufacturer. Look at a SiC832 (which is a 70A SPS) datasheet for example, they list 70A for "continuous current rating". Now should you run them at 70A? No, obviously not, because they'd run really inefficient = create a lot of heat. You're completely missing the point tho, if you would've done your math correctly you would've noticed that they would never even come close to running at 70A with the 1000W BIOS.
@@-eMpTy- manufacturer is so confident in above mentioned power stage working at 70A continiously that datasheet efficiency curve don't even come close. I am even sure it will handle 70A continious... under LN2 (approximate 13W power dissipation per power stage is no joke).
I haven't wrote "that will not run 70A at all", but "it will not run 70A at standart conditions".
My 980 TI reaches about 320 Watt in a stable overclock with 40 Degrees Celsius. I find it still impressive that this card is in my system now for years and is still running really well.
when i used to overclock my PC i bought a small chest freezer and fill it with water, make a closed loop with it and it works like a charm with my i7 4790k and gtx 1080Ti
You definitely gave NVIDIA ideas for the RTX 5090.
Maybe you could try this with the 7900 XTX? Seems like it has much more potential performance to squeeze out with modding and that insane cooling system you have
I think derbauer already did that
@@slipknot1943 so? he did it for 4090 too, LTT has a hundred times the views and only fair if he does it for team green, to do it for team red and lol team blue.
@@jondonnelly3 I hope you get better soon
@@slipknot1943 okay but I want to see Linus and Alex do it
This. AMD GPUs tend to be way more open to messing with and modding one to do this sounds easier.
using Linus's afterthought to fill the awkward pause was genius lmao
OMG Linus your exact blank face stare into the object, and the way you had your hands behind your back leaning over the workbench, and reaction to Alex when you said "Uh-huh." @7:46, that's exactly what my late techie uncle Paul would do, when observing my cousin and I working on PC's. You made me have flashbacks dude! :D
Now we need to see Linus actually playing f1 with this and a sim racing rig
I'd like to see videos on these types on mods on Radeon and Arc GPUs. Lets see how far a 7900 XTX can be pushed and how for an A770 can be pushed.
Already done by other, more competent, youtubers
@@karserasl Whose videos would you suggest then?
@@karseraslby whom? Are there any?
@@karseraslthey are literally one of the most popular and watched tech youtubers on the site so they can't be a more 'competent youtuber' lol
Should I be worried about Alex’s well-being with how many jank set-ups keep happening under his watch? That’s a lot of bare wire coming off of the GPU.
Just don't touch them 5head
Nah, Alex is a professional. Man can jank it up all he wants. He knows what hes doing. It's Linus we should be worried about.
@@Dwykid1 considering how much stuff he drops. I don't want to see him drop dead from the setups Alex keeps implementing.
Its 12v...
no gays
Now this is what I've been missing in this generation - some excitement and pushing the envelope!
13:15 He said "GPU Twerk" again XD
You guys should try this with the 4070 since it seems to be sticky held back by power.
Biggest takeaway for the regular user from this video is that Win + Ctrl + Shift + B resets the graphics driver. Never even knew this was a thing, much less that it was in stock Windows. Thanks!
The performance went down after you increased the memory clock because these chips have ECC memory. At some point it'll start putting out more and more errors which have to be corrected. You don't see artifacts like we've seen with older cards, but what you see instead is a performance decrease.
That is coool!
That's a really smart observation. Can we/you measure this somehow? Alternatively, if you understand virtual memory, page faults, and paging/thrashing where our OS's can no longer advance the real task and spend almost all time swapping pages between physical and virtual, a comparable thing can happen between a processor's cache and its main memory source of data. We then can argue that the problem is an I/O bandwidth problem (which is why we have the cache in the first place). I would not know how to measure this all, though.
@@jpdj2715 cpu bottleneck....
I have a 2kw electric heater in my room, I can't wait for Alex to match it with the 5090! (Alex + Linus Chaos-Danger-Jank videos are the best)
balz
Did they finally get that chilled water cooler working?
My recommend solution is to use a water tower feed with an overflow bypass to stabilize the output.
Simply put, you use a tall pressurized pipe as a reservoir. You have flow input in the middle, to fill it with cold water, the cooling loops take input through a manifold at the bottom, and the top has an adjustable overflow pressure valve that feeds to an unpressurized return reservoir that the loop return also feeds to.
Still a lot of wasted pressure, but at least now what is going through the loop is stable and regulated.
I just want to say I have the Corsair ULP Air 100 that they are advertising in the video. It's got the speed of ULP and the tactile feedback of an old school typewriter. I absolutely love the keyboard!!! I returned an ASUS Falchion and got this keyboard and will never ever go back. It is worth every penny of the price with its 4 connection buttons, customizable RGB, and the volume roller is the most satisfying thing of all. It's sooooo smooth! Typing is effortless and dare I say a pleasure as this keyboard will give you an extra 20 wpm if you already are an expert typist. Great for gaming as the keys guide your fingers towards the center with the concave design. Definitely get it if you want premium feel.
You forgot the referral link lol.. nice ad
18:01 "Riding that razor's edge". Seems appropriate for F1 racecars driving in the rain...
I DEFINITELY thought that said “GPU TWERK” when I saw the window
Now I’m even more interested in why it was stuck at that performance cap.
0:20 I love how they picked Magnussen and Verstappen for the low and high power consumption.
Win + CTRL + SHIFT + B
That's a super useful command I did not know about before, thanks Alex!
7:10
"What are you going to measure with? Wheres the oscilloscope?"
Ive walked that road when consulting experts of an expensive hobbie im trying with my jank tools and abilities.
Alex is like my spirit animal.
A volt meter is actually a better tool here. They're much more accurate. (say .02%) and an average oscilloscope is like 5%. A scope is great for watching trends, and seeing data--it "can" measure raw voltages but it's not adapted for it.
This is an interesting difference compared to Ampere, where I flashed a 1000w EVGA Kingpin bios to a reference 3090 and it will just suck up as much power as you give it, up to 850w I've tried. The returns are very diminishing but it does gain performance every step of the way from 350w stock to 850w. And that's without any voltage mods and regular water cooling. Presumably a card with better power design than reference could use even more power but the one time I tried letting it eat the full 1000w it tripped some power protection in my power supply, as the card has only 2 8 pin connectors. Even though it's a 1600w platinum super flower.
Where can I find these custom bios? I knew they existed but my 3080ti FTW3 Ultra draws over 400w regularly and sits around 2000 on the clock with no tweaking just a (very aggressive) custom fan curve. I have two ftw3 hybrid kits (long story on why 2) that I'm finally thinking of installing it as it sits around 78c-79c under full load and the summer heat is coming. EVGA isn't clear on the instructions regarding the bios change to register it as the hybrid model though. I'm not going to 1000 but I'd love to tinker with a higher draw custom bios. TLDR to you have the bios url? :) thank you!
I am a huge fan of keyboard shortcuts, and I think Win + Ctrl + Shift + B is my new favorite one.
that shortcut does not reset the driver
@@nainmain yes it does.
@@raresmacovei8382 yeah also if I remember correctly all it did is restart the desktop environment
@@HyBlock so your source is: trust me bro.
@@nainmain nope all you have to do is Google "reset graphics drivers windows shortcut" and you'll see plenty articles. Not only do you not know what you're talking about, you're acting like you have any idea too.
Yo, guys, 17:30 - the problem is GPU. Check your GPU cuda cores load as GPU load/usage and you will see that you are limited by CPU due that GPU is not utilizing its all Cuda cores and improving frequencies improved frequencies until you reach the wall of CPU with the same CUDA cores but showing higher impulsing on Cuda cores as frequencies. i9 9900K @ 5.3 GHz can utilize 2x Titan RTX SLI both above 95% while 2x RTX 2080 TI SLI were utilized at 97%. You got too many Nvidia CUDA cores on ADA and this is why Nvidia decided no Titan due that there is no CPU that is even capable to push fully RTX 4090 at 100% utilization. That is why Nvidia decided no ADA Quadro in the form of Titan due the CPU cant handle more CUDA cores when utilizing DXs APIs.
16:15 That shortcut is actually so usefull. Gotta remember that.
Nvidia definitely tweaked something with the 4090 in the last update or two because my power usage is waaayyy down. I barely even hit 340w unless I'm playing some unoptimized garbage(virtually every other game). Wicked episode. Always enjoy Alex and Linus shenanigans, together they always bring us the crazy frankensteined tech we want! They need to start a miniseries on LTT "It'll be fine" or maybe "What's the worst that could happen?"
18:19 Alex canceled Linus’ Ult💀
The Strix 4090 VRM is definitely cabable of pushing 1000W+. It's GPU VRM setup is equiped with 24 70A phases. So theorettical at about 80% efficiency you could push 1300w easily.(cooling needs to be sufficient though)
Did I miss the part where they hit 1000W on the GPU?
Also, I'd imagine the lack of improvement at higher power is intentional. It's just the reserve they'll use when they release the 4090ti. I'd imagine the 4090 is already a 4090ti and just has some built in limiter.
Yeah that was terribly clickbaity. 1000W GPU? more like 600W GPU
They decided not to chase it, but the BIOS they flashed did support up to 1000W. They weren't getting any performance improvement past 600W and they didn't want to blow up the card.
And as they mentioned in the start of the video, this card was originally planned to have a mich higher power limit but Nvidia set it lower because it didn't go any faster with that power level
It has 15k cores enabled out of 18k, if I'm not mistaken. The Quadro variant has all 18k cores enabled, and probably will 4090ti too
Never happened
The tip to soldering motherboards is to heat gun the area you want to solder then solder it on with a bit of flux
This 4090 its a damn beast of a card! I hope that sooner rather than later i (we) can have something similar in our tower. Im still stuck with a 1080ti.
The 1080ti still a beast tho it’s very similar performance to the RTX 3060
I mean, if you don't like your 1080ti, I'll trade you for my 980ti
@@jeremybeaverson7167 I just want to jump to 4k... I have a LG E9 just there not being used for gaming at its fullest but goddam, I'm not going to pay 2500€ for this card (paid 630€ for my evga 1080ti ftw3 back then and I thought I was insane) and thats why I hope that we soon can have this kind of power in for a much cheaper price. They created something beautiful with this card for sure.
@@pauloa.7609 you're going to be waiting for awhile lol
@@APunishedManNamed2 its seems to be the case. well, quit from gaming and use the spare time for smething else entirely. new games arent even worth it anymore anyway
I added 20 watts to my 3080 Ti mobile via a bios update. Huge performance boost when used with a good cooling pad.
3:27 Fun Fact that connector was design for peaks of up to 1250W, so maybe not theoretically sustained, but can manage a 1KW, I would assume it can be done as long as you keep the temperature of the connector/cable under control
So your saying it’s time to liquid cool the cables?
@@billy101456 I know you're joking, but liquid cooled cables are fairly common for high current applications. Eg DC fast charging for electric cars. For GPUs, it'd make more sense to increase the voltage to 48V or 60V like what you see on data center hardware.
@@NavinF sadly they had their chance. And they picked more thin wires and demand more power. As well as letting the gpu know exactly how much it CAN have. Instead of maybe up the voltage or fewer thicker cables with larger pins so the heat won’t build up with a bad connection.
@@billy101456 Heat isn't the problem, you can easily push 1kW through a 12VHPWR connector as the OP noted. The problem is voltage drop and wasted copper in today's connectors. This can (and likely will) be fixed in a backwards-compatible manner when GPUs support both 12V and 60V, with a higher power limit when paired with a 60V PSU.
@@NavinF Computer connectors don’t use copper, it’s stamped steel in a plastic housing. The high power connectors that were melting last year were from heat due to a tweaked connector making a less than perfect connection, causing them to overheat. Or they were only plugged in 90 some percent and weren’t making full contact, leading to overheating. Moreover, just throwing more power at a consumer product isn’t a good solution. Seeing the chip itself only runs at 1.3 to 1.7 volts, adding more to the supply voltage will only get you so far.
Dude everytime i can never guess when you do the segway to your sponsor always so random with it good stuff
External GPUs used to be just for CAD Machines doing 3d in the late 90s and early 2ks.
I love this in so many ways.
I love these types of videos cause It's like two best friends doing some incredible janky shizzle and both know It's probably super stupid but still just goes for it
Linus: Our sponsor
Alex: Wait, we didn't kill anything yet
Corsair/4090:
Love these mad-scientist videos with Alex!
I'm glad Linus is still making space heaters that you can game on at super high perf numbers
4:25 "...my vision's going...", I felt that, Linus.
Man I'm really in love with all the videos with the new cooler.
I used to push 1.425v out of 700 series Titans and GTX 980 Classified’s, you’d be surprised how much voltage they’ll take 😂, that was literally my gaming over clock voltage. This was in liquid cooling. The Titans, one did eventually die after 2 years of that daily I think, the 980s weren’t phased by it at all, but those were Classified cards built for abuse
I understand the architecture is not the same and these gigantic dies can’t take massive voltage without extreme cooling like the older cards could.
larger process nodes also take more voltage. What was 700/900 series? 28nm? 32?
What's the theory behind increasing the chip voltage? I can't of a way that increasing the voltage will increase the clock speed. The chip will draw more current with increased voltage, and dissipate more heat, but it's not doing more calculations. Sounds like people who equate petrol octane to more power, or have tuned exhaust pipes thinking the louder the exhaust the more power the engine is producing.
@@axelBr1 it allows you more stability for a higher clock speed. You have to manually boost the clock speed.
@@calvindibartolo2686 28 nm I believe
@@axelBr1 Transistor gates require a voltage difference to work properly and the faster the chip runs, the more noise there is that prevents the chip from recognizing where is one and where is zero, so the voltage has to be increased to keep everything stable
Remember that episode where you built the cooling system that used two to those blower motors for filling bouncy castles? I thing that would have helped in this episode.
To be fair: It does say GPU Twerk
yeah ,Greatings from Brazil!!! i luv This kind of video!!! JGARONI OC here baby
The fascinating thing about all these crazy voltmodes with ridiculous power draws is that if something goes wrong, it bangs big time
I really enjoy these videos. Thanks Linus and Luke (and the LTT team) for the great content.
Hope that K100 air doesn't have the same issues hundreds of us have with the K100 optical switches.. You know, the one where the keyboard records what you type and randomly decides to type is back at random moment, sometimes a week later, crashing in the process and forcing you to disconnect and reconnect :)
Hope you're fixing this crap soon Corsair.
Well first mistake was choosing corsair for a keyboard especially a gross external laptop style keyboard
i love how Alex and Kyle is like the same coin, but two different sides. Alex on one side is completely unhinged while Kyle has multiple sanity checks over and over
16:13 Win + Ctrl + Shift + B has saved me mid Rainbow Six game multiple times where for some reason my GPU would "Disconnect" (is what Windows Event Viewer calls it) and it would switch to my CPU graphics.
Loving the jamming music during the soldering, water-cooling, working on the card 👍 good job editors you got a good taste in music
I love how I barely understand whats going on, but linus's enthusiasm gets me excited for whatever is going on.
One day I'll have a computer worth talking about 😅
Some poor developer is currently sifting through a dozen crash reports trying to figure out what happened to cause their game to push a GPU that hard
they have linus on speed dial
I wanna see the tech support staff staring at an RMA'd 4090 with a 1000w bios
Now this is the kind of video about power consumption I would Watch 🔥
0:41 my bad listening... I understood "affordable" at first, but while searching it on ama I realised: he said "portable"...
1:58
Nvidia: nammumamanamua
Protip for Alex: you can toggle between current, min, max and avg values in GPU-Z by clicking on the field which displays the value
My favorite thing about LTT is the complete and total disregard for caution and paying attention. Like how in the world is Lunis able to drop so many things and still have sponsors?! It's insane! But, then again, the dude is THAT guy. Right place, right time, right everything. Super lucked out!
I appreciate that not only did you pick the Canadian GP you also went with Stroll. Peak Canada there.
If you guys wipe some soldering flux on the pcb where trying to solder those wires on, it'll make it a LOT easier and pretty much suck the solder to to the joint for you, no more of those dry joints you're getting.
Alex doesn't care to do anything right. If you care at all about seeing things done correctly you should just skip his videos.
Kinda surprised the strix wouldn't accept +200/+900 right away. I have the TUF model and +220/+1700 has been very stable for me.
I got two MSI Gaming Trio X, one wasn't going above 2850, the other is accepting +230 +1500 and is stable in portal RTX (which in my tests is the most agressive on the card) to 2985/3035Mhz range). There is some silicon lottery for sure