I have a Razer Blade 14 with a 3070. Bought it in the summer of ‘21 for a solid PCVR experience to couple with my Quest 2 and I have no regrets :) everything is between 100-120fps in VR.
Well, for all intents and purposes, a gpu IS a PC inside your PC. It has a processor, „chipset“, memory, in- and output IO and Power delivery circuitry.
@@john-paulhunt6943 It gets to the point where majority of people just don't need better one , games are made with majority of people in mind , the medium will just be treated as ultra setting and newest GPUs will play games on above that.
Im so glad he brought realism to the durability of power cables. I can tell when they mentioned the lifespan, most people have never heard of lifespan of cables plugged in and out and freaked out over something they had no idea about.
Just think about that, 12v and 5v drawing 400W, thats about 33 to 40A total current, divided by the 4 connectors, thats around 10A per 8pin (dont know how many of them are power-been a while) it must be getting very close to the limit on how much those connectors can take, its right on the limit, even a bit of oxidisation after a year or so, and suddenly you get massive increases in temperature on those pins. They should have worked with PSU manufacturers to add new connectors.
It will be around 2200-2300 in my country which is like three times more than the average monthly salary 😬. I really wonder how many people will buy even the 4080 12 GB also known as the new 4070. I bet even the true 4070 will be around 800-900 here which is awful. RDNA 3 please save us with more reasonable pricing.
Brooooooooooo.... that would be about as much as my whole PC! I have a 2080 super, but I am pretty sure that is only a fraction of my total rig's cost... I did got my PC on a sale though!
Can't wait for the day a manufacturer will come up with the idead of using 2 pcie-slots with one card. One for data, and the other for structural integrity.
Yeah that would be smart. Second PCI-E x16 slot is unused by the vast majority of gamers anyway since SLI is a thing of the past so this would be a good way to make use of it.
Yo Linus thanks man, when you smashed the gpu on the table and bent the cables aggressively, I was reminded why I still watch you after all these years.
Seriously, how do they make sure the card's PCB doesn't just snap? I'm less concerned about the motherboard, it's loaded along it's length and not perpendicular(ly?) and the PCIe connector is soldered it with so many pins, it should hold up fine.
@@LuLeBe First off, it's "perpendicular to", since you seem to be wondering (just trying to be helpful, not sarcastic, I don't know how my comment comes off). Now, if you've ever worked on a PCB, you'd know these things are tough as shit. They're just layers of metal on top of each other, really. I tried to drill through a CPU once (I killed it by mistake and chose to make a keychain out of it instead of making e-waste) and it broke my drill bit. I thought the PCB can't be that tough. Wrong. I was worried about the IHS, as I didn't have a proper drill bit on hand to drill through metal, but it chewed through the IHS easily and quickly. Then, I thought the PCB would be a piece of cake aaaaaaand it wouldn't drill through and I snapped my bit in half. I had to borrow another bit to finish the job (while keeping the same hole size).
Not sure about other people, but I'd be really interested to see a "we made a computer more efficient than a console" revisit, where you try to take a 4090, and undervolt it to be about equal with a 3090 and see where it ends up power wise. I'm actually super curious to see what happens when you don't go completely overkill with this generation's cards
The most I'll ever pay for a GPU will be 1000 dollars provided that card to meet or exceed the price to performance ratio that was once seen in the GTX 1080Ti
I'm just entertained nVidia is recommending you pair it with an AMD CPU, though the lower-than-intel power consumption was probably more important than Intel not competing with their GPUs
@@edragyz8596 I mean I don’t even want arc to drop nvidia. What would be really interesting is if Intel basically matches older generation GPU performance and offers it as alternative to the latest, greatest and expensive GPU’s. I mean in 3 years, the 3070 is still gonna be great for most games, and Intel could simply aim lower and offer entry level 1080p/1440p cards that don’t compete with the best buy offer solid performances for a lower price. Like I am not looking forward to my GPU dying in 2-5 years and having to upgrade to brand new brick that rips my wallet contents to pieces. I just want a instant replacement that is priced in the range of £250-£429 bracket It’s definitely going to be something that develops further as price continue to go up at the top end
@@TheArsenalgunner28 That's all fair and all, but it seems Intel is the only one who cares about beating Nvidia in the feature game. If Intel doesn't release an xx90 killer at some point, I'll still be stuck buying an xx90, because AMD just keeps MISSING.
The sizes of these things are getting to the point where Cloud Streaming can really become a more appealing option with a bit more advancement of the tech. I saw a video with the 4090 ROG Strix and it requires a full ATX tower due to how large it is. That's insane. The heat output could probably take care of an entire 3 bedroom house in winter!
A typical single room space heater is upwards of 750 Watts, some even up to 1500 Watts. So no, this card could not provide the same heating as 3 of them. 400W < 2250-4500W (Incase the math confused you)
You almost got 100 member on your team and one still foked up the color grading on this video especially on your hands..:D The infos btw is much respected! :)
It has 32 pins for your power supply. I'm not sure what that other dongle is for. Linus says it's for communication to the power supply. Probably not needed. My 800 watt PSU has four 8 pins. It may run it but not with a hot CPU.
I mean I remember when we didn't even need extra PSU connectors to GPU, or better yet when we didn't had GPUs to begin with. Times change, technology goes forward.
Potential talking point: CARD WEIGHT. I cannot stress this enough. GPU's are getting massive, and there's only so much that a tiny PCIe x16 slot can hold before something gets damaged or bent. Perhaps Nvidia and their AIB partners should start adding struts and/or mounting points to the rear of the card to reduce sagging?
A heatsink _that size_ is not just ridiculous, but bordering on impractical. Not just in the sense of space, but the weight of the card as well, which is getting into requiring a support bracket to keep it from coming out of the socket. It would make a hell of a lot more sense to turn it into a kind of AIO with the radiator/fan attached elsewhere, or redesigning it to fit on the card, if possible.
Achtually a ton of them will ship with support brackets. Go watch the Gamers Nexus video about AIB 4090 announcements. The marketing they came up with for their GPU support brackets is hilarious.
Right, when you implement a solution to fix a thermal problem that introduces a new issue. I want to design a new GPU Bra. Make it all lacey colorful. Or a Sweedish model pump style novelty gpu riser.
@@BrentJohn My next card will be getting some support no matter what. I have a 10 series card in my system, which is around 5 years old and despite being a relatively small 2-slot card it has developed some noticable sag over the years because it has no support at the back. So this isn't just an issue with chunky cards. I don't know if this would ever create real problems down the line but there's hardly any effort in putting a 10cm piece of plastic under your GPU so I'll just do it.
Yes, but no one would buy it, do you really want to pay more for efficiency when with the cost of electricity no classical user will ever see the difference on theirs electricity bill ?
@HackerMode if you want cheaper card more efficient just go with less higher graphics card of the new generation (4070 is a bit like a more efficient and cheaper 3080-3090) I get the point of stopping the power hungry card but I think with the current market you can already do it if you go on smaller graphics card
i think what you're trying to say is we should focus on making these cards more efficient overall, instead of more efficient per watt. ultimately when this card is pulling 400+ watts, that's a shit tonne more than the majority of other cards on the market, but the performance gains don't necessarily reflect the power draw. i totally understand!
Why the fuck is Nvidia trying to cook DLSS and RTX into the same package at 5:00 ? Raytracing (a proprietary lighting engine) has NOTHING to do with DLSS adding new frames to increase frame rate performance with AI. These two features are completely different from one another. When I activate RTX it's for the lighting engine, not some stupid DLSS gimmick to degrade my image with AI, ffs.
@@ntdspades5971 i reckon at this rate we will be waiting for 60series cards. I find it hard to trust amd/ati as ever gpu i have had with them has either failed or had driver issues. Yes that may have been fixed but i dont trust them in my work PC after so much lost data from random shutoffs/dead gpus. i would never pay this much for a gpu either though so the waiting game begins.
I think these new cards are a prime example of Peltier TEC should start being implemented. A solid copperplate >>TEC>> with a thin fin heatsink designed to have air forced though as fast as possible or even water cooling.
@@Adaephonable I'm more concerned with getting the heat off the card and into another more efficient medium as fast as possible. If we do that the wind tunnel might be able to do some real work. Hahahahaaa
Looking at those coolers, I am actually expecting orienting the fans sideways at some point, if you're already 4 slots wide, you might as well exhaust out of the back of the case.
@@eliashabash7591 mate. That's not the issue. It's just a stupid buy in all the ways you look at it. And Nvidia is literally worse than apple so it hurts to support them.
@@eliashabash7591 Yeah the 4090 is in line with the 3090 pricing, but the 3000 series promised much bigger improvements over 2000 than this does over 3000. So it made sense that 3000 was really expensive, because it was such a big step. That launch was 2 years ago, by now the prices would usually (without miners) sit at like $550-600 for a 3080 (MSRP was 699). And compared to that, the price has really gone up. And you know that actual prices will be even higher than their MSRP. I remember when a friend got a 580, it cost like $470 or so new. My 970 was 299. My 2070 was around $420. And the current 4070 costs $900 now???
Anybody else getting old LTT nostalgia with this onboxing? Love the current main channel stuff but it's nice to have that feeling of old comfort once in a while with Linus
2 ปีที่แล้ว +1
From what I remember that was the whole intention of creating this channel, to have a place where their old style could continue since they want deeper dives and pieces that tell a story in the main channel.
I'm still amazed that Zotac is still around. Amazed and grateful, because they're one of the few companies that still get weird with their product designs. Need an overpowered and very short GPU? Check Zotac, they probably have it. Need a low-profile GPU? Check Zotac.
I probably just got unlucky but a Zotac GTX 260 card was the only GPU that ever randomly died on me and the warranty process was a nightmare. I don't think I even got a replacement in the end. I haven't trusted Zotac since then, but to be fair, that happened like 14 years ago.
I wasn't grateful when they cranked their msrp to scalper levels, and were selling $700 3060s, $1200 3070s, and $2500 3090s on their own website during the shortage. Showed how much they cared about their customers then.
@@tibor29 Yeah, DoA products and other faulty products happen to every manufacturer. It's a real shame that EVGA got burnt out on the graphics card market because they legitimately had the best customer support for their stuff. I more appreciate Zotac for offering unusual products than for their overall quality.
I have a Zotac Oc mini RTX 2070 and it works great haven't had any problems with it. Zotac has became a better company then they used to be. Their customer support has gotten better after a lot of complaints. Their quality on their GPUs has gotten massively better over they years because of the consumers complained over it. Zotac do have clean looking cards and funky looking ones such as the one in the video
Zotac has a history of doing low profile cards, fanless cards and all kinds of stuff but look at reviews before you buy there have also been some really bad quality stuff by them from time to time.
RTX 6090: Requires a separate 1000 watt power supply with a 96 pin pcie cable and a separate case that has 20 fans installed (For average performance of course. Full performance requires more extensive measures)
2060 super is highly recommended. runs most game at 70+ fps at 1440p ultra[I wouldnt raytrace tho] and worst case scenario you drop resolution to 1080p or reduce quality on distant shadows because its pretty taxing for something you will never notice unless you go batman detective mode on your monitor over something as trivial as shadows that are far away. Also after doing a little research I found that the 2060 super is about on par with the 3060 although the 3060 handles lighting better the 2060 super runs calculations and physics and everything else slightly better, they have around the same price as well. if you get a 20 series get the 2060 super if you want better than that you need to go for 3070 tier or better otherwise you are wasting money
As much as I kinda like the new power connector for the RTX 4000 Series, I kinda wish they had placed it on the back side of the card (Similar to the RTX 2060 FE and some Quadro cards) for a cleaner look on the inside in terms of cable management
@@HVDynamo They could come up with a standard connector that fits right after it to handle the watts needed. Its not like PCIE hasn't been changed again and again either.
I think the issue there is the extra wiring that requires to extend the connection for the power to the board from the end instead of surface-mounting the connector directly to the board. I think GN had complaints about how the 2060 FE's connector made teardown a lot more difficult, as well.
@@bthatguy1181 Then you need to get PSU, MB, graphics cards, and likely case designers in line for it. And you probably can't be backwards compatible. You can adapt old PCIe power connectors to the new one. It may not be ideal, but it is viable. Then you have to consider managing it fpr vertical-mount etc. And MB makers need to include traces capable of carrying all that power from where it enters the MB to the PCIe slot. All in all, it would be a LOT more complicated to implement for a small aesthetic benefit that a huge proton of their customers wouldn't care about.
I always thought the argument of NVIDIA Remix changing artistic intent was such a nonsensical argument. Modding is already a thing, ESPECIALLY in Bethesda games. This is just a new generation of graphics mods, which have existed for ages. For example: I play Fallout 4 HEAVILY modded, to the point where I replaced all the dead plants with living plants (as would be the case 200 years after a nuke impact. Yes it completely changes the enviornment, but that's the whole point. You add visual mods specifically to change the whole vibe of the game, or to improve what was there. And it's not like you're forced to use Remix, you always have the option of simply not using it. It's just a cool tool to make a new kind of mod.
Especially since if *you* don't like the modernized/modded look, *then you don't need to install these mods!!!* 🤦♂️ That's no reason to want to keep others from making installing them for _themselves,_ sheesh! It's like those purist idiots who rant about remastered re-releases of classic music albums or movies. Of course they different, there's no point in releasing a remaster, if they don't change the sound/visuals to make it feel like something new again! But these fools act out like all the copies of the original release will just disappear in a puff of smoke, never to be heard/seen again. Sure, they might be pulled from streaming services and replaced by the new master, but if streaming is your only means to access a classic, then you probably don't care about it that much anyway (at least it's not a good idea to rely on streaming services to provide content you deeply care about in that one specific version indefinitely).
@@LRM12o8 I have actually seen people argue against modding support like that. It's a terrible view to take imo. Let modding support exist and let people, who want to change their game, change it x.x I myself don't mind old games or graphics in general, but I know people, who do, so mods that make the games look "newer" are good for them.
So the 3090 TI and the 4090 all have the 12VHPWR plug. But with the 3090 TI the adapter was just a 12 pin adapter without the 4 pins (same adapter as the normal 3090 but minus 1 8 pin). While the 4090 has the 12+4 pin adapter. So im wondering, if the 3090 TI who has the same plug as the 4090 worked without the need of those 4 pins. Shouldnt the 4090 do the same? Or do you NEED those 4 pins? The 3090 TI and 4090 PCBs are not that different. So in theory, what worked on the 3090 TI, should work on the 4090.
I don't know why, but I like the design of the Zotac cooler. Still wouldn't even consider getting an RTX 4090 card though (my electric bill is bad enough as-is).
What do you get charged for kWh? Here in California it is average 21c kWh which at 750w an hour for 4 hours a day is 50c a day (who plays that much everyday though and 50c a day is still not that bad). That is hypothetical though as you are not going to be consistently drawing that much power over our 4 hours of base screen time, even while gaming. You will be hitting much lower.
NVidia remix in the hands of environment artists who know what they are doing and understand the level design of games can really improve the games quality and playability without worrying about those downsides that you mentioned in the video like strategically places lights or vibe of an environment. That is what Im looking foraward too not hapazardly upgrading everything for the sake of upgrading it.
I think that the artist intention argument isn't valid for players who've already experienced the game at least once as it was intended. That's why modding exists. We want to try the game again in a different way, and the modder is presenting his or her interpretation.
fun fact: it consumes way more than 450w. the specified power consumption is always only the chip itself. The board itself also easily takes 100 or 200 watts. My 2080ti has a tdp of 250w but will actually draw 340w max
Looking at the size of this I think it's perfect time for case manufacturers to start experimenting with different mounting position for gpu . I know lian li has 2.
I've suggested before that graphics cards be redesigned as a module. Just a block like the PSU somewhere in the PC where it has dedicated air in and out. Perhaps a dedicated connector as well instead of PCI-E Back in the day it was an expansion card, nowadays most Personal Desktops have a graphics card. (PC's not workstations)
5:34 … compared to which gen gpu ?! Based on what hardware was that claim made ?! I never saw that demo run on a old gen tech to take those words with a grain of salt :/
Having Zotac 3080 this made me happy, to see Linus considers that 3rd best thing ;-) Scored one week after launch of 3080 at pretty much MSRP. Been serving me very well since then.
The problem is that the 4090 launchprice in sweden is 23000 SEK up to 29000 SEK (2100-2700 USD) after taxes. So... the prices are just way off the 1600 USD price USA gets. Still probably will be bought but I wouldnt even touch it for that price. I bought my whole PC for 2800 USD with a 12700k and 3080TI.
The new cards from Nvidia are amazing but they are way too expensive for me. The most I would spend on a GPU is $1000 CAD considering my 6950XT costed me $1000 CAD. Although, I'm extremely excited for RDNA3. I might buy a 7900XT if it nears performance of a 4090 in gaming but similar price of a "4070".
I never really liked Zotac and the size of this card is quite literally absurd... but the design of that card is absolutely sexy af. The curved look is really unique and I would love that on a 3000 series card
Sizewise it reminds me of the Soundblaster AWE32, which fitted in an ISA slot. It was so long, that in our pc case we had it in the bottom slot and it was resting on a small rubber block to prevent it from hitting the bottom of the case.
LINUS! QUESTION: What is the difference between traditional modding using higher poly mesh, HD textures, and light sources (like ENB, or better yet ENB and RESHADE (ReShade I don't think is considered lighting per se but I'm not sure...but they can look amazing together). So my question is...WHAT is "ReMix" of the new 4090 bringing to the table that could NOT have been done via traditional modding or even overhauls? Are you saying that this new tech is allowing programmers to do things on the fly with real physics without programming their own physics based mod engine or plugin from scratch? Hence, basically, making it easier and likely better? Help me understand oh true Tech God! Sorry for the caps...I'm just really hoping to get your attention 🙂
The main problem with this modding approach is that it's just the Graphics only approach to the modding scene. This is baiscally a wrapper/extracting the directx draw calls and replace/inject them with this software. That's it. There is no mention of gameplay overhaul, animation, no bug fixes, quality of life improvement, or new content types of mods. While yes this is a good way to enjoy the game in a visual standpoint, you might still have issues running the game in the old jankiest way possible. Like watching Halo Anniversary Cortana still using her original old animation but with the better graphics level of jank lol
@@Deliveredmean42 But I just assumed that the ReMix approach could be added to anything you've already modded with the types of mods you have referenced (patches, quality of life, animations, DLCs, etc etc). Am I mistaken?
I mean, they managed to cram the 30 series in laptops (tho most laptops with GPUs have an awful battery) so they'll find a way, maybe cutting of some stuff that they judge someone with a laptop won't need
Despite its detractors, I’m extremely excited for RTX Remix, especially because it’s available for public use and not just for game studios. Modders are often some of the most passionate members of their game’s community, so giving them the tools to more easily enhance their games to such an extent will be a GAME CHANGER.
I'm very excited for RTX Remix. It should be encouraged for older single player games like morrowind, to bring more life in the 'older' games. If you own the game you should be able to do whatever you want with it without consent of the Devs who don't approve. And we'll maybe buy more older games if we can mod/update for a modern experience.
Can't wait to see a comparison between GTA retextured with RTX Remix and the Defective Edition. It would be like: Who would win? A millionaire company tresspased by a kid leaking footage or a less than 2K $ GPU. ( •_•)>⌐■-■ (⌐■_■)
If you haven't already, an 800W is sort of the minimum. I'd recommend a 1000W PSU just to be safe and give yourself a little wiggle room and future proofing.
never trust Nvidia Gimmicks i am a RTX 3080 user whatever their gimmick was about 3080 , 40% was wrong its all about marketing thing in future i wont buy any GPU , i will stuck at 3080
Stay tuned to LTT for the full 4090 review coming soon! Would you consider buying a 4090? What GPU do you have right now?
AMD Radeon 6600... 😷
Ryzen 5600x
3070. I would like to upgrade to a 40 series so I can 4k game with less stress on my GPU.
I have a Razer Blade 14 with a 3070. Bought it in the summer of ‘21 for a solid PCVR experience to couple with my Quest 2 and I have no regrets :) everything is between 100-120fps in VR.
Sticking with AMD.
3070 3700x enough for now
With the size of these things its starting to feel like installing a PC into your PC.
funnily enough pcie "PCs" (DPUs, SmartNICs, etc) are much, much smaller
Well, for all intents and purposes, a gpu IS a PC inside your PC. It has a processor, „chipset“, memory, in- and output IO and Power delivery circuitry.
@@Renuclous Nooooooo..... A PC is a PC, not because of only hardware... software is part of the equation.. Linux, Windows, macOS.
@@Todd_Manus firmware
yo dawg i heard you like PC's
Just in time to function as a space heater in winters.
😂
Yes but will anyone in Europe be able to afford to run it?
@@davidlazarus67 good thing not everyone lives in Europe then.
@@chitorunya Yes but more than double North America. We are a much bigger market than the USA.
Any 110/120V users running an actual space heater in their gaming room will have their breaker trip.
In the distant future, the RTX 9090 will be solid state and will be the size of an iphone 4. You heard it first from SupplyNinja. Don't forget it.
you’re not wrong lol
Every time he slams it on the table, I get a sharp pain up my spine, this is no way to treat your kidney!
Why not? I treat my liver like that every other weekend
Nvidia is a piece of turd, which makes their products turds. Not kidneys. He is slamming a turd on the table which should make you feel repulsed
@@nickevison391 By slamming it on the table?
@@stevenjoummaa4640 metaphorically with alcohol, yes
200th like
With how big GPUs are getting, I can't wait to see someone make an SFF PC out of the shell of one.
Right, I am taking my Star Wars Titan XP enclosure into a RPi build, but it is pretty small inside .
😂
At this point you're modding the GPU to put a PC inside it.
@@john-paulhunt6943 It gets to the point where majority of people just don't need better one , games are made with majority of people in mind , the medium will just be treated as ultra setting and newest GPUs will play games on above that.
what does any of this mean
Im so glad he brought realism to the durability of power cables. I can tell when they mentioned the lifespan, most people have never heard of lifespan of cables plugged in and out and freaked out over something they had no idea about.
10 times is nothing
Just think about that, 12v and 5v drawing 400W, thats about 33 to 40A total current, divided by the 4 connectors, thats around 10A per 8pin (dont know how many of them are power-been a while) it must be getting very close to the limit on how much those connectors can take, its right on the limit, even a bit of oxidisation after a year or so, and suddenly you get massive increases in temperature on those pins. They should have worked with PSU manufacturers to add new connectors.
Only 1,600 before tax. Wow, that’s more than a lot of people’s whole build
Could very well get up to 1800 depending on which manufacturer you get .
It will be around 2200-2300 in my country which is like three times more than the average monthly salary 😬.
I really wonder how many people will buy even the 4080 12 GB also known as the new 4070. I bet even the true 4070 will be around 800-900 here which is awful.
RDNA 3 please save us with more reasonable pricing.
My house costs less than that
Yeah tell me about it. Mine was $1,250 and I still felt like I was over paying.
Brooooooooooo.... that would be about as much as my whole PC! I have a 2080 super, but I am pretty sure that is only a fraction of my total rig's cost... I did got my PC on a sale though!
Can't wait for the day a manufacturer will come up with the idead of using 2 pcie-slots with one card. One for data, and the other for structural integrity.
By structural integrity you mean to prevent sagging? JayzTwoCents made a video how to do it.
...what do you think the massive heatsink does?
Yeah that would be smart. Second PCI-E x16 slot is unused by the vast majority of gamers anyway since SLI is a thing of the past so this would be a good way to make use of it.
@@st0nedpenguin To transfer heat and make the card super heavy? Im not sure what you're implying here.
@@nabawi7 The heatsink provides much of the structural rigidity.
The trick is to sell someone else's kidney instead of yours
Just imagine, the 5090 will use 2x PCIe x16 solts and it will come with an additional 600W PSU in the box😂
In the box? Not a chance.
The next one will ship with an ATX wallpower plug at the back. Just so it can take a whole breaker for itself. \sarcasm
Nah it will come with a dolly so you can move your appliances hooked up to the 240V outlets
Only 600W?
@@nielskoolstra that actually makes sense and id be fine w it
Yo Linus thanks man, when you smashed the gpu on the table and bent the cables aggressively, I was reminded why I still watch you after all these years.
Yea…Jesus Christ but thank god it’s only a zotac
Lmao! I was thinking about how his handling of tech and gpu’s specifically, hardly phases me anymore 😂 it used to make me so uncomfortable lmao !
the fact they're trying to save dp 2.0 just for the hype of the 5000 series is so disgusting. over $1000 and still stuck on dp 1.4a
This GPU looks like it needs it's own case to be held properly without ripping the PCI-e connector off the motherboard
Seriously, how do they make sure the card's PCB doesn't just snap? I'm less concerned about the motherboard, it's loaded along it's length and not perpendicular(ly?) and the PCIe connector is soldered it with so many pins, it should hold up fine.
That's what I said on another guy's video, you need a construction licence to build a pillar for this card
@@LuLeBe First off, it's "perpendicular to", since you seem to be wondering (just trying to be helpful, not sarcastic, I don't know how my comment comes off).
Now, if you've ever worked on a PCB, you'd know these things are tough as shit. They're just layers of metal on top of each other, really. I tried to drill through a CPU once (I killed it by mistake and chose to make a keychain out of it instead of making e-waste) and it broke my drill bit. I thought the PCB can't be that tough. Wrong. I was worried about the IHS, as I didn't have a proper drill bit on hand to drill through metal, but it chewed through the IHS easily and quickly. Then, I thought the PCB would be a piece of cake aaaaaaand it wouldn't drill through and I snapped my bit in half. I had to borrow another bit to finish the job (while keeping the same hole size).
450w, that's more than my entire PC. 65w ryzen 1700, 175w rtx2070. 65w monitor 1440p 144hz LG32GK650f.
Depends. If the stiffness is good it should stay straight and weight distribution oke
Not sure about other people, but I'd be really interested to see a "we made a computer more efficient than a console" revisit, where you try to take a 4090, and undervolt it to be about equal with a 3090 and see where it ends up power wise. I'm actually super curious to see what happens when you don't go completely overkill with this generation's cards
you can undervolt a 3080 to draw 100w less power and it only looses around 1 to 2% performance.
@@SolarisUK now this is awesome. Any video on that? Would love to watch something like that
@@mve_simracing th-cam.com/video/FqpfYTi43TE/w-d-xo.html
@@mve_simracing silicon lottery. Usually you can undervolt so it draws 80-100w less in afterburner without even touching the core or VRAM.
And don't underestimate the power of 2k resolution. I much more prefer to have 2k and higher settings to make the enviorment more epic 🙂👍
We went from GPU Sag to GPU Gansta Lean
I like the part at 6:08 where he says that being able to change the game could be a game changer.
I'd hope it's a game changer.
Everyone 60 second in Africa, a minute passes.
The first ever computer was over 300kg, looks like we are moving in the wrong direction
299kg of that was not for cooling though
The chips keep getting smaller.
It's the cooling that is becoming bigger again.
@@RadioactiveBlueberry True but shhhhh
@@moortu are these chips actually smaller though? the problem is that transistor count keeps increasing while die size stays the same or shrinks
You're just considering weight, what about the sheer computational power 🤔
The most I'll ever pay for a GPU will be 1000 dollars provided that card to meet or exceed the price to performance ratio that was once seen in the GTX 1080Ti
2:26 yeah dude, I also love the Radeon 5900x, my favorite cpu by far, the arc 12900hk isn't bad too
LMFAOOOO
I has to replay that part myself just to make sure I heard linus right
ryzen, radeon, rtx, ratatouille... a lot of r's.
Lmfao - I was like did I miss a new CPU release or something. There was no * to correct it aswell.
I'm just entertained nVidia is recommending you pair it with an AMD CPU, though the lower-than-intel power consumption was probably more important than Intel not competing with their GPUs
I highly support EVGA's move to drop Nvidia.
And more people need to do that if they're going to listen and make changes.
It’s been over 10 years…and Nvidia still isn’t listening. They’re like the ‘Putin’ of the Graphics cards market.
lol lmao
Intel's Arc isn't quite good enough to drop Nvidia yet, especially since the main game I play is DX11 only...
@@edragyz8596 I mean I don’t even want arc to drop nvidia. What would be really interesting is if Intel basically matches older generation GPU performance and offers it as alternative to the latest, greatest and expensive GPU’s.
I mean in 3 years, the 3070 is still gonna be great for most games, and Intel could simply aim lower and offer entry level 1080p/1440p cards that don’t compete with the best buy offer solid performances for a lower price.
Like I am not looking forward to my GPU dying in 2-5 years and having to upgrade to brand new brick that rips my wallet contents to pieces. I just want a instant replacement that is priced in the range of £250-£429 bracket
It’s definitely going to be something that develops further as price continue to go up at the top end
@@TheArsenalgunner28 That's all fair and all, but it seems Intel is the only one who cares about beating Nvidia in the feature game. If Intel doesn't release an xx90 killer at some point, I'll still be stuck buying an xx90, because AMD just keeps MISSING.
The sizes of these things are getting to the point where Cloud Streaming can really become a more appealing option with a bit more advancement of the tech. I saw a video with the 4090 ROG Strix and it requires a full ATX tower due to how large it is. That's insane. The heat output could probably take care of an entire 3 bedroom house in winter!
A typical single room space heater is upwards of 750 Watts, some even up to 1500 Watts. So no, this card could not provide the same heating as 3 of them.
400W < 2250-4500W (Incase the math confused you)
@@Adaephonable wow no need to be condescending! Obviously my comment was hyperbole...yeesh!
Could someone explain to me what the hell this is ? I’m lost what is this device used for?
@@Lucas_Simoni Chubbyemu fan?
@@Lucas_Simoni That is poetic justice
That’s insane that a card that expensive doesn’t have DP 2.0
That's Nvidia for you, they know people with more money than sense will buy their bs GPUs now matter how much they hobble them...
If AMD 7000 series gets dp 2.0 then Nvidia will become peasant status!😜
Nvidia is long overdue for an ego correction!
Wouldnt that be on Zotac in this case? Pretty sure the manufactuer chooses which ports to use.
@@verakoo6187 no.
@@verakoo6187 no, Nvidia locked it down
How does a modern GPU not have DisplayPort 2.0.
It's new, but it's not modern
smh my head
Arc a770 has display port 2.0
buy intel gpu . 😂😂👌
New AMD CPUs support DP2.0
You almost got 100 member on your team and one still foked up the color grading on this video especially on your hands..:D The infos btw is much respected! :)
I remember when a GPU needing an 8 pin was insane, but 12 is ridiculous
It has 32 pins for your power supply. I'm not sure what that other dongle is for. Linus says it's for communication to the power supply. Probably not needed. My 800 watt PSU has four 8 pins. It may run it but not with a hot CPU.
@@XGalaxy4U Soon you need your own wind turbine to be eco friendly , or wind turbine sub , few people borrow one and split the bill
I remember when 500€ for a GPU was insane
I remember top-tier GPUs costing $400. That's how much a brand new 2900XT cost that I bought with my university scholarship savings.
I mean I remember when we didn't even need extra PSU connectors to GPU, or better yet when we didn't had GPUs to begin with. Times change, technology goes forward.
I love the EVGA model. The RGB is sick, and those fans are… man they are quiet!
hah
Since EVGA won't be putting it into production it will be very quiet.
@@jondonnelly3 insert capt america meme
I've heard that the power consumption is incredible low
It does have some insane technology, in addition to RGB channels it even has an Alpha channel!
Potential talking point: CARD WEIGHT. I cannot stress this enough. GPU's are getting massive, and there's only so much that a tiny PCIe x16 slot can hold before something gets damaged or bent. Perhaps Nvidia and their AIB partners should start adding struts and/or mounting points to the rear of the card to reduce sagging?
Linus carefully handling the Arc A770.
Also Linus desk slamming the RTX 4090.
A heatsink _that size_ is not just ridiculous, but bordering on impractical. Not just in the sense of space, but the weight of the card as well, which is getting into requiring a support bracket to keep it from coming out of the socket.
It would make a hell of a lot more sense to turn it into a kind of AIO with the radiator/fan attached elsewhere, or redesigning it to fit on the card, if possible.
Achtually a ton of them will ship with support brackets. Go watch the Gamers Nexus video about AIB 4090 announcements. The marketing they came up with for their GPU support brackets is hilarious.
Right, when you implement a solution to fix a thermal problem that introduces a new issue. I want to design a new GPU Bra. Make it all lacey colorful. Or a Sweedish model pump style novelty gpu riser.
They're making ones with aio radiators like the Msi Suprim 4090
@@gamingmarcus Yeah I saw. It's stupid to require a bracket for this.
@@BrentJohn My next card will be getting some support no matter what.
I have a 10 series card in my system, which is around 5 years old and despite being a relatively small 2-slot card it has developed some noticable sag over the years because it has no support at the back. So this isn't just an issue with chunky cards.
I don't know if this would ever create real problems down the line but there's hardly any effort in putting a 10cm piece of plastic under your GPU so I'll just do it.
NVIDIA needs to put a power connector on the GPU for the 50 series because you can’t buy a power supply powerful enough to
The PCI express connector needs to be redesigned as a cable and you just mount the module inside your case like a PSU.
They're only charging a kidney for these? Amazing, I thought they'd cost an arm and a leg.
(Studio audience laughter)
Bazinga type humor
title is a joke of bad taste for me
Next gen pricing will be your first born.
@@darylallen2485 🤣🤣🤣🤣🤣🤣🤣
Nvidia, take my two kidneys please !
I'll need that 4090 SLI setup !
That cooling system is freakin huge. I bet you can heat your entire house with that card.
Probably not the houses that people who get these live in
This thing consume 450 watts..Only card that uses too much power..
@@bstaznkid4lyfe392 3090ti also 450watt.
Space heaters are obsolete, just buy a modern computer.
I think we could use a generation where the focus is on making things smaller and more power efficient with only minimal performance increases.
No we dont
@@scotthadley92 We don't what?
Yes, but no one would buy it, do you really want to pay more for efficiency when with the cost of electricity no classical user will ever see the difference on theirs electricity bill ?
@HackerMode if you want cheaper card more efficient just go with less higher graphics card of the new generation (4070 is a bit like a more efficient and cheaper 3080-3090) I get the point of stopping the power hungry card but I think with the current market you can already do it if you go on smaller graphics card
i think what you're trying to say is we should focus on making these cards more efficient overall, instead of more efficient per watt. ultimately when this card is pulling 400+ watts, that's a shit tonne more than the majority of other cards on the market, but the performance gains don't necessarily reflect the power draw. i totally understand!
Why the fuck is Nvidia trying to cook DLSS and RTX into the same package at 5:00 ? Raytracing (a proprietary lighting engine) has NOTHING to do with DLSS adding new frames to increase frame rate performance with AI. These two features are completely different from one another.
When I activate RTX it's for the lighting engine, not some stupid DLSS gimmick to degrade my image with AI, ffs.
I hope AMD will be laughing all the way to the bank in november...for our sake
Definitely skipping at least the 40 series card because this shits crazy
and intel, too. with prices 5 times lower that this absolute spit in the face.
🤣
@@512TheWolf512 and like 7x weaker performance tbf
@@ntdspades5971 i reckon at this rate we will be waiting for 60series cards. I find it hard to trust amd/ati as ever gpu i have had with them has either failed or had driver issues. Yes that may have been fixed but i dont trust them in my work PC after so much lost data from random shutoffs/dead gpus. i would never pay this much for a gpu either though so the waiting game begins.
I think these new cards are a prime example of Peltier TEC should start being implemented. A solid copperplate >>TEC>> with a thin fin heatsink designed to have air forced though as fast as possible or even water cooling.
You are aware of how inefficient Peltier coolers are? Instead of a 400W card you could have a 600W card that does the exact same thing...
@@Adaephonable I'm more concerned with getting the heat off the card and into another more efficient medium as fast as possible. If we do that the wind tunnel might be able to do some real work. Hahahahaaa
Looking at those coolers, I am actually expecting orienting the fans sideways at some point, if you're already 4 slots wide, you might as well exhaust out of the back of the case.
I've never been less excited for a gpu launch...
Because you can’t afford the 4080 and “4070”
Has nothing to do with this particular card
It’s pricing is in line with last gen
@@eliashabash7591 mate. That's not the issue. It's just a stupid buy in all the ways you look at it. And Nvidia is literally worse than apple so it hurts to support them.
@@eliashabash7591 Yeah the 4090 is in line with the 3090 pricing, but the 3000 series promised much bigger improvements over 2000 than this does over 3000. So it made sense that 3000 was really expensive, because it was such a big step. That launch was 2 years ago, by now the prices would usually (without miners) sit at like $550-600 for a 3080 (MSRP was 699). And compared to that, the price has really gone up. And you know that actual prices will be even higher than their MSRP.
I remember when a friend got a 580, it cost like $470 or so new. My 970 was 299. My 2070 was around $420. And the current 4070 costs $900 now???
@@LuLeBe Its ludacris..
@@SpartanArmy117 Should have told Germany that when they relied on Russians for their Energy.
6:20 "could be an absolute game changer"
I see what you did there........
Anybody else getting old LTT nostalgia with this onboxing? Love the current main channel stuff but it's nice to have that feeling of old comfort once in a while with Linus
From what I remember that was the whole intention of creating this channel, to have a place where their old style could continue since they want deeper dives and pieces that tell a story in the main channel.
I'm still amazed that Zotac is still around. Amazed and grateful, because they're one of the few companies that still get weird with their product designs. Need an overpowered and very short GPU? Check Zotac, they probably have it. Need a low-profile GPU? Check Zotac.
I probably just got unlucky but a Zotac GTX 260 card was the only GPU that ever randomly died on me and the warranty process was a nightmare. I don't think I even got a replacement in the end. I haven't trusted Zotac since then, but to be fair, that happened like 14 years ago.
I wasn't grateful when they cranked their msrp to scalper levels, and were selling $700 3060s, $1200 3070s, and $2500 3090s on their own website during the shortage. Showed how much they cared about their customers then.
@@tibor29 Yeah, DoA products and other faulty products happen to every manufacturer. It's a real shame that EVGA got burnt out on the graphics card market because they legitimately had the best customer support for their stuff. I more appreciate Zotac for offering unusual products than for their overall quality.
I have a Zotac Oc mini RTX 2070 and it works great haven't had any problems with it. Zotac has became a better company then they used to be. Their customer support has gotten better after a lot of complaints. Their quality on their GPUs has gotten massively better over they years because of the consumers complained over it. Zotac do have clean looking cards and funky looking ones such as the one in the video
Zotac has a history of doing low profile cards, fanless cards and all kinds of stuff but look at reviews before you buy there have also been some really bad quality stuff by them from time to time.
So next gen will we see the cards not being installed into the pc anymore? We will need a separate box to just house these stupidity huge things.
All hail the 7 slot card, imagine the cooling fans that you could fit and the amount of cooling that you could produce...
whole case cooling xD
They could use the "Blowzooka" fans from Delta Electronics that they tested on LTT :D 7400 CFM or so!
I got so confused with the Radeon 5900X CPU (2:26), but it was just a mistake. I started questioning everything I knew.
Lol same! I was looking for this.
One kidney already went to iPhone
LOOOL
Soo true bro 😂
RTX 6090: Requires a separate 1000 watt power supply with a 96 pin pcie cable and a separate case that has 20 fans installed (For average performance of course. Full performance requires more extensive measures)
No like straight up this stupid shit is gonna happen if they keep going in this direction, it’s so lame for consumers
Pretty soon we are going to be installing motherboards onto the GPU instead of the other way around.
I like how they release new GPU's when I cant even get my hands on a 1660 from the stores....
With the introduction of these 40 series cards, I think I'll get a 20 or 30 series
Most people don't even need a 30 series gpu let alone a 40 series monstrosity like that.
2060 super is highly recommended. runs most game at 70+ fps at 1440p ultra[I wouldnt raytrace tho] and worst case scenario you drop resolution to 1080p or reduce quality on distant shadows because its pretty taxing for something you will never notice unless you go batman detective mode on your monitor over something as trivial as shadows that are far away. Also after doing a little research I found that the 2060 super is about on par with the 3060 although the 3060 handles lighting better the 2060 super runs calculations and physics and everything else slightly better, they have around the same price as well. if you get a 20 series get the 2060 super if you want better than that you need to go for 3070 tier or better otherwise you are wasting money
2:32. Radeon 5900x is my favorite cpu! Much better than Ryzen!
With the price of electricity nowadays i am choosing between a Bentley and a 4090.....
As much as I kinda like the new power connector for the RTX 4000 Series, I kinda wish they had placed it on the back side of the card (Similar to the RTX 2060 FE and some Quadro cards) for a cleaner look on the inside in terms of cable management
Why cant they just add it to the mobo? Not like this is a new issue, why not just add compatibility to the motherboard?
@@bthatguy1181 Because the PCI-E slot can’t handle the power needed either.
@@HVDynamo They could come up with a standard connector that fits right after it to handle the watts needed.
Its not like PCIE hasn't been changed again and again either.
I think the issue there is the extra wiring that requires to extend the connection for the power to the board from the end instead of surface-mounting the connector directly to the board.
I think GN had complaints about how the 2060 FE's connector made teardown a lot more difficult, as well.
@@bthatguy1181 Then you need to get PSU, MB, graphics cards, and likely case designers in line for it.
And you probably can't be backwards compatible. You can adapt old PCIe power connectors to the new one. It may not be ideal, but it is viable.
Then you have to consider managing it fpr vertical-mount etc.
And MB makers need to include traces capable of carrying all that power from where it enters the MB to the PCIe slot.
All in all, it would be a LOT more complicated to implement for a small aesthetic benefit that a huge proton of their customers wouldn't care about.
"zotac recommends a 1000w power supply" , ahh finally i was looking for a nice space heater
Also you can do skateboarding on it and look like Marty McFly
I laughed too hard when he whipped out the Zotac Gaming box after explaining the cool ideas he had. I needed a good laugh today. 😂
I always thought the argument of NVIDIA Remix changing artistic intent was such a nonsensical argument. Modding is already a thing, ESPECIALLY in Bethesda games. This is just a new generation of graphics mods, which have existed for ages.
For example: I play Fallout 4 HEAVILY modded, to the point where I replaced all the dead plants with living plants (as would be the case 200 years after a nuke impact. Yes it completely changes the enviornment, but that's the whole point.
You add visual mods specifically to change the whole vibe of the game, or to improve what was there.
And it's not like you're forced to use Remix, you always have the option of simply not using it. It's just a cool tool to make a new kind of mod.
Hello there.
That's true! I also play FFXIV with better ass and tits mods and that won't change anytime soon.
Especially since if *you* don't like the modernized/modded look, *then you don't need to install these mods!!!* 🤦♂️
That's no reason to want to keep others from making installing them for _themselves,_ sheesh!
It's like those purist idiots who rant about remastered re-releases of classic music albums or movies. Of course they different, there's no point in releasing a remaster, if they don't change the sound/visuals to make it feel like something new again! But these fools act out like all the copies of the original release will just disappear in a puff of smoke, never to be heard/seen again.
Sure, they might be pulled from streaming services and replaced by the new master, but if streaming is your only means to access a classic, then you probably don't care about it that much anyway (at least it's not a good idea to rely on streaming services to provide content you deeply care about in that one specific version indefinitely).
@@LRM12o8 I have actually seen people argue against modding support like that. It's a terrible view to take imo. Let modding support exist and let people, who want to change their game, change it x.x
I myself don't mind old games or graphics in general, but I know people, who do, so mods that make the games look "newer" are good for them.
So the 3090 TI and the 4090 all have the 12VHPWR plug. But with the 3090 TI the adapter was just a 12 pin adapter without the 4 pins (same adapter as the normal 3090 but minus 1 8 pin). While the 4090 has the 12+4 pin adapter. So im wondering, if the 3090 TI who has the same plug as the 4090 worked without the need of those 4 pins. Shouldnt the 4090 do the same? Or do you NEED those 4 pins? The 3090 TI and 4090 PCBs are not that different. So in theory, what worked on the 3090 TI, should work on the 4090.
I don't know why, but I like the design of the Zotac cooler. Still wouldn't even consider getting an RTX 4090 card though (my electric bill is bad enough as-is).
What do you get charged for kWh? Here in California it is average 21c kWh which at 750w an hour for 4 hours a day is 50c a day (who plays that much everyday though and 50c a day is still not that bad).
That is hypothetical though as you are not going to be consistently drawing that much power over our 4 hours of base screen time, even while gaming. You will be hitting much lower.
NVidia remix in the hands of environment artists who know what they are doing and understand the level design of games can really improve the games quality and playability without worrying about those downsides that you mentioned in the video like strategically places lights or vibe of an environment. That is what Im looking foraward too not hapazardly upgrading everything for the sake of upgrading it.
I think that the artist intention argument isn't valid for players who've already experienced the game at least once as it was intended. That's why modding exists. We want to try the game again in a different way, and the modder is presenting his or her interpretation.
fun fact: it consumes way more than 450w. the specified power consumption is always only the chip itself. The board itself also easily takes 100 or 200 watts. My 2080ti has a tdp of 250w but will actually draw 340w max
Looking at the size of this I think it's perfect time for case manufacturers to start experimenting with different mounting position for gpu . I know lian li has 2.
I cant wait to build a PC inside my 4090 at the rate these gpus are increasing in size
"450 watts of power consumption" Dang bru i only have a 500 w power supply 💀💀
I've suggested before that graphics cards be redesigned as a module.
Just a block like the PSU somewhere in the PC where it has dedicated air in and out.
Perhaps a dedicated connector as well instead of PCI-E
Back in the day it was an expansion card, nowadays most Personal Desktops have a graphics card. (PC's not workstations)
There is a case and a patent for that already. It even got reviewed on LTT.
PCs*
Linus made my anxiety levels skyrocket from everytime he drops the card purposely 😢😂
He just wants to demonstrate FPS drops.
5:34 … compared to which gen gpu ?! Based on what hardware was that claim made ?! I never saw that demo run on a old gen tech to take those words with a grain of salt :/
Man, the zotac amp holo 4090 is insane looking
It's the same card but but RGB bling
it looks like a giant soap bar
EVGAs 4090 would have been enormous
Having Zotac 3080 this made me happy, to see Linus considers that 3rd best thing ;-)
Scored one week after launch of 3080 at pretty much MSRP. Been serving me very well since then.
The problem is that the 4090 launchprice in sweden is 23000 SEK up to 29000 SEK (2100-2700 USD) after taxes. So... the prices are just way off the 1600 USD price USA gets. Still probably will be bought but I wouldnt even touch it for that price. I bought my whole PC for 2800 USD with a 12700k and 3080TI.
Denmark here, and I'm in your boat as well. I got a 3070 laptop for 11000 Danish, so 6-8000 dkr less the. Just this card.
"THEN I was gonna unbox one from EVGA..."
...too soon, Linus. Too soon.
The new cards from Nvidia are amazing but they are way too expensive for me. The most I would spend on a GPU is $1000 CAD considering my 6950XT costed me $1000 CAD. Although, I'm extremely excited for RDNA3. I might buy a 7900XT if it nears performance of a 4090 in gaming but similar price of a "4070".
You probably don't need this amount of power, just buy 3090ti less than 1000 Canada dolar and ors work 7 8 years for you
Bro you got a 6950xt you got no reason to upgrade
@@eye776 eh my friend got a 6750xt for 450 dollars that was the best price I've seen in a while
@@arashmh7391 getting a 3090 ti is literally a sidegrade from his 6950xt
I never really liked Zotac and the size of this card is quite literally absurd... but the design of that card is absolutely sexy af. The curved look is really unique and I would love that on a 3000 series card
Sizewise it reminds me of the Soundblaster AWE32, which fitted in an ISA slot.
It was so long, that in our pc case we had it in the bottom slot and it was resting on a small rubber block to prevent it from hitting the bottom of the case.
LINUS! QUESTION: What is the difference between traditional modding using higher poly mesh, HD textures, and light sources (like ENB, or better yet ENB and RESHADE (ReShade I don't think is considered lighting per se but I'm not sure...but they can look amazing together). So my question is...WHAT is "ReMix" of the new 4090 bringing to the table that could NOT have been done via traditional modding or even overhauls? Are you saying that this new tech is allowing programmers to do things on the fly with real physics without programming their own physics based mod engine or plugin from scratch? Hence, basically, making it easier and likely better? Help me understand oh true Tech God! Sorry for the caps...I'm just really hoping to get your attention 🙂
The main problem with this modding approach is that it's just the Graphics only approach to the modding scene. This is baiscally a wrapper/extracting the directx draw calls and replace/inject them with this software. That's it. There is no mention of gameplay overhaul, animation, no bug fixes, quality of life improvement, or new content types of mods. While yes this is a good way to enjoy the game in a visual standpoint, you might still have issues running the game in the old jankiest way possible. Like watching Halo Anniversary Cortana still using her original old animation but with the better graphics level of jank lol
@@Deliveredmean42 But I just assumed that the ReMix approach could be added to anything you've already modded with the types of mods you have referenced (patches, quality of life, animations, DLCs, etc etc). Am I mistaken?
I think a new pc motherboard layout will be needed in the future and internal pc case design too. Gpu's are the dominant thing inside a case now!
Computers used to be the size of rooms and we were able to shrink them down. Now it's time to get back to where we started and come full circle.
Will be waiting for the second gen of ARC or a drop in the 30 series before I upgrade from my 20 series GPU. I like having kidneys!
Not a fan of 4000 series…but if you need a heating in your room, you get it with this card
2:26 ah yes The Radeon 5900x is my favorite cpu
I'm scared how such chips will be adapted and put into laptops. And how to cool them, I ask?
I mean, they managed to cram the 30 series in laptops (tho most laptops with GPUs have an awful battery) so they'll find a way, maybe cutting of some stuff that they judge someone with a laptop won't need
Neutered performance and zero battery life I fear
Linus may have mentioned the 7 slot wide card. But how about a card that plugs into 2 PCIE slots?
@1:03 Linus gives me hope. Linus is living proof that technical details do not matter.
Despite its detractors, I’m extremely excited for RTX Remix, especially because it’s available for public use and not just for game studios. Modders are often some of the most passionate members of their game’s community, so giving them the tools to more easily enhance their games to such an extent will be a GAME CHANGER.
If only it’s wasn’t just games with a fixed function pipeline.
And open source so not just locked on nvidea gpu
This graphics card probably draws more power than my entire PC
450 watt TDP is obscenely unacceptable for a gaming card @ any price point.
Nuff said.
i remember ZOTAC making a picture of 3090 mining in their twitter/instagram when last gen was release... so much love for gaming
That card looks like a Halo powerup
Linus: *smacking an 1600USD graficscard on the table*
my heart: *stops a beat*
I'm very excited for RTX Remix. It should be encouraged for older single player games like morrowind, to bring more life in the 'older' games. If you own the game you should be able to do whatever you want with it without consent of the Devs who don't approve. And we'll maybe buy more older games if we can mod/update for a modern experience.
Dues Ex: Human Revolution NOT directors edition.
Can't wait to see a comparison between GTA retextured with RTX Remix and the Defective Edition. It would be like:
Who would win? A millionaire company tresspased by a kid leaking footage or a less than 2K $ GPU. ( •_•)>⌐■-■ (⌐■_■)
shut up don't give Bethesda an excuse to sell Skyrim again for the 54th time with a $60 price tag but muh RTX edition stfu
A reminder that the top gaming sku during the 10 series was $699. This is a grevious price hike for the same amount of silicon.
Except the 10 series is a joke compared to this card...
The temperature would be hotter than the sun
Im not even excited for the 40 series, the 3080 12gb will do me fine for 4k for now.
What 4k display do u have? 43inch 2160p has 102ppi. Which is sharp. I hope youre not wasting all those pixels on a small 27inch monitor.
@@fynkozari9271 LG C2 42”
Can I use a 850w power supply with this card?
If you haven't already, an 800W is sort of the minimum. I'd recommend a 1000W PSU just to be safe and give yourself a little wiggle room and future proofing.
never trust Nvidia Gimmicks
i am a RTX 3080 user
whatever their gimmick was about 3080 , 40% was wrong
its all about marketing thing
in future i wont buy any GPU , i will stuck at 3080
That thing looks like a small skateboard, just add the wheels!
Wow Zotac does again with the coolest design ever!