Omg guys... really.. are you that dumb 🤦 Enable 1440p in the Phone ...wait for buffering....wait until Linus zooms in . Make a Screenshot with your Phone ...Go to Gallery and zoom in in that Screenshot. THE DIFFENCE IS HUGE !
@DEEJMASTER 333 eh not nearly as bad as other things so I don't see them as a real problem plus they don't hog that many cards and at least they are using them
@DEEJMASTER 333 >Crypto miners are completely destroying it Now that's where you are COMPLETELY wrong. We undervolt our cards giving them more life in the longer term. You gamers/PC users don't want to buy a card from someone who has mined with it (and sells at a lower price), but still being butthurt when you "have to" buy it from a scalper. That's where shame would be on you guys, because miners don't dump cards onto the market for no reason. Maybe they are old or don't profit enough for them. I still think that the behavior the PC user and/or gamers have on crypto users are still unacceptable thinking we are selfish pricks, while crypto users think about you as a human being. Bottom line then... Crypto is not as bad as you think.
It's great to be blind these days, don't miss out much more and my wallet is still quite happy and healthy I never saw any difference in those screen video shoots. surely there are, but also are negligible to me
@@theknightikins9397 I got a new in-box 3090 from ebay for £1650 a few days ago. Looks like the market's returning to baseline (slowly). It really is a monster by the way. And if you use it to mine crypto while you're not gaming you can make your money back in under a year.
But think about, new cards just push the prices of all cards down like the 1080, I still rocking my 1080 and probably will for the next 2-3 years until the 30 series is a bit cheaper. Scalpers ruined the market.
No, your smarter than the rest of the morons who always fall into the overhype trap. Why pay for something designed to be downgraded to make the next Titan card look good.
Linus: From this footage, you can clearly see the difference between 8K DLSS and 4K. Me, watching on my 1080p monitor not even in fullscreen: Oh, yeah, of course, totally.
I love how Blender, an open source software that you can run pretty well with a $300 setup that has no external gpu, is also the gold standard for benchmarking the most expensive consumer gpus in the market.
He has a good connection and relationship with videos and gets paid(sponsored)occasionally by them (recent video)what did you expect?Linus added that title probably to show that in a way he is being subjective when criticizing a product
@@Kocey_YT Dont worry, i havent met a soul who complained about resolutions, otherwise that person who looks down on others who have lower native resolutions is just a pillock.
"If you are one of the priviledged 8k display owners heres the difference" Me with a 1368x768 desktop screen: Yeah you can really see the pixels pixelling more than the older pixels
Don't forget the texture update for the rehashed optimization disaster of a game known as crysis I guess tech reviewers think bad code design is a solid benchmark for nEXT GeNErAtIoN lol
@@ryanerickson764 I mean, as long as it stress tests your components, it should be good enough as a benchmark, right? Don't know enough about benchmarking, or those specific benchmarks, to say for sure whether or not it does just that, but I'm sure there's a reason why people use Minecraft and/or Crysis. In fact, if the bad coding puts more stress on components than necessary and/or expected, good results in those tests would say even more about the performance of said components, wouldn't it? I remember when performance in the first couple of Crysis games was considered the peak performance test for gaming PC's for a 'normal' audience (aka regular gamers, rather than enthusiasts), and if the remaster is anything like that, it should work just fine as a test. As long as performance scales with better components, that is. Side note: My current desktop came with Crysis 3, and the game was pretty much brand new at the time. I think maybe it's time for a new rig...! Now if only I could get a hold of a 3080...
The difference in quality is amazing! Especially considering that the highest resolution offered by TH-cam is 4K. That means that we are comparing a 4K native image to a 4K super-sampled to 8K and then down-scaled back to 4K. If we ignore the horrible compression that TH-cam slaps on top of it all of course.
It's a very CPU-bound game. I have a 3080 and I was confused about the low frames( I was using RTX on) and I noticed that my card was only at 13%, but checked my CPU( a Ryzen 9 5900X, which in heavy games, runs at about 10% usage) and realized it was running at a stable 40%, so it's really just the game being weird.
@@nautilus7025 the main reason is that Minecraft doesn’t use multicore systems so a really good single core might run better than a multicore with lower quality cores
We use to call external power supplys _'power bricks'_ but I guess NVIDIA changed the game. The *3090* is indeed a _power brick_ itself. I stil don't see people gaming in 8K. Heck, I'm still on 1080p wishing to upgrade at least to 1440p wich is still the sweet spot for most people, as far as I can see. But money must end somewhere and there are enough people driving around in their Lambos, Ferraris and Bugattis, so...
Defo makes me want a 8k tv with hdr and 2.1 hdmi and a gaming pc with a rtx 3080 but a wonder what a ps5 will look like on 8k am no good playing games with mouse and keyboard but now all a need is a xbox controller and a can play every game with it so a dono why I am still on consoles
@@SmallPaul. Don't waste your money. Higher pixel resolution is not always better. There is a limit to how much detail you can see that's determined by your viewing distance and display size. Here's a graph that breaks it down. images.app.goo.gl/K6jamsk19wpXZ3kv6 Now keep in mind everyone's eyes are different so this isn't perfect but they're good guidelines. I went and measured the distance my head is from my display at roughly 2.5ft to 3ft. Let's go with 2.5 because it's closer and easier to see on the chart. To see the benefit of a 2160p display I would need a 40inch diagonal display. I've got a 27 inch display and it fills up my entire vision at that distance. Based on that chart I should see some benefit from something a bit over 1080p, like 1440p, but 2160p is pointless and 4320p is literally doing nothing but driving up my electric bill. This is all pretty much in line with Linus' on real word testing. th-cam.com/video/ehvz3iN8pp4/w-d-xo.html They found 1440p displays were the better ones, and that's with the benefit of the placebo effect and not having perfect 1:1 comparisons of identical monitors with only differences in resolution. Now granted eventually you'll probably have an 8k monitor if for no other reason then as display technology gets cheaper it's hard NOT to get a higher resolution display but there's no benefit to the higher display at that size in of itself. What makes a difference in picture quality is other advancements, which aren't as easy to quantify to the consumer, such as black levels, contrast ratio, HDR, lower display lag, higher frame rates etc. Personally I actually do most of my gaming on a 65inch OLED TV that's in 4K and I still usually keep it at 1080p. Why? Because with how far away I am from it I can't see the difference.
You have an I5-4950 and expect no bottlenecks? The CPU is a HUGE bottleneck. You would need an i9 11900k or i7 11700k maybe one of the upcoming alder lake chips on intels end or a 5900x or better, and even then there is a small bottleneck
@@Techandgaminggalore you don't need a 5900x. More cores =/= better performance, at least for gaming. You'd get the exact same performance when gaming from a Ryzen 5 5600x. Granted, if you are multitasking a shit ton of things at once, or doing rendering in the background while gaming, then a 5900x or 5950x is in order. Also, depending on resolution and settings, a gpu has to spend more processing power on each frame, so a weaker cpu that doesn't send as much framedata will have less of a bottleneck at higher settings.
@@land2097 still using that statement in 2020 eh? You people are pathetic. Use the real English language. Also in regards to 8K go to Gamers Nexus and see the shots taken at Nvidia, Linus and others and also you will get to hear how this is not really 8K.
The difference between the 3080 and the 3090 (generalized) is 10fps, for seven. hundred. DOLLARS. If you’re a editor, designer, or anything that’s heavy on VRAM, then the 3090 is way more then enough. But if you’re a regular gamer with money to spend, 3080 is more then enough to run any game at ultra settings while still holding a good smooth performance.
More like, do you even wear a leather jacket. While you were playing chess, I was practicing 5D chess. While you were playing around with hardware, I was creating hardware. While you were creating videos, I was creating hype. Now i have al the nicest leather jackets that exist, And what do you have? Excactly, nothing.
@@donizettilorenzo just because the titan has more tensor cores doesn't mean the performance difference in certain applications should be as much as it is. The numbers show a purely artificial limitation on the 3090, as I don't expect it to outperform the titan in every application but at least get close.
nvidia literally made hundreds of billions off the mining crap (so much they spent 40 billion to buy an unrelated cpu spec) and still they burn gamers... they have all the money but still they hurt us with 150% price increase for the high end gaming model .... GREED GREED GREED has ruined our future (hoping intel's coming video card will eat their lunch)
@@mariushernes606 the high end "GAMING" model now costs $1500 instead of the $1000 at launch like it did before. take away all the buzz and those are the facts. nvidia has been making TONs of money and they still squeeze gamers for A LOT OF MONEY
You know what's really funny? Linus could've included RX 5700 XT SPECviewperf benchmarks to show how it gets the same score in Solidworks and crushes the RTX 3090 in Siemens NX (the 5700 XT scores 74 in SNX).
@@ginxxxxx GTX 3090 isnt really for gaming. For professionals there are alot of features that makes the price OK. The real gaming successor is the 3080 and honestly its priced very low considering the power and previous generation price.
It’s a glowing review if you have an 8k tv and understand the implications trying to game on that (doom eternal will apparently run natively but others will use up scaling), and you have a lot of money to spend on that. If you’re a 4K gamer this review would probably steer you away honestly. I was thinking about buying one because I game at 4K at 144hz and I want as high quality I can get, but from the numbers I’ve seen it’s not worth it for me.
It's a Canadian review.. Canadians can't leave a bad review, no matter how bad the product was. I bet they even leave a decent tip with bad service! Eh?
regardless wat ur watching it on our eyes cant even see the difference in 4k or 8k, those that think they can are easily fooled. biology of your eyes doesn't allow for it
Imagine your mom saying "8k equals 33 megapixels and a frame at 8k 8 bits carries 128 megabytes of data and you don't have textures and all those stuff. Now go back to your 360p display"
@@shadowkiller5520 I think you misunderstood Lastav: The link Lastav mentioned is the one I was talking about, but the original comment of Kavita with that link is removed.
3D modeler here. I'm just gonna wait for the 3080 20 GB version. The only real thing the 3090 has over the 3080 is the VRAM, and if we're gonna be able to get that for a fraction of the price soon... what's the point?
Can you please watch my video on Pokemons which Ash can catch! (Fun fact: I did predict that Ash will catch a Mewtwo for about 6-7 months ago and Goh will get an Aerodactyl)
I built up a bunch of money to get a new computer, 3080-built but still no cigar :,(. My computer is way too slow to do anything, I'd love to see the 3080 in stock..
@@me_and_me_ dude spamming your video that is completely unrelated to the conversation/channel/video topic is really cringe your channel will grow, but this is the wrong way to hustle, and is really just spam given how you have done it. I wouldn't be surprised if all of these are removed, and you maybe silenced from the channel comments.
A friend of mine wanted to buy the 3090 because he's in dire need of a WS GPU, but having those features artificially locked out makes really hard for me to recommend him getting the card, despite how much we both want to see it in person
I literally can't see any difference in picture between a 1080p monitor and a 4k monitor. I would much rather have a very high frame rate 1080p than a 60fps 4k
LInus is the only youtuber who asked for something i have been dying for in a gpu without paying 7000 dollars for it . PLEASE give us sriov on the 3090 otherwise this gpu will be useless when the 20 gb 3080 comes out
So, this graphics card costs twice as much as my current rig, consumes three times the power of my current rig and is about the same physical size. I think I'll stick with what I have for now.
10:03 It's a really sleazy move if Nvidia's reason for not enabling cad features on the RTX3090 boils down to, "That's not what we made it for.". If the card truly can't handle it, then it that's one thing, but intentionally gimping a product with subpar drivers is irreputable.
Literally everyone does this lol. Radeon clipped the wings of their 5600XT to sell 5700's and Intel restricts hyperthreading in their low-end desktop parts. Sad reality but one we live in
@@LmgWarThunder "Intel restricts hyperthreading in their low-end desktop parts" That's incredibly simplified. Yields are never close to perfect. Lower tier chips are often simply high tier chips with deflective cores/features. There is a reason every single company does this. 8 core/16 thread chip has 2 non-functioning cores and hyperthreading? Disable those cores/hyperthreading and sell it as a 6core/6thread. That said, yes everyone does this at basically ever level of selling any product of any type ever. Getting butthurt over it is stupid but being vocal and asking for realistic changes is fine.
Tons of companies do this, and it's always sad to see. It's common to see it in car manufacturers, where instead of putting [premium feature] in all of their cars, they go out of their way to make it an option you have to pay extra for, even if doing so costs them more production costs since now they have to produce multiple SKUs. They will basically throw a million dollars down the drain if it means they get to charge customers an extra 1.1 million.
@@trapical "they go out of their way to make it an option you have to pay extra for, even if doing so costs them more production costs since now they have to produce multiple SKUs. They will basically throw a million dollars down the drain if it means they get to charge customers an extra 1.1 million." lol wtf is this even. "even if doing so costs them more production costs since now they have to produce multiple SKUs" --- That's not true at all. "They will basically throw a million dollars down the drain if it means they get to charge customers an extra 1.1 million." ------ Yes companies throw millions "down the drain" to get money. That's what having a big company and making money is. Although what's hilarious is that you seem to think they do that at a LOSS?!?!?! You're 100% R/ word dude
What you need is an open source crowd owned computing company that can kick the shit out of these cabal companies like Nvidiot and AMDont. WE the people need to decide what's best for us, not the deepstate.
@@gideon4942 You understand that DLSS isn't the same as Ray-Tracing right? The point of DLSS is that it makes the game run at higher fps. It literally boosts performance. It renders the game at a lower resolution, such as 1440p, then using AI on a different part of the GPU, it upscales that to 4k (this obviously isn't the only resolutions that it works on, just an example). While not using it technically gives a crisper image, in my opinion the FPS gain from it is well worth it.
youtube compression does no justice for 8K game footage. it looks worse than 1080p footage, even if you watch this video at max res. you unfortunately cant use youtube to see 4k to 8k differences, as even 4k looks like shit when its a video game with lots of details, as youtube compresses way too much. the darker the area, the more compression is used, so dark areas turn into blocky messes, and it makes the entire scene look like shit. edit: watching this at max res, even the 4k footage looks FAR worse than what 4k actually looks like on my display. so 4k/8k gaming looks like absolute shit imo on youtube, my 2nd monitor is a 1080p display and it even looks worse than native 1080p gameplay imo.
It's a pretty brick, gotta say. Got my Asus RTX 3080 TUF Gaming 3 days ago, and happily gonna stick with it for the next 3-4 years. The 3090's 15-25% performance gain doesn't seem very compelling when considering the fact that I would need to upgrade my 1 week old PSU again (Fractal Design 760W Ion+ Platinum), and my mobo/CPU for removing bottlenecking with my current i7-7700k setup. Thats over 2200€ here in Finland with the cheapest RTX 3090 available (Asus TUF Gaming).
I'd say go for it. The amount of money you save from not having any future kids due to being castrated by your wife is a lot more than 2200€ for the hardware
Amazing frame rate, sadly the human eye can only see "out of stock"
don't worry, the stock will normalize around 3089
@@Jairjax doubt, that just means scalper bots will be infinitely more advanced at buying stock.
My eyes only see "credit card declined"
I’m dead 💀
I won't be satisfied until TH-cam maximize-minimize transitions become smooth
Finally I can play Microsoft flight simulator at 23fps
In vr
1 fps short of cinematic
IF has Microsoft in name, stay away from it
Hahaha!!!
Didn't he said it was cpu bound rather.
me watching 4k native vs 8k DLSS vs 8k native comparison at 360p: yeah, I trust you on that one mate
I felt the same at 1080 on a 14" laptop!
I'm watching in portrait mode on my phone so to me it seems like a small card idk why he says it so big...
I watched it at 4k and still couldn't tell it apart. It is youtube compressed after all.
Omg guys... really.. are you that dumb 🤦
Enable 1440p in the Phone ...wait for buffering....wait until Linus zooms in . Make a Screenshot with your Phone ...Go to Gallery and zoom in in that Screenshot. THE DIFFENCE IS HUGE !
@@x32i77 At 1440p on my 27 inch monitor it does actually look very different, when he zoomes it in you can see quite large differences
"At launch there were only 7 in canada and Linus got 5 of those "
~ Dave2D
That's probably true :(
@DEEJMASTER 333 yeah I hate that people do that.
What's wrong with miners?
@DEEJMASTER 333 eh not nearly as bad as other things so I don't see them as a real problem plus they don't hog that many cards and at least they are using them
@DEEJMASTER 333 >Crypto miners are completely destroying it
Now that's where you are COMPLETELY wrong. We undervolt our cards giving them more life in the longer term. You gamers/PC users don't want to buy a card from someone who has mined with it (and sells at a lower price), but still being butthurt when you "have to" buy it from a scalper. That's where shame would be on you guys, because miners don't dump cards onto the market for no reason. Maybe they are old or don't profit enough for them. I still think that the behavior the PC user and/or gamers have on crypto users are still unacceptable thinking we are selfish pricks, while crypto users think about you as a human being. Bottom line then... Crypto is not as bad as you think.
"Here are some shots at what you can expect"
Me in my 1080p monitor: *Beautiful.*
1st reply Justin
True
JUSTIIIIN!!
(Here with 1440p monitor. 2k gang, wazzup?)
Same
@@rhythmkhattar2738 Why would that even matter?
6:57 - Linus: “Here are some shots of what you can expect to see in 4K”
Me, watching in 360p: Ah yes, pixels.
Lol what are you watching this on 🤣🤣
Me, watching in 144p: Ah yes, nothing
It's great to be blind these days, don't miss out much more and my wallet is still quite happy and healthy
I never saw any difference in those screen video shoots. surely there are, but also are negligible to me
Linus: "the difference is night and day"
me: ???
Ah yes I'm watching in that crystal clear 240p
*intro starts*
Linus: "NO! There's no time!"
*intro stops*
Guy who writes the jokes at the end of the intro: :(
Plot twist: He made the joke of skipping it
@@kavitatonk4744 no
but he has time for sponsors lol
@@lastavverma7631 don't try to advertise smh this is the second time I saw this video
Who else skipped 10 seconds there and needed to go back?
Actually laughed when he stressed the $1500... Here we are 5 months later and the 2000 series cards going for over that
yeah rtx 3090 above $3500 usd rn
The 3090 is going for over $5k in the middle of May 2021 lmao
RTX 3090 now goes for somewhere between 3.5K€ to 5K€ depending on the brand.
I saw a guy selling a GTX 1660 for $3k. It’s even worse.
@@theknightikins9397 I got a new in-box 3090 from ebay for £1650 a few days ago. Looks like the market's returning to baseline (slowly). It really is a monster by the way. And if you use it to mine crypto while you're not gaming you can make your money back in under a year.
Me looking at benchmarks for these top cards: 'Hmm, interesting'
*continues to be poor*
But think about, new cards just push the prices of all cards down like the 1080, I still rocking my 1080 and probably will for the next 2-3 years until the 30 series is a bit cheaper. Scalpers ruined the market.
You couldn't buy one now even if you weren't poor.
@@CorporateShill66 Every 60 seconds in africa, a minute passes
No, your smarter than the rest of the morons who always fall into the overhype trap. Why pay for something designed to be downgraded to make the next Titan card look good.
th-cam.com/video/6okUdVVIvUM/w-d-xo.html
Linus: From this footage, you can clearly see the difference between 8K DLSS and 4K.
Me, watching on my 1080p monitor not even in fullscreen: Oh, yeah, of course, totally.
Watching this on a 8y old, faded, 720p display on my laptop confused as fuck too :D
The video is in 720 anyway. Lol
Cyn Hicks it's 4k wdym
@@CynHicks it depends on you phone
@@ZY-vw6xl Its the highest option.
How do i put that in my nokia 3310 snake is a bit laggy
SlayCC feed the Nokia
Bruh, Nokia 3310 outperforms 3090. Snake runs at 110 FPS for me. Turn RTX settings to medium and render resolution to 4K. You'll be good to go 👍
@Buffy Foster I have two dedicated fans and liquid cooling for it.
Ha ha?
:D
I love how Blender, an open source software that you can run pretty well with a $300 setup that has no external gpu, is also the gold standard for benchmarking the most expensive consumer gpus in the market.
well define "pretty well"
@@priyam352 render time within the average lifetime
20yrs ago, when it was fairly new, Blender was only a few megabytes big. Maybe less than 10MB & could still run on a potato.
When he addressed Jensen personally I could just imagine him sitting at home watching the video getting more and more pissed off
In his kitchen... breaking spatulas
he's only angry, because he left something in the oven
@@ekksoku It must've been the RTX 3090 SUPER TI 64GB Vram, was so big he couldn't get it out of the oven
LOL
noliNagirlasia.link
Linus: *tries to make Nvidia mad*
Jensen: just for that, the Titan is now $3001
Actually makes only 5 cards available on launch date as he already saw that video. And 0 more to come at the same price.
Oh no 🤣🤣🤣
Thanks, Obama.
I mean, uh... Thanks, Linus.
haha I like the casual taunting of Linus to NVIDIAs CEO
Funny you mention that, In Australia the 3090 ranges from $2750-$3400, the 3080 is $1300-$1850
Me: *Watches on phone screen*
Me: "Ugh yeah look at the difference between 4k and 8k"
yeah AMIRITEEE
Actually though lol
Yep. I'm watching on a potato. 😊
I watched this on 55" 4k tv and disn't see any major difference, 8k is overrated
Lmao yeah, on my five year old phone
It's crazy over a year later the same graphics card cost 3x more than the release price
I RECOMMEND HIM👆👆 FOR QUALITY PC AND GPUI RECOMMEND HIM👆👆 FOR QUALITY PC AND GPU
Aaaaaaaand problem solved.
$950 on ebay now
Linus in 2022: Why haven't we received the reviewers graphics cards this time?!
I get it’s a joke but that would never happen with LTT.
@Gus Johnson uhh not really
@Gus Johnson guess who actually sounds like a prick?
Linus: "Making Nvidia’s CEO mad"
Linus: "9.5 out of 10, needs a software update."
That's the Canadian equivalent to being mad
That's Canadian insult for "your products lacking software optimization"
Too much water 💦
@@brokengames9020 yeaa wait what
He has a good connection and relationship with videos and gets paid(sponsored)occasionally by them (recent video)what did you expect?Linus added that title probably to show that in a way he is being subjective when criticizing a product
So we're at the "Look how crappy this 4K image looks" stage...
no were at the " i can finally buy gt 1030" stage
Soon elitists be like "you game at ugly 4K, chum?"
I am still using a 768p monitor and have no idea how to feel about this
@@Kocey_YT Dont worry, i havent met a soul who complained about resolutions, otherwise that person who looks down on others who have lower native resolutions is just a pillock.
@@Kocey_YT i play games at 640x480
The human eye can’t see more than intel HD graphics.
I think for most people their wallet/ bank account is more limiting than their eyes.
My human wallet can't see more than gt 1030
It can see up to 16k
i was gonna like it but i want it to stay at 69 likes
@@nickborlas5282 it’s a joke
3090: *exists*
My GTX 960: "you still love me, right?"
My integrated graphics: How long again until you get the money to get a dedicated card
@@DhirC35 same (I still have Intel's 𝙞𝙣𝙩𝙚𝙡grated graphics)
on my laptop
My 930M: forget about it, you're 12. You can't get money.
My GTX 555 doesn't even know what it is and it's still scared of the 3090.
My 3gig gtx 1060: was I good enough
“You can clearly see 8kDLSS give better quality then 4K native”
Bruh I’m on my iPad on 360p I m just lucky I can read what it says on screen
Same lol
I was awkwardly staring at my android 1080p IPS LCD screen and trying to find any pixel difference if I could lmao.
😂😂😂
even at 1080p I couldn't tell, TH-cams compression doesn't help.
4k was the best , 8k had allsorts of errors
14:36 Imagining Jensen giving his full attention and then quickly clicking away as soon as he realised it was sponsor time made my day
Lol
“But mom!! It’s the only one that works for my zoom classes!”
Linus: What's the Ampere Titan gonna' cost? $3000?
Jensen: 👀
*writes it down*
Probably 4
think the 3090 IS the "titan" this go around .
@@mousemickey5870 NVidia doesn't WANT to release a titan, but... they might have to if AMD brings the heat
mouse mickey It’s sorta, Linus pointed out there are missing Titan only features and optimizations from the 3090.
Jensen buys Linus's whole block and ruins his neighborhood.
Don't say that bro, I live down the street.
@@mollygrubber maybe he will give you a good deal
Pov you are jensen
minus the jacket
Why are you locked in the bathroom
You talking to me?
If you’re Jensen skip to 13:30
Thank me later (with a 3080 perhaps)
(a 3090 also works)
Whomst are thee so wise in the ways of business
Lol that was funny
I'm feeling incredibly called out right now.
@@jensonaltmann478 lol
@@jensonaltmann478 lol
Now all I have to do is wait for 5 years for the price to be in my price range
*15 years
when that hapen, the "5060-ti" will be 2x faster than a 3090 with a 400 dollars price tag.
I literally have to do that
@@dhruvavikas1632 nah bro the 980 came out only 7 years ago and 1060 released in 2017.
@@jsgv7935
bruh that literally means nothing- do you realize how soon after the other graphics cards came out after that? It’s speeding up
"If you are one of the priviledged 8k display owners heres the difference"
Me with a 1368x768 desktop screen:
Yeah you can really see the pixels pixelling more than the older pixels
me with bad internet watching in 144p with interuptions and loading
TH-cam in mobile doesn't allow more than 480p in India
im really thinking this joke is too overused
@@pedrol5004 DAMN RIGHT
@@pedrol5004 At this point, I see them as competition in which who has worst display 😂
We live in an age where minecraft is being used as a benchmark test
Don't forget the texture update for the rehashed optimization disaster of a game known as crysis
I guess tech reviewers think bad code design is a solid benchmark for nEXT GeNErAtIoN lol
@@ryanerickson764 crysis is just a joke but minecraft ray tracing does put alot of stress on pcs. Especially sues shaders
th-cam.com/video/vfc42Pb5RA8/w-d-xo.html
@@ryanerickson764 I mean, as long as it stress tests your components, it should be good enough as a benchmark, right? Don't know enough about benchmarking, or those specific benchmarks, to say for sure whether or not it does just that, but I'm sure there's a reason why people use Minecraft and/or Crysis. In fact, if the bad coding puts more stress on components than necessary and/or expected, good results in those tests would say even more about the performance of said components, wouldn't it?
I remember when performance in the first couple of Crysis games was considered the peak performance test for gaming PC's for a 'normal' audience (aka regular gamers, rather than enthusiasts), and if the remaster is anything like that, it should work just fine as a test. As long as performance scales with better components, that is.
Side note: My current desktop came with Crysis 3, and the game was pretty much brand new at the time. I think maybe it's time for a new rig...! Now if only I could get a hold of a 3080...
@@me_and_me_ bruh
Linus: Just look at the difference in detail.
Me who wears glasses and watching the video in 240p on my phone: ...wow.
Now imagine not wearing them glasses. The money we'd save, bro.
well that's what I was wondering.. Look at the difference of 8K Vs 4K on your 1080p Monitor..
Kay.
lol same here i didn't notice anything on my 144 p display but i agree with him hahah
The difference in quality is amazing! Especially considering that the highest resolution offered by TH-cam is 4K. That means that we are comparing a 4K native image to a 4K super-sampled to 8K and then down-scaled back to 4K. If we ignore the horrible compression that TH-cam slaps on top of it all of course.
Exactly... 👏👏👏
GPU: *costs 1.5K*
Minecraft (a game nearly 10 Years old): Best I can do is 40 FPS
It's a very CPU-bound game. I have a 3080 and I was confused about the low frames( I was using RTX on) and I noticed that my card was only at 13%, but checked my CPU( a Ryzen 9 5900X, which in heavy games, runs at about 10% usage) and realized it was running at a stable 40%, so it's really just the game being weird.
@@nautilus7025 r9 5900xt
Its actually more ram dependent then cpu dependent at least with mods but yeah overall Minecraft is just a weird beast
@@nautilus7025 the main reason is that Minecraft doesn’t use multicore systems so a really good single core might run better than a multicore with lower quality cores
@@nautilus7025 R9 5900 *XT*
Back in the day we called these things "cards" for obvious reasons. Nowadays lets just call them bricks. I ain't even mad tho.
Damn u old BOAYay
We use to call external power supplys _'power bricks'_ but I guess NVIDIA changed the game. The *3090* is indeed a _power brick_ itself.
I stil don't see people gaming in 8K. Heck, I'm still on 1080p wishing to upgrade at least to 1440p wich is still the sweet spot for most people, as far as I can see.
But money must end somewhere and there are enough people driving around in their Lambos, Ferraris and Bugattis, so...
What happens when your card gets bricked tho lol
@@Starfals it's became "bricked graphic brick"
Back when I had a GTX 570, me and my friend affectionately named it a space heater lol
yes of course I can afford the 3090 that's why I'm watching this like the rest of you guys
Yeah yeah totally #meetoo
Just get 3 1030s
@@mr.banana4178 Good idea
Lol
Lol
Linus:"You can see clearly"
Me: 1080p* "Not really"
True
Ha a better one
In my smartphone with 720 display
I was going to say that xD haha. But hey i can see the text differences
me: 240p, not really
@@LEV1ATAN yup...
"We don't have time for the intro"
*Proceeds to plug the sponsor*
Well uh it’s his job... he kinda has to
Dude it would make sense if all of his sponsors went with the actual video, but no they are all just random and they suck
"You can see clearly that 8K DLSS Utra Performance delivers significantly more detail than 4K does"
No, I don't believe I can.
TRUE
Defo makes me want a 8k tv with hdr and 2.1 hdmi and a gaming pc with a rtx 3080 but a wonder what a ps5 will look like on 8k am no good playing games with mouse and keyboard but now all a need is a xbox controller and a can play every game with it so a dono why I am still on consoles
Yup kinda lost with the youtube compression.
Very true, can't tell any difference at all! At least while watching on youtube.
@@SmallPaul. Don't waste your money. Higher pixel resolution is not always better. There is a limit to how much detail you can see that's determined by your viewing distance and display size. Here's a graph that breaks it down. images.app.goo.gl/K6jamsk19wpXZ3kv6 Now keep in mind everyone's eyes are different so this isn't perfect but they're good guidelines. I went and measured the distance my head is from my display at roughly 2.5ft to 3ft. Let's go with 2.5 because it's closer and easier to see on the chart. To see the benefit of a 2160p display I would need a 40inch diagonal display. I've got a 27 inch display and it fills up my entire vision at that distance. Based on that chart I should see some benefit from something a bit over 1080p, like 1440p, but 2160p is pointless and 4320p is literally doing nothing but driving up my electric bill. This is all pretty much in line with Linus' on real word testing. th-cam.com/video/ehvz3iN8pp4/w-d-xo.html They found 1440p displays were the better ones, and that's with the benefit of the placebo effect and not having perfect 1:1 comparisons of identical monitors with only differences in resolution. Now granted eventually you'll probably have an 8k monitor if for no other reason then as display technology gets cheaper it's hard NOT to get a higher resolution display but there's no benefit to the higher display at that size in of itself. What makes a difference in picture quality is other advancements, which aren't as easy to quantify to the consumer, such as black levels, contrast ratio, HDR, lower display lag, higher frame rates etc. Personally I actually do most of my gaming on a 65inch OLED TV that's in 4K and I still usually keep it at 1080p. Why? Because with how far away I am from it I can't see the difference.
"Look at the difference between 4K, 8K DLSS, 8K Native, you can clearly see..."
I continue to smile and nod pretending I see the difference.
Had to switch video quality to 4k and then stare closely at the background to see the difference...
I was going to admit to the same , in fear of being called boomer....lol
@Midas' Touch if you look at text its obvious but that's about all I noticed
@Shim Bro Not if you have say a 55 inch or 65 inch screen and sit so close you can reach out and touch it.
@Shim Bro I've heard that about 1080p back in the day, but the difference between 1080p and 4K native is night and day.
"Dug out our RTX 3080"
Wow, anyone remember that thing? Geez, how long's it been since that was relevant?
Never relevant, since it never existed xD
Literally 2 weeks.
The scalpers ruining the launch made me not even want one anymore
@King Marco Louis III more like 2 minutes
@@GitSumGaming I blame Nvidia they knew about the scalpers and they still make the founders edition Limited
just bought one of these, hope it doesn't bottleneck my i5 4590
i think you atleast need a 2 core 2,9 ghz 6 yo cpu like me if you dont want to bottleneck it (:-[])
Don't worry, your gpu bottlenecks itself by games not being designed to fully utilize it
You have an I5-4950 and expect no bottlenecks? The CPU is a HUGE bottleneck. You would need an i9 11900k or i7 11700k maybe one of the upcoming alder lake chips on intels end or a 5900x or better, and even then there is a small bottleneck
@@Techandgaminggalore you don't need a 5900x. More cores =/= better performance, at least for gaming. You'd get the exact same performance when gaming from a Ryzen 5 5600x. Granted, if you are multitasking a shit ton of things at once, or doing rendering in the background while gaming, then a 5900x or 5950x is in order. Also, depending on resolution and settings, a gpu has to spend more processing power on each frame, so a weaker cpu that doesn't send as much framedata will have less of a bottleneck at higher settings.
@@Techandgaminggalore Sarcasm is hard to understand apparently.
LInus: we dont have time for the intro
also Linus: just like our sponsor
Intros don't pay the bills. LOL
Chase that bag
Linus: I’m about to make Jensen mad.
Steve from GN: Hold my cat...
more like "cat from GN: Hold my Steve..."
Lol that was worth looking past the "wtf linus, I'm watching this on a toaster you rich kid" comments xD
@Voyd boy Steve from cat: hold my GN
its
Jen-Hsun, lol
Steve should see this comment lol
People with 8K display:
"mm yes the floor is made of floor"
salty?
Lol, that is good
Ok then, how am i suppose to see Lara's nipple hair if not with 8k?
Did any of yall feel like the 4k native image looked better than the 8k dlss in motion?
@@land2097 still using that statement in 2020 eh? You people are pathetic. Use the real English language. Also in regards to 8K go to Gamers Nexus and see the shots taken at Nvidia, Linus and others and also you will get to hear how this is not really 8K.
The difference between the 3080 and the 3090 (generalized) is 10fps, for seven. hundred. DOLLARS. If you’re a editor, designer, or anything that’s heavy on VRAM, then the 3090 is way more then enough. But if you’re a regular gamer with money to spend, 3080 is more then enough to run any game at ultra settings while still holding a good smooth performance.
except RUST🤣
@@pepin8277 and ark i swear that open world game that makes your eyes bleeding at max settings is still unoptimised
Lol 3080 at 700usd...
That card wasn't even 700 when it was released
Crypto mining is pain..
What I’m looking at and really wanting is a 3060ti. That’s more than enough to run anything at 1080p above 30fps.
Even a 1080ti has been able to run all of my games, very well.
Linus: "Don't you think, Jen-Hsun?"
Jensen: "stop picking on me"
More like, do you even wear a leather jacket.
While you were playing chess,
I was practicing 5D chess.
While you were playing around with hardware,
I was creating hardware.
While you were creating videos,
I was creating hype.
Now i have al the nicest leather jackets that exist,
And what do you have?
Excactly, nothing.
More like ... "Hey Linus .. Do you want to try your hand at designing massively scaled integrated circuits?"
"I'm never asked for this".
@@brokengames9020 Sounds like broke. Name checks out
Me watching on a 1080p monitor: Yeah! I totally see the difference.
Me on my iphone 5
They seriously should stop with the needless resolution bump ups and get to comfortable 120+ fps on titles.
But no, all hail the fancy slideshows!
@@Nefarious_Fish something has to be clever to be stolen
@@branchvine 💀💀💀🤣🤣
"do you think that's fair, Anthony?"
"... Anthony?"
"uuhh, yeah!"
@Rita [FUСК МЕ] OPEN My CANAL ! NO!
lmao!!!!
Let the man take a little nap. 😄
Read while he said
@Rita [FUСК МЕ] OPEN My CANAL ! imagine your mom opens your channel
7:32 Me watching the video in 480p : Ahh yes the detail.
Minecraft went from “any pc can run minecraft” to being used as a benchmark for RTX 3090 which only got 40fps
40fps at 8K though. My 2080S run the game maxed out with RTX at 1440p around 90-100fps
Can it run Crysis is dead. Now it's: Can it run Minecraft? (with RTX 4k)
@@Cakemagic1 but crysis remastered is coming out so can it run crysis is coming back
@@brokengames9020 Yeah yeah I wear tin foil now, thank you.
Cyberpunk 2077: Hold my frickin Keanu "breathtaking" Reeves pixels
“What’s the ampere titan going to cost? $3 grand?”
*Nvidia:* uhhhh..... YES?
so the normal price...
Jensen taps his nose
www.tweaktown.com/news/75147/nvidias-next-gen-titan-rtx-specs-teased-48gb-gddr6x-and-over-3000/index.html
nVidia: Keep walking if you have to ask.
@@gcharb2d that's the quadro.
Last video: Sponsored by Nvidia
This video: "Making Nvidia's CEO mad"
Proof that Linus will shill any old crap if you pay him enough.
@@CaptainKenway wdm nvidia paid him to do this.
The reason those drivers aren't unlocked is to justify the titan existing until the stock is gone.
More like
This video: Still sponsored by nvidia
If you want to check a real review it's better to check gamer nexus or jayz2cents.
He was literately shilling this card last video as a 8k gaming card. Nvidia CEO not mad at him lol.
thank you for this, u made my rx 580 cry and it won't stop shaking now
How badass would it be if Jensen commented here: "Challenge accepted Linus"
I know right? Where's the "atta boy" button?
what is linus complaining about ? price ? performance ?
Love the addressing of Jensen at the end. Good job Linus, let's hope they listen and this card isn't artificially weakened for no good reason :)
Less RT cores than the Titan. So there's a good *hard* reason. Edit: *TENSOR* cores
@@donizettilorenzo tensor cores*
@@tikket10 Pardon, exactly
@@donizettilorenzo just because the titan has more tensor cores doesn't mean the performance difference in certain applications should be as much as it is. The numbers show a purely artificial limitation on the 3090, as I don't expect it to outperform the titan in every application but at least get close.
If I was Jensen I'd be pretty surprised by Linus calling me by name from the video.
Petition to change the term "graphics card" to "graphics brick"
Signed
2022: "New RTX 4090 graphics block!
@@juanacosta4351 2028 - New X-RTX 16k 6090 graphics tower
It's already bigger than a brick
and here I was thinking that 4k cant even be 720p it looks so horrible
Me watching the 8k bit on TH-cam streamed at 2000p: Ahh yes of course! I do notice the massive changes in resolution!
Me at 6:57 watching at 480p: *"They are the same picture"*
Me at 6:57 watching at 1080p: "They are the same picture"
@@marcoalves9029 Me watching at 4K with my laptop about to break, "they are the same picture"
Me: Dreaming about watching it in 8k
My wallet: nope.
watching in 4k , still don't see much difference
144p on phone to save data, "yeah looks good'
*Linus:* There’s no time for the intro.
*also Linus:* *Tells a sponsor*
*me:* There’s NO TIME FOR A SPONSOR.
If only the editor ignored Linus and kept the video going.
Broken Games can you elaborate/prove your claims?
@@gujju_cousins8550 ^^^^
Linus: WE DONT HAVE TIME FOR A 3 SECOND INTRO
Also Linus: time for a 1 minute word from our sponsor
Gotta make time to chase the bread 💪💪
Who needs intros in 2020?
Conclusion: somebody should sponsor the intro.
theres always time for money
Who earns money from playing their own intro?
Linus:8k gaming
TH-cam subtitles:AK gaming
Steve from GN: "I'm about to destroy this card".
Linus : we dont have time for the intro
Me : nice no sponsored time so
Linus : AND THIS IS OUR SEGWAY TO OUR SPONSOR
*segue
@@FalconSpirit beat me by 2 mins
Yi
Nvidia: "We could have optimized the Geforce cards to be faster than the Titan in rendering workloads, but, uh, we didn't"
nvidia literally made hundreds of billions off the mining crap (so much they spent 40 billion to buy an unrelated cpu spec) and still they burn gamers... they have all the money but still they hurt us with 150% price increase for the high end gaming model .... GREED GREED GREED has ruined our future (hoping intel's coming video card will eat their lunch)
@@ginxxxxx what? 150% increase?
@@mariushernes606 the high end "GAMING" model now costs $1500 instead of the $1000 at launch like it did before. take away all the buzz and those are the facts. nvidia has been making TONs of money and they still squeeze gamers for A LOT OF MONEY
You know what's really funny? Linus could've included RX 5700 XT SPECviewperf benchmarks to show how it gets the same score in Solidworks and crushes the RTX 3090 in Siemens NX (the 5700 XT scores 74 in SNX).
@@ginxxxxx GTX 3090 isnt really for gaming. For professionals there are alot of features that makes the price OK.
The real gaming successor is the 3080 and honestly its priced very low considering the power and previous generation price.
"Here are some shots at what you can expect"
Me on my 720p phone "Beautiful"
"We even dug out our 3080"
Yeah I'm sure that thing was already buried in the back somewhere... Probably took quite awhile to find...
I took that as he dug or removed it out of the test set up
@@EmpireOfNerds Or more likely his personal set up lol
Nvidia: don't leak our ampere titan pricing linus... SMH!
I don't see why Nvidia's CEO would be mad. This is a pretty glowing review.
Because he paid Linus $30k for it
It’s a glowing review if you have an 8k tv and understand the implications trying to game on that (doom eternal will apparently run natively but others will use up scaling), and you have a lot of money to spend on that.
If you’re a 4K gamer this review would probably steer you away honestly.
I was thinking about buying one because I game at 4K at 144hz and I want as high quality I can get, but from the numbers I’ve seen it’s not worth it for me.
Yeah, awesome piece of hardware. But not worth it's money for 98% of users.
I bet Jensen himself liked the video 😂
It's a Canadian review.. Canadians can't leave a bad review, no matter how bad the product was. I bet they even leave a decent tip with bad service! Eh?
My mom said if I keep watching expensive stuff she’ll smash my head against the keybsssbbbbbbbbbshhhhhhhhhhhhh
Homedog what is she gonna smash your head against you fell asleep
@@Gobbledygoober I think he meant jhukgrbiofnthjuk
Me: Tries to see the difference between 4k and 8k native
Also me: I am watching this on a phone
regardless wat ur watching it on our eyes cant even see the difference in 4k or 8k, those that think they can are easily fooled. biology of your eyes doesn't allow for it
I have a 55 inch 4K screen and didn't see a difference to be honest.
I'm watching this on a 720p display so yeah not much to see
Bruh it doesn't matter human eyes can't even tell the difference between 4k and 8k no matter what you're watching on
the human eye can't see more than 720p anyway
"Yes mom I do need this for online school"
“Zoom can’t run without it mom”
Imagine your mom saying "8k equals 33 megapixels and a frame at 8k 8 bits carries 128 megabytes of data and you don't have textures and all those stuff. Now go back to your 360p display"
@@jimitSoni comment of the century lol
NO TIME FOR THE INTRO, roll the in vid advertisement tho
@@kavitatonk4744
Please remove your comment, nobody wants to see you channel.
You disabled comments and like/dislike: That is a bad sign!
but didn't you hear Linus' voice was a lot faster compared to other intros
yeah the intro doesn't give them money
@@shadowkiller5520
I think you misunderstood Lastav:
The link Lastav mentioned is the one I was talking about, but the original comment of Kavita with that link is removed.
@@shadowkiller5520 :-/ Jeez.
Just bought a 3090 for $999. What a time to be alive.
Wheree?
i saw an msi one on amazon right now for $1060 :p
i got mine for free because my friends dad makes them
Just got one for $800 plus tax and shipping
Friend just sold me his lightly used 3090 for $800.
6:57 "Here are some shots"
*Me, watching in 240p on my phone because my bandwidth quota ran out* Ah yes, I see clearly
Same here man😂😂
3D modelers are going to love it. Everyone else is lying to themselves.
I'm a 3D modeler and the 3090 is still pretty garbage at its price, may just be poor tho
Rich people and enthusiasts are going to love it.
I love 3d design, but I'll stick to my 2060 super for now lol
For render/lighting artist yep, for modelers not really
3D modeler here. I'm just gonna wait for the 3080 20 GB version. The only real thing the 3090 has over the 3080 is the VRAM, and if we're gonna be able to get that for a fraction of the price soon... what's the point?
... and *THICKER*
( ͡° ͜ʖ ͡°)
Hey I know you! What're you doing here? =o
Dave2D
Can you please watch my video on Pokemons which Ash can catch!
(Fun fact: I did predict that Ash will catch a Mewtwo for about 6-7 months ago and Goh will get an Aerodactyl)
thiccer*
now that everyones into the 3000 series i can finally get my hands on a gtx 1060!
Scalpers: no
When the test bench has less RAM than the GPU has VRAM.
4:16: "Dug out our RTX 3080" like it was thrown to the side :(
* Sad non-bot buyer noises *
I built up a bunch of money to get a new computer, 3080-built but still no cigar :,(.
My computer is way too slow to do anything, I'd love to see the 3080 in stock..
Nighthorde26 give it a few months when the other manufacturers get around to selling their own versions
Yes, Hi Linus where will you burying it so I can dig it up?
Just buy in store at best buy. Bots can't automate brick and mortar purchases.
Let them fix the problems that are popping up first.
Linus: "Don't you think, Jen-Hsun?"
Jensen: "I never asked for this."
This pun tho
Lol called him Butt hurt too 😂😂 Linus Squaring up with those billionaires.
noliNagirlasia.link
Linus: what is the titan going to cost 3k?!
3090: hold my vapor cooling.
Lol in the Netherlands the 3090 costs on average EUR 3.500 its insane.
This description says all:
Buy an RTX 3080:
On Amazon (PAID LINK): sorry
On Newegg (PAID LINK): none left
On B&H (PAID LINK): RIP
That's why I downloaded it instead
"you can see clearly"
me with my 1080p display: yup clearer than water
me with my 768p tv: hmm yes the window here is made of brick
"you can clearly see the difference in 8K native."
Me: No, no I don't think I can. 4k be just fine.
well to be fair, you CAN see it on an 8k screen, but on 4k screens its not really noticeable.
@@TransPaladin Probably on a 80" screen when you 're about 2-3 meters away.
th-cam.com/video/D9K9oU_VTQ8/w-d-xo.html
th-cam.com/video/vfc42Pb5RA8/w-d-xo.html
You can clearly see the difference in 8k native on you 5" phone screen. /s
@@me_and_me_ dude spamming your video that is completely unrelated to the conversation/channel/video topic is really cringe
your channel will grow, but this is the wrong way to hustle, and is really just spam given how you have done it. I wouldn't be surprised if all of these are removed, and you maybe silenced from the channel comments.
7:20 linus: check out the differences in these here pictures!
me: they are the same picture.
We need SR-iOV JENSEN!!!!!! Do it we'll sick Linus on you!
noice
A friend of mine wanted to buy the 3090 because he's in dire need of a WS GPU, but having those features artificially locked out makes really hard for me to recommend him getting the card, despite how much we both want to see it in person
@@iaial0 Well its good to debate about buying one so that in about four months when you can actually get one you can make a decision....lol
th-cam.com/video/vfc42Pb5RA8/w-d-xo.html
I'd even be ok with mediated vfio without some janky licensing.
Linus: Here's 4K. 8K DLSS, and native 8K for comparison
Me with my 1080p monitor: Hmm yes I see
i can see it at 1080p on a 1440p monitor(whihc makes things look worse) so you should be able to as well lol
I literally can't see any difference in picture between a 1080p monitor and a 4k monitor. I would much rather have a very high frame rate 1080p than a 60fps 4k
Nothing wrong with 1080. Very, very few people are using 8K.
@@Shadow0fd3ath24 that's because 1080p does not upscale into 1440P evenly and correctly since it's not a direct integer upscale
+ youtube compression
LInus is the only youtuber who asked for something i have been dying for in a gpu without paying 7000 dollars for it . PLEASE give us sriov on the 3090 otherwise this gpu will be useless when the 20 gb 3080 comes out
ITGC wait a 20gig 3080?
@@press_580 yeah load of leaks out about it
so more waiting? -_-
this sucks.
Broken Games k man we get it, you woke asf no need to go on and spread your negativity
@@brokengames9020 Cool, so did you preorder your 3080 yet?
7:08 - Me with a 1080p monitor: They're the same image
I can finally play FTL at 8K, yay
I'm super excited to drop a 3090 into my rig and play Slay the Spire and Hades.
stardew valley in 8K! this will be glorious!
Now you can see your ghost shell in 8k!
Dust fpl?.. 😂 😂
LOL
Undertale at 8k all the way.
lmao these bots posting comments when the video ain't even been out a minute
Facts
Isn't that what you have just done?
@@lastavverma7631 STFU
👍🏿👍🏻
Hahaha
Wow I'm watching an 8k video at 360p.
And 60hz lmfao
50hz
30hz
240p here boi
@@Broat_o_o but the maximum fps for youtube is 60fps which is = 60hz
So, this graphics card costs twice as much as my current rig, consumes three times the power of my current rig and is about the same physical size. I think I'll stick with what I have for now.
10:03 It's a really sleazy move if Nvidia's reason for not enabling cad features on the RTX3090 boils down to, "That's not what we made it for.". If the card truly can't handle it, then it that's one thing, but intentionally gimping a product with subpar drivers is irreputable.
Literally everyone does this lol. Radeon clipped the wings of their 5600XT to sell 5700's and Intel restricts hyperthreading in their low-end desktop parts. Sad reality but one we live in
@@LmgWarThunder "Intel restricts hyperthreading in their low-end desktop parts" That's incredibly simplified. Yields are never close to perfect. Lower tier chips are often simply high tier chips with deflective cores/features. There is a reason every single company does this. 8 core/16 thread chip has 2 non-functioning cores and hyperthreading? Disable those cores/hyperthreading and sell it as a 6core/6thread.
That said, yes everyone does this at basically ever level of selling any product of any type ever. Getting butthurt over it is stupid but being vocal and asking for realistic changes is fine.
Tons of companies do this, and it's always sad to see.
It's common to see it in car manufacturers, where instead of putting [premium feature] in all of their cars, they go out of their way to make it an option you have to pay extra for, even if doing so costs them more production costs since now they have to produce multiple SKUs. They will basically throw a million dollars down the drain if it means they get to charge customers an extra 1.1 million.
@@trapical "they go out of their way to make it an option you have to pay extra for, even if doing so costs them more production costs since now they have to produce multiple SKUs. They will basically throw a million dollars down the drain if it means they get to charge customers an extra 1.1 million."
lol wtf is this even.
"even if doing so costs them more production costs since now they have to produce multiple SKUs" --- That's not true at all.
"They will basically throw a million dollars down the drain if it means they get to charge customers an extra 1.1 million." ------ Yes companies throw millions "down the drain" to get money. That's what having a big company and making money is. Although what's hilarious is that you seem to think they do that at a LOSS?!?!?! You're 100% R/ word dude
What you need is an open source crowd owned computing company that can kick the shit out of these cabal companies like Nvidiot and AMDont. WE the people need to decide what's best for us, not the deepstate.
Linus: "8K DLSS ends up much closer to 8K native here"
Me on my 360p wifi: "wow yeah it does"
Same.
360p? what are you on a CRT from 30years ago?
@@evans32 "360p wifi"
I watch videos on 144p on my phone while on mobile data.
@@evans32 I mostly watch TH-cam in 360p in my phone. Sometimes 480p and rarely 720p.
So is DLSS really worth using? I constantly feel like I'm missing out on fine detail and crisper images if I don't do native 4K
It’s better with dlss
Yes I would use it if you can. It usually gives a significant boost in performance while keeping everything maxed.
@@gideon4942 wtf are you talking about
@@gideon4942 You understand that DLSS isn't the same as Ray-Tracing right? The point of DLSS is that it makes the game run at higher fps. It literally boosts performance. It renders the game at a lower resolution, such as 1440p, then using AI on a different part of the GPU, it upscales that to 4k (this obviously isn't the only resolutions that it works on, just an example). While not using it technically gives a crisper image, in my opinion the FPS gain from it is well worth it.
@@salvatoremazzola4199 oh fuck I'm dumb lmao. Ok yeah forget I said anything.
"We dont have time for intros!" Also linus "segway blah blah blah"
“you can clearly see the difference”
Me at 144p: Hmmm
lol same here
I couldnt on my phone lol. I felt bad cause i couldnt tell
youtube compression does no justice for 8K game footage. it looks worse than 1080p footage, even if you watch this video at max res. you unfortunately cant use youtube to see 4k to 8k differences, as even 4k looks like shit when its a video game with lots of details, as youtube compresses way too much. the darker the area, the more compression is used, so dark areas turn into blocky messes, and it makes the entire scene look like shit.
edit: watching this at max res, even the 4k footage looks FAR worse than what 4k actually looks like on my display. so 4k/8k gaming looks like absolute shit imo on youtube, my 2nd monitor is a 1080p display and it even looks worse than native 1080p gameplay imo.
It's a pretty brick, gotta say. Got my Asus RTX 3080 TUF Gaming 3 days ago, and happily gonna stick with it for the next 3-4 years. The 3090's 15-25% performance gain doesn't seem very compelling when considering the fact that I would need to upgrade my 1 week old PSU again (Fractal Design 760W Ion+ Platinum), and my mobo/CPU for removing bottlenecking with my current i7-7700k setup. Thats over 2200€ here in Finland with the cheapest RTX 3090 available (Asus TUF Gaming).
Cool
Ok
I'd say go for it. The amount of money you save from not having any future kids due to being castrated by your wife is a lot more than 2200€ for the hardware
Ei helvetti äijä mistä sait sen rtx 3080
Your 3080 is not top of the line. Stop trying to feel better about that fact.
"Yes mom, this will run zoom perfectly!"
Linus: "look at the details "
Me on my 1080p screen:👓
You mean 144p right?
👁👄👁
Yeah... yeah... sad times...
TH-cam actually upscales videos and increases bitrate when watching at a higher resolution than your display is set to.
@@mycelia_ow i dont get it