But does it really tarnish their reputation? It's pretty scandalous. Most people eagerly anticipate the release of the X3D chips and the Windows 24H2 update. Other than that, I think Zen5 will gradually gain traction.
Once the 7000 series sells out, and consumers being fearful of buying a degrading intel cpu....jeez it's pretty faint praise....yeah, sure, given all other options are crap, I'd buy zen5....I guess. Or wait for the next gen and hope it's decent.
@@StingyGeek, it's not AMD's fault that Intel's CPUs have performed poorly for the last five or six years. Don't we always want to buy "the best" product? You sound like you believe there is no other option, so discredit AMD in the process. They may not always be at the top, but that's what competition is about! Without competition, companies like Nvidia can get away with nonsense.
It doesn't tarnish anything. Nothing really happened to Intel for making customers stuck in 14nm process node loop for almost a decade. These TH-camrs are bunch of yappers who has a memory of a fish.
@@vicvecoIf “most eagerly await 9000x3d” than Intel still would not hold 80%+ market share For tarnishing a companies reputation… that has not happened. It’s not like exploding Ryzen 7000 series tarnished AMD reputation. Or when they rebranded 7000 series in 8000 by adding an APU tarnished AMD reputation either. Let’s also not forget that Ryzen 9000 flopping hard not changing AMD’s reputation. Lastly, Ryzen 7000-9000 having such weak sales that AMD still has to keep releasing Ryzen 5000 parts to shore up financial losses, hasn’t messed with AMD’s reputation either. So, cut the fanboy bullsh!t. These are PC parts; not sports team.
AMD may have stood on a rake with the launch of Zen 5 but at least they didn't pour gas at their feet and then set themselves on fire like Intel has been doing with their 13th and 14th generation failures. Zen 5 is a mediocre uplift sure but at least there aren't any reliability concerns.
AMD's GPU division failures on products and their CPU marketing debacle probs makes it overall worse compared to Intel and NVIDIA. At least in the grand scheme of market share of GPUs and CPUs, they are still behind both their contemporaries in Steam Charts (behind NVIDIA in gpu shares completely) and long term server market share (behind Intel server end cpu share despite winning in most metrics).
@@Therapistwatermirrursevers are a hard market but Intel having their asses handed to them by AMD in that market is a big surprise, Epyc is a powerful and efficient chipset afterall
@@concinnus Why would you expect that because of 13th and 14th gen Intel, now all processors are suspect to such fails? The track record of AMD is good. Intel, not so much, given remember the Pentium FDIV bug?
Ya, I honestly don't even know why this is a discussion. Intel's communication on the failing cpu's was abysmal and they weren't even really owning up to the issues at first. The CPU's are literally failing and no bios is going to fix that, it only slows the inevitable.
Intel also stepped up with a five year warranty .. I don't think AMD would ever do that. And they should - my Threadripper 6950X died six months past the puny 3 year warranty. I'm not so sure AMD is much better.
AMD screwed up with setting expectations too high for what gaming performance would be at launch, however, it's definitely still possible that Zen 5 could end up living up to the performance figures that AMD was showing, as games and other software (such as Windows itself) becomes better optimized for its new architectural features. Zen 5 has more new features and changes to the architecture vs. previous gen Zen architectures than any other Zen CPU architecture has had. The move to chiplets was a big deal for manufacturing, but the actual substantive architectural changes, especially from a programming standpoint, were not actually that significant. I'm not suggesting that you should buy Zen 5 over Zen 4 for a gaming system today, especially because a 7600 is already a very powerful gaming CPU that's going to be MORE than powerful enough for anybody who isn't planning to spend 900 USD or more on a graphics card. It's hard to say how long it could take for Zen 5 to be well optimized for in games, or how much performance will improve vs. Zen 4. If you know a lot about the history of CPU performance in games and other applications, you'll know that SOME architectural changes can see big improvements from day one, but also some other changes require a lot more changes and optimizations to software, and it can sometimes take many years before those architectural changes are well optimized for in a wide range of games, and there will continue to be many games or applications which aren't coded to take advantage of many modern features very long after, especially games or apps which aren't very demanding and are made on a lower budget by companies with less experienced and smaller programming teams. Sometimes it's not until one or two CPU generations later, sometimes even longer than that, before a wide range of software becomes well optimized for a new architectural feature. It took ages for games and many other applications to take advantage of dual-core CPUs, and it took even longer for games to start benefitting significantly from having four or more cores. Zen 5 already offers a 12% performance improvement over Zen 4 in games in Linux, believe it or not, and does offer a significant improvement over Zen 4 on Windows in a couple of games already, which at least suggests that it's possible for more games to see a bigger improvement over Zen 4 performance in the future.
@@ericmatthews8497 Good for them to do that. If any company has an actual systemic issue that's degrading their products then obviously they need to step up somehow. But your example is just a one-off. Sometimes shit happens and you get a bad sample and that can happen with any company. Could have been Intel just as well with their 3 year warranty at the time. Also, the discussion is about stuff the companies have done this year. I'm not just hating on them because I like competition but Intel fuckup is on another level this time.
You're really downplaying how shit AMD has been this year. First saying their 5800 XT was as fast, in some cases faster than a 13700k. Then going off saying Zen 5 will be better than 14th gen, another lie. Then blaming reviewers by saying their testing was flawed while also admiting they purposely nerfed intel CPU performance to make Zen 5 look better. Yeah intel fucked up by keeping quiet on the whole thing, but at least their not straight up lying and spitting in your face like AMD is.
I work as an animator and just kind of touch on the texture stuff to make a realistic texture these days, there's multiple layers in a texture that make a shader, you have the texture it's self, a normal map to give it depth, a bump map to give it more texturised, these days you also have subsurface scattering and multiple other things and it all needs to load into vram, it's a lot.
@@Chrissy717 If you cherry pick the artistically best looking games from 10 years ago, they still hold up super well. I think that helps to explain what they're thinking. The graphics may be less detailed and less sophisticated in many ways, but art is pretty subjective, and it's clear that it's possible for an older and less detailed looking game to still "look better" to someone. Of course, to generalize that "games looked better 10 years ago" is still a pretty dubious generalization. Definitely there were lots of bad looking and artistically poorly designed games back then. The two worst artistically designed games I've ever played are both over 10 years old now. While many things in artistic design and graphical design are subjective, there are a lot of things which people are likely to agree upon when it comes to what types of art styles are better, especially from a gaming standpoint, where making different types of things easily recognizeable, and easy to tell apart from each other, can be very important and valuable for making the gameplay itself better, for example. For me, one of the worst offenses is to have unimportant features with no relevance to gameplay and minimal to no relevance to the narrative being more attention grabbing than information that's important to the game's objectives or to the game's narrative.
@@Chrissy717 There is a technical aspect and artist aspect to things. What you just did is such a silly thing to say and not really care about what the guy just said.
People acting like Zen5 is the worst thing ever when it's just mid despite the marketing mishandling. Intel did it for years after Sandy Bridge. One underwhelming generation isn't such a reputation destroyer. Intel literally mishandling the 13 and 14 gen disaster is so much worse and completely destroyed any kind of goodwill from the costumers. That's a much bigger deal imo
@@brolo7234 Yeah and people still bought Intel anyway. It's not really something that damages your reputation. I'm not going to upgrade from my 5800X3D anytime soon but sure all hell I'm not jumping ship because of Zen5 , especially when Intel is so much more unreliable and anti consumer
@@brolo7234 The thing is that zen 5 performs well outside of gaming. People pretend like gaming is the centre of the universe, that goes for Intel systems too. Recently they've released pretty good workstation and server chips. Zen 5 simply was optimised for datacenter and workstation work loads and it shows. I don't know why AMD bothers with non x3D chips for consumers since that's what is exactly what customers want.
@@roccociccone597Normal because if you want to see praises of Zen 5 being good in productivity, you have to go to the channel or community that does productivity specifically, forgot the word for it in english, but its a bias cause people are only caring what they want not overall picture. But their marketing is really a main problem.
They already did unless they drastically changed the design and pull off a mircale. The 9800 part is barely an Intel bump from the decade where they fucked everyone by holding back performance. It was not a terrible product just not gains like they had in every launch prior and they overprimised making the number slook much worse. Look at the difference between the 5800/7800 and X3D parts and you can see where it will line up and it is not going to magically save this launch unless they underprice it which I don't see happening. Even then compared to prior launches it is disappointing. They still have not done the simple price drop that would explode sales. Remember the launch wasn't bad because the part was bad it was just a shit value.
they will price is stupid high because they have shipping containers full of zen 5 no one wants. which will just create another yard full of 3d chips no one wants. AMD is smart like that.
Zen 4 is selling better than ever and no one buys Intel anymore. Zen 5 was underwhelming but all they have to do is lowering prices. Intel has to prove they can offer actually working CPUs for 12+ months before I'd consider them again.
@@Bound4Earth Yes keep believe that stupidity, when Zen 4 3D is literally 20% faster in gaming over it's own normal Zen 4 Model, while also having 10% less clocks & lower power consumption.
I think people will be disappointed in the 9800X3D. Anywhere Zen5 DOES actually perform better than Zen4 are all places 3D vcache is useless. And we've already seen overclocking does squat for X3D so I wouldn't hope for a whole lot of gains there either. Unless X3D is somehow NOT simply Zen5 with vcache added I wouldn't expect any remarkable uplift over the 7800X3D. No one will call Arrow Lake a failure if it doesn't get a win over X3D, but _everyone_ will call AMD a failure if it _does._ Let's hope I'm wrong. Or maybe not. I'd be happy to see X3D get out of the $400s because that's WAY too much for an 8 core in 2024, let alone 2025.
@@JagsP95 That's NOT 'true 4k' - please don't short change your self. Upscaling has it's traditional 'good' uses, of course, but when Nvidia markets upscaling and fake frames as a xx90 card 'feature', you know they're f#ng with us i.e. those high end cards are not so high end without that fake sh#t.
@@ChrisM541 I really don't care how they put those pixels on my screen. As long as it looks good then that's all I care about. It's all computer generated anyways. DLSS Quality on my 4K tv looks indistinguishable from native 4K. DLSS upscaling alone makes Nvidia worth it. I'm happy with my 4070 Super.
I doubt you will find many games without any upscaling. Even if you set the output at 4K, there’ll be upscaling for all sorts of graphical effects, including reflections, global illumination, textures, light shafts. These can always introduce aliasing. We even talk about “high resolution shadow maps” and stuff independently of the game’s stated render res. When you use DLSS, they don’t decrease the resolution of all graphical effects either. Some might still be running at 4K, like motion blur and depth of field. And the UI of course. And you know that native resolution isn’t the holy grail. There are benefits from downscaling. Ray tracing sometimes needs to be sampled at much higher than the game’s stated render resolution. 4K ain’t 4K, basically. It’s more complicated. To make good-looking games, there are compromises.
I've messed with Linux every so often, in dual boot or VM scenarios, for 33 years, but I'm closer now to dumping Windows entirely than at any point before. I don't care about the genre of games that make use of anti-cheat. With that in mind, I'm really not sure why I'm still using Windows today when I could be using SteamOS or something else for gaming.
@@photonboy999obviously not been keeping up either the news. Since Valve have been normalising games on Linux it’s an entirely different experience these days.
What is there to repair? It's not like these processors don't work, or rapidly degrade. Yeah, it's a small upgrade for gaming, but there are much more uses for a CPU. Intel been doing 5% perf. Increases for all uses from Sandy Bridge to 8700, when they finally gave us more then 4/8 on mainstream desktop and nobody seem to mind
No worries, it's just HUB and their clickbaitey vid titles. Ofc AMD is doing fine despite Zen 5 not being that much of an improvement. They are not forcing Zen 5 on us and we can happily live with Zen 4. Now if another gen gets such a small improvement too, then I start to worry.
@@rtt4163 As if other tech companies don't do that on the regular. That is why people come to watch channels like this in the first place, to compare claims with actual tests
Yes, in Brazil Intel denies warranty if you use mem frequency above 5600MT. It's the first question they ask you when applying for warranty. And then you're blacklisted, you can't reapply for warranty for the same S/N of the cpu.
What does it do to the stats if I watch the whole video, leave, then remember I forgot to like it, start it again, like it and then leave? I do that so often.
VRAM problem is that new games put MORE types of data on the GPU than just textures and meshes, like Distance Fields and Virtual Shadow Maps or more G-buffers to make better materials and effects.
I don't think AMD's is as damaged as this video title suggests. I have a 7800X3D system and a 7700X system. Zen 5 isn't an update for existing Zen 4 owners but for new builders it's great. AGESA 12.2.0.2 and the Windows 11 scheduler fixes add quite a bit of performance versus their launch. HUB will hopefully test the new AGESA in the upcoming weeks, possibly during the 9800X3D launch.
AGESA changes dont affect 9700x/9600x tho as these are single CCD processors. If these didnt see big generational changes, double ccds wont either (and benchmarks done suggest its 1% difference on double ccd cpus).
the tittle dont sugggest AMD is Damaged as much you say the tittle does.........its not the products that is bad, it s the way they tried to solds us the product. Stop white knighting AMD.
It was always going to be hard for Zen5 to be impressive because Zen4 is so darn good. If Zen5 is a bust, it's because of where our expectations got set. Pushing a datecenter-focused CPU out to gaming reviewers is just dumb. But I've said it before -- look who's in charge of AMD's marketing department.
@@keldon1137 AGESA 12.2.0.2 has already been tested and it indeed improves performance. I'm not making this up. My comment is based on facts. HUB and GN will hopefully get around to testing it but it just released and is still in beta UEFI form. They, rightly, don't typically test betas.
@@AshtonCoolman 12.2.0.2 aims to reduce cross CCD latency, meaning it only affects double CCD cpus (not 9600/9700). In case of double CCD cpus (9950/9900) from benchmarks i seen its within 1% improvement.
Zen 5 is rather impressive for most of the things that matter (scientific / HPC computing, technical computing, data center infrastructure.... all of which primarily run on Linux). Reviewers on TH-cam tend to focus on Windows / gaming performance, which is one of the most boring areas of computing at the moment, IMO.
Which is exactly why IDGAF about X3D. I still care about gaming, but gaming isn't CPU intensive in the real world, and the i7 absolutely curb stomps it in production for less $$. AMD needs to give mixed use users a lot more love.
Zen5 was not advertised from AMd on CES on scientific /HPC computing and other shits then they show benches for Windows tasks and games and of course they lies alot!! main buyers of desktops are not peoples with this shits what you mention then gamers and windows productivity tasks
I guess only intel reputation had a very harsh hit. Amd coming short of expectations is not a big deal, and the handling of their pr was not great but far from the worse that can happen. Nvidia using misleading product stack names is not a news so i guess its business as usual, but 13th gen and 14th gen issues and the poor pr handling has been a total failure
Linux gaming seems to be able to play like 60-70% of the games on steam, mostly on similiar level, sometimes higher if we look at Wendel comparing Shadow of the Tomb Raider, but yeah, you have to keep in mind that there's 30% chance your next game might not work, so linux gaming can be good if you just dual boot and play games, that you can't on linux, on windows
in my experience it's only games that literally want to run malware (kernel level anticheat) on your system. I cannot comprehend how compromising your entire system just to play some online game is worth it.
@@roccociccone597 well... people do approve of this, and other use kernel level cheats to avoid "weaker" anti-cheats, so those are reply to what cheaters were allowing cheats to do, and still 2 "best' cheating methods are now AI and 2 pc setups forwarding inputs, so yeah, i kind of can't blame people for putting those anti-cheats in
@@roccociccone597arent there still plenty of features missing like frame gen, DLDSR, HDR, ive heard that even basic stuff like VRR gets broken often on wayland, so its definitely not just anticheat
@@garrett3117 zen 5 seemed to have way bigger differences on Linux more or less 10-12% uplift, that's why there's a big part of audience that points out Linux testing so much, as for "built in" Shadow of the Tomb raider, it goes from 270 to more or less 320fps between windows and linux? but in other games it can be the same or lower than windows, so yeah, it depends, you can mostly say it's very comparable if game can run
AMD will just be fine unless Arrow Lake gets significant performance increase and if it runs somewhat cool. On the other hand I strongly believe AMD did not change the architecture as they claimed just increased IPC like Raptor lake refresh and wanted to sell you CPU while everybody has a negative opinion about Intel. Both company are bad you just choose lesser evil.
How would that happen if people generally liked Intel up until this year? Zen 5 was being created when Intel was on the comeback and was doing better. You know CPUs typically take 3-5 years to make right? You also realize that there usually are separate leap frogging teams working on Zen so for example, one team works on Zen 2 while another is working on Zen 3 and then the Zen 2 team switches to Zen 4 when they are done and the Zen 3 team works on Zen 5. At least that's how AMD was doing it in the past when they wanted to make a completely new architecture every other generation. I've heard they will stop doing this because it's getting increasingly difficult to do. Zen 6 really should be their next great architecture if things work out due to the big latency decrease from stacking the IOD and CCDs together according to rumors and leaks.
I am seriously considering ditching windows when W10 becomes obsolete as I no longer trust windows as a product. I will be installing a few different distros on my secondary computer to see what works for me.
Already did. Linux has been great so far. Just a learning curve because it's a new animal. Only problem is some anti cheats that are kernal level won't work because Linux won't allow that level of fuckery with their OS.
I've been talking with a friend who dualboots Windows 10 and Linux Mint, and a lot of the things I need from my daily driver seem to be covered on Linux thanks to the Steam store and emulation. It's really just day 1 new game releases I'm a bit concerned with, but Linux seems to have come a long way since last time I looked around 2016.
It’s not that Linux doesn’t let anti cheat runs on kernel, it’s that is so open that you could see what actually does that anti cheat and what info it gets
AMD had 8 core chips at the time. Well actually they weren't really 8 cores, because the cores shared resources with each other. A sly move by AMD. Unfortunately they got outperformed by dual core i3's from Intel. Honestly, were you even alive during the AMD FX vs Core era?
@@General_Li_Shin Could also be on Microsoft, helping their buds at Intel by not optimizing for AMD. Bulldozer CPUs worked much better by the time the first Ryzen CPUs hit shelves.
@@nimrodery You're completely out to lunch. This conspiracy that Windows purposely nerfs AMD is so tired. Bulldozer shared FPUs between compute units. It was destined to be lackluster from the get go.
A small suggestion re Linux benchmarking: many games on Steam are windows games that will run on Linux, perhaps with adjustments. But there are also games that, on release, have both windows-native and Linux-native versions. Viewers might find it interesting to see a performance comparison of those two native versions run on identical hardware, you would just need to swap out boot drives to do the testing. You would, of course, keep things like video drivers up to date. And you aren't going to test multiple Linux distributions, just one distro as a kind of standard candle for comparison. Results might be interesting.
Would be interesting. I know from personal experience the War Thunder and Cities Skylines (1) linux clients are dogshit, and usually run better through proton anyways.
Can we talk about how bad lighting has gotten over the last 5-7ish years, or how bad model details have suffered? Its like with film or TV CGI. Feels like executives are pushing the dev time to the limit with open world BS. I'd rather have a game look like something 10 even 20 years ago with better physics and lighting, than what we have now.... Moore's law has caught up with eh executive class, but they are in denial...
I've tried Linux several times over the years but it's just not a realistic gaming platform. Every time I get a diehard Linux fan trying to convert me all I have to do is ask them if they would recommend Linux over Windows to their mom or grandma. Until that answer is an honest yes (which it never is) then don't try to convince people they're better off spending hours on GitHub trying to get the shit to even run rather than actually _playing_ your games on Windows at 5% lower framerates. Even Wendell, the champion for Linux says don't use Linux if your primary use is gaming.
On the VRAM question, I recently upgraded to a 6800xt and I noticed FF16 was putting me at nearly 15GB of VRAM usage because I was running 4k, high textures, with frame gen. Without frame gen it dropped to like 12GB. If companies (read:NVidia) want people to be using this tech then people need to have the VRAM to do it. So glad I went with a 16gb card.
Ray tracing isn't an important or must-have feature. Nvidia has convinced a lot of people that ray tracing performance is more important than standard rasterized performance, but the fact remains that the performance cost of ray tracing even on Nvidia graphics cards is extremely high relative to the actual improvement in visual quality that you can get from it. Suddenly a lot of Nvidia fanboys have gone from saying that having the highest FPS is important, to arguing that having half the frame-rate with ray tracing enabled is better. Some scenes can even look WORSE with ray tracing enabled, and often they just look different but not necessarily objectively better. Ray tracing does seem like a promising technique, and it seems like it's likely to become very common in the future as graphics cards become increasingly better at it, but games still need to be designed to run with ray tracing turned off, or at least with the option to turn ray-tracing settings down low enough so that the performance hit is less than 20% rather than closer to 50%. I have played Cyberpunk 2077 and Control (among a few other games) with an RTX 3080, and did a lot of testing with ray tracing turned on as well as turned off, so I have a lot of personal experience with ray tracing in games where it's been well implemented.
@@syncmonismwhile the 3080 is a solid card no doubt, it’s far from ideal when testing ray tracing/dlss in cyberpunk. The fps hit you get from ultra settings with ray tracing is definitely too much in that game if you’re using a card without frame gen, but with frame gen ray reconstruction + dlss quality is pretty spectacular in my opinion. I agree that nvidia shouldn’t advertise fps numbers based off of dlss or frame gen, but we can make that statement without downplaying the tech that these newer cards have. I’m no nvidia fanboy either, I love my 7800x3d, but my experience with ray tracing, path tracing, and dlss has been pretty great in single player games. We both agree that nvidia has a bit of a ways to go to make upscaling quite on par with rasterization, but they’re getting pretty close in the past few years, and the doors that dlss can open up are very cool
I remember when I first played MGS2 on PS2 and thought the game look so realistic, I remember that black chick that had that gatling gun, Fortune, I remember back thinking how state of the art that game was.
Watching this on Nobara Linux (KDE) while watching TH-cam & gaming, that part is exactly the same. Actually think the alt tabbing goes a bit better. I don't play anticheat games that are kernel invasive, mostly FPS titles certain ones won't run (maybe they require a VM/separate Windows install dualboot setup). Would suggest dualboot for those looking to try, my main game ran on Linux so I've been clean for a few months (still have a Windows 10 LTSC install on another machine in case I want to stab/sort through my NTFS drives quickly/bench/play test a machine.
Also Nobara here but the Gnome flavor (even though I'm considering switching to KDE) and most of the problem I had were because with a Microsoftiedf brain I was banging my head to find a solution that was just 2 o 3 clicks away in the settings. Even the Terminal a lot more accessible now but not required. Of all things he could have said about Linux he chose to against installing apps which as easy as using an app store. This is a serious case of bad fame. People is afraid of what they don't know.
@@lolilolplix Wendell also said you can't just compare Windows against Linux in that regard because there are fundamental differences in the way they process things.
@@jamegumb7298 You clearly didn't watch the Level1techs video as some of the games did you better on Linux. Also wine literally stands for "wine is not an emulator" there is so emulation when running games on linux. All wine and proton do is translate the Spanish of windows commands to the French Linux understands. It like the 3ds "emulating" the ds and gba where its not emulating anything and all and just running the code native because the hardware is similar enough. Linux and Windows run on the same hardware so Linux doesn't need to emulate anything. And translating windows commands to linux isn't like having to take time to look it up in a dictionary. It's like being bilingual where you already know the language enough to hold a conversation but you haven't learned every single word yet. And every update to proton and wine is just learning more words so less mistakes get made.
Can you add Minecraft: java into your game testing? I know it's not a "new" game, and it isn't very demanding, but it's still popular and if you crank up the render distance and add mods/shaders, it does become quite demanding.
ofc great textures and low VRAM usage is reference to Skyrim, it's so obvious but really, i think it's most probably going for Arkham Trilogy, be it City or Knight, idk
I think Tim should stay away from Linux questions until he made a decent effort to use it in a modern setting. :) The biggest issue for Windows users is that Linux is NOT a singular experience and as such it can be confusing to get into. Valve need to release SteamOS, so that the average gamer has a singular introduction to gaming in Linux. Otherwise there are so many distributions to choose from with different desktop environments, software, window compositors, kernel release schedules etc. (LTS vs rolling releases - impacting driver release speed). I have use Manjaro ("Arch", but made for people). It has a rolling release schedule, so you have access to the latest kernel, hence; the lastest drivers. My experience has been that it is: - WAY easier than Windows (no need to manually download/install drivers, applications etc), you just update the OS, or use an "app store"-like interface for software packages. - Less resource intensive (0.5-1GB of RAM) - On AMD APUs games tend to reach higher FPS due to more efficient memory handling hence better bandwidth. - Has better support than Windows for "legacy games" since Proton is more robust than Windows own compatibility layer. - Plays 98%+ of my 350+ single player collection, I even have some older games that don't work in Windows without manual mods that do work in Linux. - Plays 50% of all anti-cheat games (total). - Has support for the majority anti-cheat solutions. But developers often don't enable it due to increased demand on their support team. - AMD just works better with Linux as of right now, as an RX6800-owner, it has been a flawless experience. I know a lot of people swear by Nvidia even in Linux, but my experience has been that driver support is lagging behind AMD, especially in terms of recent features like HDR, the Wayland compositor etc. Especially on older GPUs like the GTX-series. My thinking is that Linux is more interesting for HUB to do an "early look at", once it reaches 5% market share and there's a "go to" gaming-oriented distribution for the majority of gamers can try out for themselves. And instead of calling it "Linux", it might be better to call it out by the distribution's name, like "SteamOS". Because It's not like we are saying "NT Core" instead of "Windows 11" or "Windows 10", or how we don't say "Unix" instead of "MacOS". For normal people the OS is the GUI, the user experience, and how the general OS is assembled. Linux is just the core to a very diverse array of OSes. So calling it "Linux" in itself highlight a communication issue for "Linux gaming" in general, and that's also why you can see such a varied amount of different experiences using the "same OS".
@@Hardwareunboxed well, I am a windows user since 30 years back. I just gave Linux a fair shake recently and read up on it... Once I did, I realized that there is a lot miscommunication online since people treat Linux like a "singular OS experience" when it is not. Linux is a terminal OS, similar to Unix. But you don't say "Unix" when you are referring to MacOS. You don't say "NT Core" when you are referring to Windows XP or Windows 11... To the end user UI/UX is what defines the OS... So the first step for Linux to gain popularity is for someone to create that singular OS experience for gamers. Valve will probably be the ones to do it. Otherwise we are stuck with this endless loop of people "trying Linux, and it sucks", when in fact what sucked about their experience (to no fault of their own, except not reading up) was that they chose a Linux Distribution/OS that didn't cater their needs. many people choose LTS distributions, but most of the time they would be better of with rolling releases, especially if they dabble with new modern hardware frequently.
@@Hardwareunboxed I totally understand why you don't want to benchmark games in Linux, but I really appreciate it when CPU/GPU reviews include some Linux benchmarks for Blender. As far as games go, Windows comparisons are a fine rule of thumb to understand relative power but Linux in Blender is such a huge difference. I know it's never been so complex to benchmark components, and that your audience is mainly looking for games performance.
You make a really good point here. Whenever someone tells me they "tried Linux" my first thought is "which Linux?" Some distros are harder to learn and use. Some distros are marketed as easier to maintain than they actually are. Some distros suffer from being too small to be easily supported. Some distros are just plain bad. A lot of people bounce off because they don't expect to have to (re)learn so much for basic usage, and picked a distro that doesn't make that easy. On having a "go to" gaming distro - say a desktop version of SteamOS (more specifically, one maintained and promoted by Valve in an official capacity) - would really help with solidifying. Linux users tend to be really opinionated on their distro (and DE) of choice, but newcomers might not be. Having an OS for Valve to say _"I hear you're sick of Windows, but worried about gaming compatibility, well here's the one we use ourselves for testing; this is the most approachable and compatible distro you can choose for gaming."_ - could really consolidate those newcomers unsure where to start. And in turn that makes it easier and more appealing for devs to support compatibility with that distro, which in turn helps the rest of the Linux ecosystem. The immutable model Valve is using for SteamOS on the Deck puts them in a really good position to do this, making it far easier to target than even other "stable" distros.
As opposed to who? Intel who had two generations of high end CPUs cook themselves? Does reputation even matter? Consumers only care about what you can do for them today. Listen to comments online and everyone hates NVidia. What does everyone buy? NVidia.
Reputation matters, but not in a mono-/duopoly. A while ago, I was explaining how AMD is probably better value-wise at midrange to a younger relative only to be met with "Who's this Sapphire? Never heard of them, I'd rather just buy a brand I trust in, like ASUS". So while they were rather accepting of the notion of driver issues being in the past for AMD and willing to give them a try, the issue of choosing between many board partners can be a dealbreaker. And this notion can go extremely strong; people get Samsung AC units which are meh at best just because Samsung's done right by them with other appliances in their house. If the price difference is 30%+ for more or less the same product, consumers start looking the other way, before that? "I'd rather overpay but be sure it works well".
Yes it does when AMD is already looking like Intel by gouging prices when Intel just fucked up. They didn't even wait to drive up prices while underdelivering and overpromising. This is Intel behavior of old and should be shit on. Fanboys like you should stop excusing shit behavior especially when AMD is being anti-consumer. What if they repeat next year, would you finally say this should be criticized and try to fix it then? Because if AMD cared about the consumer the price would have dropped and sales would have exploded. But they value money more. Even with the slower sales I think they are making much more money then they did last year because Intel fucked up so badly. Hence no price drop and why you should be critical of AMD.
You act like AMD has been amazing for decades. Yeah I switch when they are good but if they continue this, Intel might get me back if they deliver a CPU that blows AMD away in a few years and just works. I stick to companies not trying to f me over. AMD is no longer that company to me. They stopped being that last GPU launch.
@@Bound4Earth Unhinged much? I'm not really on the market for a new system right now, so I don't put that much stock into it either way. 10/10 way to start with baseless personal attacks. Last I had an AMD GPU was over 10 years ago, probably 80%+ of my/relatives home systems have Intel CPUs, yet idiots like you are quick to point fingers. Overpriced products do little reputational damage. Under-delivering is another thing; I must've missed the hype AMD allegedly did for this gen (there's been a singular promotional piece I'm aware of, and that's not exactly hyping it up). I'm also aware of bad benchmarks - not sure if incompetence or malice, probably both. To me personally, neither of big hardware companies suffered a substantial reputational loss because of the way they've handled it. Don't see myself avoiding Intel purely because they've ran into some bad streak in manufacturing or AMD because they released overpriced products. Systemic issues, such as with Gigabyte PSUs, are a big deal. You can bash Intel/AMD/NVidia all you like, but they by necessity have been somewhat competitive.
@@Bound4Earth Well aside of being locked to a given chipset there's not much to switching sides. GPU side is more capricious, NVidia had me for a long time because of the work applications. But then having great work and game performance on consumer hardware was no longer a thing, AMD has improved... I do happen to care about power efficiency, VRAM, Linux support, so there's also that. So I guess it's the exact opposite: my viewpoint is not of someone who's considered AMD awesome for two decades, but rather been mostly using Intel/NVidia with the occasional dip into AMD. And at least NVidia has been failing to meet my expectations for the past few years. If NVidia straps enough VRAM to a somewhat affordable GPU, I'd probably still go with that. If not, the next upgrade is going to be AMD. Brand loyalty is largely just accumulated experience of various pain points (and expectation of such). If it just works, might as well grab whatever goes on a good sale first. But back to the original point - even at its worst, AMD has somehow survived the ordeal, being locked in a duopoly. In part, the government won't let it die entirely because they'd rather not have a monopoly. Someone like WD may spin up a flash memory division or wind it down. And it is this market in which the brand loyalty matters most.
On the linux gaming craze which has blown out of proportion imho: I'm a linux user. I don't care about benchmarking games on linux. There is almost no way of getting clear data like 1% and 0.1% lows etc. It's not worth the immense effort to try and run benchmarks for the low single digit userbase. Modern games USUALLY run a bit worse than on windows. One huge plus (at least for me) is that I can get older games running, which newer windows versions won't even try to install in most cases. Another point, there are a lot of factors that impact the performance. Your kernel version, your wine (proton) version and their interaction. Your driver choice (whether it's "official/unofficial", yes that's a thing on linux). If you choose to benchmark one specific combination of all of this, you will be benchmarking a use case for let's say 10% of linux users, which is 10% of 2% of userbase. In other words, practically useless. /rant out, ignore grammar mistakes please.
I am over 30 years old. I have been rooting for Linux for over 16 years or so. Unfortunately, I have yet to see Linux dominate in any significant way (besides servers). While it has become much better to use, 'normal' people are just going to use Windows or macOS. I use Windows ONLY for gaming and macOS ONLY for work stuff. I have a Raspberry Pi as a home server . So, you know, torrenting for easily sharing free manuals for washing machines and so on, local file sharing, DNS caching. Unfortunately, most newer games have their own launchers, and some of them have heavier anti-cheat software that just cannot work in tandem with a Linux environment, so many e-sports games just can't run on Linux
Has there ever been a poll for HUB audience about how they use their computers in the first place. I'd really want to see it. Ideally, as a multiple choice of: - Gaming - Listening to music - Browsing the internet [more than/comparable to mobile] - Watching movies - Lightweight work [thin client/remote desktop, editing office docs etc] - Computationally heavy work [rendering, engineering simulations, data science...] - Data storage. Every time I hear "gaming only" it seems like I have a fundamental cultural misunderstanding, some parts that keep being important to me are clearly omitted, and there sometimes is a laser focus on details I keep finding minor or even irrelevant.
Textures and texturemaps. Early games introduced bumpmapping. A normalmap (texturemap) is used to store information of the surface irregularities (the bumps). When the light hits the flat surface from a certain direction (that hits the eye/camera), you can render the light-reflections using the normalmap's bumps. It makes the surface seem to have more detail so it no longer looks flat. You can use texturemaps for all kinds of purposes: Reflectivity, transparency/translucency, you name it. All the properties of a surface are defined in texturemaps. The colors of pixels are usually considered "the texture", by people.. Texturemaps are basically extra data-values for all the pixels in that texture. The data can be more abstract, like timing info. An example is a window hit by raindrops, where the drops seem to appear on random locations, and even drip down (not all at the same moment). Such a texturemp holds the differences in time that drops appear on the glass. If you want to render a scene more and more realistically, you need more and more information of that scene => more texturemaps.
even with zen 5 not being spectacular, their support for AM4 still has me hold AMD cpus in good rep. I hope they actually try to be better price wise with the GPUs, but thanks to AI, I doubt it.
Great honest video... As an old guy in this space it really does pain me that people have issues with windows... Those engineers do a magnificent job in what they do considering how many different use cases there are out there! I've never had a major issue with Windows unless my build was shit! And I certainly wouldn't want to go to a Apple style eco system which is locked or a Linux one that is too open!
CPU/GPU tests on Linux would be interesting because they could demonstrate whether the OS is fully utilizing the hardware's potential in a mutual comparison.
11:30 - PS5 GPU entry level? Are we forgetting Xbox Series S? :) Its GPU compute power is something equivalent to a 5500 XT (with ray accelerators). It is the lowest common denominator this gen that devs have to target.
With this topic it's too much elitism and I'm a PC gamer myself and still annoyed about this. Is the power of a RX 6700 (XT) really entry-level nowadays? The RX 6600 is still a valid card! I'm never willing to pay nearly 1000 € or even close just for a graphics card AND I can afford it! 4K60 or even higher? Sure, just bring more cash into the game! The PS5 just costs below 400-500 € nowadays! Slim SKU included! It becomes more complicated, when you account game prices into it, but still. Who on the PC space (with a brain) would really come with the same arguments? There are real arguments to make. More multiplayer options, no general subscription needed for that either, more choice to either go for more CPU or GPU performance at the same budget, depending on your game preferences. Or even the argument, that you can use the PC for work as well!
Yes GPU price are insane as well as cpu price. AMD also struck on 6 core ryzen 5 and 8core Ryzen 7 for how oong now? I get it being actually being better than previous generation but still laptop market is a complete shit show. intel is just a shit ahow.
@@sMv-Afjal There was a time that i5 was not sufficient for gaming but R5 was enough, that was during i5-9600K vs R5 3600 , because the Intel one doesn't HT but the AMD one has SMT. So it was 6t vs 12t, some games were having stuttering problems because of that. That was the motivation behind i5-10600K having HT, 6c12t . For lower-end CPU, you will only see AMD respond with more core count if there is a critical need for it, beggars can't be choosers. For high-end CPU, there are rumours saying Zen 6 R9 will have 32cores.
@@sMv-Afjal To be fair to AMD in this case, 6C12T still is sufficient even with Zen3. There the difference between 6C and 8C is bigger than with Zen4. I don't know, at what price point you expect CPUs to go down to. You can buy well sufficient ones at 100 € ($) and with over 200 there's the 5700X3D or 7700 (tray version). Do you expect a 7600X3D to go below 300 $/€ that soon? We could criticise performance for applications between AMD and Intel. Depending on the price range there are differences, the 7700(X) and 14600K(F) are in the same range area and the latter is around 10 % or more faster. Below that it's similar, but not as drastic. I'm rather really annoyed, that the 7800X3D became so expensive again! I'd prefer the 7700X boxed to be below 250 €, but at least the tray is cheaper.
Marketing material: In my job as an electronics engineer, many years ago, I returned a power supply to a manufacturer as faulty because their data sheet on their website ( which was readily available) said load regulation should be 0.01% and it measured at 0.05%. Their response was, "Load regulation can be up to 0.1%, that is what the manual says", to get the manual, you had to register with their website. My argument was, "People won't make a purchase based on the manual, If they have to sign up to get it, they will merely look on the readily available data sheet, if they buy the power supply based on that data and find it doesn't do what they want, they would return it as faulty." Eventually, the company corrected their data sheet.
I appreciate Tim at least considering the possibility of thinking about Linux seriously. As a reviewer and tester with such mainline products, it does make sense that Linux not be considered. I’m always reminded of Arrested Development - “There are dozens of us! DOZENS!!!”
I would say for most Linux operating systems, installing basic apps is much, much easier than on Windows. Linux software stores are commonly used in the way that the Microsoft Store is trying to be used and work flawlessly most of the time.
On the whole Linux situation: *TL;DR* - Game compatibility isn't comprehensive or consistent enough, has a couple major hangups (anticheat, launchers), is too much of a melting pot for devs to support, and takes a lot of effort to learn when coming from Windows. Worth learning if you're an enthusiast, but performance-wise not worth benchmarking. I moved to Linux a couple years ago. Most of the problems Tim mentioned are no longer very applicable in mainstream distros. Anticheat is still a big one though, probably the biggest hangup for most (and probably will be for a long time). But compatibility seems to be plug-and-play for around 85-90% of games I play, in my experience. And there's almost never any noteworthy performance difference (which is why it doesn't bother me that HWUB doesn't test Linux). There are still quite a few games that run like trash and have terrible bugs or break at the slightest breeze, but these days those are the exception rather than the rule. Linux gaming is viable and fairly approachable, but it's still got some consistency and reliability issues, and I agree on it being still only "for power users". Another thing is launchers. Big AAA publishers like Ubi and EA like to force players to go through their launcher to run the game, even when the game is sold through Steam. And plenty of games these days aren't sold through Steam, not until much later. Epic's still doing timed exclusivity. These launchers tend to be a nightmare for Proton - and none of them run natively on Linux. Bottles can help, Lutris can help, Winetricks can help, alternate launchers (Heroic) can help - but in my experience it's never as reliable or consistent as a game that goes straight from Steam to launching the game itself on Proton. The main reason for the lack of compatibility and support from devs on Linux is the very "composite" nature of it. Any distro is a whole bunch of moving parts with each update, even on stable point-release distros, there's so much variation across so many different projects that make the core of your system work. These have different update cadences and combinations (sometimes unpredictably so) that pair different core components at different versions. There's hundreds of different distros out there. There's only one Windows 11. It's not a surprise that so few games make any attempt at compatibility for Linux, especially when they can't even keep up with bugs and performance on Windows. What Tim didn't mention is having to learn a wholly different OS. I had no idea just how much I would have to re-learn to understand basic usage and maintenance. "Power user" level functionality even more so. Linux is not like Windows. Concepts like "DE" and "packages" and "mount point" were alien concepts to me at first. People give troubleshooting/advice for Linux in the form of command-line because the system UI varies depending on your DE of choice, which is also interchangeable. Packages are installed with a package manager (which may have a GUI frontend), not downloading and running an installer. All storage on your system is mapped as a tree, rather than distinct separate "drives". Paths are case-sensitive. It's a learning process. These things aren't worse (I'd argue they're better), they're just not what Windows users are used to. I've seen mention that if Valve were to put forth a desktop version of SteamOS _in an official capacity_ and promote and support it as such, it could really help solidify the Linux gaming userbase. Valve is honestly killing it with SteamOS 3 using an immutable model on the Deck. Making a push to have devs target not some arbitrary third-party distro, but their own SteamOS, with immutable versioned system components, and user-installed software separated on an entirely different packaging layer, could do wonders for compatibility and support.
I think that second question is confusing "good art direction" with "texture quality". The limitations of older games forced developers to put effort into designs, whereas nowadays they can just stuff whatever muddy indistinct image they want into a texture file and call it "high-resolution".
@Hardware Uboxed. In your sharts I still see you use WRONG titel name Microsoft Flight Simulator 2024 instead of 2020. The 2024 version will be released the 19th of November. Please correct it. And DX11 is still faster than DX12 in MSFS2020.
Unless you're really looking at every item on a scene under the microscope, having textures with a really high resolution won't change visuals as much as it will require larger VRAM buffers. One thing that's quite funny is the textures that actually give the detail aren't the "main" ones which are tipically higher res (the diffuse or albeldo as it's also called) but instead normal, bump and displacement maps which normally are fairly smaller compared to their diffuse counterparts... Like say you have a 4096x4096 diffuse texture (which is overkill for the most part even for some closeup type stuff) the normal and bump map textures will likely be less than half the resolution of the diffuse (there are obviously exceptions but still) Textures can only get you so far in terms of improving the visuals vs the toll they take (bigger install sizes, larger VRAM buffer requirements and so forth) A lot of the time, what will make your in game scene look off will most likely be the aliasing type being used or - because we are in the age of image reconstruction and upscaling - the initial rendering technique before the frame is "reconstructed" You get way more detail with denser geometry than bigger textures but textures require less testing so that's usually the main choice.
16:50 - I think there is a sizable population of gamers willing to leave Windows if games ran better on Linux. I am one of them and have been debating it for a while with people that there will be a huge shift for gamers for an OS that consumes less resources and has better performance for solely game use.
Is there going to be 7800X3D testing amidst the testing of the 5000 & 3000 series CPUs? I really want to see how the 24H2 vs 23H2 effects the 7800X3D before I do the update myself.
Re Linux, it'll be interesting to see if the recent announcement from Microsoft about moving away from kernel level safety solutions will make drm and ac clients portable onto Linux
Ahhh! Texture quality in games! My problem with modern games is not really the texture quality as much as the use of shadows or recently raytracing used to create an image where I can't see the textures much less the player character or the opponents.
But you don't know how good it is, and you don't know what the price will be... Personally I expect it to launch at a higher price than is justified, as is almost always the case with AMD CPU launches. Say 5% faster than 7800X3D for gaming, and 30-35% higher cost.
Developers spend most of their time doing more of things you can't see while actually playing the game. Manufacturers have built absurdly inefficient and expensive hardware to implement it. People pay through nose for things that make no gameplay difference. Tell me where is sanity? Silliness of the week. The AMD Ryzen 9 9950X Processor is selling out in places and I don't think it's gamers buying it.
Well, it's a rendering beast. I have it on my radar for when the new generation of GPUs drops. Those nice and fast Zen5 corse will eat up any 8K renderings I throw at them without issues.
@@mondodimotorithe problem was never the gaming performance. The problem was AMDs claims of gaming performance. If they just touted efficiency gains(even if not exactly true) and better production performance.. we would be looking at zen5 in a much better light. Instead AMD acts like reviewers are lying or confused when we see only 3% uplift for games. It's not a bad CPU at all but expectations run wild.
@@christophermullins7163 So something people have been doing since the beginning of time is suddenly a big problem with AMD? I invite you to go back and look at the pre launch hype from TH-camrs. Some of the most over the top hype comes from the very TH-camrs trashing the 9000 series post launch.
Very good explanation of the texture/VRAM question by Tim. Excellent even. Many people think that the GPU only places textures on the screen, while ignoring the fact that so much computation takes place on the GPU, and feeding that from VRAM is muuuuch quicker than using system memory.
I just bought the ryzen 9 7950x3d and took me a full day just to get it to run properly. The core parking was a must and had to go through a special process. Now its a beast
The car example from GTA was an interesting example though it doesn't have to do with the VRAM amount per se but rather the amount of different things in the scene: Different models (or on more modern games models composed of more separate parts) mean more drawcalls so more things to be calculated so more work for the CPU and GPU. This can also be applied with the "texture layering" which is done by having multiple uv channels on models so again more parts being drawn (the main model in the default state plus the same model with the second UV channel being used to display the "conditional" texture or textures) which may reduce the polygon count but doesn't reduce drawcalls because models will have more materials. Also most textures are seriously compressed an the standard format generally is DDS. You're very unlikely to see large textures in PNG format, for example. The same 4096x4096 texture in PNG can easily be hundreds of megabytes in size but as a DDS compressed file be 100 times smaller. Just imagine having those huge textures without compression. Not even a 4090 would have enough VRAM if that was the case.
One thing the Steam Deck has taught me is that there is no floor for how low I'm willing to crank the graphics down, as long as it hits 40 FPS. I feel that 30 FPS is an unplayable framerate.
Reputation suggests they've burned their customers, which hasn't happened. People are taking a pause and waiting for better products, most of them because of how great their current Zen 3 and Zen 4 products are treating them. Zen 4 will probably be a product that even 8 years from now, casual gamers (non-enthusiasts) will still be perfectly satisfied having in their systems running e-sports and previous gen games - This segment won't be interested in buying $90 AAA titles anyhow.
Regarding the VRAM discussion, textures have actually become a lower and lower proportion of overall VRAM usage over time. It's still substantial, but most memory gets used up by the seemingly endless number of buffers required by modern graphical rendering techniques. This is why it's so difficult for developers to adapt their games to run on the Series S and why modern games can still require 6 GB of VRAM to run in potato graphics mode. I saw a nice chart a while back showing all the different things using VRAM in a particular game, wish I could find it again.
Agree 100% that the time is not yet right for ROUTINE coverage of Linux gaming. But a one-off status review of using eg Linux Mint (as a widely-used, pseudo-Windows distro) as a general desktop environment and then for gaming would be great to see. My sense is that more and more people are unhappy with the direction that Microsoft is taking with Windows and that, unless you use very specialised software, Mint is generally on a par with Windows for general home/office work. Gaming is coming on apace but still a work in progress.
Perceived texture quality is difficult to separate from the quality of lighting, shadows, bump mapping etc. In older games, particularly ones with extra sharp textures, those might paradoxically have stood out more and led to a wow factor that is no longer there when the visuals in total look amazing. Back then, the textures often were the only thing that looked amazing, with subpar lighting, blobby, flickering shadows and far fewer polygons. Also, when you compare a slow moving game to one with fast movement, the slow game will always win out in terms of "texture-wow-factor" (provided the textures are high res and good). Conversely, some modern fast-moving games might actually get away with lower res textures?
Games have become more Vram-hungry because shaders have become more flexible and complex. Older titles have a lot fewer individual maps that make up a "texture". Most of the effects used to be baked into the diffusion maps. Tesselation bump maps, material maps, reflection maps, specular maps, occlusion maps, pre-rendered shadow maps for low presets etc etc Games used to have all that stuff baked into not more two or three image files, now it's all loaded separately, calculated and composed in real-time. That takes memory.
"It is little details that you probably wouldn't notice if you were playing the game normally." Exactly. Most of the whiz-bang details gobbling tons of the VRAM do not matter much to normal gameplay, it mainly matters to pixel-peepers. While actually playing, most of the fine details get blurred out by screen and eye persistence.
What is your opinion on the fact amd was not offering some fix for security issue found for zen2 meanwhile they are selling them and also releasing them as new with 7000 series so they must be supported as long as zen 4
Zen 6 is going to have some _interesting_ marketing considering Zen 6 will most likely have NPU cores (Both Intel and AMD are starting to use them and it'll be a Win11/12 thing) So it'll probably be a slight improvement to perf overall, but nothing spectacular and probably no core increases.
On Vram: it also depends, even what game mode you play in the same game. MW3 uses considerably less vram then warzone/resurgence, because the maps are small. In a MW 3 map you might hit 10gb usage on extreme, while in WZ/Res you WILL use all available vram, for "open world caching".
I daily drive linux, but I definitely see where you guys are coming from with that segment. I will have to push back a tiny bit Tim. You mention installing software is harder on Linux, but if it works, I think installing through a software center is easier and better than opening a web browser, searching up the name of the thing, avoid the malicious links that say they offer that software, downloading an installer and so on. It is just that you are not used to installing things that way so the easier thing seems harder, but this is arguable. For my use case, I mostly do software development and occasionally play some single player games, and getting the tools I use set up on Windows would be harder than on Linux for me, so in my case I find it easier to use Linux. For content creators, they are gonna try it and go "my Adobe apps don't work this sucks" and yeah sure they are right and that's a problem. I just don't use any of those apps. But that''s why you get a lot of negative press on YT when people try Linux, because spoilers content creators are not gonna be happy when content creator things don't work
it's arguable easier to install and play a game on linux than it is on widows currently...that is some games are actually far simpler to get up and going, while others require a tiny tweak here or there.
My understanding for XMP is that the memory chips have 3 or 4 profiles stored on them. And the profiles are JEDEC listed so technically they are in spec
As some who still plays a lot of older games, and a mixture of new games with a top tier PC on a 4k monitor. I can say that plenty of older games still look just as good if not better than many of today's titles. I still play Battlefield 1, Watch Dogs 2, Modern Warfare 2019 and COD Black Ops 3, and the texture quality on those games is signficantly better than most current games while using 6gb at 4k max settings. What's dated in some of those older games is the lack of ray tracing, or pixelated shadows, or maybe just simplistic or flat foliage. The only recent games with textures that genuinely looked good, with great visuals all around where The Last of Us Part 1 and Hogwarts Legacy, they use 10+ GB at 4k Ultra. Sons of the Forest looks insane as well and I think uses about 8gb. Resident Evil 8 looks damn near photorealistic and runs better at 8k than most games do at 4k with a 4090. There are some great looking games, but texture quality is definitely worse now than it was 6-8 years ago on the average game.
I've been gaming primarily on Linux since november/december of 2017 and I think the contrast between back then(pre-DXVK, pre-proton, pre-lutris, etc.) and now really has people overestimating just how good Linux gaming is. If I was on an NVIDIA GPU I would go back to Windows today, and if I wasn't a power user I wouldn't use Linux at all really. Even the low thought distros like PopOS eventually ask you to go to the command line(not necessarily in any high complexity or hard to understand way, but opening a terminal at all is a step too far for the "average user"), and games are often finicky enough that if I hadn't built up what are almost automatic at this point tweaks I'd run into games significantly often which just "don't work" to me. I think it'd be worthwhile to maintain a Linux test OS just to keep an eye on issues like shader stutter if that ever becomes a problem again because DXVK resulted in a long period where Elden Ring was an objectively better gaming experience via proton/via DXVK(you can technically use DXVK on windows but it's extremely cursed) due to managing shaders more properly on behalf of the game while the game failed to properly manage shaders, causing the stutters on all systems. The main problem is that there isn't really a "stable target" for Linux as well, so benchmarking is extremely difficult to do in a truly consistent manner. Phoronix is the obvious standout there.
If the hardware unboxed team decides to try linux again in the future, I would recommend using Kde as your desktop environment, such as kubuntu, or kde fedora (not kde plasma, that distro is more of a testing ground), it supports vrr, high refresh, proper 4k monitor scaling, hdr, and 10 bit color. I would also recommend using fedora linux as its usually very up to date and very easy to install. I also believe there are gaming specific user tweaks for linux for optimal performance. But yes I would not say linux is for the average user yet. You still need a decent amount of trouble shooting skills to get all your apps running correctly.
About the VRAM usage question: Different games load different amounts of the game world at once. Optimise for less VRAM, and you may have more loading screens (or need to hide them with transitions). Also, you could reuse textures, if you have fewer different objects in a scene. Therefore, the individual texture could be of higher quality, when there are fewer of them.
18:00 - I have heard of warranty being voided by XMP when it came to intel chips, although this was a long long long time ago around 8th Gen (coffee lake??) when Intel introduced their *'Performance Tuning Protection Plan'* (PTPP) extended warranty programme and were trying to strong arm the enthusiast community into buying into it. I think it they charged an extra $30-80 if you wanted to enable XMP or overclock their chips. But for most people it was easy to bypass if they had an IQ over 100... Customer: _"CPU isnt working, Ive tested in 20 other machines and nothing boots."_ Intel: *"Were you running XMP when the CPU stopped working in the original machine??"* Customer: _"XMP??? Whats that?? Ive never heard of it. I dont even know how to change a BIOS battery"_ Intel: **Authorises RMA**
To install a program in Linux is not hard, you don't need to be an advanced user. You press the 'windows' key. Click on the software icon. A window opens. Point your mouse on the icon of the thing you want to install and click install. Job done. I don't know why people spread outright lies about Linux. If you can click a mouse you can use Linux. If you feel adventurous you can install stuff by other means but you don't have to. Just click on an app, choose what you want, click on an icon to install. Go to app menu (windows key) click on icon to run app. Yeah, it is rocket science. Keep spreading lies.
I think the X3D chips could be the saviour of Zen 5, given the efficiency gains of Zen 5 over Zen 4 there could be significant overheads for the X3D chips to take advantage of. Given that Zen 3 and 4 X3D operated on reduced frequency and TDP due to thermal sensitivity of the 3D V cache, if the Zen 5 parts can take advantage of higher clock speeds due to the increased efficiency of the architecture it could provide a notable performance boost for gamers and give Intel even more headaches when Arrow Lake launches.
But does it really tarnish their reputation? It's pretty scandalous. Most people eagerly anticipate the release of the X3D chips and the Windows 24H2 update. Other than that, I think Zen5 will gradually gain traction.
Once the 7000 series sells out, and consumers being fearful of buying a degrading intel cpu....jeez it's pretty faint praise....yeah, sure, given all other options are crap, I'd buy zen5....I guess. Or wait for the next gen and hope it's decent.
@@StingyGeek, it's not AMD's fault that Intel's CPUs have performed poorly for the last five or six years. Don't we always want to buy "the best" product? You sound like you believe there is no other option, so discredit AMD in the process. They may not always be at the top, but that's what competition is about! Without competition, companies like Nvidia can get away with nonsense.
Zen 5 will get traction if Zen 6 failed to perform up to the hype. that is when we will get really worry.
It doesn't tarnish anything. Nothing really happened to Intel for making customers stuck in 14nm process node loop for almost a decade. These TH-camrs are bunch of yappers who has a memory of a fish.
@@vicvecoIf “most eagerly await 9000x3d” than Intel still would not hold 80%+ market share
For tarnishing a companies reputation… that has not happened. It’s not like exploding Ryzen 7000 series tarnished AMD reputation. Or when they rebranded 7000 series in 8000 by adding an APU tarnished AMD reputation either. Let’s also not forget that Ryzen 9000 flopping hard not changing AMD’s reputation. Lastly, Ryzen 7000-9000 having such weak sales that AMD still has to keep releasing Ryzen 5000 parts to shore up financial losses, hasn’t messed with AMD’s reputation either.
So, cut the fanboy bullsh!t. These are PC parts; not sports team.
AMD may have stood on a rake with the launch of Zen 5 but at least they didn't pour gas at their feet and then set themselves on fire like Intel has been doing with their 13th and 14th generation failures. Zen 5 is a mediocre uplift sure but at least there aren't any reliability concerns.
AMD's GPU division failures on products and their CPU marketing debacle probs makes it overall worse compared to Intel and NVIDIA. At least in the grand scheme of market share of GPUs and CPUs, they are still behind both their contemporaries in Steam Charts (behind NVIDIA in gpu shares completely) and long term server market share (behind Intel server end cpu share despite winning in most metrics).
@@Therapistwatermirrursevers are a hard market but Intel having their asses handed to them by AMD in that market is a big surprise, Epyc is a powerful and efficient chipset afterall
There weren't any reliability concerns with 13th gen less than two months into its lifecycle, either.
Zen 5 in Win 10 is on average is 10% faster than in Win 11 24H2 Intel se small decline on win 10
@@concinnus Why would you expect that because of 13th and 14th gen Intel, now all processors are suspect to such fails? The track record of AMD is good. Intel, not so much, given remember the Pentium FDIV bug?
In my opinion AMD new processors being "meh" is nowhere NEAR as disappointing as the Intel shitshow this year.
Ya, I honestly don't even know why this is a discussion. Intel's communication on the failing cpu's was abysmal and they weren't even really owning up to the issues at first. The CPU's are literally failing and no bios is going to fix that, it only slows the inevitable.
Intel also stepped up with a five year warranty .. I don't think AMD would ever do that. And they should - my Threadripper 6950X died six months past the puny 3 year warranty. I'm not so sure AMD is much better.
AMD screwed up with setting expectations too high for what gaming performance would be at launch, however, it's definitely still possible that Zen 5 could end up living up to the performance figures that AMD was showing, as games and other software (such as Windows itself) becomes better optimized for its new architectural features. Zen 5 has more new features and changes to the architecture vs. previous gen Zen architectures than any other Zen CPU architecture has had. The move to chiplets was a big deal for manufacturing, but the actual substantive architectural changes, especially from a programming standpoint, were not actually that significant.
I'm not suggesting that you should buy Zen 5 over Zen 4 for a gaming system today, especially because a 7600 is already a very powerful gaming CPU that's going to be MORE than powerful enough for anybody who isn't planning to spend 900 USD or more on a graphics card.
It's hard to say how long it could take for Zen 5 to be well optimized for in games, or how much performance will improve vs. Zen 4. If you know a lot about the history of CPU performance in games and other applications, you'll know that SOME architectural changes can see big improvements from day one, but also some other changes require a lot more changes and optimizations to software, and it can sometimes take many years before those architectural changes are well optimized for in a wide range of games, and there will continue to be many games or applications which aren't coded to take advantage of many modern features very long after, especially games or apps which aren't very demanding and are made on a lower budget by companies with less experienced and smaller programming teams. Sometimes it's not until one or two CPU generations later, sometimes even longer than that, before a wide range of software becomes well optimized for a new architectural feature. It took ages for games and many other applications to take advantage of dual-core CPUs, and it took even longer for games to start benefitting significantly from having four or more cores.
Zen 5 already offers a 12% performance improvement over Zen 4 in games in Linux, believe it or not, and does offer a significant improvement over Zen 4 on Windows in a couple of games already, which at least suggests that it's possible for more games to see a bigger improvement over Zen 4 performance in the future.
@@ericmatthews8497 Good for them to do that. If any company has an actual systemic issue that's degrading their products then obviously they need to step up somehow.
But your example is just a one-off. Sometimes shit happens and you get a bad sample and that can happen with any company. Could have been Intel just as well with their 3 year warranty at the time.
Also, the discussion is about stuff the companies have done this year. I'm not just hating on them because I like competition but Intel fuckup is on another level this time.
You're really downplaying how shit AMD has been this year. First saying their 5800 XT was as fast, in some cases faster than a 13700k. Then going off saying Zen 5 will be better than 14th gen, another lie. Then blaming reviewers by saying their testing was flawed while also admiting they purposely nerfed intel CPU performance to make Zen 5 look better. Yeah intel fucked up by keeping quiet on the whole thing, but at least their not straight up lying and spitting in your face like AMD is.
I work as an animator and just kind of touch on the texture stuff to make a realistic texture these days, there's multiple layers in a texture that make a shader, you have the texture it's self, a normal map to give it depth, a bump map to give it more texturised, these days you also have subsurface scattering and multiple other things and it all needs to load into vram, it's a lot.
Tell that to the people who are filled with nostalgia and try to tell you 10 year old games "still look modern".
@@Chrissy717 they need to take the beer goggles off..
@@Chrissy717 If you cherry pick the artistically best looking games from 10 years ago, they still hold up super well. I think that helps to explain what they're thinking. The graphics may be less detailed and less sophisticated in many ways, but art is pretty subjective, and it's clear that it's possible for an older and less detailed looking game to still "look better" to someone. Of course, to generalize that "games looked better 10 years ago" is still a pretty dubious generalization. Definitely there were lots of bad looking and artistically poorly designed games back then. The two worst artistically designed games I've ever played are both over 10 years old now.
While many things in artistic design and graphical design are subjective, there are a lot of things which people are likely to agree upon when it comes to what types of art styles are better, especially from a gaming standpoint, where making different types of things easily recognizeable, and easy to tell apart from each other, can be very important and valuable for making the gameplay itself better, for example. For me, one of the worst offenses is to have unimportant features with no relevance to gameplay and minimal to no relevance to the narrative being more attention grabbing than information that's important to the game's objectives or to the game's narrative.
@@Chrissy717 There is a technical aspect and artist aspect to things. What you just did is such a silly thing to say and not really care about what the guy just said.
People acting like Zen5 is the worst thing ever when it's just mid despite the marketing mishandling. Intel did it for years after Sandy Bridge. One underwhelming generation isn't such a reputation destroyer. Intel literally mishandling the 13 and 14 gen disaster is so much worse and completely destroyed any kind of goodwill from the costumers. That's a much bigger deal imo
I mean we criticized Intel back then as well. Only fair to criticize AMD for it.
@@brolo7234 Yeah and people still bought Intel anyway. It's not really something that damages your reputation. I'm not going to upgrade from my 5800X3D anytime soon but sure all hell I'm not jumping ship because of Zen5 , especially when Intel is so much more unreliable and anti consumer
@@brolo7234 The thing is that zen 5 performs well outside of gaming. People pretend like gaming is the centre of the universe, that goes for Intel systems too. Recently they've released pretty good workstation and server chips. Zen 5 simply was optimised for datacenter and workstation work loads and it shows. I don't know why AMD bothers with non x3D chips for consumers since that's what is exactly what customers want.
@brolo7234 If we don't have the power of criticism, they will never improve or change, and peers wouldn't know any better for all camps 🫳🎤!
@@roccociccone597Normal because if you want to see praises of Zen 5 being good in productivity, you have to go to the channel or community that does productivity specifically, forgot the word for it in english, but its a bias cause people are only caring what they want not overall picture.
But their marketing is really a main problem.
I think people forgot this is a Q&A.
I'd like to see the 5900x or 5950x tested on 24H2 since they're two ccd chips.
Hardware Canucks have tested the older Ryzen generations wrt the Windows patch/update. Go check it out.
TL;DR: The X3D parts tend to benefit the most.
Win 10 vs Win 11 24H2, specifically, as many of us haven't infested our systems with that OS yet.
Honestly, while Zen 5 was a bust of a launch, AMD would need to completely drop the ball with Zen 5 X3D to lose ground to Intel.
They already did unless they drastically changed the design and pull off a mircale. The 9800 part is barely an Intel bump from the decade where they fucked everyone by holding back performance. It was not a terrible product just not gains like they had in every launch prior and they overprimised making the number slook much worse. Look at the difference between the 5800/7800 and X3D parts and you can see where it will line up and it is not going to magically save this launch unless they underprice it which I don't see happening. Even then compared to prior launches it is disappointing. They still have not done the simple price drop that would explode sales. Remember the launch wasn't bad because the part was bad it was just a shit value.
they will price is stupid high because they have shipping containers full of zen 5 no one wants.
which will just create another yard full of 3d chips no one wants.
AMD is smart like that.
Zen 4 is selling better than ever and no one buys Intel anymore. Zen 5 was underwhelming but all they have to do is lowering prices.
Intel has to prove they can offer actually working CPUs for 12+ months before I'd consider them again.
@@Bound4Earth Yes keep believe that stupidity, when Zen 4 3D is literally 20% faster in gaming over it's own normal Zen 4 Model, while also having 10% less clocks & lower power consumption.
I think people will be disappointed in the 9800X3D. Anywhere Zen5 DOES actually perform better than Zen4 are all places 3D vcache is useless. And we've already seen overclocking does squat for X3D so I wouldn't hope for a whole lot of gains there either. Unless X3D is somehow NOT simply Zen5 with vcache added I wouldn't expect any remarkable uplift over the 7800X3D.
No one will call Arrow Lake a failure if it doesn't get a win over X3D, but _everyone_ will call AMD a failure if it _does._
Let's hope I'm wrong. Or maybe not. I'd be happy to see X3D get out of the $400s because that's WAY too much for an 8 core in 2024, let alone 2025.
Any upscaling = not 4k. Don't care what flavor of upscaling it is.
True but using dlss quality on a 4k LG C2 is getting hard to distinguish at a normal sitting distance.
@@JagsP95still not 4k
@@JagsP95 That's NOT 'true 4k' - please don't short change your self. Upscaling has it's traditional 'good' uses, of course, but when Nvidia markets upscaling and fake frames as a xx90 card 'feature', you know they're f#ng with us i.e. those high end cards are not so high end without that fake sh#t.
@@ChrisM541 I really don't care how they put those pixels on my screen. As long as it looks good then that's all I care about. It's all computer generated anyways. DLSS Quality on my 4K tv looks indistinguishable from native 4K. DLSS upscaling alone makes Nvidia worth it. I'm happy with my 4070 Super.
I doubt you will find many games without any upscaling. Even if you set the output at 4K, there’ll be upscaling for all sorts of graphical effects, including reflections, global illumination, textures, light shafts. These can always introduce aliasing. We even talk about “high resolution shadow maps” and stuff independently of the game’s stated render res.
When you use DLSS, they don’t decrease the resolution of all graphical effects either. Some might still be running at 4K, like motion blur and depth of field. And the UI of course.
And you know that native resolution isn’t the holy grail. There are benefits from downscaling. Ray tracing sometimes needs to be sampled at much higher than the game’s stated render resolution.
4K ain’t 4K, basically. It’s more complicated. To make good-looking games, there are compromises.
I've messed with Linux every so often, in dual boot or VM scenarios, for 33 years, but I'm closer now to dumping Windows entirely than at any point before. I don't care about the genre of games that make use of anti-cheat. With that in mind, I'm really not sure why I'm still using Windows today when I could be using SteamOS or something else for gaming.
Maybe because on AVERAGE gaming performance is lower, and there are far more issues?
I think nobara is worth a try. Fedora needs a bit of fiddling, which is alright for me, but nivara is supposed to do it out of the box.
100% with you, I’m now dual booting Nobara KDE just for gaming and Plex and really liking it. Just steer clear of any Arch distros
@@photonboy999obviously not been keeping up either the news. Since Valve have been normalising games on Linux it’s an entirely different experience these days.
I'd have switched already if the office suites available on Linux were better. That's literally the only thing holding me back
What is there to repair? It's not like these processors don't work, or rapidly degrade. Yeah, it's a small upgrade for gaming, but there are much more uses for a CPU. Intel been doing 5% perf. Increases for all uses from Sandy Bridge to 8700, when they finally gave us more then 4/8 on mainstream desktop and nobody seem to mind
No worries, it's just HUB and their clickbaitey vid titles. Ofc AMD is doing fine despite Zen 5 not being that much of an improvement. They are not forcing Zen 5 on us and we can happily live with Zen 4. Now if another gen gets such a small improvement too, then I start to worry.
The main issue was not the actual processors was it?
maybe for lying constantly???
More like they need to "repair" their marketing with the unrealistic performance claims they're making.
@@rtt4163 As if other tech companies don't do that on the regular. That is why people come to watch channels like this in the first place, to compare claims with actual tests
Yes, in Brazil Intel denies warranty if you use mem frequency above 5600MT. It's the first question they ask you when applying for warranty. And then you're blacklisted, you can't reapply for warranty for the same S/N of the cpu.
Sounds like a scam
Sounds like a lack of consumer protection laws.
What does it do to the stats if I watch the whole video, leave, then remember I forgot to like it, start it again, like it and then leave? I do that so often.
me, every single time i watch from the tv app
Don't worry about it
@@tanmaypanadi1414 because...
VRAM problem is that new games put MORE types of data on the GPU than just textures and meshes, like Distance Fields and Virtual Shadow Maps or more G-buffers to make better materials and effects.
Did you listen to the video?
I don't think AMD's is as damaged as this video title suggests. I have a 7800X3D system and a 7700X system. Zen 5 isn't an update for existing Zen 4 owners but for new builders it's great. AGESA 12.2.0.2 and the Windows 11 scheduler fixes add quite a bit of performance versus their launch. HUB will hopefully test the new AGESA in the upcoming weeks, possibly during the 9800X3D launch.
AGESA changes dont affect 9700x/9600x tho as these are single CCD processors. If these didnt see big generational changes, double ccds wont either (and benchmarks done suggest its 1% difference on double ccd cpus).
the tittle dont sugggest AMD is Damaged as much you say the tittle does.........its not the products that is bad, it s the way they tried to solds us the product. Stop white knighting AMD.
It was always going to be hard for Zen5 to be impressive because Zen4 is so darn good. If Zen5 is a bust, it's because of where our expectations got set. Pushing a datecenter-focused CPU out to gaming reviewers is just dumb. But I've said it before -- look who's in charge of AMD's marketing department.
@@keldon1137 AGESA 12.2.0.2 has already been tested and it indeed improves performance. I'm not making this up. My comment is based on facts. HUB and GN will hopefully get around to testing it but it just released and is still in beta UEFI form. They, rightly, don't typically test betas.
@@AshtonCoolman 12.2.0.2 aims to reduce cross CCD latency, meaning it only affects double CCD cpus (not 9600/9700). In case of double CCD cpus (9950/9900) from benchmarks i seen its within 1% improvement.
Zen 5 is rather impressive for most of the things that matter (scientific / HPC computing, technical computing, data center infrastructure.... all of which primarily run on Linux). Reviewers on TH-cam tend to focus on Windows / gaming performance, which is one of the most boring areas of computing at the moment, IMO.
Which is exactly why IDGAF about X3D. I still care about gaming, but gaming isn't CPU intensive in the real world, and the i7 absolutely curb stomps it in production for less $$. AMD needs to give mixed use users a lot more love.
Zen5 was not advertised from AMd on CES on scientific /HPC computing and other shits then they show benches for Windows tasks and games and of course they lies alot!! main buyers of desktops are not peoples with this shits what you mention then gamers and windows productivity tasks
I guess only intel reputation had a very harsh hit. Amd coming short of expectations is not a big deal, and the handling of their pr was not great but far from the worse that can happen. Nvidia using misleading product stack names is not a news so i guess its business as usual, but 13th gen and 14th gen issues and the poor pr handling has been a total failure
9:31 "right amount of vram"
basically will only work on 5090, other peasants with 16 gb or less won't get anything
Linux gaming seems to be able to play like 60-70% of the games on steam, mostly on similiar level, sometimes higher if we look at Wendel comparing Shadow of the Tomb Raider, but yeah, you have to keep in mind that there's 30% chance your next game might not work, so linux gaming can be good if you just dual boot and play games, that you can't on linux, on windows
in my experience it's only games that literally want to run malware (kernel level anticheat) on your system. I cannot comprehend how compromising your entire system just to play some online game is worth it.
@@roccociccone597 well... people do approve of this, and other use kernel level cheats to avoid "weaker" anti-cheats, so those are reply to what cheaters were allowing cheats to do, and still 2 "best' cheating methods are now AI and 2 pc setups forwarding inputs, so yeah, i kind of can't blame people for putting those anti-cheats in
@@1Grainer1 Any perf differences? Not just AM5, but I'm guessing "it depends".
@@roccociccone597arent there still plenty of features missing like frame gen, DLDSR, HDR, ive heard that even basic stuff like VRR gets broken often on wayland, so its definitely not just anticheat
@@garrett3117 zen 5 seemed to have way bigger differences on Linux more or less 10-12% uplift, that's why there's a big part of audience that points out Linux testing so much, as for "built in" Shadow of the Tomb raider, it goes from 270 to more or less 320fps between windows and linux? but in other games it can be the same or lower than windows, so yeah, it depends, you can mostly say it's very comparable if game can run
AMD will just be fine unless Arrow Lake gets significant performance increase and if it runs somewhat cool. On the other hand I strongly believe AMD did not change the architecture as they claimed just increased IPC like Raptor lake refresh and wanted to sell you CPU while everybody has a negative opinion about Intel. Both company are bad you just choose lesser evil.
How would that happen if people generally liked Intel up until this year? Zen 5 was being created when Intel was on the comeback and was doing better. You know CPUs typically take 3-5 years to make right? You also realize that there usually are separate leap frogging teams working on Zen so for example, one team works on Zen 2 while another is working on Zen 3 and then the Zen 2 team switches to Zen 4 when they are done and the Zen 3 team works on Zen 5. At least that's how AMD was doing it in the past when they wanted to make a completely new architecture every other generation. I've heard they will stop doing this because it's getting increasingly difficult to do. Zen 6 really should be their next great architecture if things work out due to the big latency decrease from stacking the IOD and CCDs together according to rumors and leaks.
I am seriously considering ditching windows when W10 becomes obsolete as I no longer trust windows as a product. I will be installing a few different distros on my secondary computer to see what works for me.
Already did. Linux has been great so far. Just a learning curve because it's a new animal. Only problem is some anti cheats that are kernal level won't work because Linux won't allow that level of fuckery with their OS.
I've been talking with a friend who dualboots Windows 10 and Linux Mint, and a lot of the things I need from my daily driver seem to be covered on Linux thanks to the Steam store and emulation. It's really just day 1 new game releases I'm a bit concerned with, but Linux seems to have come a long way since last time I looked around 2016.
It’s not that Linux doesn’t let anti cheat runs on kernel, it’s that is so open that you could see what actually does that anti cheat and what info it gets
W10 is 10% faster for Zen5
A good starter distro is nobara. It seems fairly good.
AMD make 1 mistake and we're all over. Intel make 2/4 cores with not much innovation for over and decade and we're like "yeah ok"
AMD had 8 core chips at the time. Well actually they weren't really 8 cores, because the cores shared resources with each other. A sly move by AMD. Unfortunately they got outperformed by dual core i3's from Intel.
Honestly, were you even alive during the AMD FX vs Core era?
yea cause Intel was still better, that's completely on AMD being shit
@@General_Li_Shin Could also be on Microsoft, helping their buds at Intel by not optimizing for AMD. Bulldozer CPUs worked much better by the time the first Ryzen CPUs hit shelves.
@@General_Li_Shin I think the only processor they made that was better for a short time was the Athlon 64 chip?
@@nimrodery You're completely out to lunch. This conspiracy that Windows purposely nerfs AMD is so tired.
Bulldozer shared FPUs between compute units. It was destined to be lackluster from the get go.
A small suggestion re Linux benchmarking: many games on Steam are windows games that will run on Linux, perhaps with adjustments. But there are also games that, on release, have both windows-native and Linux-native versions. Viewers might find it interesting to see a performance comparison of those two native versions run on identical hardware, you would just need to swap out boot drives to do the testing. You would, of course, keep things like video drivers up to date. And you aren't going to test multiple Linux distributions, just one distro as a kind of standard candle for comparison. Results might be interesting.
Would be interesting. I know from personal experience the War Thunder and Cities Skylines (1) linux clients are dogshit, and usually run better through proton anyways.
Can we talk about how bad lighting has gotten over the last 5-7ish years, or how bad model details have suffered? Its like with film or TV CGI. Feels like executives are pushing the dev time to the limit with open world BS. I'd rather have a game look like something 10 even 20 years ago with better physics and lighting, than what we have now.... Moore's law has caught up with eh executive class, but they are in denial...
I got the 9950X, upgraded from 5800x3D, and gotta say, I'm absolutely impressed 😎
for 650$ you must be but if you buy 9950x over 7950x or 14700k ,14900k you would not be impressed then more like depresed
@@11ESSE111 not really. I'm really liking it and I don't care for a few percentage points difference between them.
@@myhnea92 you must care for bigger difference in pricing while there is no some difference in performance so
@@11ESSE111 again, not really. I just went for what I liked and that's about it 🤷🏻♂️
@@myhnea92 you could then go for some server cpu for 10000$ then and just say you just liked it
X3D gives them basically a cheat code for gamers. Unless if Intel can figure out a way to out CPU extra cache AMD is always going to be on top.
@@RobloxianX
well well well
be in for a surprise
The solution is having faster RAM with lower latency. We will see Zen5 with DDR5 6000 going against Arrow Lake with DDR5 10000.
Maybe they should first figure out a way to not have their chips nuke themselves in a couple of years.
The latest xeons have more cache, they're now halfway between AMD's normal and AMD's X3D. That indicates they're probably going that direction.
Intel is gonna put more cache… problem solved.
I've tried Linux several times over the years but it's just not a realistic gaming platform. Every time I get a diehard Linux fan trying to convert me all I have to do is ask them if they would recommend Linux over Windows to their mom or grandma. Until that answer is an honest yes (which it never is) then don't try to convince people they're better off spending hours on GitHub trying to get the shit to even run rather than actually _playing_ your games on Windows at 5% lower framerates.
Even Wendell, the champion for Linux says don't use Linux if your primary use is gaming.
Zen5 is better than Zen4, be it that it's slight, but 11th gen actually went backwards compared to 10th gen.
yes for 2% but costs 30-50% more and that is shit !!
On the VRAM question, I recently upgraded to a 6800xt and I noticed FF16 was putting me at nearly 15GB of VRAM usage because I was running 4k, high textures, with frame gen. Without frame gen it dropped to like 12GB. If companies (read:NVidia) want people to be using this tech then people need to have the VRAM to do it. So glad I went with a 16gb card.
Ray tracing isn't an important or must-have feature. Nvidia has convinced a lot of people that ray tracing performance is more important than standard rasterized performance, but the fact remains that the performance cost of ray tracing even on Nvidia graphics cards is extremely high relative to the actual improvement in visual quality that you can get from it. Suddenly a lot of Nvidia fanboys have gone from saying that having the highest FPS is important, to arguing that having half the frame-rate with ray tracing enabled is better.
Some scenes can even look WORSE with ray tracing enabled, and often they just look different but not necessarily objectively better. Ray tracing does seem like a promising technique, and it seems like it's likely to become very common in the future as graphics cards become increasingly better at it, but games still need to be designed to run with ray tracing turned off, or at least with the option to turn ray-tracing settings down low enough so that the performance hit is less than 20% rather than closer to 50%.
I have played Cyberpunk 2077 and Control (among a few other games) with an RTX 3080, and did a lot of testing with ray tracing turned on as well as turned off, so I have a lot of personal experience with ray tracing in games where it's been well implemented.
@@syncmonismwhile the 3080 is a solid card no doubt, it’s far from ideal when testing ray tracing/dlss in cyberpunk. The fps hit you get from ultra settings with ray tracing is definitely too much in that game if you’re using a card without frame gen, but with frame gen ray reconstruction + dlss quality is pretty spectacular in my opinion.
I agree that nvidia shouldn’t advertise fps numbers based off of dlss or frame gen, but we can make that statement without downplaying the tech that these newer cards have. I’m no nvidia fanboy either, I love my 7800x3d, but my experience with ray tracing, path tracing, and dlss has been pretty great in single player games. We both agree that nvidia has a bit of a ways to go to make upscaling quite on par with rasterization, but they’re getting pretty close in the past few years, and the doors that dlss can open up are very cool
I remember when I first played MGS2 on PS2 and thought the game look so realistic, I remember that black chick that had that gatling gun, Fortune, I remember back thinking how state of the art that game was.
Watching this on Nobara Linux (KDE) while watching TH-cam & gaming, that part is exactly the same. Actually think the alt tabbing goes a bit better. I don't play anticheat games that are kernel invasive, mostly FPS titles certain ones won't run (maybe they require a VM/separate Windows install dualboot setup). Would suggest dualboot for those looking to try, my main game ran on Linux so I've been clean for a few months (still have a Windows 10 LTSC install on another machine in case I want to stab/sort through my NTFS drives quickly/bench/play test a machine.
Also Nobara here but the Gnome flavor (even though I'm considering switching to KDE) and most of the problem I had were because with a Microsoftiedf brain I was banging my head to find a solution that was just 2 o 3 clicks away in the settings. Even the Terminal a lot more accessible now but not required. Of all things he could have said about Linux he chose to against installing apps which as easy as using an app store. This is a serious case of bad fame. People is afraid of what they don't know.
Wendell recently tested Linux vs Windows 11, some games are faster and some are slower.
@@lolilolplix Wendell also said you can't just compare Windows against Linux in that regard because there are fundamental differences in the way they process things.
@@lolilolplix No they are all slower, the problem is there is no linux native software, everything is emulated. This is why it is so bad.
@@jamegumb7298 You clearly didn't watch the Level1techs video as some of the games did you better on Linux. Also wine literally stands for "wine is not an emulator" there is so emulation when running games on linux. All wine and proton do is translate the Spanish of windows commands to the French Linux understands. It like the 3ds "emulating" the ds and gba where its not emulating anything and all and just running the code native because the hardware is similar enough. Linux and Windows run on the same hardware so Linux doesn't need to emulate anything. And translating windows commands to linux isn't like having to take time to look it up in a dictionary. It's like being bilingual where you already know the language enough to hold a conversation but you haven't learned every single word yet. And every update to proton and wine is just learning more words so less mistakes get made.
Can you add Minecraft: java into your game testing? I know it's not a "new" game, and it isn't very demanding, but it's still popular and if you crank up the render distance and add mods/shaders, it does become quite demanding.
its demanding mainly because java is dosghit though
I think “time to load” some suitably large mod pack could make a decent benchmark.
ofc great textures and low VRAM usage is reference to Skyrim, it's so obvious
but really, i think it's most probably going for Arkham Trilogy, be it City or Knight, idk
I think Tim should stay away from Linux questions until he made a decent effort to use it in a modern setting. :) The biggest issue for Windows users is that Linux is NOT a singular experience and as such it can be confusing to get into. Valve need to release SteamOS, so that the average gamer has a singular introduction to gaming in Linux. Otherwise there are so many distributions to choose from with different desktop environments, software, window compositors, kernel release schedules etc. (LTS vs rolling releases - impacting driver release speed). I have use Manjaro ("Arch", but made for people). It has a rolling release schedule, so you have access to the latest kernel, hence; the lastest drivers. My experience has been that it is:
- WAY easier than Windows (no need to manually download/install drivers, applications etc), you just update the OS, or use an "app store"-like interface for software packages.
- Less resource intensive (0.5-1GB of RAM)
- On AMD APUs games tend to reach higher FPS due to more efficient memory handling hence better bandwidth.
- Has better support than Windows for "legacy games" since Proton is more robust than Windows own compatibility layer.
- Plays 98%+ of my 350+ single player collection, I even have some older games that don't work in Windows without manual mods that do work in Linux.
- Plays 50% of all anti-cheat games (total).
- Has support for the majority anti-cheat solutions. But developers often don't enable it due to increased demand on their support team.
- AMD just works better with Linux as of right now, as an RX6800-owner, it has been a flawless experience. I know a lot of people swear by Nvidia even in Linux, but my experience has been that driver support is lagging behind AMD, especially in terms of recent features like HDR, the Wayland compositor etc. Especially on older GPUs like the GTX-series.
My thinking is that Linux is more interesting for HUB to do an "early look at", once it reaches 5% market share and there's a "go to" gaming-oriented distribution for the majority of gamers can try out for themselves. And instead of calling it "Linux", it might be better to call it out by the distribution's name, like "SteamOS". Because It's not like we are saying "NT Core" instead of "Windows 11" or "Windows 10", or how we don't say "Unix" instead of "MacOS". For normal people the OS is the GUI, the user experience, and how the general OS is assembled. Linux is just the core to a very diverse array of OSes. So calling it "Linux" in itself highlight a communication issue for "Linux gaming" in general, and that's also why you can see such a varied amount of different experiences using the "same OS".
Average "Linux" User.
@@Hardwareunboxed well, I am a windows user since 30 years back. I just gave Linux a fair shake recently and read up on it... Once I did, I realized that there is a lot miscommunication online since people treat Linux like a "singular OS experience" when it is not. Linux is a terminal OS, similar to Unix. But you don't say "Unix" when you are referring to MacOS. You don't say "NT Core" when you are referring to Windows XP or Windows 11... To the end user UI/UX is what defines the OS... So the first step for Linux to gain popularity is for someone to create that singular OS experience for gamers. Valve will probably be the ones to do it. Otherwise we are stuck with this endless loop of people "trying Linux, and it sucks", when in fact what sucked about their experience (to no fault of their own, except not reading up) was that they chose a Linux Distribution/OS that didn't cater their needs. many people choose LTS distributions, but most of the time they would be better of with rolling releases, especially if they dabble with new modern hardware frequently.
@@Hardwareunboxed I totally understand why you don't want to benchmark games in Linux, but I really appreciate it when CPU/GPU reviews include some Linux benchmarks for Blender. As far as games go, Windows comparisons are a fine rule of thumb to understand relative power but Linux in Blender is such a huge difference. I know it's never been so complex to benchmark components, and that your audience is mainly looking for games performance.
You make a really good point here. Whenever someone tells me they "tried Linux" my first thought is "which Linux?" Some distros are harder to learn and use. Some distros are marketed as easier to maintain than they actually are. Some distros suffer from being too small to be easily supported. Some distros are just plain bad. A lot of people bounce off because they don't expect to have to (re)learn so much for basic usage, and picked a distro that doesn't make that easy.
On having a "go to" gaming distro - say a desktop version of SteamOS (more specifically, one maintained and promoted by Valve in an official capacity) - would really help with solidifying. Linux users tend to be really opinionated on their distro (and DE) of choice, but newcomers might not be. Having an OS for Valve to say _"I hear you're sick of Windows, but worried about gaming compatibility, well here's the one we use ourselves for testing; this is the most approachable and compatible distro you can choose for gaming."_ - could really consolidate those newcomers unsure where to start. And in turn that makes it easier and more appealing for devs to support compatibility with that distro, which in turn helps the rest of the Linux ecosystem. The immutable model Valve is using for SteamOS on the Deck puts them in a really good position to do this, making it far easier to target than even other "stable" distros.
As opposed to who? Intel who had two generations of high end CPUs cook themselves? Does reputation even matter?
Consumers only care about what you can do for them today. Listen to comments online and everyone hates NVidia. What does everyone buy? NVidia.
Reputation matters, but not in a mono-/duopoly. A while ago, I was explaining how AMD is probably better value-wise at midrange to a younger relative only to be met with "Who's this Sapphire? Never heard of them, I'd rather just buy a brand I trust in, like ASUS".
So while they were rather accepting of the notion of driver issues being in the past for AMD and willing to give them a try, the issue of choosing between many board partners can be a dealbreaker. And this notion can go extremely strong; people get Samsung AC units which are meh at best just because Samsung's done right by them with other appliances in their house. If the price difference is 30%+ for more or less the same product, consumers start looking the other way, before that? "I'd rather overpay but be sure it works well".
Yes it does when AMD is already looking like Intel by gouging prices when Intel just fucked up. They didn't even wait to drive up prices while underdelivering and overpromising. This is Intel behavior of old and should be shit on. Fanboys like you should stop excusing shit behavior especially when AMD is being anti-consumer. What if they repeat next year, would you finally say this should be criticized and try to fix it then? Because if AMD cared about the consumer the price would have dropped and sales would have exploded. But they value money more. Even with the slower sales I think they are making much more money then they did last year because Intel fucked up so badly. Hence no price drop and why you should be critical of AMD.
You act like AMD has been amazing for decades. Yeah I switch when they are good but if they continue this, Intel might get me back if they deliver a CPU that blows AMD away in a few years and just works. I stick to companies not trying to f me over. AMD is no longer that company to me. They stopped being that last GPU launch.
@@Bound4Earth Unhinged much? I'm not really on the market for a new system right now, so I don't put that much stock into it either way. 10/10 way to start with baseless personal attacks. Last I had an AMD GPU was over 10 years ago, probably 80%+ of my/relatives home systems have Intel CPUs, yet idiots like you are quick to point fingers.
Overpriced products do little reputational damage. Under-delivering is another thing; I must've missed the hype AMD allegedly did for this gen (there's been a singular promotional piece I'm aware of, and that's not exactly hyping it up). I'm also aware of bad benchmarks - not sure if incompetence or malice, probably both. To me personally, neither of big hardware companies suffered a substantial reputational loss because of the way they've handled it. Don't see myself avoiding Intel purely because they've ran into some bad streak in manufacturing or AMD because they released overpriced products.
Systemic issues, such as with Gigabyte PSUs, are a big deal. You can bash Intel/AMD/NVidia all you like, but they by necessity have been somewhat competitive.
@@Bound4Earth Well aside of being locked to a given chipset there's not much to switching sides. GPU side is more capricious, NVidia had me for a long time because of the work applications. But then having great work and game performance on consumer hardware was no longer a thing, AMD has improved... I do happen to care about power efficiency, VRAM, Linux support, so there's also that. So I guess it's the exact opposite: my viewpoint is not of someone who's considered AMD awesome for two decades, but rather been mostly using Intel/NVidia with the occasional dip into AMD. And at least NVidia has been failing to meet my expectations for the past few years.
If NVidia straps enough VRAM to a somewhat affordable GPU, I'd probably still go with that. If not, the next upgrade is going to be AMD.
Brand loyalty is largely just accumulated experience of various pain points (and expectation of such). If it just works, might as well grab whatever goes on a good sale first.
But back to the original point - even at its worst, AMD has somehow survived the ordeal, being locked in a duopoly. In part, the government won't let it die entirely because they'd rather not have a monopoly. Someone like WD may spin up a flash memory division or wind it down. And it is this market in which the brand loyalty matters most.
On the linux gaming craze which has blown out of proportion imho: I'm a linux user. I don't care about benchmarking games on linux. There is almost no way of getting clear data like 1% and 0.1% lows etc. It's not worth the immense effort to try and run benchmarks for the low single digit userbase. Modern games USUALLY run a bit worse than on windows. One huge plus (at least for me) is that I can get older games running, which newer windows versions won't even try to install in most cases.
Another point, there are a lot of factors that impact the performance. Your kernel version, your wine (proton) version and their interaction. Your driver choice (whether it's "official/unofficial", yes that's a thing on linux). If you choose to benchmark one specific combination of all of this, you will be benchmarking a use case for let's say 10% of linux users, which is 10% of 2% of userbase. In other words, practically useless. /rant out, ignore grammar mistakes please.
I am over 30 years old. I have been rooting for Linux for over 16 years or so. Unfortunately, I have yet to see Linux dominate in any significant way (besides servers). While it has become much better to use, 'normal' people are just going to use Windows or macOS. I use Windows ONLY for gaming and macOS ONLY for work stuff. I have a Raspberry Pi as a home server
. So, you know, torrenting for easily sharing free manuals for washing machines and so on, local file sharing, DNS caching. Unfortunately, most newer games have their own launchers, and some of them have heavier anti-cheat software that just cannot work in tandem with a Linux environment, so many e-sports games just can't run on Linux
I'm really confused by this whole 24h2 thing. If I'm on Windows 10 should I download Windows 11 or not?
Has there ever been a poll for HUB audience about how they use their computers in the first place. I'd really want to see it.
Ideally, as a multiple choice of:
- Gaming
- Listening to music
- Browsing the internet [more than/comparable to mobile]
- Watching movies
- Lightweight work [thin client/remote desktop, editing office docs etc]
- Computationally heavy work [rendering, engineering simulations, data science...]
- Data storage.
Every time I hear "gaming only" it seems like I have a fundamental cultural misunderstanding, some parts that keep being important to me are clearly omitted, and there sometimes is a laser focus on details I keep finding minor or even irrelevant.
Textures and texturemaps.
Early games introduced bumpmapping. A normalmap (texturemap) is used to store information of the surface irregularities (the bumps). When the light hits the flat surface from a certain direction (that hits the eye/camera), you can render the light-reflections using the normalmap's bumps. It makes the surface seem to have more detail so it no longer looks flat.
You can use texturemaps for all kinds of purposes: Reflectivity, transparency/translucency, you name it. All the properties of a surface are defined in texturemaps. The colors of pixels are usually considered "the texture", by people..
Texturemaps are basically extra data-values for all the pixels in that texture. The data can be more abstract, like timing info. An example is a window hit by raindrops, where the drops seem to appear on random locations, and even drip down (not all at the same moment). Such a texturemp holds the differences in time that drops appear on the glass.
If you want to render a scene more and more realistically, you need more and more information of that scene => more texturemaps.
even with zen 5 not being spectacular, their support for AM4 still has me hold AMD cpus in good rep. I hope they actually try to be better price wise with the GPUs, but thanks to AI, I doubt it.
Great honest video... As an old guy in this space it really does pain me that people have issues with windows... Those engineers do a magnificent job in what they do considering how many different use cases there are out there! I've never had a major issue with Windows unless my build was shit! And I certainly wouldn't want to go to a Apple style eco system which is locked or a Linux one that is too open!
CPU/GPU tests on Linux would be interesting because they could demonstrate whether the OS is fully utilizing the hardware's potential in a mutual comparison.
11:30 - PS5 GPU entry level? Are we forgetting Xbox Series S? :)
Its GPU compute power is something equivalent to a 5500 XT (with ray accelerators). It is the lowest common denominator this gen that devs have to target.
With this topic it's too much elitism and I'm a PC gamer myself and still annoyed about this.
Is the power of a RX 6700 (XT) really entry-level nowadays? The RX 6600 is still a valid card! I'm never willing to pay nearly 1000 € or even close just for a graphics card AND I can afford it!
4K60 or even higher? Sure, just bring more cash into the game! The PS5 just costs below 400-500 € nowadays! Slim SKU included! It becomes more complicated, when you account game prices into it, but still.
Who on the PC space (with a brain) would really come with the same arguments? There are real arguments to make. More multiplayer options, no general subscription needed for that either, more choice to either go for more CPU or GPU performance at the same budget, depending on your game preferences.
Or even the argument, that you can use the PC for work as well!
Yes GPU price are insane as well as cpu price. AMD also struck on 6 core ryzen 5 and 8core Ryzen 7 for how oong now? I get it being actually being better than previous generation but still laptop market is a complete shit show.
intel is just a shit ahow.
@@sMv-Afjal There was a time that i5 was not sufficient for gaming but R5 was enough, that was during i5-9600K vs R5 3600 , because the Intel one doesn't HT but the AMD one has SMT. So it was 6t vs 12t, some games were having stuttering problems because of that. That was the motivation behind i5-10600K having HT, 6c12t . For lower-end CPU, you will only see AMD respond with more core count if there is a critical need for it, beggars can't be choosers. For high-end CPU, there are rumours saying Zen 6 R9 will have 32cores.
@@sMv-Afjal To be fair to AMD in this case, 6C12T still is sufficient even with Zen3. There the difference between 6C and 8C is bigger than with Zen4.
I don't know, at what price point you expect CPUs to go down to. You can buy well sufficient ones at 100 € ($) and with over 200 there's the 5700X3D or 7700 (tray version). Do you expect a 7600X3D to go below 300 $/€ that soon?
We could criticise performance for applications between AMD and Intel. Depending on the price range there are differences, the 7700(X) and 14600K(F) are in the same range area and the latter is around 10 % or more faster. Below that it's similar, but not as drastic.
I'm rather really annoyed, that the 7800X3D became so expensive again! I'd prefer the 7700X boxed to be below 250 €, but at least the tray is cheaper.
ZEN5 launch sounds similar to ZEN+ launch in 2018.
It was a minor upgrade over the previous generation.
Marketing material: In my job as an electronics engineer, many years ago, I returned a power supply to a manufacturer as faulty because their data sheet on their website ( which was readily available) said load regulation should be 0.01% and it measured at 0.05%.
Their response was, "Load regulation can be up to 0.1%, that is what the manual says", to get the manual, you had to register with their website. My argument was, "People won't make a purchase based on the manual, If they have to sign up to get it, they will merely look on the readily available data sheet, if they buy the power supply based on that data and find it doesn't do what they want, they would return it as faulty."
Eventually, the company corrected their data sheet.
I appreciate Tim at least considering the possibility of thinking about Linux seriously. As a reviewer and tester with such mainline products, it does make sense that Linux not be considered.
I’m always reminded of Arrested Development - “There are dozens of us! DOZENS!!!”
I would say for most Linux operating systems, installing basic apps is much, much easier than on Windows. Linux software stores are commonly used in the way that the Microsoft Store is trying to be used and work flawlessly most of the time.
Linux has gotten a lot more user friendly in the last couple of years, maybe they should give it another try?
No
no
What happens if i watch some of the video and come back later to finish it, does it hurt your stats?
On the whole Linux situation:
*TL;DR* - Game compatibility isn't comprehensive or consistent enough, has a couple major hangups (anticheat, launchers), is too much of a melting pot for devs to support, and takes a lot of effort to learn when coming from Windows. Worth learning if you're an enthusiast, but performance-wise not worth benchmarking.
I moved to Linux a couple years ago. Most of the problems Tim mentioned are no longer very applicable in mainstream distros. Anticheat is still a big one though, probably the biggest hangup for most (and probably will be for a long time). But compatibility seems to be plug-and-play for around 85-90% of games I play, in my experience. And there's almost never any noteworthy performance difference (which is why it doesn't bother me that HWUB doesn't test Linux). There are still quite a few games that run like trash and have terrible bugs or break at the slightest breeze, but these days those are the exception rather than the rule. Linux gaming is viable and fairly approachable, but it's still got some consistency and reliability issues, and I agree on it being still only "for power users".
Another thing is launchers. Big AAA publishers like Ubi and EA like to force players to go through their launcher to run the game, even when the game is sold through Steam. And plenty of games these days aren't sold through Steam, not until much later. Epic's still doing timed exclusivity. These launchers tend to be a nightmare for Proton - and none of them run natively on Linux. Bottles can help, Lutris can help, Winetricks can help, alternate launchers (Heroic) can help - but in my experience it's never as reliable or consistent as a game that goes straight from Steam to launching the game itself on Proton.
The main reason for the lack of compatibility and support from devs on Linux is the very "composite" nature of it. Any distro is a whole bunch of moving parts with each update, even on stable point-release distros, there's so much variation across so many different projects that make the core of your system work. These have different update cadences and combinations (sometimes unpredictably so) that pair different core components at different versions. There's hundreds of different distros out there. There's only one Windows 11. It's not a surprise that so few games make any attempt at compatibility for Linux, especially when they can't even keep up with bugs and performance on Windows.
What Tim didn't mention is having to learn a wholly different OS. I had no idea just how much I would have to re-learn to understand basic usage and maintenance. "Power user" level functionality even more so. Linux is not like Windows. Concepts like "DE" and "packages" and "mount point" were alien concepts to me at first. People give troubleshooting/advice for Linux in the form of command-line because the system UI varies depending on your DE of choice, which is also interchangeable. Packages are installed with a package manager (which may have a GUI frontend), not downloading and running an installer. All storage on your system is mapped as a tree, rather than distinct separate "drives". Paths are case-sensitive. It's a learning process. These things aren't worse (I'd argue they're better), they're just not what Windows users are used to.
I've seen mention that if Valve were to put forth a desktop version of SteamOS _in an official capacity_ and promote and support it as such, it could really help solidify the Linux gaming userbase. Valve is honestly killing it with SteamOS 3 using an immutable model on the Deck. Making a push to have devs target not some arbitrary third-party distro, but their own SteamOS, with immutable versioned system components, and user-installed software separated on an entirely different packaging layer, could do wonders for compatibility and support.
19:11 about that there was a famous case in brazil 4 years ago, intel was denying cpu warranty because users had xmp on
14:17 A one off linux benchmark of like 10-15 games would be an excellent video...
I think that second question is confusing "good art direction" with "texture quality". The limitations of older games forced developers to put effort into designs, whereas nowadays they can just stuff whatever muddy indistinct image they want into a texture file and call it "high-resolution".
I'm making the move to Linux from Win 11. Recall sealed that deal.
@Hardware Uboxed. In your sharts I still see you use WRONG titel name Microsoft Flight Simulator 2024 instead of 2020. The 2024 version will be released the 19th of November. Please correct it. And DX11 is still faster than DX12 in MSFS2020.
Unless you're really looking at every item on a scene under the microscope, having textures with a really high resolution won't change visuals as much as it will require larger VRAM buffers. One thing that's quite funny is the textures that actually give the detail aren't the "main" ones which are tipically higher res (the diffuse or albeldo as it's also called) but instead normal, bump and displacement maps which normally are fairly smaller compared to their diffuse counterparts... Like say you have a 4096x4096 diffuse texture (which is overkill for the most part even for some closeup type stuff) the normal and bump map textures will likely be less than half the resolution of the diffuse (there are obviously exceptions but still) Textures can only get you so far in terms of improving the visuals vs the toll they take (bigger install sizes, larger VRAM buffer requirements and so forth) A lot of the time, what will make your in game scene look off will most likely be the aliasing type being used or - because we are in the age of image reconstruction and upscaling - the initial rendering technique before the frame is "reconstructed" You get way more detail with denser geometry than bigger textures but textures require less testing so that's usually the main choice.
16:50 - I think there is a sizable population of gamers willing to leave Windows if games ran better on Linux. I am one of them and have been debating it for a while with people that there will be a huge shift for gamers for an OS that consumes less resources and has better performance for solely game use.
Happy with my purchase of 9600X. I use it for RPCS3, now I can play GOW3 at 60FPS.
Is there going to be 7800X3D testing amidst the testing of the 5000 & 3000 series CPUs? I really want to see how the 24H2 vs 23H2 effects the 7800X3D before I do the update myself.
Re Linux, it'll be interesting to see if the recent announcement from Microsoft about moving away from kernel level safety solutions will make drm and ac clients portable onto Linux
Ahhh! Texture quality in games! My problem with modern games is not really the texture quality as much as the use of shadows or recently raytracing used to create an image where I can't see the textures much less the player character or the opponents.
Yes. 9800x3d good enough for the price. the end
But you don't know how good it is, and you don't know what the price will be...
Personally I expect it to launch at a higher price than is justified, as is almost always the case with AMD CPU launches. Say 5% faster than 7800X3D for gaming, and 30-35% higher cost.
@@exscape The video was named "can AMD repair their reputation after Zen5" when i made the comment
Great content Tim & Steve! I guess do another batch of Zen5 3D with improved architecture design and people will eventually forgive AMD's mistake.
Developers spend most of their time doing more of things you can't see while actually playing the game. Manufacturers have built absurdly inefficient and expensive hardware to implement it. People pay through nose for things that make no gameplay difference. Tell me where is sanity? Silliness of the week. The AMD Ryzen 9 9950X Processor is selling out in places and I don't think it's gamers buying it.
Well, it's a rendering beast. I have it on my radar for when the new generation of GPUs drops. Those nice and fast Zen5 corse will eat up any 8K renderings I throw at them without issues.
@@mondodimotorithe problem was never the gaming performance. The problem was AMDs claims of gaming performance. If they just touted efficiency gains(even if not exactly true) and better production performance.. we would be looking at zen5 in a much better light. Instead AMD acts like reviewers are lying or confused when we see only 3% uplift for games. It's not a bad CPU at all but expectations run wild.
@@christophermullins7163 So something people have been doing since the beginning of time is suddenly a big problem with AMD? I invite you to go back and look at the pre launch hype from TH-camrs. Some of the most over the top hype comes from the very TH-camrs trashing the 9000 series post launch.
@@mondodimotori I think people forget gaming is only a fraction of the CPU market. Gaming isn't very demanding on CPUs to begin with.
Very good explanation of the texture/VRAM question by Tim. Excellent even. Many people think that the GPU only places textures on the screen, while ignoring the fact that so much computation takes place on the GPU, and feeding that from VRAM is muuuuch quicker than using system memory.
Black Myth Wukong really needs an update that makes some of the characters cuter, especially the Wolf scouts, and Tiger Vanguard.
I just bought the ryzen 9 7950x3d and took me a full day just to get it to run properly. The core parking was a must and had to go through a special process. Now its a beast
The car example from GTA was an interesting example though it doesn't have to do with the VRAM amount per se but rather the amount of different things in the scene: Different models (or on more modern games models composed of more separate parts) mean more drawcalls so more things to be calculated so more work for the CPU and GPU.
This can also be applied with the "texture layering" which is done by having multiple uv channels on models so again more parts being drawn (the main model in the default state plus the same model with the second UV channel being used to display the "conditional" texture or textures) which may reduce the polygon count but doesn't reduce drawcalls because models will have more materials.
Also most textures are seriously compressed an the standard format generally is DDS.
You're very unlikely to see large textures in PNG format, for example.
The same 4096x4096 texture in PNG can easily be hundreds of megabytes in size but as a DDS compressed file be 100 times smaller.
Just imagine having those huge textures without compression.
Not even a 4090 would have enough VRAM if that was the case.
One thing the Steam Deck has taught me is that there is no floor for how low I'm willing to crank the graphics down, as long as it hits 40 FPS.
I feel that 30 FPS is an unplayable framerate.
Reputation suggests they've burned their customers, which hasn't happened. People are taking a pause and waiting for better products, most of them because of how great their current Zen 3 and Zen 4 products are treating them. Zen 4 will probably be a product that even 8 years from now, casual gamers (non-enthusiasts) will still be perfectly satisfied having in their systems running e-sports and previous gen games - This segment won't be interested in buying $90 AAA titles anyhow.
Regarding the VRAM discussion, textures have actually become a lower and lower proportion of overall VRAM usage over time. It's still substantial, but most memory gets used up by the seemingly endless number of buffers required by modern graphical rendering techniques. This is why it's so difficult for developers to adapt their games to run on the Series S and why modern games can still require 6 GB of VRAM to run in potato graphics mode. I saw a nice chart a while back showing all the different things using VRAM in a particular game, wish I could find it again.
Agree 100% that the time is not yet right for ROUTINE coverage of Linux gaming. But a one-off status review of using eg Linux Mint (as a widely-used, pseudo-Windows distro) as a general desktop environment and then for gaming would be great to see. My sense is that more and more people are unhappy with the direction that Microsoft is taking with Windows and that, unless you use very specialised software, Mint is generally on a par with Windows for general home/office work. Gaming is coming on apace but still a work in progress.
Perceived texture quality is difficult to separate from the quality of lighting, shadows, bump mapping etc. In older games, particularly ones with extra sharp textures, those might paradoxically have stood out more and led to a wow factor that is no longer there when the visuals in total look amazing. Back then, the textures often were the only thing that looked amazing, with subpar lighting, blobby, flickering shadows and far fewer polygons. Also, when you compare a slow moving game to one with fast movement, the slow game will always win out in terms of "texture-wow-factor" (provided the textures are high res and good). Conversely, some modern fast-moving games might actually get away with lower res textures?
Companies competing on who's the worst instead of who's the best
Games have become more Vram-hungry because shaders have become more flexible and complex.
Older titles have a lot fewer individual maps that make up a "texture". Most of the effects used to be baked into the diffusion maps.
Tesselation bump maps, material maps, reflection maps, specular maps, occlusion maps, pre-rendered shadow maps for low presets etc etc
Games used to have all that stuff baked into not more two or three image files, now it's all loaded separately, calculated and composed in real-time. That takes memory.
You should do at least one episode to test out linux gaming to show how bad microsoft is nerfing performance on windows. People would watch that.
I think that the "XMP is overclocking and not covered under warranty" is more about they don't want to offer customer support at all for XMP/EXPO.
"It is little details that you probably wouldn't notice if you were playing the game normally." Exactly. Most of the whiz-bang details gobbling tons of the VRAM do not matter much to normal gameplay, it mainly matters to pixel-peepers. While actually playing, most of the fine details get blurred out by screen and eye persistence.
What is your opinion on the fact amd was not offering some fix for security issue found for zen2 meanwhile they are selling them and also releasing them as new with 7000 series so they must be supported as long as zen 4
Zen 6 is going to have some _interesting_ marketing considering Zen 6 will most likely have NPU cores (Both Intel and AMD are starting to use them and it'll be a Win11/12 thing)
So it'll probably be a slight improvement to perf overall, but nothing spectacular and probably no core increases.
On Vram: it also depends, even what game mode you play in the same game. MW3 uses considerably less vram then warzone/resurgence, because the maps are small. In a MW 3 map you might hit 10gb usage on extreme, while in WZ/Res you WILL use all available vram, for "open world caching".
Could you make a survey here about how many of your followers would like to see Linux benchmarks? I think it counts more than generic steam users...
I daily drive linux, but I definitely see where you guys are coming from with that segment. I will have to push back a tiny bit Tim. You mention installing software is harder on Linux, but if it works, I think installing through a software center is easier and better than opening a web browser, searching up the name of the thing, avoid the malicious links that say they offer that software, downloading an installer and so on. It is just that you are not used to installing things that way so the easier thing seems harder, but this is arguable.
For my use case, I mostly do software development and occasionally play some single player games, and getting the tools I use set up on Windows would be harder than on Linux for me, so in my case I find it easier to use Linux. For content creators, they are gonna try it and go "my Adobe apps don't work this sucks" and yeah sure they are right and that's a problem. I just don't use any of those apps. But that''s why you get a lot of negative press on YT when people try Linux, because spoilers content creators are not gonna be happy when content creator things don't work
Intel is the most disappointing company… They’ve been holding the crown since Skylake..
it's arguable easier to install and play a game on linux than it is on widows currently...that is some games are actually far simpler to get up and going, while others require a tiny tweak here or there.
I RMAed a 13900K in 2023, I used XMP even manually tweaked timings and overclocked the memory. Intel honored the warranty no questions.
Tim. Respect for saying whats needed to be said about the fps console situation.
My understanding for XMP is that the memory chips have 3 or 4 profiles stored on them. And the profiles are JEDEC listed so technically they are in spec
As some who still plays a lot of older games, and a mixture of new games with a top tier PC on a 4k monitor. I can say that plenty of older games still look just as good if not better than many of today's titles. I still play Battlefield 1, Watch Dogs 2, Modern Warfare 2019 and COD Black Ops 3, and the texture quality on those games is signficantly better than most current games while using 6gb at 4k max settings. What's dated in some of those older games is the lack of ray tracing, or pixelated shadows, or maybe just simplistic or flat foliage.
The only recent games with textures that genuinely looked good, with great visuals all around where The Last of Us Part 1 and Hogwarts Legacy, they use 10+ GB at 4k Ultra. Sons of the Forest looks insane as well and I think uses about 8gb. Resident Evil 8 looks damn near photorealistic and runs better at 8k than most games do at 4k with a 4090. There are some great looking games, but texture quality is definitely worse now than it was 6-8 years ago on the average game.
I've been gaming primarily on Linux since november/december of 2017 and I think the contrast between back then(pre-DXVK, pre-proton, pre-lutris, etc.) and now really has people overestimating just how good Linux gaming is. If I was on an NVIDIA GPU I would go back to Windows today, and if I wasn't a power user I wouldn't use Linux at all really. Even the low thought distros like PopOS eventually ask you to go to the command line(not necessarily in any high complexity or hard to understand way, but opening a terminal at all is a step too far for the "average user"), and games are often finicky enough that if I hadn't built up what are almost automatic at this point tweaks I'd run into games significantly often which just "don't work" to me.
I think it'd be worthwhile to maintain a Linux test OS just to keep an eye on issues like shader stutter if that ever becomes a problem again because DXVK resulted in a long period where Elden Ring was an objectively better gaming experience via proton/via DXVK(you can technically use DXVK on windows but it's extremely cursed) due to managing shaders more properly on behalf of the game while the game failed to properly manage shaders, causing the stutters on all systems. The main problem is that there isn't really a "stable target" for Linux as well, so benchmarking is extremely difficult to do in a truly consistent manner. Phoronix is the obvious standout there.
If the hardware unboxed team decides to try linux again in the future, I would recommend using Kde as your desktop environment, such as kubuntu, or kde fedora (not kde plasma, that distro is more of a testing ground), it supports vrr, high refresh, proper 4k monitor scaling, hdr, and 10 bit color. I would also recommend using fedora linux as its usually very up to date and very easy to install. I also believe there are gaming specific user tweaks for linux for optimal performance. But yes I would not say linux is for the average user yet. You still need a decent amount of trouble shooting skills to get all your apps running correctly.
Im still at 23H2 when do we get this update..????????????+ i have the 7700x.
24H2 is on insider program for windows, release preview branch.
About the VRAM usage question:
Different games load different amounts of the game world at once. Optimise for less VRAM, and you may have more loading screens (or need to hide them with transitions).
Also, you could reuse textures, if you have fewer different objects in a scene. Therefore, the individual texture could be of higher quality, when there are fewer of them.
18:00 - I have heard of warranty being voided by XMP when it came to intel chips, although this was a long long long time ago around 8th Gen (coffee lake??) when Intel introduced their *'Performance Tuning Protection Plan'* (PTPP) extended warranty programme and were trying to strong arm the enthusiast community into buying into it.
I think it they charged an extra $30-80 if you wanted to enable XMP or overclock their chips. But for most people it was easy to bypass if they had an IQ over 100...
Customer: _"CPU isnt working, Ive tested in 20 other machines and nothing boots."_
Intel: *"Were you running XMP when the CPU stopped working in the original machine??"*
Customer: _"XMP??? Whats that?? Ive never heard of it. I dont even know how to change a BIOS battery"_
Intel: **Authorises RMA**
Where did the 600p info come from?
5:18 Me also imagining all my favorite old games in HD.
To install a program in Linux is not hard, you don't need to be an advanced user. You press the 'windows' key. Click on the software icon.
A window opens. Point your mouse on the icon of the thing you want to install and click install. Job done. I don't know why people spread outright lies about Linux. If you can click a mouse you can use Linux. If you feel adventurous you can install stuff by other means but you don't have to. Just click on an app, choose what you want, click on an icon to install. Go to app menu (windows key) click on icon to run app. Yeah, it is rocket science. Keep spreading lies.
In my case 9950X either matched or beats 13900k/14900k, you can see on my channel
No disappointment honestly, solid CPU
4:05 what happened to texture compression, games used to have something called dxt.
I think the X3D chips could be the saviour of Zen 5, given the efficiency gains of Zen 5 over Zen 4 there could be significant overheads for the X3D chips to take advantage of. Given that Zen 3 and 4 X3D operated on reduced frequency and TDP due to thermal sensitivity of the 3D V cache, if the Zen 5 parts can take advantage of higher clock speeds due to the increased efficiency of the architecture it could provide a notable performance boost for gamers and give Intel even more headaches when Arrow Lake launches.