[SPON: Use "brokensilicon“ at CDKeyOffer’s Christmas Sale to get Win 11 Pro for $23: www.cdkeyoffer.com/cko/Moore11 ] [SPON: Save BIG on the MinisForum BD795i & BD790i SE Motherboards: shrsl.com/4sgrf ]
I love how this pinned comment is a week old but the video feels so fresh 🤣🤣🤣 Anyway, two quick questions about Zen 6, is it going to require a new socket for mainstream desktop, will it be capable of running CU DIMM's or whatever is new by then - these are kind of the same question I suppose and determines whether I get into AM5 now or not. Keep up the excellent content, and try to keep the stress levels lover in 2025.
Let's face it. There is never going to be a gpu price correction because it simply doesn't exist. Amd and Nvidia literally work together I'm secret to price gouge. Amd is not going to fix it nor is Nvidia.. Profits rules all..
I have been following MLID's RDNA 4 coverage for ages and the arc has moved from "This is going to be a great midrange card " to "LOL AMD is gonna screw this up again."
That's just the standard Radeon cycle for the last decade. Somehow they always manage to screw it up (again) and then aggressively have to cut prices not long after launch until the pricing is closer to what people want to pay for it.
@@bb5307 The worst part is sometimes they don't even bother lowering prices, 7700XT was a pointless card day 1 and it hasn't moved down one cent ever since. I'm assuming they want to completely run out of 6700/6750XTs before they do anything with it but who knows they might just let it die in a corner obsoleted by RDNA4 navi44 completely instead.
@@bryan_350_5 the performance leaks are contradictory to one another, a more recent leaks say that its within 5 percent of a 4080 so atm nothing is reliable
@@CalgarGTX Some cards just exist in limited numbers just to make another card look more enticing. 7700xt to 7800xt and 7900xt to 7900xtx come to mind.
Sony buying Kadokawa is not about FromSoftware and gaming at all. It's about thousands of their mostly anime-related IPs across anime, novels, manga, games, and so on. Kadokawa owns an absolutely giant share of anime IPs, and Sony owns a dominant share in its international distribution (as well as anime and anime music). From Software is just something gamers know, but it's tiny in comparison to the real value of Kadokawa.
I really think if AMD is serious about their intention to gain market share with RDNA4 like Jack Hyunh said in the toms hardware interview, they will price RDNA4 aggressively. Specially if they don't want to lose market share to intel.
1 motivation B580 12GB 250$ LL FG XESS2💙💙💙💯🤩🤣It will have 90 FPS @4k (RT LL FG XESS2 on) in Sony's best PS5 game, upcoming Spider man 2 PC feb 1st 2025 and that is all it counts 😁😄
@@Hitthegasmedia Windows is really awful. I only have it on my gaming stationary PC. Would be nice to not be forced to pay to Microsoft for the OS to play games.
PS5 Pro sales have no bearing on PS6 pricing. You guys seem to be glossing over the fact that the end goal is to sell software and services. Sony is not going to want to sell fewer overall consoles with higher revenue if that impacts the total revenue that could be generated with a larger userbase. If anything, I think the Pro's success suggests that it would behoove Sony to launch with two SKUs, and have the more powerful model somewhat subsidize the price of the base model.
I agree the pro won’t affect the ps6 pricing just because 1/5 of the user base is willing to pay 700 for now doesn’t mean that a majority of the market would be willing to do it 3-4 years from now they are gonna want quicker adoption when next gen comes, they seen success with the ps5 at around 500 and would probably try to keep it between 500 and 600 because the vast majority will be shopping in that price range doesn’t matter if they have competition from Microsoft or not. I still don’t know if they want to do the two screw approach but that might be the play with the handheld.
i dont believe. its good at launch but this way is better. ps5 came, 3 years later we were looking each other like "yeah, what now? this generation is nothing of what it promises" then ps5 come out to refresh and make a newer push with a new tech in a better manufacture process .. its way more viable than if they launched normal ps5 (rx 6700 non xt) and a ps5 pro with a rx 68650xt at day 1
GPUs went x2+ times in price for no real reason (mining and crypto bubbles + Covid don't equal actual value of a GPU). Why won't consoles do the same? People are trained to play games and consume media content. They are now held hostage. Why won't Sony try to exploit it? They must have their shares grow.
@@stanislavkimov2779 I’m in the US and it is $700 here and as far as why Sony won’t charge that much it’s because they make their money on software sales they need mass adoption of the consoles to make more money, they probably won’t take a loss on them anymore but they won’t overcharge for the hardware and push customers to other platforms for no reason when they don’t have to if they can make it for $500-600 and break even on it then that’s exactly what they will do, the pro console is different it’s for consumers willing to spend more they can price that at whatever 1/5 playstation players are willing to spend
They couldn't sell a 4080 for 1200 so I don't see how they can sell a 5080 for more than that. I've gone from skipping 1 gen for the first time in my life to possibly skipping 2 gens. Or I can't believe I'm saying this, maybe buying an AMD card
@@HeliumFreak exactly. Everyone seems to be convinced the 5080 will be $1600 or something.. performance might be there but the vram is not enough. Consumers that are willing to pay $2k+ for a gaming PC are also in the position to know how bad 16gb for over $1200 really is. Money is tight today even for people with plenty of money. Everyone has less than they used to so we are doing research. At the same price, a 4090 would be a better buy unless all you want is to run path tracing in cyberpunk in 2024.. new games will be hampered by 16gb Guaranteed.
I skipped two gens and got an AMD 7900 XT. It worked great but had two flaws. Driver issues for VR (which now seem fixed, but I didn't want to wait) and my stupid mistake of buying the largest sized card out there because Asus gave a rebate. 😅 Of course then I wanted to go mini ITX... I sold it and got a discounted 4080s so I could just enjoy VR without hassles. I think I can sit out a few gens now if things keep going like this. If I need another system, I see no issues in getting another AMD card.
These influencer make it sound like AMD cards are "worse" but honestly the 7900XTX has been faster than the 4080 since day one and it's still faster than the 4080 Super. The card is stable, drivers are okay and Adrenaline is better than the shitty Nvidia Panel. RT is the big elephant in the room, because RT requires upscaling (and FSR is worse than DLSS) and because PAth Tracing is a Nvidia-only experiment so far, so how much you care about that is the needle mover. I personally sold my 3080Ti because of frustration with how much I had to rely on DLSS...the 7900XTX removed that need for me (I play at 3440x1440) and I've been happy to not use upscaling at all except in a few games and in Quality mode. When SteamOS comes out I will be ready to drop Windows for a better experience that doesn't hamper my CPU or GPU...with Nvidia I couldn't. I change my hardware every 4/5 years so I took a bet changing prematurely my 3080Ti for a 7900XTX, so far is paying well. Be wary of the amount of unwarranted praising that Nvidia and Intel are taking despite their faux pas (that for some reason are never as bad as AMD's ones, funny innit?).
@@or1on89AMDs RDNA3 perf is trash in comparison though. Daniel Owens did a big 4080 7900XTX comparison...Raster was upto 7% favouring the 7900XTX....but normalising the more games tested to an even more neglible difference....BUT RT was upto 50% faster on NVidia. Paying $1000 for a gpu with trash RT is not great. RDNA4 looks way way better....but AMD don't appear to have a 7900XTX raster perf card in the RDNA4 lineup. IF AMD manage 7900XT raster and 4070TI RT perf for $600-$650 it'll probably sell pretty well. But 7900XTX raster plus 35% to 40% better RT perf at that price. That would take market share.
AMD has to stop the MSRP bullshit. Nvidia can get away with it because of how it’s objectively better products then AMD competition at the same price. AMD can’t get away with undercutting Nvidia by a few percent, they’re shooting themselves at the foot🤦♂️
LMAO. Remember this is prerecorded more than a week ago. Can't wait for his indirect response to HUBs latest video. Gonna be another deranged Intel ARC take.
@@pctechlooniethe funniest part is that I think HUB were "friends" of MLID, or at least they respected him as a content creator and we're willing to make videos with him. I think they still are to some degrees but they have acknowledged that he was wrong about arc. It's OK tho, everyone ca be wrong, it's human
@@richardyao9012 Probably something like bla bla bla praising ARC is corporate cheerleading, bla bla bla real adult analysis only from me bla bla bla Tom Peterson is a liar bla bla bla My sources are more reliable bla bla bla LMAO!!! If you want comedy watch his video from early 2022 where he praises the 6500XT, announces the death of sub 200$ GPUs and says people on RX 580s and 970 better sell their GPUs and get the 6500XT for 250-270. Also uses BOM clown math to prove the 6500XT is not making AMD enough money.
44:33 I doubt Blackwell will actually be better at RT relative to raster performance, they have been sneakily stagnent on this, Adored TV did a breakdown of it. They seem to be incapable of actually doing this because the regular non RT hardware on the card is still a hard bottle to the RT performance somewhere. They straight up can't reduce the impact of RT in games right now and haven't for 6 years. All they can do is make the cards faster overall and improve the RT hardware to keep up with that performance uplift.
Sure, they can it's called decoupling the RT cores from the Shader's R.O.P's & giving them their own R.O.P's for the RT cores, but hey that means Nvidia can't uses it software like D.L.S.S to be superior. Right now rasterizations is the bottleneck limiting RT performance as both the RT cores & rasterization rending is all done on the same R.O.P's
you should be more doubtful about rdna 4 raytracing performance, it is lagging behind even compared to intel ... blackwell raytracing should be as much as people claim it to be like 20 percent or so increment , or they can blow everyone away with neural rendering tehcniques which they were working for these past 4 years... they might be able to bring pathtracing to 60 class gpu with neural radiance caching and might even surprise us with better vram utilization throught neural texture compression (nvidia cards are already efficient with vram utilisation compared to amd and intel who knows what they have up their sleeve leatherjacket man is a sneaky guy)
RDNA 4 really looks by the leaks to be the worst RDNA generation. If the 9070 really performs worse than a 7900 gre it will be a card worse than Vega 56 (in terms of overall perf and price and power consumption at the time of release), and close to Covid scalper's era of price/perf A card which costs as much as a 7800xt and performs as a 7800xt, which by itself was already kind of bad, 3 years later (before you judge me. The 7800xt is objectively bad, only that other radeon and Geforce cards at the moment are so much worse that the 7800xt looks fine by comparison. But the 7800xt is still a terrible card. This whole generation has been one of the worst generation of GPU in history)
@lucazani2730 No they don't. You're an idiot if you base your performance expectations on early leaked benchmarks of an unspecified SKU, at an unspecified power limit, with unspecified drivers.
GTA6 will be sanitized and woke, having lost many old Rockstar devs. I have PS5 Pro, and am neutral to it's release. Maybe many normies (who buy MTs in Fifa, Cod and Fortnite) will buy it. But for me GTA6 is really not as big as it could have been. Western AAA is dead and crashing now.
@@nabieladrian I don't buy games for more than 40$ with rare exceptions (like Fromsoft releases). Publishers can set whatever prices they want. It's up to people if they want to buy it. Wait for discounts, or just pirate it completely. Or abandon AAAs, and play AA/indie or old games. I buy mostly Japanese games now anyways.
7:03 Sony want kadokawa cuz of Anime, Light Novel, Manga IP, from software and other game just a bonus if sony want total buyout. Sony right now monopolize western Anime market with Crunchyroll (Funimation & AnimeLab) and with Aniplex.
@@kevinerbs2778 yep is bad, but cuz i'm in SEA region, i can use other platform like youtube (Muse Asia, Ani-One/Medialink) and bilibili/bstation, its cheap or even free
I have assumed for years that AMD and Nvidia work together to keep the total market as high as possible instead of competing on price with each other. You KNOW this happens. If you don't think so you probably also think the president runs the country. Things are never as they seem.
Zen 5 is already so fast now, it's being held back by slow memory access, hence why it benefits so much from the extra 3D cache. So if AMD fixes the memory controller issues with Zen 6, it will be a beast! And I predict they will (even if it means a delay), simply because they have to if they want to make 12-core CCDs happen, or else it's going to be bottlenecked on memory.
Tom loves to hate on the B580. He says it is going to be a paper launch. I am getting my B580 from my B&H order today, January 2nd. I didn't really have to jump through hoops to get it. What i don't get is why he gets very excited about Qualcomm making GPUs? I mess around with some Chinese Domestic Market GPUs, Innosilicon Fantasy II and Zhaoxin C960. While they are very solid for desktop use, they still very much struggle in games. Innosilicon Fantasy II is PowerVR based and that technology has been around for quite a while. The MTT S80 is an excellent example of a on paper good card with drivers that struggle in modern gaming. Also, I feel, the Nvidia ARM APU will probably work best on Linux. Microsoft couldn't be bothered to help Qualcomm get Windows on ARM to function properly. TLDR: Qualcomm GPUs will be meaningless to the gaming market. Games are so specifically tuned to GPUs nowadays with game day drivers.
He gets very gloomy on Xbox too which is weird imo, because it's gaining revenue big time yoy since like 2016 I haven't adjusted for inflation BT even if adjusted they are still growing, they did loose a lot in hardware though but I've heard rumors of a handheld with series S tech. I really don't understand this SEGA theory Microsoft has money to burmln
I would rather a higher spec PS6 for $699 than a corner cutting lower spec console for $499. This is why I think MS should make a $999 'Xbox Surface' which can dual-boot into Xbox or into Windows and used it like a normal PC, the Xbox part is securely separated on a different drive or whatever.
@@Mervinion Maybe, I was just thinking from a game security pov it might be better to keep the 2 separate. Plus I am not sure how much stuff is needed to ensure Windows works well but would still slowdown a console OS.
XboxOS is a WSL based OS iirc, so it basically is a pared down Windows os. What you're talking about is possible with just drivers for the hardware. Fwiw, surface hardware is absolute garbage at $1000 what you're getting is basically a series s anyway. MS hardware needs a good, hard look.
@kaeota I was just using the Surface name to be amusing, but also didn't I hear that the Xbox Team was taken over by the Surface Team? The thing is, if you make it a Windows PC then people would just buy the game from Steam or Epic so Xbox still wouldn't get game sales.
AMD already messed up the naming, removed the RX 7900 GRE from the market which would have been a close competitor to the RX 9070 XT so as to not leave us with another option, so its a safe bet AMD will f*ck up the pricing at least.
Why do you act like you already know what the performance of 9070 xt is, all you have to go of is a rumour yet ur saying “it’s a safe bet”? Brain dead activities.
@Will-zi6fn ... it is a safe bet for the performance to be around the RX 7900 GRE since the preliminary tests published, the official announcement when they cut down the performance from where AMD originally wanted to go, the fact that AMD pulled the RX 7900 GRE from the market close to the launch and the leaks reported by MLiD all point in that direction.
@@Javier64691 That probably holds true for the raytracing performance since its said to have improved by 40%+, yes. Also, FSR 4.0 will make a significant difference as well. As for raw performance ... well lets just hope it indeed sits at 4080 level and the GRE was pulled only to have no competition in regards of the price.
you're right, but i don't really agree with the 7900 gre was purposefully removed argument, very likely that it was simply supply constraints. they used cut down dies and most likely amd just ran out of enough of them. either way if all the rumors are true the launch might be bad, but the cards are still going to be compelling once the prices drop, as they always do.
I wonder if the reason we didn't get UDNA sooner was actually because Sony (and Microsoft) were such big GPU clients. It has to be easier to optimize the silicon for gaming specifically, as opposed to dual-use gaming/datacenter. Consoles could have been an excuse to postpone this merger (or maybe were the catalyst for splitting RDNA and CDNA to begin with).
Actually UDNA was supposed to come out later than originally planned. However the timeline was accelerated due to competition and what I believe was Microsoft needing a new gpu architecture for their next console which surprise surprise is coming out in 2026, around the same time UDNA is supposed to debut
@@ZeroZingo MCM isn't a failure at all, or else CDNA wouldn't exist. Which all is MCM is a lot more than just cache chiplets. UDNA is bringing parts of CDNA, GCN & RNDA all into one why it's call "unified " RDNA4 is short stop for 6 to 8 months stop gap. There is no reason to actually release it, other than having something out there for a short time.
I think FSR 4 needs to be good enough, but it's adoption that matters! BTW: I don't care about RT for lighting at all. I like the RT shadows but I like my 240hz monitor even more :D
FSR4 does not need to beat dlss it just needs to have a noticeable shrinking of the gap in quality. Most people think fsr at 1080p is unusable. This is specifically where fsr needs the most help.. it could be huge for AMD but it is also so late. I am more interested in having all reflection artifacts removed. I hate screen space artifacts! 🤮
We may not care about things like ray reconstruction right now since ray tracing is still stupidly expensive but in the future technologies like that will be very important as ray tracing becomes more prominent. Without it a lot of ray tracing features introduce a ridiculous amount of noise and that can make ray tracing less attractive than rasterization in the worst case.
I disagree with PSSR being the best “FSR” version, imo it’s still fsr1 when you can run above 100 fps. Above that threshold even driver based FSR1 aka RSR can look native.
Granite Rapids D is adding AMX complex operations. Diamond Rapids is adding FP8 and TF32 data type support, in-flight data type conversions, transpositions. The per core AMX acceleration is becoming full featured. Granite Rapids D is also integrating dual on-chip 100Gbe.
On consumer chips, Intel is perpetually out in front on IO such as WIFI7, TB5, PCIE5, CuDIMM. Now Intel has apparently reduced the cost of 3D advanced packaging so that they can use it on consumer chips. Intel 18a, bspd, gaa are on the roadmap for Panther Lake in 2025, along with a new Celestial GPU and a next gen NPU.
Regarding Switch 2, I believe the CPU will not be any real limit. The ARM A78c are supposed to be quite similar in performance compared to ZEN 2 and I know that since it’s a handheld it’ll be less powerful compared to consoles like XBOX series or PS5 but it will definitely be better than the Steam Deck’s. But here is something interesting. The chip is supposed to have a module for files decompression, like PS5 I guess, a feature that can offload stress from the CPU compared to XBOX series or Steam Deck. Last but not least, Steam Deck’s cpu is X86 so honestly, there is no way to have an X86 architecture run on battery in a similar performance to a similar performance of an ARM CPU
Thanks for bringing some info on the table. Very interesting about that decompression block. Then there's also work graphs and other next gen functionality that'll massively reduce CPU load. There's plenty of levers for Nintendo to pull if they want to lower CPU use.
I could easily see Microsoft raise the prices of GamePass and also put an Ad supported tier. And I just think bc of how COD is performing that they will drop day & date.
If Microsoft can get alot people to play just only COD for about 6 to 8 months then it's a winner, but if people drop off after a month or 2 they may lost millions then selling the game up right even if people drop after a month or 2 they still got about $60+ or more per user. I know COD was more ment to be a gate way to gamepass to raise the user counts so people can keep paying for it and not drop gamepass over all.
@@HeathStevens COD or gamepass? I know Microsoft wants to make gamepass like netflix and be very depended on them for games, they want to be like valve make alot profit and do little work
Whether the 5090 is priced at $2,500 or $2,000, it is not worth paying a premium unless you have the financial flexibility (burn money) to afford such high prices.
I wonder if Nintendo and Nvidia could employ a smartshift type of technology for the Switch 2? This would mainly be for handheld mode where cpu limits would hit hardest. If the dlss upscaled image can get to a good resolve on the 1080p screen with even a little overhead being left for cpu it could make the difference in getting the target fps. If Nintendo is that worried about power consumption they could even integrate this feature into a perf mode.
FSR is good if you're in the quality mode going from 1440p to 4k or 1080p to 1440p. otherwise, it falls apart pretty quickly when you go from lower resolutions. DLSS handles this better, along with less fizzle & noise. but hopefully the AI in FSR 4 helps out a lot
The day the classic Wintel platform successfully moves to RISC in such a way that we can run old stuff over a compatibility layer with only minor downsides will be a joyous day. So I do not understand why nVidia would want to build a CISC chip? We are DESPERATELY trying to move away from this. Have been for nearly 30 years. Microsoft has been trying these past 15 years. They seem close. There is an enthusiast community which has compatibility lists which echo what the ProtonDB people are doing. If I was nVidia I would be working with MS on a thing similar to what the Apple folks did. MS is nearly there.
44:03 When the 'Neural Rendering' leaks came out, I looked it up. Nvidia has a number of papers about it, and Inno3d did a talk in '23 about 'The Future of Neural Rendering'. What they showed running then truly blew my mind. This is a legitimate paradigm shift in rendering that allows for materials of arbitrary complexity to be fully path-traced at extremely high framerates, *without* frame generation. They showed it running on a 4090. I highly encourage you to look into it- I haven't been this excited about a new tech since programmable shaders first came about!
100% agree. Just saw it, what a great highlight and explanation of the most profound papers. I think it was by Marco Salvi at I3D Symposium and not Inno3D. This tech has been blowing my mind for a while. I recommend Two Minute Papers for paper highlights if you're interested. The stuff on AI physics simulation there is just mind boggling. The rate of progress in neural rendering has been truly amazing. NeRFs and the AI filters for older games are even more crazy than what was shown here even if they are in their infancy and easily +5 years away. This is clearly days of the technology, but I just can't wait to see where it'll be in 5 years form now. Remember where DLSS and RT was 5 years ago? Look at it now. Wow! Because AI on another level is involved here I suspect the progress will be infinitely more rapid. No wonder Cerny like NVIDIA is doubling down on RT and ML for the PS6. Oh and for sure we'll be getting a Wow demo at CES where they'll go all out. This stuff is from 2023, and I'm sure they have a lot of stuff cooking behind the scenes. Getting asynchronous reprojection working without its flaws (NVIDIA can pull this off) + Neural Crisp (superior TAA without all the drawbacks of TAA (this one will be tricky) + frame extrapolation should convince anyone that the future is neural rendering. Imagining running the game with a 3x frame multiplier (on top of upscaling) with input decoupled from framerate (ultrafast and low latency) + sharp and antialiased image. This is not going to happen right away but I think it'll happen a lot sooner than most people think.
@@christophermullins7163 the only reason why they did it is because it already runs bad on 40 series. The speedup is nowhere near 100% which you would expect, framegen is +tens times more demanding than DLSS upscaling. now imagine how it runs on a comparable last gen card which has a 2-3x weaker optical flow accelerator + Tensor cores. It'll run like SHIT. I doubt the games would even scale on 20 series. 100% bet they'll open it up to earlier gens like they did with ray tracing because people kept complaining and people will conclude it runs like shit and not use it. There's no reason why NVIDIA would lock any of this stuff behind 50 series when it's already running on 40 and 30 series (the cards the AI researchers used). When NVIDIA locks something to a newer tier it's usually because the old tier just can't run it.
Regarding Bartlett Lake, ...remember Haswell 4790K CPU. It was a late but FANTASTIC chip. Many people are still on it today. I hope we will see a 1700 lga repeat here!!!
Ww know the 5080 specs, so if they clock it equal to a 4090, no way it's faster. But if the frecuency ia high enough and the GDDR7 increaae bandwidth and data transfer speeds, there's a chance. But i doubt it will be much better than the 4090, probably we'll see a 4070 vs 3080 situation, when they are practically tied, but the technologies like RT, FG and Vram make the difference in some scenarios.
"Of course it will be powerful, its Nvidia" That is what everyone said about intel until it wasn't the case anymore. So hopefully nvidia takes an L sooner rather than later.
Regarding DLSS vs FSR, I think it has reached the point where it's not truly about how wide the peak gap is, but rather how many people are satisfied with the common baseline -whatever that may be. And I think FSR has come very close to reaching that baseline where a lot of people may start thinking "yea sure DLSS is theoretically better, but FSR is already good enough". Same way people know a 240hz monitor is technically better than a 144hz monitor, but it's a neat bonus at that point, rather than a main selling point. Most people are already satisfied with 144hz as it is. It isn't really as contested of a metric anymore compared to the early days where each new iteration of DLSS and FSR was a major frog leap of improvement and also mindshare associated with that improvement. I think it is more adamant than ever, that Nvidia keeps stacking on NEW features to DLSS, rather than just improving it overall. Improvement is important, but new sales points start to mean increasingly more to keep that "good enough" floor higher than what AMD can easily reach. Neural processing could be something to help this, where potentially you can start seeing the "performance" DLSS mode getting better and better quality without dropping the fps benefits. That would be something that would really push hard on the key metric that FSR can sort of hang onto right now. At the top end DLSS is ofc superior and people who care about the top end will care more about just general improvement, over new novel features. It's at the base floor where the real fight will happen. At least that's my presumption until we see what each company unveils with their upcoming GPU launches.
@@Dempig Which exactly proves my point, the vast majority of those who will see the high-end benefits of DLSS are likely demanding more from the baseline to begin with, such as upscaling to 4K. The overwhelming majority of people play at 1440p and 1080p, and will not upscale beyond that point, so their demand by nature is 4x lower than yours, so the peak difference isn't as big of a factor to them, as it is to you. I am totally with your experience, I play at 3K ultrawide with my 4080 and I tried FSR too in some titles and the quality difference is noticeable. Had I been on a bit lower end hardware at regular resolution, I would ofc appreciate DLSS still but I probably wouldn't consider it as much of a selling point for a budgeted build that isn't intended to play at 4K high quality anyways.
Regarding the PS price, while yes, enough people buy PS5Pro at $700 and Sony probably ended up making more money from PS5Pro vs PS4Pro, this doesn't mean PS6 can be successful at $700. Remember that Sony business model or traditional console business model is to put the console in every household and then make money from the software. If they sell PS6 at $700, sure, it probably will make more money from the hardware vs PS5, but the ecosystem probably will be dead rather soon due to small user base thus not attractive from 3rd party pov.
Nah probably "B750 16GB" for $449-$499 this gen tho boosting profit margins on a "B770 24GB" for $549 on the extra VRAM would be smart. AMD is dropping us a 7900XT (7900XTX when UV-OCed) for $579 so I don't see Intel charging that much for something barely beating the RX 6800XT in raster and 7900 GRE in ray-tracing.
Hopefully PlayStation 6 should be fully backwards compatible with PlayStation 3, with full 8K Optical Blu-ray Drive support. Just like the original PlayStation 3, it would be good if Sony designs PlayStation 6 to be the best modern day retro machine with expansion ports and add-ons. I think PlayStation 6 and new Handheld should be an updated modern version of PlayStation 3 with increased RAM, improved hardware Graphics, Artificial Intelligence, with AMD x86 processor implementation in FPGA. (Also PlayStation 3, or another processor architecture, could be implemented in FPGA at the same time) One design for PlayStation 6 architecture could be, two or one physical Cell processors with three FPGAs to hold any other architecture. This way developers could target Cell Processor, AMD x86 processor, or any other architecture depending on experience.
RDNA 4 gonna be goated. I can feel it. The mid range tides will turn. 5080 and 5090 will still sell, but the mid range is gonna run through AMD this year
You're delusional, something will disappoint, either the price or the performance....probably both lol. If the 9070xt was actually within 5% of the 4080 ( i doubt it ) and they priced it at $500 it would be awesome card. But the performance will probably like the 7900xt imo. The price needs to stay at $500 unless the performance is better than we think, and even then....at $650 that's a very easy pass. Better off getting a 4080 second hand when they drop in price in a few weeks
@InternetListener well the ray tracing performance increased was definitely overhyped lol the raster increase was only slighlt behind where they said it would be. And the price increase wasn't worth the uplift
If only Sony only made money selling consoles… most money they make selling games, so the ps6 won’t be $700 unless they want game sales to drop to half of where they are now. This success will likely just accelerate ps6 pro plans
Anyone hear anything about an update for the Ryzen 8700G? It's 780m gpu part is now already thoroughly aged, and the AI Ryzen laptop CPUs have the new 890m or upcoming 8060s. Basically the 8700G is a laptop CPU in a desktop package. So, any plans from AMD to put the newer gen with 890m or even 8060s on desktop?
@@bakakafka4428 Nah, I saw a test with an 8700G against 8700G+RTX3050 6GB, the APU system was both slower and had higher power draw than CPU+GPU combined. If you go with a cheap CPU+used GPU (like i5 12400+RX6600) you can end up with a much better system for cheaper than an APU.
The only reason to keep a 3090 is because you would've bought it used. If you bought it new you missed the best residual selling price when 4090 launched until the launch of 4070 ti super.... Whatever it costs just buy it and sell it before 6090 comes out... You could've got 4090 for free or even profit... Now it still holds above 1000€ at used stores to sell. Asking prices to buy it used go 1450€... Only a maximum of 700€ of lost value for two years of use. Or Whatever difference you really would've. Got
X86S is one of the many proposed extensions to that ISA over the next few years. An equally large change would be APX which expands the GPR count from 16 to 32. Then there is AVX10 which furthers the SIMD capabilities, though hopefully the worst of AVX10 will also be killed. AMX is supposed to be enhanced further. Intel is probably tackling too much at once. Since X86S impacts backwards compatibility to a degree, nerfing this made sense. I do think that x86S-like idea will resurface in a few years as that ISA is in dire need of some simplification and clean up. In particular the boot up process is very messy with x86S aiming to fix that. (AMD also has a commitment to open source firmware which throws a wrench into this and would need to be rewritten to support x86S.) Both Zen 5 and Arrow Lake got nerfed hard by Windows 11 upon their launch. Peaking over to the Linux side of things, both Zen 5 and Arrow Lake were much closer to their advertised IPC increases. Zen 6 being 10% higher IPC does seem feasible if you look at where the instruction regressions were between Zen 4 and Zen 5: there were common instructions that when from single cycle to two cycle execution. If Zen 6 restores them to single cycle execution, that'll be a good performance jump right there. The bigger things for Zen 6 is going to be platform improvements with new packaging technologies and a new IO die. The Zen 6 X3D parts should really be something else. Zen 5 Threadripper would be impressive with V-cache. However if you look at AMD's Epyc line up, there is a super expensive 16 core part with 512 MB of L3 cache which means each die only has a single core enabled. This is the closest you could get to V-cache performance without going 3D. And yeah, the real V-cache part like that with 16 cores would have 1536 MB of L3 cache which I don't see AMD formally releasing (a 32 or 64 core variant like that perhaps). Threadripper will just follow in Eypc's footsteps. Following AMD's release cadence, we'll see V-cache Epyc chips in February (ISSCC) and hopefully new Threadripper a month later for GDC. Bartlett Lake should launch to DIY if only to be the replacement parts for the lineage of failed Raptor Lake cores. I think it'd be popular just on the merit of not requiring an entirely new platform. Motherboards are expensive and if tariffs kick in as proclaimed, full system upgrades will be priced out of the reach of many. If it can clock the same as Raptor Lake, it'd end up out performing Arrow Lake in many tasks.
Qualcoms next elite single score will be so high may be on par with M4s or 9000ryzen cpu and when they launch desktop CPUs they will have more multicore score than 9950x.
27:12 liked how you used "perceived". T239 will come later & with more competition than tegra x1 in 2017. Peop!e are missing this. What will aid Switch 2 in comparison to the original is that companies will support it from day zero. And that development is stagnating, we still see "AAA " stuff being released for X1/PS4 afterall.
Genuine question - what aaa titles have launched on the ps4 / xbone in the last couple of years? From where I sit, modern aa games struggle on the series S, let alone last gen HW
@@kaeota Gran Turismo, God of War Ragnarok, Horizon: Forbidden West, Elden Ring, Howards Legacy, RE4:Remake, Overwatch 2, Diablo: IV, Yakuza:Infinite Wealth, Shadow of the Erdtree, Metaphor:Refantazio, Black Ops:6. All AAA sports franchizes, etc. Now look at it this way: Shadow of the Erdtree, Refantazio & Balatro (not AAA) are 3 of the most important releases this year and all got a ps4 release.
Does Nvidia already have some experience with the Jetson line of products? ARM cores and Gpu core integrated With a unified memory? And in genuine curiosity, how is a cpu+apu different?
I came up with a scenario where Nvidia would charge $249 for the 5060. ...So, Nvidia takes the already planned 5060 8gb card and turns it into a limited release to be sold directly from their store at $249. They do this about a month before AIB partners come out with their own 12gb variants that will have an msrp ~$299 to $339. Doing this would drum up excitement for the 5060, and cause some people to think Nvidia are being benevolent in their pricing. In reality it's just a glorified paper launch that's used to gauge interest in a future re-release of an 8gb 5060 that would mirror the 3060 8gb in respective marketing and pricing. ...I'm not promoting this. Just saying that it's something Nvidia *might* do. Thoughts...?
4K 30fps is a complete waste of time for the Switch 2. They need to be able to run all their games at 60fps. And if that means dropping down to 1440p to do so, then i think that would be the better option. The only trouble with that though is that some TV's only do 1080p and 4K. Some TV's don't have 1440p in their EDID. If the Switch 2 was handheld only then 1080p 60fps would be more than enough. But it's not. It can be docked with a modern 4K TV so it NEEDS to be able to perform adequately at that resolution. Nintendo knows this.
Nintendo didn't want to launch the Switch 2 into the mess of the post covid global supply chain. Now they can launch it at the same price of the Switch or with a slight increase like $50. They want people to be able to buy their hardware.
I'm relieved to hear a more realistic guess at the 5090 2200$ that would be too much but I wouldn't be surprised, there's a reason why nvidia artifically kept 4090 prices high...it's not to sell you a 5090 for 1200$
Currently £40 price cut on ps5 pro in a few of the popular stores in the UK. I think only particular regions are selling well. Europe seems to not be onboard with £700
Better more nuanced DDR5 memory? Interesting, can't wait for some data showing how much better, I splurged buying a 64gb kit at 6000 cas 30 building my newest system. So hope Steam OS replaces Windows! Already adding Linux Mint Cinnamon to my newest build, will add a WIn10 install on a second drive til I'm more comfortable in Linux.
You could get that now from either Intel or AMD workstation. No reason to wait. The 16 core Threadripper Pro might be OEM only but I recall Intel goes down to 12 or 16 cores for DIY.
Tortious Vs Hare. Nvidia had more monitors with Gsync and then AMD smashed them with open source licensing for Free sync and now all monitors are Free sync, which Nvidia had to bow down and support in their driver's.
Apple reportedly indicating TSM N2 is not yielding well enough for the 2025 iPhone business. Intel reportedly has an integrated Celestial GPU coming on 18a Panther Lake with 12 cores (vs 8 on Lunar Lake Battlemage).
This implies a year slip for TSM N2. Apple generally gobbles up all the first year capacity for new process nodes at TSM, so AMD might be looking at 2027 for TSM N2 capacity.
Reference there being more competition in the console market because of SteamOS, I don't agree with this because we are talking about off the shelf components which is much more expensive compared to what Sony pays and sells at. Even if the PS6 is $699, to make a SteamOS machine with that performance for the first few years would be $1200+.
Yes they did, I know...and yet as usual they did not capitalize on it to take gaming market share...Nvidia did first. Same thing with Freesync btw - it wasn't specific to AMD, but that tech was around for years before NVIDIA did something with it for gamers...
I think Nintendo is waiting for Jensen to present the console themselves, he will hold it, call it "the greatest handheld ever created blah blah blah" and make Nintendo print money, at the end of the day, Nvidia team is the one that created this "unknown" chip.
AMD lacks in both software support (RocM is not supported on windows) and dedicated hardware (tensor cores) so, as a ML Engineer, I don't see a way that they can compete with NVIDIA on Deep Learning features, at least not on windows. PS5 Pro can make PSSR work because it runs on a unix based OS, where RocM is supported. Intel on the other hand gains massive ground on both software and hardware support, so, given the current state, it is more probable for intel to compete with NVIDIA, rather than AMD. Intel is not 100% there yet, but they are definitely on the right path, especially when it comes to Machine Learning.
What’s interesting is the NPU (XDNA?) that shipped with the 8845 processors is only supported on windows at the moment. I’m a Linux user that was looking to experiment with the performance with some custom models.
@@Thesaltymaker you can technically run models with AMD hardware (including NPU) on windows but currently, only with DirectX 12 as a provider (DirectML) with onnx. This works but it is slower than running a model bare metal, as RocM would allow.
Hey Tom and MLID team, do you guys have any info to support that AMD will change to the 9070 XT? I ask because I would rather it be 8800 XT, but I fear that their marketing team is digging past the bottom of the barrel.
for me it doesn't really matter how many ai stuff nvidia (or amd for that matter) tries to shove in their cards if, at the end of the day, the price to performance and experience we get is worse than it would've been with just actual good generational improvements and price to performance ratios like we had in the past instead of underpowered and overpriced cards with a lot of bells and whistles.
i just want nvidia's gpu to be at msrp, there's no way that till now from the 4090 launch, no one sold it at msrp, or even around it . aib card are now like 700 $ above msrp its going crazy. hopefully it wont be this insane with the 50 series
Pretty much Nvidia dropping like i would have guessed. Slightly improved architecture with higher power draw and GDDR7 for the same price. Probably some new features or updated ones that may or may not be back compatible with rtx. Outside of 5080/5090 I don't expect vast improvements outside of RT.
It would be different if it 80% as die efficient as nvidia but if is like 50%. Sure nvidia makes a massive profit on the 4070Ti but even 60% profit for nvidia means the b580 costs more than it is sold for. $250 vs $600+. Good for non gaming though.
Examples of how the audience continues to get the console market wrong, whether it is Switch massive success, PS5 Pro massive success, or the higher than expected demand for PS Portal MexZaibatsu - give it 4 months when enough devs have decided to optimize their games for it & maybe price drops kick in when the Pro undersells due to its MSRP just reality - ps5 pro is gimmicky bs, massively overpriced and will hurt Sony when more and more people are disgruntled by it. all expectedly wrong, turns out the successful gaming companies, while they do make obvious mistakes at times sometimes painful ones, they get a lot right too
I don't see PS6 at 700$, maybe 500$ no disk drive (25% increase gen on gen) Maybe they release PS6 and PS6pro at once, PS6 at a loss but making money on the pro 🤔 (the pro could be overpriced)
There is no PS5 with disc drive sold anymore (and disc drive is crazy expensive because tiny supply), so I do not expect PS6 to have one either. You will be lucky if there will even be an option for a disc drive in PS6. I fully expect Sony will try to force the "digital only" ecosystem for next gen. But maybe I'm too cynical. BTW, PS5 + disc drive costs 600 EUR here. 900 EUR for PS5 Pro + disc drive.
[SPON: Use "brokensilicon“ at CDKeyOffer’s Christmas Sale to get Win 11 Pro for $23: www.cdkeyoffer.com/cko/Moore11 ]
[SPON: Save BIG on the MinisForum BD795i & BD790i SE Motherboards: shrsl.com/4sgrf ]
Lies of Phil
Don't think rdna 5 is coming out any sooner than June 2026
I love how this pinned comment is a week old but the video feels so fresh 🤣🤣🤣 Anyway, two quick questions about Zen 6, is it going to require a new socket for mainstream desktop, will it be capable of running CU DIMM's or whatever is new by then - these are kind of the same question I suppose and determines whether I get into AM5 now or not. Keep up the excellent content, and try to keep the stress levels lover in 2025.
Let's face it. There is never going to be a gpu price correction because it simply doesn't exist. Amd and Nvidia literally work together I'm secret to price gouge. Amd is not going to fix it nor is Nvidia.. Profits rules all..
I have been following MLID's RDNA 4 coverage for ages and the arc has moved from "This is going to be a great midrange card " to "LOL AMD is gonna screw this up again."
That's just the standard Radeon cycle for the last decade. Somehow they always manage to screw it up (again) and then aggressively have to cut prices not long after launch until the pricing is closer to what people want to pay for it.
As soon as everyone saw yhr name change to 9070XT and how its between 7900 gre and a xt, we knew it was gonna be a disappointment
@@bb5307 The worst part is sometimes they don't even bother lowering prices, 7700XT was a pointless card day 1 and it hasn't moved down one cent ever since.
I'm assuming they want to completely run out of 6700/6750XTs before they do anything with it but who knows they might just let it die in a corner obsoleted by RDNA4 navi44 completely instead.
@@bryan_350_5 the performance leaks are contradictory to one another, a more recent leaks say that its within 5 percent of a 4080 so atm nothing is reliable
@@CalgarGTX Some cards just exist in limited numbers just to make another card look more enticing. 7700xt to 7800xt and 7900xt to 7900xtx come to mind.
Sony buying Kadokawa is not about FromSoftware and gaming at all. It's about thousands of their mostly anime-related IPs across anime, novels, manga, games, and so on. Kadokawa owns an absolutely giant share of anime IPs, and Sony owns a dominant share in its international distribution (as well as anime and anime music). From Software is just something gamers know, but it's tiny in comparison to the real value of Kadokawa.
Yup this is correct
Man every time they make the sony acquisition of Kadokawa about fromSoftware I die a little on the inside
I thought about this, too. FromSoft is just a minor bonus. Kadokawa is all over anime.
doesnt sony also own crunchyroll?
@@jeremymerry7967 yep. Kinda brings up that "m" word, dunnit?
Upcoming gen GPUs from both companies will be overpriced...
Both have 0 motivation to reduce prices.
Beautifully stated 💯
Yes sir it's pretty much e waste two years from now 6000 series will be here. It's all A.I blah blah right now.
I really think if AMD is serious about their intention to gain market share with RDNA4 like Jack Hyunh said in the toms hardware interview, they will price RDNA4 aggressively. Specially if they don't want to lose market share to intel.
@@justhomas83 2027 RTX 6090 will be $4000 😂
@@DailyThingsInLife might be ill have the money by then if I decide to go through with it. I have a RTX 4090 so I'm going to chill until then.
1 motivation B580 12GB 250$ LL FG XESS2💙💙💙💯🤩🤣It will have 90 FPS @4k (RT LL FG XESS2 on) in Sony's best PS5 game, upcoming Spider man 2 PC feb 1st 2025 and that is all it counts 😁😄
I want to see SteamOS become a full Windows replacement for desktop PCs.
No.
Bazzite and Pop! OS hybrid would be siiiic 🤘
@@Hitthegasmedia Windows is really awful. I only have it on my gaming stationary PC. Would be nice to not be forced to pay to Microsoft for the OS to play games.
Linux needs to get its act together in general
only if you game and nothing else
PS5 Pro sales have no bearing on PS6 pricing. You guys seem to be glossing over the fact that the end goal is to sell software and services. Sony is not going to want to sell fewer overall consoles with higher revenue if that impacts the total revenue that could be generated with a larger userbase. If anything, I think the Pro's success suggests that it would behoove Sony to launch with two SKUs, and have the more powerful model somewhat subsidize the price of the base model.
I agree the pro won’t affect the ps6 pricing just because 1/5 of the user base is willing to pay 700 for now doesn’t mean that a majority of the market would be willing to do it 3-4 years from now they are gonna want quicker adoption when next gen comes, they seen success with the ps5 at around 500 and would probably try to keep it between 500 and 600 because the vast majority will be shopping in that price range doesn’t matter if they have competition from Microsoft or not. I still don’t know if they want to do the two screw approach but that might be the play with the handheld.
i dont believe. its good at launch but this way is better. ps5 came, 3 years later we were looking each other like "yeah, what now? this generation is nothing of what it promises" then ps5 come out to refresh and make a newer push with a new tech in a better manufacture process .. its way more viable than if they launched normal ps5 (rx 6700 non xt) and a ps5 pro with a rx 68650xt at day 1
GPUs went x2+ times in price for no real reason (mining and crypto bubbles + Covid don't equal actual value of a GPU). Why won't consoles do the same?
People are trained to play games and consume media content. They are now held hostage. Why won't Sony try to exploit it? They must have their shares grow.
@@initialfd-3716 it's not 700$, it's 800 euros. While base PS5 costed 400 euros on release.
@@stanislavkimov2779 I’m in the US and it is $700 here and as far as why Sony won’t charge that much it’s because they make their money on software sales they need mass adoption of the consoles to make more money, they probably won’t take a loss on them anymore but they won’t overcharge for the hardware and push customers to other platforms for no reason when they don’t have to if they can make it for $500-600 and break even on it then that’s exactly what they will do, the pro console is different it’s for consumers willing to spend more they can price that at whatever 1/5 playstation players are willing to spend
They couldn't sell a 4080 for 1200 so I don't see how they can sell a 5080 for more than that.
I've gone from skipping 1 gen for the first time in my life to possibly skipping 2 gens. Or I can't believe I'm saying this, maybe buying an AMD card
@@HeliumFreak exactly. Everyone seems to be convinced the 5080 will be $1600 or something.. performance might be there but the vram is not enough. Consumers that are willing to pay $2k+ for a gaming PC are also in the position to know how bad 16gb for over $1200 really is. Money is tight today even for people with plenty of money. Everyone has less than they used to so we are doing research. At the same price, a 4090 would be a better buy unless all you want is to run path tracing in cyberpunk in 2024.. new games will be hampered by 16gb Guaranteed.
@@HeliumFreak hopefully the new AMD GPU is close to a 4080 in raster and Ray tracing with FSR4 being at least halfway between dlss and fsr3. 🤞
I skipped two gens and got an AMD 7900 XT. It worked great but had two flaws. Driver issues for VR (which now seem fixed, but I didn't want to wait) and my stupid mistake of buying the largest sized card out there because Asus gave a rebate. 😅 Of course then I wanted to go mini ITX... I sold it and got a discounted 4080s so I could just enjoy VR without hassles. I think I can sit out a few gens now if things keep going like this. If I need another system, I see no issues in getting another AMD card.
These influencer make it sound like AMD cards are "worse" but honestly the 7900XTX has been faster than the 4080 since day one and it's still faster than the 4080 Super. The card is stable, drivers are okay and Adrenaline is better than the shitty Nvidia Panel.
RT is the big elephant in the room, because RT requires upscaling (and FSR is worse than DLSS) and because PAth Tracing is a Nvidia-only experiment so far, so how much you care about that is the needle mover.
I personally sold my 3080Ti because of frustration with how much I had to rely on DLSS...the 7900XTX removed that need for me (I play at 3440x1440) and I've been happy to not use upscaling at all except in a few games and in Quality mode. When SteamOS comes out I will be ready to drop Windows for a better experience that doesn't hamper my CPU or GPU...with Nvidia I couldn't. I change my hardware every 4/5 years so I took a bet changing prematurely my 3080Ti for a 7900XTX, so far is paying well. Be wary of the amount of unwarranted praising that Nvidia and Intel are taking despite their faux pas (that for some reason are never as bad as AMD's ones, funny innit?).
@@or1on89AMDs RDNA3 perf is trash in comparison though. Daniel Owens did a big 4080 7900XTX comparison...Raster was upto 7% favouring the 7900XTX....but normalising the more games tested to an even more neglible difference....BUT RT was upto 50% faster on NVidia. Paying $1000 for a gpu with trash RT is not great. RDNA4 looks way way better....but AMD don't appear to have a 7900XTX raster perf card in the RDNA4 lineup. IF AMD manage 7900XT raster and 4070TI RT perf for $600-$650 it'll probably sell pretty well. But 7900XTX raster plus 35% to 40% better RT perf at that price. That would take market share.
I can't wait to see how much Nvidia and AMD try to over-charge us for their new generation of cards!
AMD has to stop the MSRP bullshit. Nvidia can get away with it because of how it’s objectively better products then AMD competition at the same price. AMD can’t get away with undercutting Nvidia by a few percent, they’re shooting themselves at the foot🤦♂️
lucky you only have to wait a few more weeks
@@themarketgardener They're not better products when they only last 2 years each generation.
If the cards move product, they aren't overpriced, you're just poor.
@@terpfen if Nvidia started at $2000 for 5060 they would still 'move products' just not that many.
Tom made a new video and took more shots at the B580. This is becoming his signature trademark.
LMAO. Remember this is prerecorded more than a week ago. Can't wait for his indirect response to HUBs latest video. Gonna be another deranged Intel ARC take.
They called him full of it without mentioning him by name. His response is sure to be entertaining.
@@pctechlooniethe funniest part is that I think HUB were "friends" of MLID, or at least they respected him as a content creator and we're willing to make videos with him.
I think they still are to some degrees but they have acknowledged that he was wrong about arc. It's OK tho, everyone ca be wrong, it's human
@lucazani2730 sssh. Don't tell Tom that. He needs to stay in character. Can't allow him to break with the role.
@@richardyao9012 Probably something like bla bla bla praising ARC is corporate cheerleading, bla bla bla real adult analysis only from me bla bla bla Tom Peterson is a liar bla bla bla My sources are more reliable bla bla bla
LMAO!!!
If you want comedy watch his video from early 2022 where he praises the 6500XT, announces the death of sub 200$ GPUs and says people on RX 580s and 970 better sell their GPUs and get the 6500XT for 250-270. Also uses BOM clown math to prove the 6500XT is not making AMD enough money.
44:33 I doubt Blackwell will actually be better at RT relative to raster performance, they have been sneakily stagnent on this, Adored TV did a breakdown of it. They seem to be incapable of actually doing this because the regular non RT hardware on the card is still a hard bottle to the RT performance somewhere. They straight up can't reduce the impact of RT in games right now and haven't for 6 years. All they can do is make the cards faster overall and improve the RT hardware to keep up with that performance uplift.
Do you know what that video was called? I'd like to watch it.
@@Vagabundo74 I think it was "Investigating Nvidia's Raytracing Performance" from November 2022
@@adeptuspotatocus6451 That's exactly the one.
Sure, they can it's called decoupling the RT cores from the Shader's R.O.P's & giving them their own R.O.P's for the RT cores, but hey that means Nvidia can't uses it software like D.L.S.S to be superior. Right now rasterizations is the bottleneck limiting RT performance as both the RT cores & rasterization rending is all done on the same R.O.P's
you should be more doubtful about rdna 4 raytracing performance, it is lagging behind even compared to intel ... blackwell raytracing should be as much as people claim it to be like 20 percent or so increment , or they can blow everyone away with neural rendering tehcniques which they were working for these past 4 years... they might be able to bring pathtracing to 60 class gpu with neural radiance caching and might even surprise us with better vram utilization throught neural texture compression (nvidia cards are already efficient with vram utilisation compared to amd and intel who knows what they have up their sleeve leatherjacket man is a sneaky guy)
If the names like RX 9070 are true than AMD already blundered RDNA4
RDNA 4 really looks by the leaks to be the worst RDNA generation. If the 9070 really performs worse than a 7900 gre it will be a card worse than Vega 56 (in terms of overall perf and price and power consumption at the time of release), and close to Covid scalper's era of price/perf
A card which costs as much as a 7800xt and performs as a 7800xt, which by itself was already kind of bad, 3 years later (before you judge me. The 7800xt is objectively bad, only that other radeon and Geforce cards at the moment are so much worse that the 7800xt looks fine by comparison. But the 7800xt is still a terrible card. This whole generation has been one of the worst generation of GPU in history)
I wanted a 8800xt that was 7900xt perf with 4070ti rt, not a 9070xt regardless of the perf but seems like a flop now on that front too.
@lucazani2730 No they don't. You're an idiot if you base your performance expectations on early leaked benchmarks of an unspecified SKU, at an unspecified power limit, with unspecified drivers.
GTA 6 will sell so many Playstations Sony just had to test how high they can push the price. Looks like they were successful.
Consumer has no power anymore like what do you mean the prices going up 5 years after release...
GTA6 will be sanitized and woke, having lost many old Rockstar devs. I have PS5 Pro, and am neutral to it's release.
Maybe many normies (who buy MTs in Fifa, Cod and Fortnite) will buy it. But for me GTA6 is really not as big as it could have been.
Western AAA is dead and crashing now.
@@nabieladrian I don't buy games for more than 40$ with rare exceptions (like Fromsoft releases). Publishers can set whatever prices they want. It's up to people if they want to buy it. Wait for discounts, or just pirate it completely. Or abandon AAAs, and play AA/indie or old games. I buy mostly Japanese games now anyways.
GTA6 will be a flop compared to 5
7:03 Sony want kadokawa cuz of Anime, Light Novel, Manga IP, from software and other game just a bonus if sony want total buyout.
Sony right now monopolize western Anime market with Crunchyroll (Funimation & AnimeLab) and with Aniplex.
Which is actually really smart of them, their only problem is not getting out of the way
I hatw crunchroll
@@kevinerbs2778 yep is bad, but cuz i'm in SEA region, i can use other platform like youtube (Muse Asia, Ani-One/Medialink) and bilibili/bstation, its cheap or even free
Happy New Year gentlemen, I look forward to your content in '25.
An interesting and engaging episode. Hoping for an interesting and eventful CES.
I would not be surprised if the CEO of AMD and Nvidia are related or something the way they fix the market with Nvidia.
They are distant cousins
They are.
Yeah,about that...
They are cousins
I have assumed for years that AMD and Nvidia work together to keep the total market as high as possible instead of competing on price with each other. You KNOW this happens. If you don't think so you probably also think the president runs the country. Things are never as they seem.
Zen 5 is already so fast now, it's being held back by slow memory access, hence why it benefits so much from the extra 3D cache.
So if AMD fixes the memory controller issues with Zen 6, it will be a beast! And I predict they will (even if it means a delay), simply because they have to if they want to make 12-core CCDs happen, or else it's going to be bottlenecked on memory.
Yeah I look forward Zen 6 c or Zen7 for my new built
Tom loves to hate on the B580. He says it is going to be a paper launch. I am getting my B580 from my B&H order today, January 2nd. I didn't really have to jump through hoops to get it.
What i don't get is why he gets very excited about Qualcomm making GPUs? I mess around with some Chinese Domestic Market GPUs, Innosilicon Fantasy II and Zhaoxin C960. While they are very solid for desktop use, they still very much struggle in games. Innosilicon Fantasy II is PowerVR based and that technology has been around for quite a while. The MTT S80 is an excellent example of a on paper good card with drivers that struggle in modern gaming.
Also, I feel, the Nvidia ARM APU will probably work best on Linux. Microsoft couldn't be bothered to help Qualcomm get Windows on ARM to function properly.
TLDR: Qualcomm GPUs will be meaningless to the gaming market. Games are so specifically tuned to GPUs nowadays with game day drivers.
He gets very gloomy on Xbox too which is weird imo, because it's gaining revenue big time yoy since like 2016 I haven't adjusted for inflation BT even if adjusted they are still growing, they did loose a lot in hardware though but I've heard rumors of a handheld with series S tech. I really don't understand this SEGA theory Microsoft has money to burmln
My 2025 fever dream: Nvidia's upcoming APU powering Shield 2 running Steam OS.
Do you guys think the switch 2 will have a full backwards compatible library?
Lots of speculation around the lore of Strix Halo, but I'm sure Strix Halo: Infinite will clear everything up
I would rather a higher spec PS6 for $699 than a corner cutting lower spec console for $499.
This is why I think MS should make a $999 'Xbox Surface' which can dual-boot into Xbox or into Windows and used it like a normal PC, the Xbox part is securely separated on a different drive or whatever.
I think that a better solution is to create XboxOS, basically a stripped down version of Windows, that can run most Windows apps.
@@Mervinion Maybe, I was just thinking from a game security pov it might be better to keep the 2 separate. Plus I am not sure how much stuff is needed to ensure Windows works well but would still slowdown a console OS.
@@Speak_Out_and_Remove_All_Doubt As I understand it games on Xbox run practically in virtual machines. Security should not be much of an issue.
XboxOS is a WSL based OS iirc, so it basically is a pared down Windows os. What you're talking about is possible with just drivers for the hardware.
Fwiw, surface hardware is absolute garbage at $1000 what you're getting is basically a series s anyway. MS hardware needs a good, hard look.
@kaeota I was just using the Surface name to be amusing, but also didn't I hear that the Xbox Team was taken over by the Surface Team?
The thing is, if you make it a Windows PC then people would just buy the game from Steam or Epic so Xbox still wouldn't get game sales.
AMD already messed up the naming, removed the RX 7900 GRE from the market which would have been a close competitor to the RX 9070 XT so as to not leave us with another option, so its a safe bet AMD will f*ck up the pricing at least.
Why do you act like you already know what the performance of 9070 xt is, all you have to go of is a rumour yet ur saying “it’s a safe bet”? Brain dead activities.
It was already disproven by a recent leak claiming is within 5% of a RTX 4080, not 7900 GRE performance.
@Will-zi6fn ... it is a safe bet for the performance to be around the RX 7900 GRE since the preliminary tests published, the official announcement when they cut down the performance from where AMD originally wanted to go, the fact that AMD pulled the RX 7900 GRE from the market close to the launch and the leaks reported by MLiD all point in that direction.
@@Javier64691 That probably holds true for the raytracing performance since its said to have improved by 40%+, yes. Also, FSR 4.0 will make a significant difference as well. As for raw performance ... well lets just hope it indeed sits at 4080 level and the GRE was pulled only to have no competition in regards of the price.
you're right, but i don't really agree with the 7900 gre was purposefully removed argument, very likely that it was simply supply constraints. they used cut down dies and most likely amd just ran out of enough of them. either way if all the rumors are true the launch might be bad, but the cards are still going to be compelling once the prices drop, as they always do.
"Perfect blue" caused some issues, because sony didn't use this colour to train ML.
Thank you for one of the best tech information on youtube. Wish you all the best in 2025
The same day this video came out. The 5700x3D went on sale for $179. So they were really close and on point
I wonder if the reason we didn't get UDNA sooner was actually because Sony (and Microsoft) were such big GPU clients.
It has to be easier to optimize the silicon for gaming specifically, as opposed to dual-use gaming/datacenter.
Consoles could have been an excuse to postpone this merger (or maybe were the catalyst for splitting RDNA and CDNA to begin with).
RDNA is at dead end, MCM is a failure. AMD didn't have a choice.
Actually UDNA was supposed to come out later than originally planned. However the timeline was accelerated due to competition and what I believe was Microsoft needing a new gpu architecture for their next console which surprise surprise is coming out in 2026, around the same time UDNA is supposed to debut
@@ZeroZingo MCM isn't a failure at all, or else CDNA wouldn't exist. Which all is MCM is a lot more than just cache chiplets. UDNA is bringing parts of CDNA, GCN & RNDA all into one why it's call "unified " RDNA4 is short stop for 6 to 8 months stop gap. There is no reason to actually release it, other than having something out there for a short time.
I think FSR 4 needs to be good enough, but it's adoption that matters! BTW: I don't care about RT for lighting at all. I like the RT shadows but I like my 240hz monitor even more :D
FSR4 does not need to beat dlss it just needs to have a noticeable shrinking of the gap in quality. Most people think fsr at 1080p is unusable. This is specifically where fsr needs the most help.. it could be huge for AMD but it is also so late.
I am more interested in having all reflection artifacts removed. I hate screen space artifacts! 🤮
@@christophermullins7163 They should stop being poor 😅
We may not care about things like ray reconstruction right now since ray tracing is still stupidly expensive but in the future technologies like that will be very important as ray tracing becomes more prominent. Without it a lot of ray tracing features introduce a ridiculous amount of noise and that can make ray tracing less attractive than rasterization in the worst case.
@@TheGuruStud AMD should stop being poor?
@@christophermullins7163up scailing at 1080p period looks terrible dlss and fsr
For the competition wrap-up, manufacturing competition is the concern. Having enough designs isn't a problem.
I disagree with PSSR being the best “FSR” version, imo it’s still fsr1 when you can run above 100 fps. Above that threshold even driver based FSR1 aka RSR can look native.
Granite Rapids D is adding AMX complex operations. Diamond Rapids is adding FP8 and TF32 data type support, in-flight data type conversions, transpositions. The per core AMX acceleration is becoming full featured. Granite Rapids D is also integrating dual on-chip 100Gbe.
On consumer chips, Intel is perpetually out in front on IO such as WIFI7, TB5, PCIE5, CuDIMM. Now Intel has apparently reduced the cost of 3D advanced packaging so that they can use it on consumer chips. Intel 18a, bspd, gaa are on the roadmap for Panther Lake in 2025, along with a new Celestial GPU and a next gen NPU.
Thanks for all those interesting gaming tech chats! I keep follow you guys in 2025! Have a happy new year!
Regarding Switch 2, I believe the CPU will not be any real limit. The ARM A78c are supposed to be quite similar in performance compared to ZEN 2 and I know that since it’s a handheld it’ll be less powerful compared to consoles like XBOX series or PS5 but it will definitely be better than the Steam Deck’s. But here is something interesting. The chip is supposed to have a module for files decompression, like PS5 I guess, a feature that can offload stress from the CPU compared to XBOX series or Steam Deck. Last but not least, Steam Deck’s cpu is X86 so honestly, there is no way to have an X86 architecture run on battery in a similar performance to a similar performance of an ARM CPU
Thanks for bringing some info on the table. Very interesting about that decompression block. Then there's also work graphs and other next gen functionality that'll massively reduce CPU load.
There's plenty of levers for Nintendo to pull if they want to lower CPU use.
I'm surprised that amd don't make a x3d cpu for the Playstation 😊
Sony didn't ask for it probably. Maybe PS6 could have a 3d cache shared between CPU n GPU
@ElPibePlay1000 ya just wonder how it would do.
Because they don’t use dedicated CPU’s
you are surprised only because you don't know how it works and why is not needed.
SoIC-X/Hybrid Bonding is a very expensive and slow process, no way TSMC can produce that with console volumes.
I could easily see Microsoft raise the prices of GamePass and also put an Ad supported tier. And I just think bc of how COD is performing that they will drop day & date.
If Microsoft can get alot people to play just only COD for about 6 to 8 months then it's a winner, but if people drop off after a month or 2 they may lost millions then selling the game up right even if people drop after a month or 2 they still got about $60+ or more per user. I know COD was more ment to be a gate way to gamepass to raise the user counts so people can keep paying for it and not drop gamepass over all.
@ZaberfangX you do know they lost half of their player base in Dec?
@@HeathStevens COD or gamepass? I know Microsoft wants to make gamepass like netflix and be very depended on them for games, they want to be like valve make alot profit and do little work
Whether the 5090 is priced at $2,500 or $2,000, it is not worth paying a premium unless you have the financial flexibility (burn money) to afford such high prices.
Ho ho for the New Year! 🍺🙂
My question is will we get a desktop APU on zen5 or will we need to wait for zen6 for them to release a APU for desktop again
I wonder if Nintendo and Nvidia could employ a smartshift type of technology for the Switch 2? This would mainly be for handheld mode where cpu limits would hit hardest.
If the dlss upscaled image can get to a good resolve on the 1080p screen with even a little overhead being left for cpu it could make the difference in getting the target fps.
If Nintendo is that worried about power consumption they could even integrate this feature into a perf mode.
Unlikely but who knows what Nintendo will want to do.
FSR is good if you're in the quality mode going from 1440p to 4k or 1080p to 1440p. otherwise, it falls apart pretty quickly when you go from lower resolutions. DLSS handles this better, along with less fizzle & noise. but hopefully the AI in FSR 4 helps out a lot
The day the classic Wintel platform successfully moves to RISC in such a way that we can run old stuff over a compatibility layer with only minor downsides will be a joyous day. So I do not understand why nVidia would want to build a CISC chip? We are DESPERATELY trying to move away from this. Have been for nearly 30 years. Microsoft has been trying these past 15 years. They seem close. There is an enthusiast community which has compatibility lists which echo what the ProtonDB people are doing. If I was nVidia I would be working with MS on a thing similar to what the Apple folks did. MS is nearly there.
44:03 When the 'Neural Rendering' leaks came out, I looked it up. Nvidia has a number of papers about it, and Inno3d did a talk in '23 about 'The Future of Neural Rendering'. What they showed running then truly blew my mind. This is a legitimate paradigm shift in rendering that allows for materials of arbitrary complexity to be fully path-traced at extremely high framerates, *without* frame generation. They showed it running on a 4090. I highly encourage you to look into it- I haven't been this excited about a new tech since programmable shaders first came about!
So it’ll be supported on 40 series?
100% agree. Just saw it, what a great highlight and explanation of the most profound papers. I think it was by Marco Salvi at I3D Symposium and not Inno3D.
This tech has been blowing my mind for a while. I recommend Two Minute Papers for paper highlights if you're interested. The stuff on AI physics simulation there is just mind boggling. The rate of progress in neural rendering has been truly amazing. NeRFs and the AI filters for older games are even more crazy than what was shown here even if they are in their infancy and easily +5 years away.
This is clearly days of the technology, but I just can't wait to see where it'll be in 5 years form now. Remember where DLSS and RT was 5 years ago? Look at it now. Wow! Because AI on another level is involved here I suspect the progress will be infinitely more rapid. No wonder Cerny like NVIDIA is doubling down on RT and ML for the PS6.
Oh and for sure we'll be getting a Wow demo at CES where they'll go all out. This stuff is from 2023, and I'm sure they have a lot of stuff cooking behind the scenes. Getting asynchronous reprojection working without its flaws (NVIDIA can pull this off) + Neural Crisp (superior TAA without all the drawbacks of TAA (this one will be tricky) + frame extrapolation should convince anyone that the future is neural rendering. Imagining running the game with a 3x frame multiplier (on top of upscaling) with input decoupled from framerate (ultrafast and low latency) + sharp and antialiased image. This is not going to happen right away but I think it'll happen a lot sooner than most people think.
@@mikeramos91It'll be software locked to 50 series just like framegen. It's Nvidia.. you this.
@@christophermullins7163 the only reason why they did it is because it already runs bad on 40 series. The speedup is nowhere near 100% which you would expect, framegen is +tens times more demanding than DLSS upscaling. now imagine how it runs on a comparable last gen card which has a 2-3x weaker optical flow accelerator + Tensor cores. It'll run like SHIT. I doubt the games would even scale on 20 series. 100% bet they'll open it up to earlier gens like they did with ray tracing because people kept complaining and people will conclude it runs like shit and not use it.
There's no reason why NVIDIA would lock any of this stuff behind 50 series when it's already running on 40 and 30 series (the cards the AI researchers used). When NVIDIA locks something to a newer tier it's usually because the old tier just can't run it.
@@pctechloonie You would be a terrible CEO. You have no clue, really.
Regarding Bartlett Lake, ...remember Haswell 4790K CPU. It was a late but FANTASTIC chip. Many people are still on it today. I hope we will see a 1700 lga repeat here!!!
With all the talk of pie at one poiny, I thought it was turning into an attitude era The Rock promo 😂
What I want to know is, will the 5080 beat a 4090? Gen on Gen, it *should* but, I donno about this time around...
Ww know the 5080 specs, so if they clock it equal to a 4090, no way it's faster. But if the frecuency ia high enough and the GDDR7 increaae bandwidth and data transfer speeds, there's a chance.
But i doubt it will be much better than the 4090, probably we'll see a 4070 vs 3080 situation, when they are practically tied, but the technologies like RT, FG and Vram make the difference in some scenarios.
@@Just_An_Ignacio the 5080 will have less memory bandwidth than the 4090
"Of course it will be powerful, its Nvidia"
That is what everyone said about intel until it wasn't the case anymore. So hopefully nvidia takes an L sooner rather than later.
Regarding DLSS vs FSR, I think it has reached the point where it's not truly about how wide the peak gap is, but rather how many people are satisfied with the common baseline -whatever that may be. And I think FSR has come very close to reaching that baseline where a lot of people may start thinking "yea sure DLSS is theoretically better, but FSR is already good enough". Same way people know a 240hz monitor is technically better than a 144hz monitor, but it's a neat bonus at that point, rather than a main selling point. Most people are already satisfied with 144hz as it is. It isn't really as contested of a metric anymore compared to the early days where each new iteration of DLSS and FSR was a major frog leap of improvement and also mindshare associated with that improvement.
I think it is more adamant than ever, that Nvidia keeps stacking on NEW features to DLSS, rather than just improving it overall. Improvement is important, but new sales points start to mean increasingly more to keep that "good enough" floor higher than what AMD can easily reach. Neural processing could be something to help this, where potentially you can start seeing the "performance" DLSS mode getting better and better quality without dropping the fps benefits. That would be something that would really push hard on the key metric that FSR can sort of hang onto right now. At the top end DLSS is ofc superior and people who care about the top end will care more about just general improvement, over new novel features. It's at the base floor where the real fight will happen.
At least that's my presumption until we see what each company unveils with their upcoming GPU launches.
I play at 4k and FSR is the reason im switching back to nvidia. FSR looks horrible even at 4k quality.
@@Dempig Which exactly proves my point, the vast majority of those who will see the high-end benefits of DLSS are likely demanding more from the baseline to begin with, such as upscaling to 4K. The overwhelming majority of people play at 1440p and 1080p, and will not upscale beyond that point, so their demand by nature is 4x lower than yours, so the peak difference isn't as big of a factor to them, as it is to you.
I am totally with your experience, I play at 3K ultrawide with my 4080 and I tried FSR too in some titles and the quality difference is noticeable. Had I been on a bit lower end hardware at regular resolution, I would ofc appreciate DLSS still but I probably wouldn't consider it as much of a selling point for a budgeted build that isn't intended to play at 4K high quality anyways.
Regarding the PS price, while yes, enough people buy PS5Pro at $700 and Sony probably ended up making more money from PS5Pro vs PS4Pro, this doesn't mean PS6 can be successful at $700. Remember that Sony business model or traditional console business model is to put the console in every household and then make money from the software. If they sell PS6 at $700, sure, it probably will make more money from the hardware vs PS5, but the ecosystem probably will be dead rather soon due to small user base thus not attractive from 3rd party pov.
When are we going to see a new CCD with more threads?
Panther Leg Tom has a nice ring to it. I think im going to add that to my pickup arsenal.
Intel gpu with 24gb?
Just an underclocked B580 for workstation use.
Nvidia also has a 70W "RTX4070" with 20GB
@@lharsay for $3,000+
Nah probably "B750 16GB" for $449-$499 this gen tho boosting profit margins on a "B770 24GB" for $549 on the extra VRAM would be smart.
AMD is dropping us a 7900XT (7900XTX when UV-OCed) for $579 so I don't see Intel charging that much for something barely beating the RX 6800XT in raster and 7900 GRE in ray-tracing.
@@lharsay wait… nah… source?
Only reason I got a pro was because I had a work discount. I got it for 525, 585 including the drive.
@@leax_Flame 🔥. At that price it actually makes more sense than a PC. Otherwise...
Hopefully PlayStation 6 should be fully backwards compatible with PlayStation 3, with full 8K Optical Blu-ray Drive support.
Just like the original PlayStation 3, it would be good if Sony designs PlayStation 6 to be the best modern day retro machine with expansion ports and add-ons.
I think PlayStation 6 and new Handheld should be an updated modern version of PlayStation 3 with increased RAM, improved hardware Graphics, Artificial Intelligence, with AMD x86 processor implementation in FPGA.
(Also PlayStation 3, or another processor architecture, could be implemented in FPGA at the same time)
One design for PlayStation 6 architecture could be, two or one physical Cell processors with three FPGAs to hold any other architecture.
This way developers could target Cell Processor, AMD x86 processor, or any other architecture depending on experience.
RDNA 4 gonna be goated. I can feel it. The mid range tides will turn. 5080 and 5090 will still sell, but the mid range is gonna run through AMD this year
You're delusional, something will disappoint, either the price or the performance....probably both lol. If the 9070xt was actually within 5% of the 4080 ( i doubt it ) and they priced it at $500 it would be awesome card. But the performance will probably like the 7900xt imo. The price needs to stay at $500 unless the performance is better than we think, and even then....at $650 that's a very easy pass. Better off getting a 4080 second hand when they drop in price in a few weeks
@@PCgamingbenchmarktime rdna 3.5 igpus and rdna 2.x of ps5 pro are pointing to deception for sure ...
@InternetListener well the ray tracing performance increased was definitely overhyped lol the raster increase was only slighlt behind where they said it would be. And the price increase wasn't worth the uplift
I'm most excited about Valve's stuff. Steam Deck is awesome, hope they cut the price on it and release a bonkers SD2
If only Sony only made money selling consoles… most money they make selling games, so the ps6 won’t be $700 unless they want game sales to drop to half of where they are now. This success will likely just accelerate ps6 pro plans
Anyone hear anything about an update for the Ryzen 8700G? It's 780m gpu part is now already thoroughly aged, and the AI Ryzen laptop CPUs have the new 890m or upcoming 8060s. Basically the 8700G is a laptop CPU in a desktop package. So, any plans from AMD to put the newer gen with 890m or even 8060s on desktop?
The AM5 socket has shittier memory bandwidth than a GTX1650 G5 or an RX6500XT.....
@@lharsay Yeah, but for a low end cheap gaming PC with very frugal power consumption, these APUs are pretty good.
@@bakakafka4428 Nah, I saw a test with an 8700G against 8700G+RTX3050 6GB, the APU system was both slower and had higher power draw than CPU+GPU combined. If you go with a cheap CPU+used GPU (like i5 12400+RX6600) you can end up with a much better system for cheaper than an APU.
Happy hoho and a merry new year!
If the 5090 is 2500 I'll sit on my 3090 for an extra generation and see what AMD releases for the high end. If it's 2k I'll upgrade.
same...
I refuse to buy a new card with less vram
$1999 would be a W for the 5090 IMO
@@TheRealDlo are you insane?
It’s a strategic plan. They are purposely putting false rumors of $2500 so when it’s only $2000 it doesn’t seem that bad of a deal 🤷🏻♂️
The only reason to keep a 3090 is because you would've bought it used. If you bought it new you missed the best residual selling price when 4090 launched until the launch of 4070 ti super.... Whatever it costs just buy it and sell it before 6090 comes out... You could've got 4090 for free or even profit... Now it still holds above 1000€ at used stores to sell. Asking prices to buy it used go 1450€...
Only a maximum of 700€ of lost value for two years of use. Or Whatever difference you really would've. Got
X86S is one of the many proposed extensions to that ISA over the next few years. An equally large change would be APX which expands the GPR count from 16 to 32. Then there is AVX10 which furthers the SIMD capabilities, though hopefully the worst of AVX10 will also be killed. AMX is supposed to be enhanced further. Intel is probably tackling too much at once. Since X86S impacts backwards compatibility to a degree, nerfing this made sense. I do think that x86S-like idea will resurface in a few years as that ISA is in dire need of some simplification and clean up. In particular the boot up process is very messy with x86S aiming to fix that. (AMD also has a commitment to open source firmware which throws a wrench into this and would need to be rewritten to support x86S.)
Both Zen 5 and Arrow Lake got nerfed hard by Windows 11 upon their launch. Peaking over to the Linux side of things, both Zen 5 and Arrow Lake were much closer to their advertised IPC increases. Zen 6 being 10% higher IPC does seem feasible if you look at where the instruction regressions were between Zen 4 and Zen 5: there were common instructions that when from single cycle to two cycle execution. If Zen 6 restores them to single cycle execution, that'll be a good performance jump right there. The bigger things for Zen 6 is going to be platform improvements with new packaging technologies and a new IO die. The Zen 6 X3D parts should really be something else.
Zen 5 Threadripper would be impressive with V-cache. However if you look at AMD's Epyc line up, there is a super expensive 16 core part with 512 MB of L3 cache which means each die only has a single core enabled. This is the closest you could get to V-cache performance without going 3D. And yeah, the real V-cache part like that with 16 cores would have 1536 MB of L3 cache which I don't see AMD formally releasing (a 32 or 64 core variant like that perhaps). Threadripper will just follow in Eypc's footsteps. Following AMD's release cadence, we'll see V-cache Epyc chips in February (ISSCC) and hopefully new Threadripper a month later for GDC.
Bartlett Lake should launch to DIY if only to be the replacement parts for the lineage of failed Raptor Lake cores. I think it'd be popular just on the merit of not requiring an entirely new platform. Motherboards are expensive and if tariffs kick in as proclaimed, full system upgrades will be priced out of the reach of many. If it can clock the same as Raptor Lake, it'd end up out performing Arrow Lake in many tasks.
Qualcoms next elite single score will be so high may be on par with M4s or 9000ryzen cpu and when they launch desktop CPUs they will have more multicore score than 9950x.
hopium
Hardware doesn't matter without software or decent emulation performance.
27:12 liked how you used "perceived". T239 will come later & with more competition than tegra x1 in 2017. Peop!e are missing this.
What will aid Switch 2 in comparison to the original is that companies will support it from day zero. And that development is stagnating, we still see "AAA " stuff being released for X1/PS4 afterall.
Genuine question - what aaa titles have launched on the ps4 / xbone in the last couple of years?
From where I sit, modern aa games struggle on the series S, let alone last gen HW
@@kaeota Gran Turismo, God of War Ragnarok, Horizon: Forbidden West, Elden Ring, Howards Legacy, RE4:Remake, Overwatch 2, Diablo: IV, Yakuza:Infinite Wealth, Shadow of the Erdtree, Metaphor:Refantazio, Black Ops:6. All AAA sports franchizes, etc.
Now look at it this way: Shadow of the Erdtree, Refantazio & Balatro (not AAA) are 3 of the most important releases this year and all got a ps4 release.
@Refreshment01 thanks for enlightening me! Very surprised many of these run at all on ps4, let alone in a playable state, impressive.
@@Refreshment01Also Jedi Fallen Order
Does Nvidia already have some experience with the Jetson line of products? ARM cores and Gpu core integrated
With a unified memory? And in genuine curiosity, how is a cpu+apu different?
I came up with a scenario where Nvidia would charge $249 for the 5060. ...So, Nvidia takes the already planned 5060 8gb card and turns it into a limited release to be sold directly from their
store at $249. They do this about a month before AIB partners come out with their own 12gb variants that will have an msrp ~$299 to $339. Doing this would drum up excitement for the
5060, and cause some people to think Nvidia are being benevolent in their pricing. In reality it's just a glorified paper launch that's used to gauge interest in a future re-release of an 8gb
5060 that would mirror the 3060 8gb in respective marketing and pricing. ...I'm not promoting this. Just saying that it's something Nvidia *might* do. Thoughts...?
Wish tou talked about the rumoured "9070xt"
4K 30fps is a complete waste of time for the Switch 2. They need to be able to run all their games at 60fps. And if that means dropping down to 1440p to do so, then i think that would be the better option. The only trouble with that though is that some TV's only do 1080p and 4K. Some TV's don't have 1440p in their EDID. If the Switch 2 was handheld only then 1080p 60fps would be more than enough. But it's not. It can be docked with a modern 4K TV so it NEEDS to be able to perform adequately at that resolution. Nintendo knows this.
Nintendo didn't want to launch the Switch 2 into the mess of the post covid global supply chain. Now they can launch it at the same price of the Switch or with a slight increase like $50. They want people to be able to buy their hardware.
I'm relieved to hear a more realistic guess at the 5090 2200$ that would be too much but I wouldn't be surprised, there's a reason why nvidia artifically kept 4090 prices high...it's not to sell you a 5090 for 1200$
5090 going to be +2499 I fear. The demand from AI for this card is gonna be unbeliably high.
Currently £40 price cut on ps5 pro in a few of the popular stores in the UK. I think only particular regions are selling well. Europe seems to not be onboard with £700
Better more nuanced DDR5 memory? Interesting, can't wait for some data showing how much better, I splurged buying a 64gb kit at 6000 cas 30 building my newest system. So hope Steam OS replaces Windows! Already adding Linux Mint Cinnamon to my newest build, will add a WIn10 install on a second drive til I'm more comfortable in Linux.
What I would love to see is Nvidia bringing out their own console using their own SOC, now that would be competition for Sony!
Switch 2 handheld resolution 1080p or 900p
Its the "people dont want to buy a 100$ jacket, they want 200$ jacket for 100$" effect. So AMD should put msrp high, but sell it to retailers low.
I would even take an 8 or 10/12 core threadripper, I just want the PCIe lanes better support for large amounts of RAM
You could get that now from either Intel or AMD workstation. No reason to wait. The 16 core Threadripper Pro might be OEM only but I recall Intel goes down to 12 or 16 cores for DIY.
41:34 Oh, Dan, you sweet summer child...
Did he really not leak everything about RTX 5080 and RTX 5090...
$ 999 for the RTX 5080 only a few hundred $ off the mark wrongly leaked.
DLSS2 is available on 600+ games, FSR2 is only ~150 games.
DLSS3 FG is available on 130+ games, FSR3 FG is only in about 50 games.
Tortious Vs Hare. Nvidia had more monitors with Gsync and then AMD smashed them with open source licensing for Free sync and now all monitors are Free sync, which Nvidia had to bow down and support in their driver's.
@Loundsify completely irrelevant comparison ... besides high end G Sync monitors are still nvidia exclusive to this day
Apple reportedly indicating TSM N2 is not yielding well enough for the 2025 iPhone business. Intel reportedly has an integrated Celestial GPU coming on 18a Panther Lake with 12 cores (vs 8 on Lunar Lake Battlemage).
This implies a year slip for TSM N2. Apple generally gobbles up all the first year capacity for new process nodes at TSM, so AMD might be looking at 2027 for TSM N2 capacity.
Oh yeah lets listen to the guy who said PS5 Pro was rdna 4 😂😂😂😂 JOKE!
Reference there being more competition in the console market because of SteamOS, I don't agree with this because we are talking about off the shelf components which is much more expensive compared to what Sony pays and sells at. Even if the PS6 is $699, to make a SteamOS machine with that performance for the first few years would be $1200+.
42:20 AMD had fluid frames tech for video for ten years.
Yes they did, I know...and yet as usual they did not capitalize on it to take gaming market share...Nvidia did first. Same thing with Freesync btw - it wasn't specific to AMD, but that tech was around for years before NVIDIA did something with it for gamers...
The new RTX 50 series is what im hyped for.
Eww. Hell nuh. Way overpriced and underoptimised garbage
I think Nintendo is waiting for Jensen to present the console themselves, he will hold it, call it "the greatest handheld ever created blah blah blah" and make Nintendo print money, at the end of the day, Nvidia team is the one that created this "unknown" chip.
29:20 where will be more game CPU limited, why? You forget about UE5...
AMD lacks in both software support (RocM is not supported on windows) and dedicated hardware (tensor cores) so, as a ML Engineer, I don't see a way that they can compete with NVIDIA on Deep Learning features, at least not on windows. PS5 Pro can make PSSR work because it runs on a unix based OS, where RocM is supported. Intel on the other hand gains massive ground on both software and hardware support, so, given the current state, it is more probable for intel to compete with NVIDIA, rather than AMD. Intel is not 100% there yet, but they are definitely on the right path, especially when it comes to Machine Learning.
What’s interesting is the NPU (XDNA?) that shipped with the 8845 processors is only supported on windows at the moment. I’m a Linux user that was looking to experiment with the performance with some custom models.
@@Thesaltymaker you can technically run models with AMD hardware (including NPU) on windows but currently, only with DirectX 12 as a provider (DirectML) with onnx. This works but it is slower than running a model bare metal, as RocM would allow.
Hey Tom and MLID team, do you guys have any info to support that AMD will change to the 9070 XT? I ask because I would rather it be 8800 XT, but I fear that their marketing team is digging past the bottom of the barrel.
This shooting the breeze session is 10 days out of date lol
best channel on youtube intro music is siiiiiiiick
I thought we were done with the swearing. it wasnt as cringe this time though so an improvement i guess.
for me it doesn't really matter how many ai stuff nvidia (or amd for that matter) tries to shove in their cards if, at the end of the day, the price to performance and experience we get is worse than it would've been with just actual good generational improvements and price to performance ratios like we had in the past instead of underpowered and overpriced cards with a lot of bells and whistles.
Perfect time for A Tom and Dan podcast. Love the bros.
i just want nvidia's gpu to be at msrp, there's no way that till now from the 4090 launch, no one sold it at msrp, or even around it . aib card are now like 700 $ above msrp its going crazy. hopefully it wont be this insane with the 50 series
Pretty much Nvidia dropping like i would have guessed. Slightly improved architecture with higher power draw and GDDR7 for the same price. Probably some new features or updated ones that may or may not be back compatible with rtx. Outside of 5080/5090 I don't expect vast improvements outside of RT.
I'm waiting for Quantum computing to be the next buzzword :)
It would be different if it 80% as die efficient as nvidia but if is like 50%. Sure nvidia makes a massive profit on the 4070Ti but even 60% profit for nvidia means the b580 costs more than it is sold for. $250 vs $600+. Good for non gaming though.
Examples of how the audience continues to get the console market wrong, whether it is Switch massive success, PS5 Pro massive success, or the higher than expected demand for PS Portal
MexZaibatsu - give it 4 months when enough devs have decided to optimize their games for it & maybe price drops kick in when the Pro undersells due to its MSRP
just reality - ps5 pro is gimmicky bs, massively overpriced and will hurt Sony when more and more people are disgruntled by it.
all expectedly wrong, turns out the successful gaming companies, while they do make obvious mistakes at times sometimes painful ones, they get a lot right too
I want from Software to make a new Otogi game.
I don't see PS6 at 700$, maybe 500$ no disk drive (25% increase gen on gen)
Maybe they release PS6 and PS6pro at once, PS6 at a loss but making money on the pro 🤔 (the pro could be overpriced)
There is no PS5 with disc drive sold anymore (and disc drive is crazy expensive because tiny supply), so I do not expect PS6 to have one either. You will be lucky if there will even be an option for a disc drive in PS6. I fully expect Sony will try to force the "digital only" ecosystem for next gen. But maybe I'm too cynical.
BTW, PS5 + disc drive costs 600 EUR here. 900 EUR for PS5 Pro + disc drive.
@Astrotripper2000 ps5pro is 1.250 dollars here, if it was 900 i would ve bought it
Dang that intro MB looks awesome