It does seem silly to be able to save during a benchmark tbf. I feel like the benchmark(s) should just be accessible on console. What's the harm? People would be able to see what's different about say perf and quality mode without necessarily having to do the whole intro in the mode they don't prefer.
@@Ayoul Maybe the console manufacturers have a policy to not allow it? Even if they don't there really isn't much point to have them on console. Just us nerds geeking over it
Honestly the most intensive area of Dogtown is actually entering Dogtown and the main club at Dogtown. So maybe a bench sequence where you enter Dogtown and then bench to getting into and walking around the club's main floor could work to bench Phantom Liberty more?
I have a theory of why some games perform better on PS5 relative to XSX: that these games are fill-rate limited on the XSX. Both machines GPU have the same number of geometry units and ROPs, so while the XSX has a compute advantage over the PS5, the PS5 higher clocked GPU actually delivers around 27% higher theoretical pixel and geometry fillrate than XSX lower clocked but wider GPU design. That should explain the performance difference between some games on both platform, as well as some games were able to render at higher resolution on PS5 (The Touryst maybe?). Or games on XSX just not as optimized as PS5 code of the same game. Or a combination of both of those problems. Great job DF! Keep up with your good stuff!
Well not really its simpler just pure optimization ps5 uses the same api as ps4 while xbox dx12 and most games coming out actually perform better on ps5 df have wondered why the series x advantage in gpu and cpu hasnt showed for example in bgs3 even with a better cpu both series s and x perform worse than ps5 on the heavily cpu stressed area
Yeah the Xbox Series X clocks are terrible, even compared to a midrange PC part. They basically traded higher CUs (and indirectly ROPs) due to heat concerns. The thing is, It shouldn't run much different than the PS5 since the amount of CUs are actually considerably high so some bad optimization stuff are definitely the cause here.
It’s so crazy to think I’ve been watching you guys for 13 years. It’s gotten to the point I genuinely am concerned for your guys health sometimes as we all grown in age. You guys have inspired me more than you could possibly imagine. I’d just like to say thank you for all your dedication. You guys have saved me from a lot of bad investments and not only that really influenced my love of video game/ technical performance and analysis
@@rishav_566because most monitors start from 48 hz to 180 hz or whatever their maximum number is. 30 hz is way below the monitor's lowest possible refresh rate i.e. 48 hz
Really need to pixel count at specific moments when the XSX is losing performance and the PS5 isn’t just to verify the wider dynamic resolution is being used. That alone will verify or invalidate that theory, at least to a certain extent. I personally find it unlikely both consoles are locked to their minimum resolution the entire benchmark. I think it might come down to API’s and their optimization.
There's a whole structure starting at around 09:00 minutes in that displays on Xbox Series X and not PS5. Not sure why this wasn't discussed at all in the video but I'm curious if Xbox runs poorer because more is loaded even if it can't always be seen. Would explain why PC equivalent parts run the game better when run at Xbox Series X settings.
wow very interesting results. curious that a minor difference in native resolution could create such a gap in performance! also, welcome back hope your new year was great!
The difference in resolutions does not justify the huge drop in performance. Richard shows that at 12:22. The most plausible theory is the extra pixel rate and cache speeds that both PS5 and RX 6700 have in comparison to Series X GPU.
@JaimeIsJaime i did see that as well, nice to have some insight into the consoles though, seeing as how there aren't standard benchmarks on them though.
The difference in resolution and framerate is a similar ratio but using VRR completely eliminates framedrops on series X resulting in smooth performance at higher fidelity. It's consistent with many games but the resolution bump on xsx varies by game.
From 9:05 onwards, some of the background is missing on ps5, maybe because is not rendering that stuff is helping whit the performance, in that particular scene is evident, but that may be happening troughout the whole benchmark 😳.
Not surprising. Dropped frames has been a common theme of this generation. As an Series X owner, it’s frustrating but Sony clearly focused on the correct things when building its console and dev tools
@@TheRealGeorgeGibsonyeah it's pretty bad in some areas like outside V's apartment and other areas. there's also kinda screen tearing at the top when it happens. vrr doesn't help either. it's the same for you too?
I'm thinking the API situation is more favorable to PS5, probably easier to work with. Xbox require more work with its custom DX12U, and since it's not the prefered console, the GPU performance is often (not all the time) worse, despite its GPU being more powerful
@@ThibautMahringer Both consoles have custom lower-level APIs that are more efficient than DX12 and Vulkan. Though there are still shared aspects between the Xbox and regular DX, don’t get me wrong :)
@@thanksbetotap Interesting, I've heard that the PS5 API is still easier to have a grisp on though, so I'll modify my original comment to include this correction. Thanks !
@@ThibautMahringer Oh, thanks. I think when devs complain, two things to keep in mind: I think it’s true the PS API is a little bit easier, but oftentimes devs are talking about the actual tooling and not just the API. And also, the Xbox dev tools were more rough at launch but did improve a fair amount over the next year or so. That’s because Sony’s hardware dev timelines were earlier than Xbox’s, but that was a tradeoff because Xbox wanted some newer hardware features and were willing to wait a bit longer. Everything is tradeoffs, of course. Sony’s dev tools are definitely quite nice by all accounts!
Engineering is about this. How a 36CU chip can beat a 52CU chip by being much smaller thanks to not incorporating memory segmentation (and adding GPU cache scrubber, higher pixel fill rate...)
PS5's CUs are clocked significantly higher than the series x. Series x may be 52CUs but their clock speeds are much lower. Around 1.8 while ps5's CUs are clocked at 2.2ghz. Across the board this gives ps5 a significant bump. But then their is the cache and ram scrubbers that help clear out the cpu and the ram and the cpu. ps5 may be closer to an off the shelf pc but its still a console through and through with a lot of customizations to help it along. Meanwhile MS consoles has always been geared to pc-like but cheaper with minimal customizations.
I think sony say true 36CU can be better than 52 with lower clock i think sony focused more to easy optimalisation option for devs and this is main thing, ms just go to higher power on paper to do marketing most powerful console but they fast test ps5 in ms and they stop saying most powerfull so they must in ms see they cant prove in games this power adventage and stop do this marketing.
This was very interesting, I wonder if CDPR will patch the Xbox series version now that this is public information. surely changing the DRS range is nothing more than a single value edit and lets be honest no one is going to notice the difference in resolution but they will notice the difference in performance.
You can see that there are strange flashes of light on Xbox SX, they are more visible at night (FSR problem?). I'm not sure, but I didn't have this before, during the mission "Pisces" (Judy) that the game started to freeze up and since then I've been having problems with flashes, shadows and stability. We see the same thing in this video and seriously, I was playing comfortably until this mission, but now I don't want to play because the flashes are so disturbing.
@@gabd.5299 "not gonna bother with the dlc until they fix it" then u wont ever play it since they dont care about fsr2, only nvidia tech so dlss and rt lol
I recently had a breakthrough with this game...I just purchased an ROG Strix 17 laptop with an AMD Ryzen 9 7945HX3D and a RTX 4090 mobile GPU and its able to run Cyberpunk at Ultra Settings with Path Tracing using DLSS frame generation at 80+ FPS....what CDPR has done with the graphics in this game is mind blowing
The RX 6700 has a bandwidth of 320 GB, while the PS5 boasts 448 GB. I'm not sure how much the infinity cache helps, but I don't think it's enough to offset a 128 GB difference in bandwidth. Moreover, this APU uses PCI Express 2.0 x4, leading to serious data transfer issues and performance loss. On the other hand, the PS5 doesn't face problems with data transfer. In summary, the 6700 in a PC with PCI 3.0 x16 would have significantly better performance.
This would be true and it is a factor, but people forget the PS5 has 448gb of SHARED memory so a decent portion will go to CPU, thus lowering the real world nomber available for the GPU.
@@DavidSmith-bv8mv but as far as I know, 10 GB of ps5's memory are dedicated to the GPU by standard, which can be changed by the developer if needed. This would set the amount of VRAM between these two to the same level.
@emmyjaeger22 they are talking about memory bandwidth not the total memory amount (despite using GB, a unit for the amount of memory instead of GB/s). The bandwidth is shared between cpu and gpu on the consoles. On PC the gpu gets 320 GB/s from its onboard VRAM and the CPU would get its own 48GB/s when using some standard DDR5 6000 MT/s RAM. that's 96 GB/s when running two RAM modules in dual channel mode as most PCs are configured. 320 + 96 GB/s is more or less equal to the 448 GB/s available on PS5
@@DavidSmith-bv8mvactually CPUs aren’t truly dependent on bandwidth vs GPUs. So I’m guessing Sony gave the CPU cluster exactly what it needs, while giving the majority of the mentioned bandwidth to the GPU cluster.
@@emmyjaeger22not really, Sony only really sets aside a small amount for OS system functions. While the rest can be set however the developer wants. Plus, all of it runs at the exact same bandwidth.
The most bizarre thing is that Microsoft probably paid a lot for those chips because they are way bigger than PS5 SOC. Still Sony was smart enough to compensate in thermals and clock while paying way less for their APU. Not to mention that Sony sells more consoles which probably helps with cost.
Exactly....I've seen it jump between both consoles with vastly different games. I have both ps5 and series x, and my gaming pc with a 3060ti and ryzen 5 5600x. It's still crazy to me that my pc runs great at 1440p with every game but ps5 still kicks my pcs ass at 4k( yes i know my 3060ti does much better at 1080 and 1440) especially games that use 4k,120hz, and vrr for 40plus fps.
Why would it be an API issue as the game and all of its RT tech was built with DX12 in mind going forward. It is a PC game first and foremost, afterall. If anything it should be favoring the Xbox pretty well. So I would personally discount this as one of the reasons, but memery segmentation is definitely a big possibillity.
These tests highlight the deficiency of your coverage very well. The horrorshow of the lighting lods is very apparent here. How can you look at these benchmarks and give a pass to those graphics? There are entire buildings that get darker as you get closer, it's just bizarre. We have an entire generation of games now where the speed of light seems to be about walking pace and the world gets built around slowly as you move. If you didnt keep praising this maybe we could have visually stable games?
Just speaking to resolution alone: 1152p with VRS is clearly deliberately chosen by CDPR as better than 1008p. That is after all the entire point of VRS-to be able to render at a higher resolution, at the same or lesser hardware utilization, to improve overall image quality.
Yet the image quality is still worse than PS5 (as shown in the full DF review of Phantom City) and worse performance. Looks like Sony know what they were doing when they removed that trash from PS5's pipleline.
Rich, have you ever shared how to actually initiate the city streaming test that DF has used in their videos? I've wanted to try it out for a long time but Google always has me come up short! I can't figure out how to run anything other than the standard benchmark.
I've always found it rather interesting that despite the Series X having a GPU compute unit advantage over the PS5, it runs at a locked 1.825Ghz compared to the PS5's variable 2.23Ghz boost which mirrors the clock speed behaviors of RDNA 2 GPU's on the PC. I wonder why Microsoft went with such a low clock speed on the Xbox this time around and how it effects the performance relatively because on paper the Series X should always have the advantage. We have seen that advantage with ray tracing in the test you guys did with Control which makes sense with how RDNA 2 does ray tracing. Speaking of which I feel Sony have done a way better job of showcasing ray tracing capabilities on console despite having the inferior hardware for it. This was a very interesting video, I love these investigative pieces into gaming hardware.
Heat consumption they thought about. Problem is less clocked higher performs better CPU wise especially. Ever wonder why Nvidia cards that are lower perform better than some top AMD cards for years
Well, their SoC is massive and the extra cost coupled with a 56cu ( 4 are disabled ) and heat stability of a part that size in 2020, in a console box at over 2gherz is to risky and would be super expensive, so they went the wide approach to PS5 narrow 36cu approach at 2.23gherz (which still needs liquid metal to help keep it cool ). Plus that higher GB gddr6 modules wouldn't have been cheap, either.
The combination of hardware decompression and superior throughput allows the PS5 to get more performance out of its GPU. The difference is that the PS5 memory architecture was designed for games. I'm surprised rich didn't come to the conclusion that the PS5 is able to feed the GPU faster.
I don't think that tracks, I mean I suppose it could in this game, but we have other instances where we don't see this issue. I highly doubt the decompression block of the Series X at 6GB/s would be the issue. It was designed by Microsoft so that decompression would never be an issue. Now if you are saying this game exceeds 4.8GB/s the top potential of the SSD in the Series hardware while using compression, then you might be right. To me this looks more like the slower clock-speed of the GPU in the XSX is the issue, I say that because even a 6650XT can get pretty close to the performance of the Series X an medium settings, all while on a 128bit memory bus and having 62% less CUs, it is that substantial clock-speed advantage that makes it comparable.
It will be very interesting to see how Cyberpunk runs on the inevitable ps5pro. I'm hoping CDPR will update or optimise it to take advantage of whatever proprietary image upscaler that Sony is supposedly using on the pro. Could it improve performance with a more 'native' looking 4k image? This is obviously excluding the reytracing aspect.
@@CommodoreMudkip Can't see them not taking advantage of a new system. They say that now because I'm sure CDPR doesn't have access to the new PS5(if it actually exists).
As a gamer with a Series X, PS5, and a solid PC (i7-12700k + 4070 Ti), that consistency shown here on the PS5 is the experience that has been sticking in my head, even when playing on my much more powerful PC. On PC, to achieve a truly satisfying level of performance in many games requires SpecialK or other methods of framerate locking or performance tweaking. PS5 is living up to its hype in that most games I've played on there (Spiderman 2, Last of Us, God of War reboots, Rachet and Clank, etc.), they just run so beautifully right out of the box as it were. I really wish we could get to a timeline where PC gaming gets that same level of ease to consistency. Yes, we have flexibility, but something on the GPU manufacturer level from Nvidia/AMD to easily and consistently lock framerates and frametimes, along with data similar to SpecialK to help understand what your PC and game are being limited by and causing performance issues. SpecialK is AWESOME, but being an independent project has a few downsides (lots of great upsides obviously). Mainly, developers don't really recognize it even exists and there are many games that simply do not work correctly with it and even claim that SpecialK is a cheating tool. It'd be nice to get something a bit more "official", so these big companies would be sort of forced to recognize a standardized frametime locking software along with perhaps some other, ease-of-use graphical preset features (some games, like Forza, are just completely out of hand with how much time it takes to find a performant set of settings).
The resolution is dynamic, which means most of the time both machines run similar trying to achieve 1440p. Amd even when we have the lowest values, this 144p difference is very small to justify that difference in performance.
@@TheBean87 they said in the video that it's pretty much the same.on both consoles, I checked back the video and they both look the same and are drawing the same distances.
Rich, great video! I was eating, drinking and making merry - but to each his own 😂 Bad news really. I have both consoles but was going to get the Series X ultimate edition as it is fully on disc - phantom liberty is a PS5 download 🤦🏻♂️
The Series X may have a fixed-function throughput bottleneck compared to the PS5, resulting from the lower GPU clockspeed. Combine that with the higher resolution target and you have a possible explanation for all observations. I also believe that Sony's AGC graphics API may also have less overhead on the GPU side, much like modern Vulkan tends to do with RDNA2 on PC. D3D12 is weird when it comes to descriptor management and execute indirect, which seems to map to the hardware in a suboptimal way.
9:04 why does the xbox in this case render a different background structure than the ps5? Seems to me that maybe xbox has a slightly higher visibility setting?
Could be, but it looks more like it's completely different versions of the game running on the two consoles. The space port seen at 9:04 was added (on all platforms) in the 2.0 update (not just for Phantom Liberty - it's still visible from outside without the expansion). All platforms also saw a dip in performance with that update across the entire city due to other changes... Lots of discussion about hardware architecture and DirectX vs Vulkan here - for something that's easily explained by the two console ports having different render settings (in addition to higher minimum resolution on the Series X) - or even, as it looks like, benchmarking completely different versions of the game on each platform.
We can clearly see the efficiency of a PS5 hardware in action here. I believe the SSD speeds and cache scrubber on PS5 GPU both help with assets streaming and maintaining the GPU to work at its near peak theoretical performance.
Exactly. It’s unfortunate Rich doesn’t bring up the point of more efficient hardware doing its job. If it was only gpu related the gap on the performance graph would stay consistent. The inconsistent results suggests there’s more to it like you said.
It's efficiency, but it's software and not hardware. They optimized the PS5 version more, if the both machines had similar resolution scaling, the Xbox would out perform it.
the gap in resolution is really small like 100-150 pixels of width/height but the performance difference is huge more like 10fps and more in someplace so it's not that.@@watershipdown
@@divanshu5039you will never convince some it's the hardware. After watching the road to ps5 by Mark Cerny I knew the ps5 was going to be good even if it was less powerful than the series x on paper. Sony addressed the bottlenecks, MIcrosoft threw power at the problem. It will be interesting to see what happens with the next generation and how Sony will improve on what they've already done.
Rich. There's a couple of simple reasons why the PS5 is faster. 1) it has a much higher G-pixel throughout over series x. Look it up. 2) It has a very fast GPU cloxk speed and throughput is raised as cerny himself said over the entire chipser design on PS5. 3) the PS5 has the fastest decompression and data in and out, compared to even PC. It's custom built and designed to be extremely efficient. I'm looking forward to seeing what Cerny does with PS6 tbh. He knows his stuff. The Xbox isn't a bad console, quick resume is good. Plays most games very well. But ppl need to stop the tflop rubbish. Look at no. 1 on my list.
None of what you said is factual. The difference you are seeing is nothing more than lead platform benefits. If Devs focused on developing games on the Xbox, you will see the benefits on Xbox over PS.
@@XAV-117 Go look up my last post. The only clueless one is you. The Xbox was never more powerful. If it was it wouldn't be behind in games from the getgo.
@@Crashed131963you forget that MS had the marketing rights with CDPR over CP2077 so your assumption makes no sense whatsoever. Some games run better on PS5, some on series X, that’s just the way it is.
@@pandagonerogue.140Why does marketing rights mean the developer automatically spends more time optimising for a particular console? Realistically it’s more likely to come down to the Ps5 OS being much less of a hardware burden than the Series X OS, but more optimization for the more popular console does make sense too.
Good to see the old myth of console performance being so much better than equivalent PC hardware is dead and buried. As it has been for over ten years now. Equivalent PC hardware is essentially just as fast and you do not need a massively powerful more PC than console to get equivalent or better performance on most titles.
@@Omar-kl3xp The PC kit used is an unusual OEM only integrated system released for China, with a disabled GPU. The discrete 6700 GPU is chosen to try and approximate console performance as close (but still imperfectly as mentioned with the smaller memory bus) as possible. Within a margin of 1.2 percent it's barely worth mentioning. Certainly not even remotely as dramatic as people that might imagine a console advantage would be 30 percent or even 10. It's next to nothing on this test.
@@pgr3290 I think the bigger gap performance will happen only when you develop exclusivity to the console ,there are still differences between the two if they fully optimised the game is fully made for that particular console then we will most likely see a bigger gap performance maybe ,but it is true that now consoles are becoming more and more like pcs .
For the cost it is “better”.. if the equivalent components cost more to get roughly the same performance, what is being busted? The “old myth” still there
Super cool stuff. Would be interesting to see how the downclocked rx6700 with console settings performs in a standard pc (something modest, like a i5 12400 for example) and see if it drops frame like it did with the desktop kit.
In terms of pixel count the PS5 has a considerable performance advantage, if it can drop 552960 pixels that's rendering 26.5% fewer than Series X on the lower bounds. If future tests allow for a matched DRS range I would expect them to perform more closely, though boost clocks on the PS5 GPU may still keep it slightly ahead in some scenarios.
Think your percentage difference is slightly off, but yeah I agree... PS5 1,792×1,008 = 1,806,336 XSX 2,048×1,152 = 2,359,296 2,359,296 - 1,806,336= 552,960 difference PS5 is rendering 23.4% fewer pixels. XSX is rendering 30.6% more pixels. This all obviously assumes PS5 is also bottoming out in this scene. People undervalue just how many pixels this difference is... 1008 and 1152 sound pretty similar but they really aren't in terms of resolution.
@@brewski535 Thats very negligible differences for there to be a 10-15 fps difference, also those are limits, it's doesn't mean PS5 is rendering at the lowest resolution.
@@adrianhosein7698 I think it's safe to assume the PS5 will be rendering at or near its lowest DRS. XSX dropped 8% more frames but if it's running at 30% higher resolution at times that would account for much of the performance difference. Without pixel counting those moments we don't know for sure but it's likely resolution is at least part of the difference here. A 30% higher resolution would cause a massive difference in terms of performance, even if it's only for part of the benchmark.
@@brewski535 that makes no sense sir, the point of dynamic resolution is to help with performance, if Xbox is dropping frames more frequently it means the Xbox version is at its lowest bounds more often, PS5 holds 60 consistently suggesting there's more headroom, which means most likely ps5 isn't at its lowest bounds as much as series x is, remember dynamic res kicks in the when frames drops, if ps5 is dropping less frames then logically it'll mean it's at its lowest bounds less often.
Curious to understand why the beefier Series X GPU is "limiting" the performance when compared with PS5 smaller GPU, even when SXS is using VRS. That's weird and interesting!
Its simple , the xbox is engineered like a mass produced custom PC build , while ps5 is more very precisely engineered and designed in all aspects , especially Software , where Microsoft uses a modified windows running clunky goofy directX
What? The most powerful console ever made is loosing to the PS5? Wasn't the new SDK fixing this and making the Xbox more powerful again? Just to see, how numbers as number of cores and megaflops do not tell the whole story. PS: It amazes me Rich saying the Xbox is at same level of PS5 in those tests. Wow, I know DF is Microsoft friendly, but that was a bit too much.
Why can't it be streaming bandwidth, which is higher on PS5 than Xbox series x, which seems to perform lower generally in games despite a higher CPU clock speed - including on Baldur's Gate 3
I've seen it jump between both consoles with vastly different games. I have both ps5 and series x, and my gaming pc with a 3060ti and ryzen 5 5600x. It's still crazy to me that my pc runs great at 1440p with every game but ps5 still kicks my pcs ass at 4k( yes i know my 3060ti does much better at 1080 and 1440) especially games that use 4k,120hz, and vrr for 40plus fps on Ps5 is amazing
Are those games actually running at native 4k for your consoles or via dynamic res, checkerboard, or some other kind of dlss rendering? And have you tried doing the same with your pc? Because I don't think much of any AAA games on consoles use native Res and keep the same kind of quality graphics settings. Unless your card struggles overall at 4k regardless of game?
@@ruekurei88 8gb vram is not enough for 4k upscaled on modern pc games, even with dlss, fsr... you are going to load 4k assets and vfx, even my 10 gb 3080 struggles with it, you will experience stutters and performance degradation unless i disable ray tracing entirely (which is very vram hungry) or drop down the resolution a notch. Good example is spider-man. 1440p is also a challenge, but if you keep ray tracing on high its ok, very high is out of question.
I also have both PS5 and PC with the same specs. It's amazing how PS5 can be similar or better than my pc. But I only have 1080p monitor so I can't really see PS5 quality on it since it's based on 4K in most games and most games run only 60 fps while my PC can be at 120 at 1080p. Not sure where to play when I get a 4K TV tho.
Modern consoles don't run 4K native, with the same settings and scaling tested here my 3 years old 3070 8gb does nearly double FPS, without even enabling frame generator.
Hi Richard, You mentioned it but does the Series consoles use Hardware Accelerated Variable Rate Shading (VRS) in Cyberpunk? If so, it seems like their implementation isn’t as performant as the Gears tier 2 implementation. I’ve requested this CDPR’s forums for cyberpunk but no response. Can you ask the developer? Thanks for all your efforts and Happy New Year!
@@thenobleyouthofmadness55Tier 2? Is not about being implemented is about how, because Xbox have some capabilities beyond PC DirectX, also, Devs are using 8 core with higher clock speed or 16 threads with a bit lower clock speed, there a lot to know.
People are always surprised when PS5 performance better since Series X has a higher teraflop, teraflops doesn't mean anything when games cannot be designed and optimized perfectly. I owned both XBOX and PS5 and the Xbox screen tearing problem still hasn't been fixed for most games. I mainly use PS5 to game and series X just to play some 360 games nowadays.
@@E-087 yeah that's the strange part, for some reason Xbox (a more powerful console on paper) with FreeSync always performs worse than PS5 with nothing and a "less powerful console on paper", made you really think. I personally think is the GPU clock on the series X is terribly designed or optimised that causes this. Also PS5 have VRR.
I wonder when CDPR is going to fix the shit shadow and lighting render distance (1:04, 2:41, 2:57, and a lot more) with them popping in front of you just meters away when not using ray tracing or path tracing. I all can do is believe they did it just to make RT/PT look better because there is no way they havent noticed this when it has been in the game, on all platforms, since launch
can someone explain why the physical ultimate edition is 70$ (CAD) but on gog and xbox store its 105$ for digital? Did they forget to update the price?
I didnt even realize that was the sale price on GOG, even on steam the ultimate costs $110.. wth CDPR, tbh I've seen this before where the ultimate costs just a tad less than a base game and its DLC added separately, odd thing but that appears to be the case on GOG rn, seen it with Witcher 3 complete edition on steam and gog too. Actually yeah its just imitating bundles where its cheaper here than buying things individually, Even at full price, the $110 will still be cheaper than buying it individually which would be $120 which I mean thats nice but you could just make that the price of the normal product? For the consoles I can see the want to have a new version thats actually playable on the disc itself but its really stupid that CDPR didnt do parity with having the DLC on disc like the Xbox gets, I mean the stuttering can be fixed on xbox so theyre golden more than PS5 users which will never get that DLC if they buy it used.
The performance on Series X compared to PS5 is purely down to the developers bad optimisation, this is CDPR we are talking about, they seem more interested in adding new tech preview features than fixing long standing issues
I really wonder what Microsoft is doing differently than Sony with the in theory more capable hardware. It's such a letdown every time. PS5 almost always gives you a better performance. And of course the better content.
It just shows the importance of "time" Basically Series X had significantly less time in development for developers than PS5 did, the issue is some developers spent more time to optimise Series X where performance advantage was made use of and beaten PS5 while some said "it's enough" and ignored developing further..... If you think about in terms of cost it make sense, developers won't care a 3-4fps dropped somewhere in the game after it releases the game for a specific console.... If users complaining then yeah but if they not why change?
People forget that the ps5 was built based on the developer opinions and what developers actually wanted in the ps5 ,so the console was built based on the developers,when you listen to developers and what actually developers want in the console you will end up having better performance.
I wish they would mention the over sharpening FSR flicker issue on Xbox. Otherwise it looks like the developers are content not putting the work in to fix it.
At first I was quite excited for VRS, but every day that passes it seems it was all very overpromised. Seems that it doesn't mix very well with upscaling or DRS, pillars of modern game rendering. And the fact that PS5 doesn't support it natively (Although foveated rendering works fine?) won't give the feature any more wings.
Mate, it will give it wings when it comes to VR games. When you move your eyes in a VR game what a dev could do to implement pixel culling with it, afterall what you're not looking at, the engine code could remove small fine details helping with performance.
@@DavidSmith-bv8mv Yeah! Idk why I don't mentally lump foveated rendering with VRS but at the same time mentioned it hahaha. But yes, foveated rendering is and has always been the real deal. I'm talking more about tier 2 VRS.
@@thelawyer95 Well, Switch got Witcher 3 so it's not impossible for them to give more attention to Nintendo Likely Switch 2 will have some form of DLLS 3 to help it along as well..
I've definitely noticed more performance issues on Series X since the 2.1 patch and Phantom Liberty which is why I clicked on this video. I've experienced it in the other areas of Night City as well, so I don't think it's just Dogtown. I also own Cyberpunk on PS5 but I would rather play it on my Xbox. Pretty disappointing tbh.
I've noticed that DF seems to use a certain benchmark sequence that is different from the built-in one. I wonder how have they created this sequence and make it play out by itself?
Will DF be covering the FSR3 mod for Cyberpunk 2077? I've tested it on a 3090 and it works a treat - pathtracing overdrive went from 45fps to 71fps at 4k dlss performance!
@@Radek494 console wars were super cool when I was like… 12 years old and it was 2006. It’s 2024. Pick your console and move on with it. No need to be so aggressively rude towards someone’s personal choice in which gaming device they chose.
@@Radek494probably because the ps5 version is the weaker one. Lower res limit, missing building in the distance 9:06, there’s so many cuts to that version no wonder it’s almost locked.
Oh, its Darth Richard!
He really looks like Dark Helmet. It's great!
😂😂😂😂
Richard Palpatine would be more like it.
Is it bc he wears black hoodie?
Let the Bespoke flow through you
Genuinely interesting! Clever idea Richard. It's a shame we can't achieve this on more cross-platform games.
It does seem silly to be able to save during a benchmark tbf. I feel like the benchmark(s) should just be accessible on console. What's the harm?
People would be able to see what's different about say perf and quality mode without necessarily having to do the whole intro in the mode they don't prefer.
@@Ayoul Maybe the console manufacturers have a policy to not allow it? Even if they don't there really isn't much point to have them on console. Just us nerds geeking over it
Honestly the most intensive area of Dogtown is actually entering Dogtown and the main club at Dogtown. So maybe a bench sequence where you enter Dogtown and then bench to getting into and walking around the club's main floor could work to bench Phantom Liberty more?
my RTX 4080: "say what?" lol
PS5 is not stable here.
@@iPh1l1pp
The question is: 'Is Xbox Series X or S performing better than PS5 in the same scene?'
yeah 4070ti is struggling not enough vram which sucks
Dogtown
I loved the Cyberpunk-style glitchy transitions on the overlay text!
You know a video is gonna be good when Richard pulls out the Frankenstein PC
A Columbo reference in 2024? I'm not mad, we just got done watching the entire series on Blu-ray last month and it was great!
The original 70’s run of Columbo is the GOAT.
@@Ryotsu2112^This.
Check out Gianni, brother.
Needs a reboot
@@jasonsmith530 Few things ever need a reboot. How would you ever do it better?
Cyberpunk is going to beat Rise of the Tomb Raider record of the most covered game by Digital Foundry
Actually it probably beat it already
@@znubionekyeah definitely. The overdrive coverage alone is already a ton.
I have a theory of why some games perform better on PS5 relative to XSX: that these games are fill-rate limited on the XSX. Both machines GPU have the same number of geometry units and ROPs, so while the XSX has a compute advantage over the PS5, the PS5 higher clocked GPU actually delivers around 27% higher theoretical pixel and geometry fillrate than XSX lower clocked but wider GPU design. That should explain the performance difference between some games on both platform, as well as some games were able to render at higher resolution on PS5 (The Touryst maybe?). Or games on XSX just not as optimized as PS5 code of the same game. Or a combination of both of those problems.
Great job DF! Keep up with your good stuff!
What about the faster ssd on the ps5, does it help to get better performance?
@@KT-83 I think it can, but across a wide range of games, I think the GPU setup is the main reason here
PS5 have higher GPU clock. Is more stable in stress situations than Xbox. I mean PC Gpus works the same.
Well not really its simpler just pure optimization ps5 uses the same api as ps4 while xbox dx12 and most games coming out actually perform better on ps5 df have wondered why the series x advantage in gpu and cpu hasnt showed for example in bgs3 even with a better cpu both series s and x perform worse than ps5 on the heavily cpu stressed area
Yeah the Xbox Series X clocks are terrible, even compared to a midrange PC part. They basically traded higher CUs (and indirectly ROPs) due to heat concerns.
The thing is, It shouldn't run much different than the PS5 since the amount of CUs are actually considerably high so some bad optimization stuff are definitely the cause here.
The benchmarks are really interesting, but what i really want is that handling and tire grip for the vehicles.
yeah I played it again last night and the car handles like dog crap don't know what they did but car skating around for no reason
@@randomcro24 fr
That's one hell of a rabbit hole Richard. Thank you so much for this amazing piece of content.
It’s so crazy to think I’ve been watching you guys for 13 years. It’s gotten to the point I genuinely am concerned for your guys health sometimes as we all grown in age. You guys have inspired me more than you could possibly imagine. I’d just like to say thank you for all your dedication. You guys have saved me from a lot of bad investments and not only that really influenced my love of video game/ technical performance and analysis
bruh....
Lmao
The best time to delete this comment was immediately after posting it. The second best time is now.
@@FidolorousWhat’s wrong with being concerned for someone?
@@pyrox6585 People on the internet fill their kicks by dunking on others like that. Maybe they'll grow up one day.
Playing this game on ps5 + oled really is amazing. Can’t stop playing
you play on performance or quality?
You find empty games fun? 😂 You'll love death stranding.
bro there's screen tearing on oled with ps5 performance mode, vrr doesn't work
@@rishav_566because most monitors start from 48 hz to 180 hz or whatever their maximum number is. 30 hz is way below the monitor's lowest possible refresh rate i.e. 48 hz
Really need to pixel count at specific moments when the XSX is losing performance and the PS5 isn’t just to verify the wider dynamic resolution is being used. That alone will verify or invalidate that theory, at least to a certain extent. I personally find it unlikely both consoles are locked to their minimum resolution the entire benchmark. I think it might come down to API’s and their optimization.
Exactly its simpler than what most people seem to think
AFAIK DirectX has more overhead than what Sony uses on PlayStation, I remember vaguely that a developer said it if im not mistaken
oliver did a comparison before and xbox uses vrs aswell
@@machinefannatic99 dang so xbox performance is really disappointing
Lowest 1080p on PS5 and lowest 1152p on Series X and highest is 1440p for both according to the accompanying article on their site.
There's a whole structure starting at around 09:00 minutes in that displays on Xbox Series X and not PS5. Not sure why this wasn't discussed at all in the video but I'm curious if Xbox runs poorer because more is loaded even if it can't always be seen. Would explain why PC equivalent parts run the game better when run at Xbox Series X settings.
wow very interesting results. curious that a minor difference in native resolution could create such a gap in performance! also, welcome back hope your new year was great!
The difference in resolutions does not justify the huge drop in performance. Richard shows that at 12:22. The most plausible theory is the extra pixel rate and cache speeds that both PS5 and RX 6700 have in comparison to Series X GPU.
@JaimeIsJaime i did see that as well, nice to have some insight into the consoles though, seeing as how there aren't standard benchmarks on them though.
The difference in resolution and framerate is a similar ratio but using VRR completely eliminates framedrops on series X resulting in smooth performance at higher fidelity. It's consistent with many games but the resolution bump on xsx varies by game.
@@seanskatesalotps5 also has vrr.
Even though resolution can drop lower on PS5 than on Xbox in this game, it doesn't mean the resolution is actually lower during this comparison.
From 9:05 onwards, some of the background is missing on ps5, maybe because is not rendering that stuff is helping whit the performance, in that particular scene is evident, but that may be happening troughout the whole benchmark 😳.
Not surprising. Dropped frames has been a common theme of this generation. As an Series X owner, it’s frustrating but Sony clearly focused on the correct things when building its console and dev tools
It's literally the same on the both consoles coming from my experience of having cyberpunk on both consoles.
The framerate drops are ridiculous.
@@TheRealGeorgeGibsonyeah it's pretty bad in some areas like outside V's apartment and other areas. there's also kinda screen tearing at the top when it happens. vrr doesn't help either. it's the same for you too?
May not be possible, but would be fascinating to see current PS4/Xbox One versions against whatever PC version matches!
xbox one and ps4 "current" what year is this?? 2012?
@@nick13b I think they are referring to the latest patched versions for those consoles, not calling them current gen.
Dashing in that hoodie rich, you could almost say you have a bespoke look about you today, in the here and now.
Richard, this is absolutely brilliant! Fascinating information you’ve unearthed. Thank you so much for sharing this with all of us!
I'm thinking the API situation is more favorable to PS5, probably easier to work with.
Xbox require more work with its custom DX12U, and since it's not the prefered console, the GPU performance is often (not all the time) worse, despite its GPU being more powerful
The main API on the Xbox is actually not DX12 :)
@@thanksbetotap DX12 Ultimate ?
@@ThibautMahringer Both consoles have custom lower-level APIs that are more efficient than DX12 and Vulkan. Though there are still shared aspects between the Xbox and regular DX, don’t get me wrong :)
@@thanksbetotap Interesting, I've heard that the PS5 API is still easier to have a grisp on though, so I'll modify my original comment to include this correction.
Thanks !
@@ThibautMahringer Oh, thanks.
I think when devs complain, two things to keep in mind: I think it’s true the PS API is a little bit easier, but oftentimes devs are talking about the actual tooling and not just the API. And also, the Xbox dev tools were more rough at launch but did improve a fair amount over the next year or so. That’s because Sony’s hardware dev timelines were earlier than Xbox’s, but that was a tradeoff because Xbox wanted some newer hardware features and were willing to wait a bit longer. Everything is tradeoffs, of course.
Sony’s dev tools are definitely quite nice by all accounts!
Engineering is about this. How a 36CU chip can beat a 52CU chip by being much smaller thanks to not incorporating memory segmentation (and adding GPU cache scrubber, higher pixel fill rate...)
Indeed
PS5's CUs are clocked significantly higher than the series x. Series x may be 52CUs but their clock speeds are much lower. Around 1.8 while ps5's CUs are clocked at 2.2ghz. Across the board this gives ps5 a significant bump. But then their is the cache and ram scrubbers that help clear out the cpu and the ram and the cpu. ps5 may be closer to an off the shelf pc but its still a console through and through with a lot of customizations to help it along. Meanwhile MS consoles has always been geared to pc-like but cheaper with minimal customizations.
That is even more if you think about the stupid battle about teraflops back in 2020
I think sony say true 36CU can be better than 52 with lower clock i think sony focused more to easy optimalisation option for devs and this is main thing, ms just go to higher power on paper to do marketing most powerful console but they fast test ps5 in ms and they stop saying most powerfull so they must in ms see they cant prove in games this power adventage and stop do this marketing.
because its clocked much higher than the 52CU chip 400mhz difference in fact
This was very interesting, I wonder if CDPR will patch the Xbox series version now that this is public information. surely changing the DRS range is nothing more than a single value edit and lets be honest no one is going to notice the difference in resolution but they will notice the difference in performance.
You can see that there are strange flashes of light on Xbox SX, they are more visible at night (FSR problem?). I'm not sure, but I didn't have this before, during the mission "Pisces" (Judy) that the game started to freeze up and since then I've been having problems with flashes, shadows and stability. We see the same thing in this video and seriously, I was playing comfortably until this mission, but now I don't want to play because the flashes are so disturbing.
its bc of VSR, it was really buggy afaik
I remember that problem
It was miles better before the fsr 2 update... It is a shit show right now... not gonna bother with the dlc until they fix it.
@@gabd.5299 "not gonna bother with the dlc until they fix it" then u wont ever play it since they dont care about fsr2, only nvidia tech so dlss and rt lol
@@krspy1337 On console fsr2 is available only.
I recently had a breakthrough with this game...I just purchased an ROG Strix 17 laptop with an AMD Ryzen 9 7945HX3D and a RTX 4090 mobile GPU and its able to run Cyberpunk at Ultra Settings with Path Tracing using DLSS frame generation at 80+ FPS....what CDPR has done with the graphics in this game is mind blowing
I just recently got into this game and it's technically impressive. Running it on a vanilla 3080.
Rich looking like Emperor Palpatine in that chair.
The RX 6700 has a bandwidth of 320 GB, while the PS5 boasts 448 GB. I'm not sure how much the infinity cache helps, but I don't think it's enough to offset a 128 GB difference in bandwidth.
Moreover, this APU uses PCI Express 2.0 x4, leading to serious data transfer issues and performance loss. On the other hand, the PS5 doesn't face problems with data transfer. In summary, the 6700 in a PC with PCI 3.0 x16 would have significantly better performance.
This would be true and it is a factor, but people forget the PS5 has 448gb of SHARED memory so a decent portion will go to CPU, thus lowering the real world nomber available for the GPU.
@@DavidSmith-bv8mv but as far as I know, 10 GB of ps5's memory are dedicated to the GPU by standard, which can be changed by the developer if needed. This would set the amount of VRAM between these two to the same level.
@emmyjaeger22 they are talking about memory bandwidth not the total memory amount (despite using GB, a unit for the amount of memory instead of GB/s). The bandwidth is shared between cpu and gpu on the consoles. On PC the gpu gets 320 GB/s from its onboard VRAM and the CPU would get its own 48GB/s when using some standard DDR5 6000 MT/s RAM. that's 96 GB/s when running two RAM modules in dual channel mode as most PCs are configured. 320 + 96 GB/s is more or less equal to the 448 GB/s available on PS5
@@DavidSmith-bv8mvactually CPUs aren’t truly dependent on bandwidth vs GPUs. So I’m guessing Sony gave the CPU cluster exactly what it needs, while giving the majority of the mentioned bandwidth to the GPU cluster.
@@emmyjaeger22not really, Sony only really sets aside a small amount for OS system functions. While the rest can be set however the developer wants. Plus, all of it runs at the exact same bandwidth.
The most bizarre thing is that Microsoft probably paid a lot for those chips because they are way bigger than PS5 SOC. Still Sony was smart enough to compensate in thermals and clock while paying way less for their APU. Not to mention that Sony sells more consoles which probably helps with cost.
Other games perform better on Series X, running at higher resolutions and frame rates. Guess it depends on what the game is doing with the GPU.
I would suggest not getting too much of a conclusion on just one game.
@@alvarosanchezleache936most games run better on ps5 tho even tho they shouldnt specs wise. Something is wrong with Xbox.
Exactly....I've seen it jump between both consoles with vastly different games. I have both ps5 and series x, and my gaming pc with a 3060ti and ryzen 5 5600x. It's still crazy to me that my pc runs great at 1440p with every game but ps5 still kicks my pcs ass at 4k( yes i know my 3060ti does much better at 1080 and 1440) especially games that use 4k,120hz, and vrr for 40plus fps.
Of course they wouldn't admit to that@@alvarosanchezleache936
Most likely a mixture of fill rate + memory segmentation with a little bit of API issues with Xbox.
Why would it be an API issue as the game and all of its RT tech was built with DX12 in mind going forward. It is a PC game first and foremost, afterall. If anything it should be favoring the Xbox pretty well. So I would personally discount this as one of the reasons, but memery segmentation is definitely a big possibillity.
Xbox uses custom windows on top while PS uses a lighter linux kernel. The OS overhead is much lesser on PS5.
well or its the kraken chip of the PS5 which gives enough headroom
Ps5 has much higher fill rate then series x.
@@Sand_1995ps5 uses bsd, not linux
These tests highlight the deficiency of your coverage very well. The horrorshow of the lighting lods is very apparent here. How can you look at these benchmarks and give a pass to those graphics? There are entire buildings that get darker as you get closer, it's just bizarre. We have an entire generation of games now where the speed of light seems to be about walking pace and the world gets built around slowly as you move. If you didnt keep praising this maybe we could have visually stable games?
This was pretty awesome I appreciated this!!
Just speaking to resolution alone: 1152p with VRS is clearly deliberately chosen by CDPR as better than 1008p. That is after all the entire point of VRS-to be able to render at a higher resolution, at the same or lesser hardware utilization, to improve overall image quality.
Yet the image quality is still worse than PS5 (as shown in the full DF review of Phantom City) and worse performance. Looks like Sony know what they were doing when they removed that trash from PS5's pipleline.
Rich, have you ever shared how to actually initiate the city streaming test that DF has used in their videos? I've wanted to try it out for a long time but Google always has me come up short! I can't figure out how to run anything other than the standard benchmark.
I believe it's a mod.
@@WickedRibbon It runs on consoles so not a mod. Don't know why they are so cagey about it.
Apparently it's run through a hidden dev debug menu
I've always found it rather interesting that despite the Series X having a GPU compute unit advantage over the PS5, it runs at a locked 1.825Ghz compared to the PS5's variable 2.23Ghz boost which mirrors the clock speed behaviors of RDNA 2 GPU's on the PC. I wonder why Microsoft went with such a low clock speed on the Xbox this time around and how it effects the performance relatively because on paper the Series X should always have the advantage. We have seen that advantage with ray tracing in the test you guys did with Control which makes sense with how RDNA 2 does ray tracing. Speaking of which I feel Sony have done a way better job of showcasing ray tracing capabilities on console despite having the inferior hardware for it.
This was a very interesting video, I love these investigative pieces into gaming hardware.
Maybe the Heat.
Heat consumption they thought about. Problem is less clocked higher performs better CPU wise especially. Ever wonder why Nvidia cards that are lower perform better than some top AMD cards for years
The PS5 uses Liquid metal. There's your answer.
Well, their SoC is massive and the extra cost coupled with a 56cu ( 4 are disabled ) and heat stability of a part that size in 2020, in a console box at over 2gherz is to risky and would be super expensive, so they went the wide approach to PS5 narrow 36cu approach at 2.23gherz (which still needs liquid metal to help keep it cool ). Plus that higher GB gddr6 modules wouldn't have been cheap, either.
they use it for their Cloud Gaming Servers. Keeping them cool is a priority.
The combination of hardware decompression and superior throughput allows the PS5 to get more performance out of its GPU. The difference is that the PS5 memory architecture was designed for games. I'm surprised rich didn't come to the conclusion that the PS5 is able to feed the GPU faster.
Yall so called hardware engineers aka Sony lapdogs have no clue wtf u talking about.
@@Ouail98You must own the Series L losing to a console that's 2 teraflops less.
@@Ouail98 Well explain it then, since you seem to be an expert and helped develop both consoles.
I don't think that tracks, I mean I suppose it could in this game, but we have other instances where we don't see this issue. I highly doubt the decompression block of the Series X at 6GB/s would be the issue. It was designed by Microsoft so that decompression would never be an issue. Now if you are saying this game exceeds 4.8GB/s the top potential of the SSD in the Series hardware while using compression, then you might be right.
To me this looks more like the slower clock-speed of the GPU in the XSX is the issue, I say that because even a 6650XT can get pretty close to the performance of the Series X an medium settings, all while on a 128bit memory bus and having 62% less CUs, it is that substantial clock-speed advantage that makes it comparable.
Nice high detail textures on Richard's face. Very normal mappy.
It will be very interesting to see how Cyberpunk runs on the inevitable ps5pro. I'm hoping CDPR will update or optimise it to take advantage of whatever proprietary image upscaler that Sony is supposedly using on the pro. Could it improve performance with a more 'native' looking 4k image? This is obviously excluding the reytracing aspect.
theyve already said there'll be no more updates to Cyberpunk 2077 as they move into full production on the sequel
@@CommodoreMudkip
Can't see them not taking advantage of a new system. They say that now because I'm sure CDPR doesn't have access to the new PS5(if it actually exists).
There is no PS5 pro
@luckyrockmore2796 ... leaks have more or less confirmed a ps5 pro dude 😂
@luckyrockmore2796 Not out but definitely will happen.
As a gamer with a Series X, PS5, and a solid PC (i7-12700k + 4070 Ti), that consistency shown here on the PS5 is the experience that has been sticking in my head, even when playing on my much more powerful PC. On PC, to achieve a truly satisfying level of performance in many games requires SpecialK or other methods of framerate locking or performance tweaking. PS5 is living up to its hype in that most games I've played on there (Spiderman 2, Last of Us, God of War reboots, Rachet and Clank, etc.), they just run so beautifully right out of the box as it were.
I really wish we could get to a timeline where PC gaming gets that same level of ease to consistency. Yes, we have flexibility, but something on the GPU manufacturer level from Nvidia/AMD to easily and consistently lock framerates and frametimes, along with data similar to SpecialK to help understand what your PC and game are being limited by and causing performance issues. SpecialK is AWESOME, but being an independent project has a few downsides (lots of great upsides obviously). Mainly, developers don't really recognize it even exists and there are many games that simply do not work correctly with it and even claim that SpecialK is a cheating tool. It'd be nice to get something a bit more "official", so these big companies would be sort of forced to recognize a standardized frametime locking software along with perhaps some other, ease-of-use graphical preset features (some games, like Forza, are just completely out of hand with how much time it takes to find a performant set of settings).
This is your first PC? Lol
The resolution is dynamic, which means most of the time both machines run similar trying to achieve 1440p.
Amd even when we have the lowest values, this 144p difference is very small to justify that difference in performance.
There’s more than just the resolution difference
@@TheBean87 like what?
@@adrianhosein7698 draw distance 9:06, amount of enemies on screen(the entire chase sequence) etc.
@@TheBean87 they said in the video that it's pretty much the same.on both consoles, I checked back the video and they both look the same and are drawing the same distances.
@@adrianhosein7698 so you just ignore the fact there’s an entire building missing on the ps5?
"It's dangerous to rely on Teraflops as an absolute indicator of performance"
-Mark Cerny, PS5 lead system architect
Rich, great video! I was eating, drinking and making merry - but to each his own 😂 Bad news really. I have both consoles but was going to get the Series X ultimate edition as it is fully on disc - phantom liberty is a PS5 download 🤦🏻♂️
The Series X may have a fixed-function throughput bottleneck compared to the PS5, resulting from the lower GPU clockspeed. Combine that with the higher resolution target and you have a possible explanation for all observations. I also believe that Sony's AGC graphics API may also have less overhead on the GPU side, much like modern Vulkan tends to do with RDNA2 on PC. D3D12 is weird when it comes to descriptor management and execute indirect, which seems to map to the hardware in a suboptimal way.
9:04 why does the xbox in this case render a different background structure than the ps5? Seems to me that maybe xbox has a slightly higher visibility setting?
Good catch!
Could be, but it looks more like it's completely different versions of the game running on the two consoles. The space port seen at 9:04 was added (on all platforms) in the 2.0 update (not just for Phantom Liberty - it's still visible from outside without the expansion). All platforms also saw a dip in performance with that update across the entire city due to other changes... Lots of discussion about hardware architecture and DirectX vs Vulkan here - for something that's easily explained by the two console ports having different render settings (in addition to higher minimum resolution on the Series X) - or even, as it looks like, benchmarking completely different versions of the game on each platform.
Its about weather effects. Nothing with visibility. CP has dynamic weather, and its hard to get the same for compare. Thats all.
wow, didn't know the PS5 version was sub-1080p sometimes
And the XSX is almost 1080p....the differences is so small that it does not exist.
This, ladies and gentlemen, is how you make and present a good benchmark: short, simple, clean methodology, reproducible and well explained.
Always a pleasure to see a Richard review
Happy New Year 2024! Gamers DF
We can clearly see the efficiency of a PS5 hardware in action here. I believe the SSD speeds and cache scrubber on PS5 GPU both help with assets streaming and maintaining the GPU to work at its near peak theoretical performance.
Exactly. It’s unfortunate Rich doesn’t bring up the point of more efficient hardware doing its job. If it was only gpu related the gap on the performance graph would stay consistent. The inconsistent results suggests there’s more to it like you said.
It's efficiency, but it's software and not hardware. They optimized the PS5 version more, if the both machines had similar resolution scaling, the Xbox would out perform it.
the gap in resolution is really small like 100-150 pixels of width/height but the performance difference is huge more like 10fps and more in someplace so it's not that.@@watershipdown
@@divanshu5039you will never convince some it's the hardware. After watching the road to ps5 by Mark Cerny I knew the ps5 was going to be good even if it was less powerful than the series x on paper.
Sony addressed the bottlenecks, MIcrosoft threw power at the problem. It will be interesting to see what happens with the next generation and how Sony will improve on what they've already done.
@@watershipdownthey target the same resolution (1440p). The bottom line 1080 vs 1150 doesn't mean it maintains constantly.
You didnt need to show rhe xbox or PS screens, but its appreciated to show your honesty
After all this years, ps5 continues to surprise with its performance compared to series X lmao
Rich. There's a couple of simple reasons why the PS5 is faster.
1) it has a much higher G-pixel throughout over series x. Look it up.
2) It has a very fast GPU cloxk speed and throughput is raised as cerny himself said over the entire chipser design on PS5.
3) the PS5 has the fastest decompression and data in and out, compared to even PC. It's custom built and designed to be extremely efficient.
I'm looking forward to seeing what Cerny does with PS6 tbh. He knows his stuff.
The Xbox isn't a bad console, quick resume is good. Plays most games very well.
But ppl need to stop the tflop rubbish. Look at no. 1 on my list.
None of what you said is factual. The difference you are seeing is nothing more than lead platform benefits. If Devs focused on developing games on the Xbox, you will see the benefits on Xbox over PS.
that's the same bs that guy told but in a diff format@@azzaprime7257
@@XAV-117 Go look up my last post. The only clueless one is you. The Xbox was never more powerful. If it was it wouldn't be behind in games from the getgo.
When rich says both versions look relatively similar at 9:02 the PS5 version is missing half the things as the Xbox version. LOL!
Thank you. Great work on the information gathering and hard scientific research on this topic. 10/10 I enjoyed this video very much.
That's really helpful and informative!
absolutely lovely video
Richard you're looking very Sith Lord like today.
Interesting testing by Richard and the DF crew 👍
The performance disparity is just the fact that the devs spent more time optimizing PS5. Most devs do. Especially European devs.
Its business you cater to the majority not the minority . Ps5 sell 3 to 1 over Xbox.
Casual gamers buy more $70 games than high end PC gamers .
@@Crashed131963you forget that MS had the marketing rights with CDPR over CP2077 so your assumption makes no sense whatsoever. Some games run better on PS5, some on series X, that’s just the way it is.
@@Crashed131963bruh wtf are you going on about Microsoft had marketing rights… wouldn’t make sense lol
@@pandagonerogue.140Why does marketing rights mean the developer automatically spends more time optimising for a particular console?
Realistically it’s more likely to come down to the Ps5 OS being much less of a hardware burden than the Series X OS, but more optimization for the more popular console does make sense too.
It's a common thing for PS5 to have better performance in this gen.
So no, it does not have to do with "European devs."...
Rich, going down a rabbit hole... no never say it ain't so ! ;) Happy new year Rich
Good to see the old myth of console performance being so much better than equivalent PC hardware is dead and buried. As it has been for over ten years now. Equivalent PC hardware is essentially just as fast and you do not need a massively powerful more PC than console to get equivalent or better performance on most titles.
To be fair the ps5 was performing better then the equivalent pc
@@Omar-kl3xp The PC kit used is an unusual OEM only integrated system released for China, with a disabled GPU. The discrete 6700 GPU is chosen to try and approximate console performance as close (but still imperfectly as mentioned with the smaller memory bus) as possible. Within a margin of 1.2 percent it's barely worth mentioning. Certainly not even remotely as dramatic as people that might imagine a console advantage would be 30 percent or even 10. It's next to nothing on this test.
@@pgr3290 I think the bigger gap performance will happen only when you develop exclusivity to the console ,there are still differences between the two if they fully optimised the game is fully made for that particular console then we will most likely see a bigger gap performance maybe ,but it is true that now consoles are becoming more and more like pcs .
This stopped being true in the PS360 era.
For the cost it is “better”.. if the equivalent components cost more to get roughly the same performance, what is being busted? The “old myth” still there
Well that was clever of you, nicely done! I don't think I would ever consider trying to abuse cross-play save games to run benchmarks. =P
Oh yeah, that’s what I mean by holiday content.
The real question this video raises is: why does Rich suddenly get a hero light (except it's not fake)? Guerilla would be proud.
so cool that this is even possible
Super cool stuff. Would be interesting to see how the downclocked rx6700 with console settings performs in a standard pc (something modest, like a i5 12400 for example) and see if it drops frame like it did with the desktop kit.
Amazing stuff once again! :)
Been a year since you uploaded. Really miss those videos :(
@@sarthaksharma9152 - yeah, I'll start doing it - hopefully starting this year. I've been working for Digital Foundry in the meantime. :)
Amazing video
In terms of pixel count the PS5 has a considerable performance advantage, if it can drop 552960 pixels that's rendering 26.5% fewer than Series X on the lower bounds. If future tests allow for a matched DRS range I would expect them to perform more closely, though boost clocks on the PS5 GPU may still keep it slightly ahead in some scenarios.
Think your percentage difference is slightly off, but yeah I agree...
PS5 1,792×1,008 = 1,806,336
XSX 2,048×1,152 = 2,359,296
2,359,296 - 1,806,336= 552,960 difference
PS5 is rendering 23.4% fewer pixels.
XSX is rendering 30.6% more pixels.
This all obviously assumes PS5 is also bottoming out in this scene. People undervalue just how many pixels this difference is... 1008 and 1152 sound pretty similar but they really aren't in terms of resolution.
@@brewski535 23.4% is indeed correct. I hastily calculated difference rather than change - appreciate the correction!
@@brewski535 Thats very negligible differences for there to be a 10-15 fps difference, also those are limits, it's doesn't mean PS5 is rendering at the lowest resolution.
@@adrianhosein7698 I think it's safe to assume the PS5 will be rendering at or near its lowest DRS. XSX dropped 8% more frames but if it's running at 30% higher resolution at times that would account for much of the performance difference. Without pixel counting those moments we don't know for sure but it's likely resolution is at least part of the difference here. A 30% higher resolution would cause a massive difference in terms of performance, even if it's only for part of the benchmark.
@@brewski535 that makes no sense sir, the point of dynamic resolution is to help with performance, if Xbox is dropping frames more frequently it means the Xbox version is at its lowest bounds more often, PS5 holds 60 consistently suggesting there's more headroom, which means most likely ps5 isn't at its lowest bounds as much as series x is, remember dynamic res kicks in the when frames drops, if ps5 is dropping less frames then logically it'll mean it's at its lowest bounds less often.
Curious to understand why the beefier Series X GPU is "limiting" the performance when compared with PS5 smaller GPU, even when SXS is using VRS.
That's weird and interesting!
PS5's GPU runs 400MHz higher clock speed than Series X. Might help some.
@nightowl3582 Yeah, but still it doesn't explain. Xbox GPU has 2TF more + additional features. So this scenario is interesting to say the least
Ps5 has higher clocks and an OS that uses much less system resources.
@@rjanuari1978 TF Means nothing in the real world.Its just a dumb number
Its simple , the xbox is engineered like a mass produced custom PC build , while ps5 is more very precisely engineered and designed in all aspects , especially Software , where Microsoft uses a modified windows running clunky goofy directX
Good video, thank you for the content :)
awesome video Rich, appreciate this type of content!
Ow man, just imagine what will happen once the PS5 PRO comes out!💀💀💀
What? The most powerful console ever made is loosing to the PS5? Wasn't the new SDK fixing this and making the Xbox more powerful again?
Just to see, how numbers as number of cores and megaflops do not tell the whole story.
PS: It amazes me Rich saying the Xbox is at same level of PS5 in those tests. Wow, I know DF is Microsoft friendly, but that was a bit too much.
Why can't it be streaming bandwidth, which is higher on PS5 than Xbox series x, which seems to perform lower generally in games despite a higher CPU clock speed - including on Baldur's Gate 3
I've seen it jump between both consoles with vastly different games. I have both ps5 and series x, and my gaming pc with a 3060ti and ryzen 5 5600x. It's still crazy to me that my pc runs great at 1440p with every game but ps5 still kicks my pcs ass at 4k( yes i know my 3060ti does much better at 1080 and 1440) especially games that use 4k,120hz, and vrr for 40plus fps on Ps5 is amazing
Are those games actually running at native 4k for your consoles or via dynamic res, checkerboard, or some other kind of dlss rendering? And have you tried doing the same with your pc? Because I don't think much of any AAA games on consoles use native Res and keep the same kind of quality graphics settings.
Unless your card struggles overall at 4k regardless of game?
@@ruekurei88 8gb vram is not enough for 4k upscaled on modern pc games, even with dlss, fsr... you are going to load 4k assets and vfx, even my 10 gb 3080 struggles with it, you will experience stutters and performance degradation unless i disable ray tracing entirely (which is very vram hungry) or drop down the resolution a notch. Good example is spider-man. 1440p is also a challenge, but if you keep ray tracing on high its ok, very high is out of question.
I also have both PS5 and PC with the same specs. It's amazing how PS5 can be similar or better than my pc. But I only have 1080p monitor so I can't really see PS5 quality on it since it's based on 4K in most games and most games run only 60 fps while my PC can be at 120 at 1080p. Not sure where to play when I get a 4K TV tho.
Modern consoles don't run 4K native, with the same settings and scaling tested here my 3 years old 3070 8gb does nearly double FPS, without even enabling frame generator.
@@ruekurei88 I agree it's rarely native res
It would be sooo fun to see Cyberpunk with rt reflections+gi medium and fsr3 mod.
What are you gonna do during your holiday brake?!
Richard: "Work"
What does Richard know about "brakes " . He's on the gas pedal all the time !
Hi Richard,
You mentioned it but does the Series consoles use Hardware Accelerated Variable Rate Shading (VRS) in Cyberpunk? If so, it seems like their implementation isn’t as performant as the Gears tier 2 implementation.
I’ve requested this CDPR’s forums for cyberpunk but no response. Can you ask the developer? Thanks for all your efforts and Happy New Year!
yes , in phantom liberty expansion , they confirmed the use of VRS on series x and s.
I see it on patch 1.5 for PC. Which patch number and notes say this?
@@thenobleyouthofmadness55Tier 2? Is not about being implemented is about how, because Xbox have some capabilities beyond PC DirectX, also, Devs are using 8 core with higher clock speed or 16 threads with a bit lower clock speed, there a lot to know.
Thanks for your work this was most interesting!
People are always surprised when PS5 performance better since Series X has a higher teraflop, teraflops doesn't mean anything when games cannot be designed and optimized perfectly. I owned both XBOX and PS5 and the Xbox screen tearing problem still hasn't been fixed for most games.
I mainly use PS5 to game and series X just to play some 360 games nowadays.
Strange, as far as i know Xbox have FreeSync and PS don't
@@E-087 yeah that's the strange part, for some reason Xbox (a more powerful console on paper) with FreeSync always performs worse than PS5 with nothing and a "less powerful console on paper", made you really think. I personally think is the GPU clock on the series X is terribly designed or optimised that causes this. Also PS5 have VRR.
@@E-087PS5 has had variable refresh rate for some time now.
@@E-087 PS5 has VRR tho.....
I wonder when CDPR is going to fix the shit shadow and lighting render distance (1:04, 2:41, 2:57, and a lot more) with them popping in front of you just meters away when not using ray tracing or path tracing. I all can do is believe they did it just to make RT/PT look better because there is no way they havent noticed this when it has been in the game, on all platforms, since launch
can someone explain why the physical ultimate edition is 70$ (CAD) but on gog and xbox store its 105$ for digital? Did they forget to update the price?
I didnt even realize that was the sale price on GOG, even on steam the ultimate costs $110.. wth CDPR, tbh I've seen this before where the ultimate costs just a tad less than a base game and its DLC added separately, odd thing but that appears to be the case on GOG rn, seen it with Witcher 3 complete edition on steam and gog too. Actually yeah its just imitating bundles where its cheaper here than buying things individually, Even at full price, the $110 will still be cheaper than buying it individually which would be $120 which I mean thats nice but you could just make that the price of the normal product? For the consoles I can see the want to have a new version thats actually playable on the disc itself but its really stupid that CDPR didnt do parity with having the DLC on disc like the Xbox gets, I mean the stuttering can be fixed on xbox so theyre golden more than PS5 users which will never get that DLC if they buy it used.
The performance on Series X compared to PS5 is purely down to the developers bad optimisation, this is CDPR we are talking about, they seem more interested in adding new tech preview features than fixing long standing issues
I really wonder what Microsoft is doing differently than Sony with the in theory more capable hardware. It's such a letdown every time. PS5 almost always gives you a better performance. And of course the better content.
Yep it’s been a dissatisfying experience
It just shows the importance of "time"
Basically Series X had significantly less time in development for developers than PS5 did, the issue is some developers spent more time to optimise Series X where performance advantage was made use of and beaten PS5 while some said "it's enough" and ignored developing further.....
If you think about in terms of cost it make sense, developers won't care a 3-4fps dropped somewhere in the game after it releases the game for a specific console....
If users complaining then yeah but if they not why change?
They arent doing anything its all up to devs
Could be DX12 being a pain. Look at all the games we have on PC that run terribly.
People forget that the ps5 was built based on the developer opinions and what developers actually wanted in the ps5 ,so the console was built based on the developers,when you listen to developers and what actually developers want in the console you will end up having better performance.
Amazing work!
Once again PS5 beats the Series X
🤣🤣🤣🤣
xbox is weak af
Fascinating results.
I wish they would mention the over sharpening FSR flicker issue on Xbox. Otherwise it looks like the developers are content not putting the work in to fix it.
Did not expect this to be as interesting as it turned out to be
Very interesting. Glad I got this on PS5. Very good game. In my personal top 25
At first I was quite excited for VRS, but every day that passes it seems it was all very overpromised. Seems that it doesn't mix very well with upscaling or DRS, pillars of modern game rendering. And the fact that PS5 doesn't support it natively (Although foveated rendering works fine?) won't give the feature any more wings.
Mate, it will give it wings when it comes to VR games. When you move your eyes in a VR game what a dev could do to implement pixel culling with it, afterall what you're not looking at, the engine code could remove small fine details helping with performance.
@@DavidSmith-bv8mv Yeah! Idk why I don't mentally lump foveated rendering with VRS but at the same time mentioned it hahaha. But yes, foveated rendering is and has always been the real deal. I'm talking more about tier 2 VRS.
Honestly i can't wait for y'alls analysis of the inevitable Switch 2 version 😂
The Switch 2 will probably be in the Ps4 to pro range when it comes to performance. I don't think there will be a switch version.
@@thelawyer95 Well, Switch got Witcher 3 so it's not impossible for them to give more attention to Nintendo
Likely Switch 2 will have some form of DLLS 3 to help it along as well..
@@thelawyer95closer to ps4. According to the video they did recently.
Leave it to Rich to drop one of these out of nowhere, thanks bro.
Once again, better performance on PS5, it really has become the norm in this gen.
Rich's skin looks so crisp in this video
I've definitely noticed more performance issues on Series X since the 2.1 patch and Phantom Liberty which is why I clicked on this video. I've experienced it in the other areas of Night City as well, so I don't think it's just Dogtown. I also own Cyberpunk on PS5 but I would rather play it on my Xbox. Pretty disappointing tbh.
just dont play in that crappy machine brah
@@luisrendon5869”crappy machine” that overall outperforms the competition.
Thanks for this video and your hard work you put into it, happy new year
I've noticed that DF seems to use a certain benchmark sequence that is different from the built-in one. I wonder how have they created this sequence and make it play out by itself?
Will DF be covering the FSR3 mod for Cyberpunk 2077? I've tested it on a 3090 and it works a treat - pathtracing overdrive went from 45fps to 71fps at 4k dlss performance!
It wouldn’t be a Digital Foundry video without a comment section full of childish hate for the Xbox. So tired of this console war BS.
Maybe it's not hate but criticism for Microsoft? Most powerful console in the world keeps underperforming to "weak" PS5😊
@@Radek494 see that’s exactly what I’m talking about. Why such animosity? At no point did I even mention the PS5…
@@Radek494 console wars were super cool when I was like… 12 years old and it was 2006. It’s 2024. Pick your console and move on with it. No need to be so aggressively rude towards someone’s personal choice in which gaming device they chose.
@@Radek494probably because the ps5 version is the weaker one. Lower res limit, missing building in the distance 9:06, there’s so many cuts to that version no wonder it’s almost locked.