They are not very well welcome in the competitive scene of many games, if you play games just casual then having an UltraWide is better but if you play competitive games is more an issue because most of this games doesnt support UltraWides (21:9), they give an advantage over normal monitors because the field of view is much wider... so you will play instead with a stretched resolution of 16:9, examples: Valorant, Overwatch etc...
@Jeremias 97.284 People think you are wrong! Have you ever played on an Ultrawide Monitor?! Its fits the human FOV much better than 16:9. If you have played with it, you won't change back.
@@wuhsabi I think it's preference. Me personally my first priority is textures so always on ultra with 16X filtering.Then after that I like lighting and shadows so I pretty much always max those out.
@@wuhsabi You're not wrong about gamers understanding the difference between visual settings but benchmarks and comparisons at ultra settings are still useful to people who run ultra settings, especially on ultrawide monitors and want to push the limits of a GPU :) I can't tell you how many videos i've seen of 3080 performance on 2560 x 1440 resolutions or 1920 x 1080. There is a lack of 21:9/3440x1440 3080 specific results out there for people who run such niche displays and at max settings, such as myself. Let us have the few videos out there like this one with 21:9 ratios/max settings without complaining lol. Granted, I don't run max settings for every game. For FPS games I will maximise my frame rate and drop down to medium settings if I need to but for RPG type games, like the Tomb Raider, I will crank to max settings because, to me, that's the most enjoyable experience to play the game at. This info in the video is invaluable to people like me looking to make a GPU purchase decision. I don't know why you're describing your frustration on a video that you find useless. I understand you're just venting a bit but if you find it useless, don't watch or comment on the video. Click away, no one is forcing you to watch it. It's simple.
Personally I'd take ultra wide 21:9 3440x1440 over 16:9 3840x2160 any day of the week. Much more immersive, higher fps and when watching movies at 2:35 aspect ratio it fills the entire screen. I'm using the popular AOC curved 34" G2 and love it! Just patiently waiting for my 3080 to arrive. 😊
But what about a LG 48 CX OLED? I was so gonna go for a LG 34GN850 until a friend got the OLED for his PC setup. The colors and picture quality simply makes any IPS look washed out in comparison, not to mention the black levels ;) The problem would be to get a good distance for a 48" monster at your desk and also gonna have to deal with 4k resolution. But maybe DLSS 2.0 or simple upscaling would do the trick. Some also speculate in running a custom resolution for that Ultrawide experience ;)
@@romainprovost7164 Yeah we're almost there, but the size isn't. I would had forget about Ultrawide in a heartbeat if there was a 32" OLED gaming monitor with 120Hz. But I've decided to restrain myself from upgrading for another year so who know's what's out by end of 2021? :P
@@laggmonstret Until (if they even can) solve the image burn in on OLED its not a great choice for games especially with the static huds that are present in almost every game. The monitor will have bad burn in after 1-2 years regardless of what you do to save it.
To be fair, 90% of the environments are baked in and there isn't much happening real time like some of these others. It does a real good job at looking amazing like this but it doesn't look that amazing, but more like a picture half the time. In my opinion.
id has always been at the forefront of game optimization. The secret is in making sure your occlusion culling is on point and the level design lends itself to it. They have mastered both aspects. It's not just optimizing the engine itself, it's making sure that the level designers are aware of it's limitations. In short: Good old manual optimization and performance profiling. You don't get it anymore because studios are too reliant on automatic optimization and project managers usually end up saying "it's good enough, let's move on".
I switched to an UW monitor a couple of weeks ago. Honestly I never expected it to be such a difference, it's not something you can know before you try it.
i don’t understand how anyone dislikes 21:9??? for competitive games it’s also such a advantage for the increased fob (tho there are some games without 21:9 support cuz of that) but in singleplayer games 21:9 feels so cinematic
The only people who hate on 21:9 have never owned a good 21:9 setup. They are the kind of people who read about the cons of something on the internet that they don't have and then run around parroting what they read.
That's kinda how it is with Ultrawide. Many games don't fully support it, especially with effects like that. There are often community made patches that fix these issues, but not always.
thank you for this 3440*1440 testing, it seems like 3090 is maybe the best choice for this resolution. i wanna keep the same setup for at least 3 or 4 years
That's what ive been saying from the start, and people kept saying it was overkill.... like according to Nvidias stats it might be overkill, but looking at this real world experience I don't think even the 3090 is going to get me to a constant 144fps on most modern games maxed out, but it will get me as close as I possibly can with a single card solution.
I just go for a new card everytime a new one is out and sell my old one. So far: 980 -> 1080 -> 2080 -> 3080 and that feels both cheaper then aiming for the 2080 Ti or this time the 3090 while still giving you good performance. But this time around things looks abit different with Nvida's fail of a release and Big Navi looking like it might beat Nvidia. I have to reconsider my pre-order for the Strix RTX 3080 that's for sure.
In Serious Sam you can start a benchmark, which is practically a demo playing on the map you choose. Also, it is ridiculous how much framerate you can earn if you lower the size of the shadow map...
Crysis has ~60% GPU util and 10GB VRAM and visuals are not that good, super poor development. The most inefficient game in the history alongside Assassins Creed.
@@whathappenedtothedon True it is pretty old, but take into consideration the time when it was released, other games had pretty good visuals and did not use as much resources. It hindert the best machines at the time, which does not mean that its a good piece of SW, but a bad one. Ironically the remastered version still does not run well on the best machines of 2020..
@@Format090 good point, but don't forget that Crysis was a lot more than just visuals...a lot of system resources go to the physics, terrain and sandbox environment. Not easy to do, I was running it back in the day on a 2 Gb card if I remember correctly
its literly the only game with physics in the bench. I loved playign with the editor, making tree suking tornados and nuking up the iland :D the good old days...also it was the first engine to make this kind of leap into the future. technicly cryengine 1 was more adavned than the cryengine 2 and 3 where they massivly cut back on simulation and went back to prescripted animations just like the rest of them...for the time it was absolutly stunning...also as many others said here the drivers of the new cards are optimised for dx11/12 the old engine even in the remaster cant utilise tensor nor raytracing cores so the 3080 is literly 40% inactive which is pretty exactly the 60% usage
I just wish there was a 3440x1440, 144 Hz OLED monitor. Mine is a 3440x1440, 144 Hz VA panel and while it looks absolutely gorgeous, watching this video in an AMOLED cellphone display makes me want to watch everything with this richness.
Just bought a 1440p 144hz monitor. I was a little worried about not being able to take full advantage with my 2070 super but this makes me feel better since I’ve essentially futureproofed my monitor for a long damn time
im running a 3080 too, got my 34" Lg ultragear gn850b a year ago and it was SUCH a jump over regular 2560x1440p acer predator that it was actually insane, never did i think id like it so much more but holy shit, now after more than a year with ultrawide id never even think of going back.
I see a decent amount of people saying “you don’t need more that 10g of vram for gaming.” After seeing some of these results, idk, I think sometimes gamers do need more than 10g
Next gen games will use more than 10GB of VRAM on high-ultra settings confirmed by some Developers already, thats why im waiting for the 20Gb version of the 3080.
Well you have to understand that's just amount allocated, not amount used. In most cases, games only use 70% of what is allocated. For example, MSFS 2020 allocates 11gb of 2080ti vram at 3440x1440p, but only actually uses 6-7 gb at any given time. The idea that next gen games will use more than 10GB of vram, especially at resolutions lower than 4k, is cap
@@tensorwolf There is no room for a 3080ti/super in the ampere technology, max they can do is use the little space they have left for increase the VRAM of the 3070 and the 3080... thing they are doing already, the TI version of the 3080 in this gen is basically the 3090.
Bro I appreciate this, as far as Red Dead Redemption 2's settings go, according to Digital Foundry's extensive testing, you're gonna wanna run DX12 mode instead of Vulkan for better performance Also, to turn off Motion Blur as it was left on, you only need that if you're running below 40fps Thanks for the video, greatly appreciated
The raytracing in Ghost Runner looks great an all but holy crap the performance hit is massive. it's the kind of game where higher fps is much more enjoyable
Ghostrunner gets decent sales & Cyberpunk blows up the bank -> Next year's Call of Duty -> Wallrunning IS BACK in Call of Duty: Cyberninja-punk Warfare
Letting it auto boost to 5.3 will give more performance than manually overclocking all cores to 5.1 mainly because games won't use 10 cores and letting a single core boost to high af clock speeds is much more performant.
I've a Asus rog curve 34" wide 3440x1440 I got a few years back & that's the exact CPU & GPU I'm lining up with 64GB DDR but was thinking my monitor wouldn't hack it but aft watching this, I'm keeping it. Thank you for showing, that's beautiful.. I'll sleep well tonight 😌
I love UW, could never go back. I have it paired with a 3090 kind of looking into one of these to get some cash back if there's not that much of a difference.
@@theothersideofthecoin3125 lol no sir, I'll let these folks clammer over the 3070 and 3080, I'll pounce when this 3080ti drops and upgrade my whole rig I'll be gaming comfortably for at least 5 years to come lol!
Ultrawide will become the true edge of pc gaming vs console. Console has the benifit of the big tv and a big couch. PC was always bragging about how better the graphic is, but the price of a good PC is 3 to 4 times of a console. With ultrawide, even at 1080p, the experience is transformed.
@@PR1NCETD0T No. Fact is Crysis was developed thinking CPUs would get MUCH higher clocks (upwards of 5ghz stable) instead of multi core, which ended up being the direction the new chips going towards.
This video makes me so happy!! I've postoned playing ALL of these games cuz i want to play them (except ark) on ultra wide with my 3080 once i get it! Thank you so much!
these results are from the highest settings enabled for everything (ultra). Putting some stuff at high and lowering antialiasing will give you many more frames with no discernible quality hit
@@hectamus_ Still, most of these games are already around for quite some time. I would expect much better results tbh. Unless I'm missing something here.
Literally the RTX 3080 is probably the best bet for 3440x1440p without "breaking the bank". You can run close to 60 or at 60FPS for most DEMANDING games. But everything else is 100+ FPS. If you want to get high FPS for everything. You'll need RTX3080Ti+. The RTX 3080 already close about 700-800 dollars. But you have to pay almost double for the 3080Ti and RTX 3090.
You sir doing god's work ultrawide is underrate i hope more game cover these res more . i have curved 21:9 144hz monitor and can't go back to 16:9 the experience is really immersive and image is really better i'm on 2560x1080 and it's somewhat can play competitive 1280x700 is a good resolution for CS:GO btw.
The performance is pretty underwhelming, I hit an average of 55- 60 fps on Horizon and Mafia on 3440x1440 with settings set to medium/high and some tweaks and overclocking here and there on an i5 6600k paired with a gtx 1070....
Your specs aren't good enough for 1440p ultrawide gaming. What did you expect? Just play 16:9 instead and use the extra screen for easy to run games and multitasking workloads.
Looking at this with my 13" MacBook Air I have a 24" I game with and is decent . I think 4k demands the best hardware to reap the benefits but is just expensive . Ive heard a lot people say 1440p is clear enough for fps to hit their targets and agree. 3440x1440 I think is the sweet spot for better frame rate and immersive game play. Will be getting one soon paired with a rtx3080
Got an Ultrawide coming later today. Going to use my 1070ti. I'm used to playing games on low, medium settings so ill be fine ill lower settings more if I have to. Cant wait to upgrade the PC later this year. Hoping to get a 3080 in a prebuilt. Maybe a 3080ti if the price is right.
@@vincentabel3825 it's technically riva statistics tuner that comes included in the install package of MSI After Burner but also other similar programs like EVGA Precision and probably gigabyte aorus engine
Ultrawide gaming is such an underrated way to experience games.
Said no one ever
@@Ineedcoffee-n4j I literally just said it.
They are not very well welcome in the competitive scene of many games, if you play games just casual then having an UltraWide is better but if you play competitive games is more an issue because most of this games doesnt support UltraWides (21:9), they give an advantage over normal monitors because the field of view is much wider... so you will play instead with a stretched resolution of 16:9, examples: Valorant, Overwatch etc...
@@rakennavarro2557 Can confirm this. I can see through walls with my 32:9 monitor in Apex :D
@@strangefigure4892 32:9 is too wide lol
Hands down the best ultrawide benchmark I've ever watched.
Ultrawide over 4k . Better frame rates wider field of view and close enough to 4k resolution in detail.
Someone that speaks the same language 👍
Yep, I sold my 32" 4K for a 34" inch 3440 UW.
@@edwardjam9832 Do you like it better? Im trying to decide between 4k and Ultrawide 3440x1440
@@nxfkry1874 it is way better than 4K, I can't imagine ever going back to 16:9 again....it seems like 4:3 to me.
@@edwardjam9832 That's cool. Thanks man
Thanks for this 21:9 Benchmark. :-) Many Tester´s forget 3440x1440.
@Jeremias 97.284 People think you are wrong! Have you ever played on an Ultrawide Monitor?! Its fits the human FOV much better than 16:9. If you have played with it, you won't change back.
@Jeremias lost
21:9 changed my world
@Jeremias boy...... you won't go back once you try it.
@@null643 Yeah you have to mod games for 21:9 support.
Shame that this card doesnt exists outside youtube...
Damn so true half of the stock was sent to youtubers
Got Mine since 2 weeks (MSI Gaming X)
it exist in the hands of scalpers
Lol
Microcenter for the win!
To be honest I expected RTX 3080 to push more FPS with this resolution.
@@wuhsabi I think it's preference. Me personally my first priority is textures so always on ultra with 16X filtering.Then after that I like lighting and shadows so I pretty much always max those out.
I have the samsung G9 super ultra wide 5120x1440 and a strix OC RTX3090 and the performance is not great at all :/
@@knerkilajnen That seems odd since it's 1m pixels less than 4k
@@wuhsabi You're not wrong about gamers understanding the difference between visual settings but benchmarks and comparisons at ultra settings are still useful to people who run ultra settings, especially on ultrawide monitors and want to push the limits of a GPU :) I can't tell you how many videos i've seen of 3080 performance on 2560 x 1440 resolutions or 1920 x 1080. There is a lack of 21:9/3440x1440 3080 specific results out there for people who run such niche displays and at max settings, such as myself. Let us have the few videos out there like this one with 21:9 ratios/max settings without complaining lol. Granted, I don't run max settings for every game. For FPS games I will maximise my frame rate and drop down to medium settings if I need to but for RPG type games, like the Tomb Raider, I will crank to max settings because, to me, that's the most enjoyable experience to play the game at. This info in the video is invaluable to people like me looking to make a GPU purchase decision. I don't know why you're describing your frustration on a video that you find useless. I understand you're just venting a bit but if you find it useless, don't watch or comment on the video. Click away, no one is forcing you to watch it. It's simple.
Also increasing temp and power limits would go a long way
Personally I'd take ultra wide 21:9 3440x1440 over 16:9 3840x2160 any day of the week. Much more immersive, higher fps and when watching movies at 2:35 aspect ratio it fills the entire screen. I'm using the popular AOC curved 34" G2 and love it! Just patiently waiting for my 3080 to arrive. 😊
But what about a LG 48 CX OLED? I was so gonna go for a LG 34GN850 until a friend got the OLED for his PC setup. The colors and picture quality simply makes any IPS look washed out in comparison, not to mention the black levels ;) The problem would be to get a good distance for a 48" monster at your desk and also gonna have to deal with 4k resolution. But maybe DLSS 2.0 or simple upscaling would do the trick. Some also speculate in running a custom resolution for that Ultrawide experience ;)
@@laggmonstret We need 34 inch display with OLED 10 bits, Gsync, 120 hz, 100% DCI p3 :)
@@romainprovost7164 Yeah we're almost there, but the size isn't. I would had forget about Ultrawide in a heartbeat if there was a 32" OLED gaming monitor with 120Hz. But I've decided to restrain myself from upgrading for another year so who know's what's out by end of 2021? :P
@@laggmonstret Until (if they even can) solve the image burn in on OLED its not a great choice for games especially with the static huds that are present in almost every game. The monitor will have bad burn in after 1-2 years regardless of what you do to save it.
I'll be exited for 4k ultrawide if that ever becomes a thing. That'll be the only thing that replaces my alienware QD-OLED 3440x1440 experience.
DOOM is such a well-optimized game. What gods.
It has ray tracing and it looks so nice, even playable on 8K60FPS it seems that when microsoft bought bethesda it did it for good
To be fair, 90% of the environments are baked in and there isn't much happening real time like some of these others. It does a real good job at looking amazing like this but it doesn't look that amazing, but more like a picture half the time. In my opinion.
@@rafaelvallen1145 Doom doesn't have ray tracing... and Microsoft buying Bethesda has nothing to do with the optimization
id has always been at the forefront of game optimization. The secret is in making sure your occlusion culling is on point and the level design lends itself to it. They have mastered both aspects. It's not just optimizing the engine itself, it's making sure that the level designers are aware of it's limitations. In short: Good old manual optimization and performance profiling. You don't get it anymore because studios are too reliant on automatic optimization and project managers usually end up saying "it's good enough, let's move on".
@@randomplayer6461 it does have raytracing
21:9+rdr2
perfect match
yeah but still game damn heavy. my 2070s only supports on middle settings with good framerate.
It looks like playing a character in a movie, it looks awesome!
I switched to an UW monitor a couple of weeks ago.
Honestly I never expected it to be such a difference, it's not something you can know before you try it.
you can just force a 21:9 resolution on any monitor and try for yourself though
@@nightcorexxx666 not really since that just crops it and makes it smaller
@@MrXaniss theres different sizes of ultrawide monitors so yeah its really the same thing except with screen borders
Thank you for profiding us the benchmarks everyone misses. 21:9 is the best in my opinion.
i don’t understand how anyone dislikes 21:9??? for competitive games it’s also such a advantage for the increased fob (tho there are some games without 21:9 support cuz of that) but in singleplayer games 21:9 feels so cinematic
The only people who hate on 21:9 have never owned a good 21:9 setup. They are the kind of people who read about the cons of something on the internet that they don't have and then run around parroting what they read.
thank you finally the resolution I've been waiting to see benchmarked!!!
After this video, i can die happily.
So much quality in 21:9
thank you! ive been waiting for a video like this :)
You knowing how to play makes the video 10x better.
exactly
2:29 The blood screen effect in Mafia only fills the 16:9 space then stops abruptly.
Bruh
wtf
That's kinda how it is with Ultrawide. Many games don't fully support it, especially with effects like that. There are often community made patches that fix these issues, but not always.
literally unplayable
thank you for this 3440*1440 testing, it seems like 3090 is maybe the best choice for this resolution. i wanna keep the same setup for at least 3 or 4 years
That's what ive been saying from the start, and people kept saying it was overkill.... like according to Nvidias stats it might be overkill, but looking at this real world experience I don't think even the 3090 is going to get me to a constant 144fps on most modern games maxed out, but it will get me as close as I possibly can with a single card solution.
I just go for a new card everytime a new one is out and sell my old one. So far: 980 -> 1080 -> 2080 -> 3080 and that feels both cheaper then aiming for the 2080 Ti or this time the 3090 while still giving you good performance. But this time around things looks abit different with Nvida's fail of a release and Big Navi looking like it might beat Nvidia. I have to reconsider my pre-order for the Strix RTX 3080 that's for sure.
In Serious Sam you can start a benchmark, which is practically a demo playing on the map you choose.
Also, it is ridiculous how much framerate you can earn if you lower the size of the shadow map...
Crysis has ~60% GPU util and 10GB VRAM and visuals are not that good, super poor development. The most inefficient game in the history alongside Assassins Creed.
Dude, the architecture of the game is almost 15 years old...
DirectX 9 is really bad compared to 12, even The witcher 2 uses a lot of resources because of D9
@@whathappenedtothedon True it is pretty old, but take into consideration the time when it was released, other games had pretty good visuals and did not use as much resources. It hindert the best machines at the time, which does not mean that its a good piece of SW, but a bad one. Ironically the remastered version still does not run well on the best machines of 2020..
@@Format090 good point, but don't forget that Crysis was a lot more than just visuals...a lot of system resources go to the physics, terrain and sandbox environment.
Not easy to do, I was running it back in the day on a 2 Gb card if I remember correctly
its literly the only game with physics in the bench. I loved playign with the editor, making tree suking tornados and nuking up the iland :D the good old days...also it was the first engine to make this kind of leap into the future. technicly cryengine 1 was more adavned than the cryengine 2 and 3 where they massivly cut back on simulation and went back to prescripted animations just like the rest of them...for the time it was absolutly stunning...also as many others said here the drivers of the new cards are optimised for dx11/12 the old engine even in the remaster cant utilise tensor nor raytracing cores so the 3080 is literly 40% inactive which is pretty exactly the 60% usage
1:34 I come home after school and check your channel every day. I love you bro
I love 21:9. Best size for movie, for game, for multitasking, for everything. Better than 4K @ 16:9.
I just wish there was a 3440x1440, 144 Hz OLED monitor. Mine is a 3440x1440, 144 Hz VA panel and while it looks absolutely gorgeous, watching this video in an AMOLED cellphone display makes me want to watch everything with this richness.
Doom is the only game on the market which is ACTUALLY optimized... so basically to run comfortably on my 120hz monitor I would need 3080Ti.
I hope CP2077 delay to make the game as optimised as possible.
as optimized that its the only game which crash on my desktop and notebook
Thx man that was an amazing benchmark i enjoyed the whole video with my 3440x1440p monitor
Just bought a 1440p 144hz monitor. I was a little worried about not being able to take full advantage with my 2070 super but this makes me feel better since I’ve essentially futureproofed my monitor for a long damn time
im running a 3080 too, got my 34" Lg ultragear gn850b a year ago and it was SUCH a jump over regular 2560x1440p acer predator that it was actually insane, never did i think id like it so much more but holy shit, now after more than a year with ultrawide id never even think of going back.
Finally...something I can watch full screen
was about to write same thing ! amazing right
It's so immersive I forgot it's a benchmark video lol
Awesome to see more people using this resolution for benchmarks! Did a video myself comparing the 3080 to the 1080Ti in this resolution :)
I see a decent amount of people saying “you don’t need more that 10g of vram for gaming.” After seeing some of these results, idk, I think sometimes gamers do need more than 10g
Next gen games will use more than 10GB of VRAM on high-ultra settings confirmed by some Developers already, thats why im waiting for the 20Gb version of the 3080.
Well you have to understand that's just amount allocated, not amount used. In most cases, games only use 70% of what is allocated. For example, MSFS 2020 allocates 11gb of 2080ti vram at 3440x1440p, but only actually uses 6-7 gb at any given time. The idea that next gen games will use more than 10GB of vram, especially at resolutions lower than 4k, is cap
@@MsST3F4N finally someone who understands..
@@rakennavarro2557 i’d suggest to rather wait for 3080ti/super.
@@tensorwolf There is no room for a 3080ti/super in the ampere technology, max they can do is use the little space they have left for increase the VRAM of the 3070 and the 3080... thing they are doing already, the TI version of the 3080 in this gen is basically the 3090.
sweet bench mate! keep on going
Control looks and plays exceptionally with max settings and DLSS 2.0
It's always funny to see how good the Doom eternal optimization is. I wish every dev but that much effort into optimizing their games.
Your a legend for making ultrawidemasterrace gpu benchmarks!
doom eternal best game best optimization best experience
Thanks for the benchmark!!!! very useful
Very nice info using 21:9
Bro I appreciate this, as far as Red Dead Redemption 2's settings go, according to Digital Foundry's extensive testing, you're gonna wanna run DX12 mode instead of Vulkan for better performance
Also, to turn off Motion Blur as it was left on, you only need that if you're running below 40fps
Thanks for the video, greatly appreciated
high resolution, high framerate & high settings 😍
The raytracing in Ghost Runner looks great an all but holy crap the performance hit is massive. it's the kind of game where higher fps is much more enjoyable
Thank you for this I can never find benchmarks for 3440x1440p
Ghostrunner gets decent sales & Cyberpunk blows up the bank -> Next year's Call of Duty -> Wallrunning IS BACK in Call of Duty: Cyberninja-punk Warfare
also thanks for all the 21:9 benchmarks. I love my 3440x1440 gaming monitor
Nice to see, this/those, in 21:9. I use it too, same res..., and I **love** 21:9!
1:12 You are the best youtuber on the platform without a doubt
1:15 This is the funniest thing I've ever seen
This is actually my dream setup
Letting it auto boost to 5.3 will give more performance than manually overclocking all cores to 5.1 mainly because games won't use 10 cores and letting a single core boost to high af clock speeds is much more performant.
Thanks, mate!
I've a Asus rog curve 34" wide 3440x1440 I got a few years back & that's the exact CPU & GPU I'm lining up with 64GB DDR but was thinking my monitor wouldn't hack it but aft watching this, I'm keeping it. Thank you for showing, that's beautiful.. I'll sleep well tonight 😌
Awesome video!
ah just found it!! thx man for theze test as this resolution is actually the futur.
I love UW, could never go back. I have it paired with a 3090 kind of looking into one of these to get some cash back if there's not that much of a difference.
This just confirms the patient waiting I'm doing for game at 3440x1440p with a 3080ti is going to be WELL worth the wait lol
And I thought I was the only one waiting for the 3080ti
@@theothersideofthecoin3125 lol no sir, I'll let these folks clammer over the 3070 and 3080, I'll pounce when this 3080ti drops and upgrade my whole rig I'll be gaming comfortably for at least 5 years to come lol!
Phone with 21:9 ratio : Is this cinema?
Considering 4k is 8 million and 3440×1440 is 5 million pixels, shouldn't frames supposed to be higher than this?
Most of those games are poorly optimised for PC.
dont have a gpu to max out even 1080p, watching these makes me happy
I think thats the best size for gaming..!!
For RDR2, would be nice to see how it does in Saint Denis at night
Ultrawide will become the true edge of pc gaming vs console. Console has the benifit of the big tv and a big couch. PC was always bragging about how better the graphic is, but the price of a good PC is 3 to 4 times of a console. With ultrawide, even at 1080p, the experience is transformed.
1:31 I really love your content bro it makes my day better
I can't believe we have to ask in 2020, "But can it run Crysis?"
crysis and ubisoft suck at optimizing games lmao, its a fact. Witcher 3 and red dead look 3x better and they run 3-4x better as well.
@@PR1NCETD0T No. Fact is Crysis was developed thinking CPUs would get MUCH higher clocks (upwards of 5ghz stable) instead of multi core, which ended up being the direction the new chips going towards.
The question has changed, its can it run cyberpunk2077
This video makes me so happy!! I've postoned playing ALL of these games cuz i want to play them (except ark) on ultra wide with my 3080 once i get it! Thank you so much!
Great video thank you!
Nice to See Crysis is Back as a Game that is hard to Play With High frame Rate :P
Keep up all the great content :)
Why do these benchmarks have such low results? I ordered a 3080 to pair with my UW 1440 monitor but these results are kinda underwhelming.
these results are from the highest settings enabled for everything (ultra). Putting some stuff at high and lowering antialiasing will give you many more frames with no discernible quality hit
@@hectamus_ Still, most of these games are already around for quite some time. I would expect much better results tbh. Unless I'm missing something here.
this deserves more likes smh
Literally the RTX 3080 is probably the best bet for 3440x1440p without "breaking the bank". You can run close to 60 or at 60FPS for most DEMANDING games. But everything else is 100+ FPS. If you want to get high FPS for everything. You'll need RTX3080Ti+. The RTX 3080 already close about 700-800 dollars. But you have to pay almost double for the 3080Ti and RTX 3090.
I play this resolution fine with an RTX 3060.
You sir doing god's work
ultrawide is underrate i hope more game cover these res more
.
i have curved 21:9 144hz monitor and can't go back to 16:9
the experience is really immersive and image is really better
i'm on 2560x1080 and it's somewhat can play competitive
1280x700 is a good resolution for CS:GO btw.
Keep the UW benchmarks coming, how about some kingdom come?
Useful video, thank you!
glorious 21:9
That card is pretty solid!
Intro/Outro Music: Aakash Gandhi - Eyes of Glory
The performance is pretty underwhelming, I hit an average of 55- 60 fps on Horizon and Mafia on 3440x1440 with settings set to medium/high and some tweaks and overclocking here and there on an i5 6600k paired with a gtx 1070....
you're disappointed in a 30-40 fps difference playing on medium with an overclocked i5 and 1070 compared to no overclock on ultra settings?
lmao. ok
@@kronillix2735 140-150% performance for 300-400% the price? Yes, I call that pretty underwhelming lmoa
Your specs aren't good enough for 1440p ultrawide gaming. What did you expect? Just play 16:9 instead and use the extra screen for easy to run games and multitasking workloads.
Looking at this with my 13" MacBook Air I have a 24" I game with and is decent . I think 4k demands the best hardware to reap the benefits but is just expensive . Ive heard a lot people say 1440p is clear enough for fps to hit their targets and agree. 3440x1440 I think is the sweet spot for better frame rate and immersive game play. Will be getting one soon paired with a rtx3080
Ty ty ty!
You're the best! Keep it up.
Got an Ultrawide coming later today. Going to use my 1070ti. I'm used to playing games on low, medium settings so ill be fine ill lower settings more if I have to. Cant wait to upgrade the PC later this year. Hoping to get a 3080 in a prebuilt. Maybe a 3080ti if the price is right.
the into music is just top notch, will take it anyday in the week over disco bars NCS
Finally! A card that can run RDR2 on 4k 60+ fps
yea but here is 3k not 4
Did you remove CoD?
I guess my 75hz ultrawide and the 1080ti is still fine. I was thinking for an upgrade(144hz uw-New Gc) but i ll wait for one more generation.
What game is that at 0:47
You guys don't even know how much i want monitor with 21:9, 3440x1440 with 40" diagonal to be a thing.
You could go LG OLED and run a custom res on it. Will do 120hz and oled is nuts! I have a 55”. They make a 48” this year too. Super good for gaming
@@damonm3 Yeah but who will bend it for me, hmm.
@@zperdek it doesn’t need to be curved
@@damonm3 OLED is nice and everything but for small room is "smaller" 38"-40" curved better. At least for me.
@@zperdek sometimes you have to settle. Would you use it as a tv also? Or just desk only?
Thank you for this video, very usefull for me!
would be nice to see a performance in Cyberpunk 2077 at max settings RTX on
can you put the advance option on ultra and tell me please?
Thank you vere much
Ghostrunner must be amazing on ultrawide monitor.
Is there any 16:10 monitors out there?
i own one
More benchmarks on 3440x1440p please
Funny how Crysis doesn't even try to hit 60 fps, just goes straight to 30 lol.
Ultrawide + 3080 masterrace
what is the software you're using for the info on the left side of the screen?
It's MSI:AB
@@vincentabel3825 it's technically riva statistics tuner that comes included in the install package of MSI After Burner but also other similar programs like EVGA Precision and probably gigabyte aorus engine
What capture card did you use?
I expect higher FPS with this 3080 and its setup.
you can use 1920x822 to repeat the experience of using 21: 9 on a full HD monitor