Be careful buying gear from Jawas... while it's normally a good deal, it's hard to trust them little cloaked gremlins. Buying from Jawa seems much safer. Still a good deal and no sand crawling tech bandits.
Yea they are buying goods at less than market value to put into new builds for the consumer for low cost solutions. There 100% new stuff is the same prices as others really. Been there done that. You can build your own for cheaper and get better parts in the process. Too many prebuilts out there just as cheap or expensive as one wants, Costco and Micorcenter for example. They only wanted to pay 350 for new in box 4070Ti. Who wouldn't? lol
The best gaming cost saver for me, is to never update the prescription for my glasses. That little bit of “Eyeball AA” makes 720P and 4K look the same.
@@Korixon. As an aside, watching Cyberpunk 2077 fly at 120+ fps on my laptop (even is we’re talking frame gen here) is pretty mind blowing. Back in my college days with my laptop then (Radeon 5470), it was often a struggle to get just 30 fps at all. Forced myself to play Battlefield 3 at 800 x 450, all Low, and upper 20’s-low 30’s fps, simply because my parents gifted me the game. And now I have a conniption if a game drops below 60.
I'd rather go ultrawide 3440x1440 for that immersive experience rather than 4k which is the same regular 16:9, just mostly small improvement in picture quality.
I’m suffering with these issues right now! I have an RTX 3070 running on a 1440 144hz monitor. With that setup I’m reaching 40-60 FPS on custom/high settings on the new Warhammer 40K title “Space Marines 2” ..the game is incredible. I just wish it was optimized.
1440p is just so much more affordable. It's the sweet spot. I also will never give up the framerate difference between 1440p and 4K. The only upgrades I'll consider are HDR and *maybe* going over to 21:9 ultrawide.
@@pozytywniezakrecony151 OLED? If so, understandable, what with OLED monitors only being a recent occurrence. VR does sound like a cool upgrade too. I, personally, just don't want to hop in until it's more refined. In particular, haptic gloves sounds really cool but it's either too much time or money for me to invest in.
At some point you don’t need more FPS. I stayed at 1080p 60 until I saved enough to move to 4K 120. It doesn’t cost that much more than 1440p 120, unless you play the most demanding games.
@@JGComments A decent 1440p 144Hz monitor at the moment is somewhere around $180-250. A decent 4K 120 or 144Hz monitor is somewhere around $400-450. Maybe "doesn't cost that much more" in regards to a whole setup but an extra $200 to have a minimal sharpness boost just doesn't make sense. I would much rather put that extra $200 into a Mini-LED 1440p monitor that would have proper HDR capability. The perceived improvement via contrast and color would be far better than the diminishing returns of being sharper. As a cherry on top, it'll look better all while still being able to natively run higher refresh rates.
The problem with palying in 4k is that when you do it for the first time with large enough display, there is no going back. Im plying on OLED 4k TV I use as monitor and it is insane.
The fast games I play I wouldn't even notice. My computer can run ultra settings and I still put it on high. I'd rather have the performance bump than something I won't notice while actually playing the game. I can see it being really cool for flight simulators.
4K gaming is not stupid it's just not affordable. Most people don't upgrade their system every generation. I have a friend who upgraded now after 10 years.
Not only 4k is not affordable, you can't physically get competitive framerates (100-200+ low 1% fps) on ultra settings in modern games without frame generation/DLSS which kinda dilutes your experience.
@@kotekutalia what modern competitive game on ultra can't get over 100 fps on a 4090, most competitive shooters are not hard to run and honestly ultra isn't really required if you plan on being competitive anyway.
@@alvarg I didn't say competitive game, I said competitive fps. Aka if you are a comp gamer you will be dissatisfied when you pay $1800 on a GPU and it sometimes still stutters when pathtracing in Cyberpunk on 4k.
@kotekutalia gpus are not ready for path tracing yet, and you would be pretty dumb to buy a card for that purpose. You could use normal ray tracing on a 4090 at 4k with great results. Competitive fps is a vague term, competitive against what. It wasn't too long ago where 60 fps average at 1080p in the latest games at max settings on the totl gpu was considered good. You can easily get that on a 4090 in pretty much every game except for some UE5 titles which makes sense as that engine was designed to push the limits of hardware.
not really to me the only difference as in well framerate and hertz the monitor can pull is a huge difference for fluent gameplay becasue 1440p monitors very few exist that can do 240hz at 4k and those monitors are near impossible to find if at all findable if its a tv that is perfectly fine for watching stuff only but for gaming no its more of a youtuber/ whatever they are called people. as ill stick to my 1080p gaming monitor from 2021 2022 till christmas when ill hopefully get my dream gaming desktop and a new 2024 1080p to 1440p 280hz gaming monitor from alienware
A soft image does not really bother me, but on a 27" the difference is quite large, so as long as I get 60fps at 1440p I surely would not go down to 1080p for higher smothness, but in itself even the lower res still delivers a good experience for me
@@TheT0nedudeNo one is concerned about you pitiful console peasants other console peasant opinions don't count either you guys legit don't have proper context to appreciate truly good graphics. If you are playing on a large tv you will absolutely notice a difference between 1080p and 4K. The more you stretch pixels the more important the quantity of them becomes. Edit: Also the higher the resolution the more beneficial upscaling becomes and the more wiggle room you have when it comes to how far you can push the upscaling before it looks bad
@@rlsjunior797 Even single player shooters feel horrible in 60fps. I only max my settings on controller friendly games like Elden Ring, Spider man, Ghost of Tsushima etc.
@@agusr32 Agreed and I can't notice any difference in clarity from 1440P to 4K at 27" monitor. Maybe larger screens. But I still use two 1080P monitors and one 1440P for my triple monitor setup. The 1080P ones still look amazing.
Resolution isn't everything. I went from a 1440p 27" VA monitor to a 1440p 27" WOLED monitor. The upgrade was massive! Especially contrast and color saturation is soooo much better. Smoothness is also much better - IF - the framerate is over 80fps, otherwise you'll get judder/stutter because of the superfast response times.
Indeed. People saying they wont go back from a 4k OLED setup is more like saying i wont go back to a TN,VN, IPS panel rather than its resolution. 2k OLED looks as good as a 4k OLED.
Alot of console games are running lower than 1080p this generation although it's somewhat impressive what the ps5 does with a underclocked rx 6700 non xt
@@evaone4286 15 years old games maybe yes. Even a 4090 almost can't run games like alan wake 2 in 4k WITH DLSS AND frame generation. How should a 3000 series GPU manage that in new games, let alone 2000 or 1000 series.
@@williehrmann some games like Alan wake require the absolute best card on the market so I thinks it's an outlier in this case. As I said, you can run games at 4k since 10 series, just depends on the settings and fps. But you can get playable fps even on ultra with some cards.
Yeah I've had a 4K display for around 5 years now. Started using it with a GTX 1080. Did downgrade from 165hz to 120hz for it though which did slightly suck. Plan to upgrade soon, I've noticed my display has slowly been gaining an odd vinyette and has a dead pixel on the one side. Not easy to notice, but it is there.
Theres midrange 4k monitors now tho and 1440p has plenty of flagships occupying the peak refresh rate oleds while 4k sidegrades with lower fresh rates. For value upper mid range now its 144hz 4k and 240hz 1440p
@@Frozoken 4060=1080p 4070=1440p 4080=4k and 4090= all above... He said don't expect big performance changes from new gpus... so the 5060=1080p 5070=1440p 5080=4k ect. Welcome to the new world of gpu 😂 😂
As someone who works in video and animation, I can honestly say that if you're not viewing content on a screen larger than 50 inches, 4k is not necessary. I have a 130 inch theater screen with a 4k projector. That was the first time I actually found 4k useful and the only place I care about resolution. Anywhere else, 1080p is fine. 8k is just ridiculous. So yes, 4k has three times the pixels, but the difference doesn't become noticeable until the acreen size gets larger. I do medical animation and rarley need to make a 4k video unless some client has a need to show the content on a large screen at a trade show
Moved to 4K 2 years ago and it's a blast really. I'm having great fun with it in games, videos or reading. Fantastic clarity, no visible dot pitch black grid, feels so smooth like a smartphone screen. Sure it needs more compute power but it's worth it really. I'm glad I didn't stay on 1440p too long. The felt difference from 1440p to 4K was greater than the move from 1080p to 1440p. Also have to say 4K performance DLSS looks so much better than 1440p quality DLSS because it has higher pixel input so resolves details much better on a vastly higher pixel density screen. In actual use, DLSSQ on my 1440p screen playing RDR2 was borderline acceptable quality while DLSSP on my 4K screen looks way better than 1440p native.
So, I actually went from 1080p > 4k > 1440p in my monitor journey. 4k looked incredibly crisp and clear, but the sacrifice in performance just wasn't worth it. However, when I dropped down to 1440p, I still got a much crisper experience than I did at 1080p, but I didn't have to sacrifice nearly as much in terms of performance. Was I getting as high FPS? No, but I'm not looking for 160+ FPS in all my games; if I'm getting 60 FPS minimum, preferably above 90+, then I'm completely fine with it.
since I started to play at 4k at 27'' everything else literally feels like 720p, even 1440p feels like complete garbage. It's crazy and with upscaling and frame gen you don't even need a insane pc anymore
i supose people choose high resolutions for high quality of image right? every time i tried upscaling and frame gen my games become garbage with ghosting and artifacts, i now that many people doesn't care because they got higher fps but what is the point in a higher resolution like 4k with a worst image ( full of ghosting and artifacts ) and less fps than if you was using full hd?
@@catgirllover666rockstargodmonk because it's not the case?haha Atleast in th3 games I played on 4k high settings where I needed upscaling (cp77,ghost of tsushima,jedi survivor,ghostwire tokyo) I could not sense a single difference with quality/ultra quality upscaling. I Switched multiple times in a row between it and even when I was looking at differences for minutes only concentrating on the image and could NEVER tell a single diff. So in my experince quality upscaling on 4k high/ultra settings in the games I have played I could never tell a single difference. Frame gen I only tried with ghost of tsushima where I couln't tell a diff either because I play it on controller so you don't notice any input lag diff
@@sensei_... Same, I'm using a budget LG 4k 27" and also having a blast. I don't even need Quality DLSS. Even Performance DLSS looks great, better than FSR Quality.
I run a 65" LG C1 so yeah- 4k or die. My 7600x/7900xt has been plenty to drive it with a little upscaling here and there as required. Honestly, anything that is 7800xt/4070 Super or higher can run a solid 4k image with upscaling.
4070 Ti Super and recent 4k upgrader here, have been able to run most games at 4k native Ultra, the more demanding ones with DLSS. The upscaling at 4k is subjectively much better at 4k than it was at 1440p and I only have fluidity problems when I get into areas where frames drop to the 60-80 fps range, once up around 90 or more motion is pretty decent. I'm running an LG 32" IPS panel, my TV is older and I'd be locked to 60 fps on it.
@@dkindig I play mostly older titles due to the limitation of my hardware but being able to play 4k locked at 60fps on a 42" 4k monitor is amazing. The only game that spanks my system is Red Dead Redemption 2 which uses more than the 8GB of video memory on my lowly RX 6600. What is crazy is that when locked at 60fps the CPU doesn't need to work as hard when gaming in 4k as opposed to 1440p or 1080p at higher refresh rates. I can get 60fps in 4k with the GTA5 built in benchmark with max settings aside from grass being set to high instead of ultra and that is with a 3rd gen Intel 3770k. The one thing that has kept this system going for so long is a bios update allowing me to add an NVME drive on a pci-e slot which I was able to install the OS and boot from. Having 32GB of RAM also helped quite a bit. Of course newer games would most likely crush my system but I am content with playing games that have aged well. New games are a gamble if it is complete on release or if you are just a beta tester until patches come out to rectify the issues.
I still play at 1080p, I dont care about upscale gimmicks, most if not all of the games i play are online comp multiplayers, upscale its a no go, i only play native at max settings so even a 4090 couldn't provide enough fps (+140) at 4k native for most of the games i play. 2k is a no for me so i will wait for better hardware to go 4k.
I just upgraded from 1440p ips panel to an OLED 1440p and wow it looks incredible! I think this is the best route if you want better picture quality but don’t necessarily have the hardware or don’t want to run 4K. 7900XT + 7800X3D btw, pushes an easy 165+ fps in any modern game with high settings
Recently dabbled in 4k on my tv. Tried Horizon last night + HDR. It's just absolutely insane. Only way I want to play story-based games now. 7800xt does incredibly well on a lot of these PS4/5 ports at 4k. Getting ~90 fps on Horizon right now with max settings.
Same here with Rx6700xt. Playing GoW, Ghost of Tsushima, Cyberpunk, and the list is increasing.( It's fantastic) I'll stop playing in my 1080p monitor for shooters.... 😂
Last Black Friday I built my rig with a rx 7900 xtx ($930) and ryzen 7800x3d (part of a bundle got great price) and I run most games at 4k max settings with 100-120 fps. I was also able to get a 4k 120hz hdr monitor for $500. Depending on the game I can mess around with ray tracing and still get very playable frames, I never play below 60. The reason I sprung for all that was because I thought good 4k gaming would cost so much more, but that actually all shook out to be reasonable for me and I’m really happy with my rig. With time, ray tracing will become much more optimized so I think my build will hold up well over the years
At 4k DLSS is indistinguishable from Native for me at least. Even performance which scales from 1080p is surprisingly so much more sharper then 1440p. If you have money definitely go for 4k
@@albert2006xp a monitor showing you 1440p even if the render resolution is 4k is still going to look worse than a proper 4k screen that's upscaled from 1080p
@@alvarg DLDSR is pretty good at making things sharp. As long as the pixel density is good and you're not literally in the screen with the magnifying glass, it shouldn't be that huge a difference. Don't underestimate how much sharper DLDSR can make the image.
On my 27" monitor i dont need 4k, but with my projector on a 110" screen its a absolute must have. I want to game on a projector since im a child. But i dont only waited for so long because of the money. I waited for usable 4k projectors because 4k on this screensize is a heeeaaavyyy jump in picturequality. And it was definitely worth it. Im so happy with it! 😍😍😍
One thing worth mentioning is that with a 4k display you can get 32 inch screens which is massively higher pixel density than a 1440p 27 inch screen. Pixel density has a huge impact on how people perceive images. I remember someone seeing my 15inch laptop screen show a 1080p image and think it was 4k because it looked "crisp and clear" when "crisp and clear" really meant a very dense pixel makeup. Same reason 720p can look good on a Switch. Also worth mentioning that 4k performance mode DLSS has a higher internal resolution than 1440p quality mode: 1080p vs 960p. AND 1080p divides evenly into a 4k resolution (you can use 4 pixels in 4k to represent 1 1080 pixel) so most of the algorithms seem to do a better job getting it to the target resolution in my experience. Still need a beefy boy to jump up to 4k performance over 1440p quality.
How about the 32" Dell UP3218K .. 280 ppi?? .. 8K/60 FPS wouldn't be ideal for racing or fast paced first person shooters, but would be a dream for RPG/grand strategy or third person action/adventure games like The Last of Us or Red Dead Redemption 1/2 -- i.e. the type of games I love most. I use a 65" LG C1/4090 for most of my 4k gaming but I'm in the market for a smaller monitor. I've had a high appreciation for pixel density, ever since getting a recent Macbook Pro and iPad, so looking for something as good or better than Apple's displays.
in the last year I went from 1440p UW 165hz OLED to a Neo G8 4k 240hz and now a 1440p Samsung G6 OLED 360hz. What I learned is that 4K is awesome for sure and for 32" or bigger its needed. What I also learned was that I missed being able to hit those higher frame rates. For example in Warzone at native 4k on the lowest settings I could hit 160fps avg. To most people this is more than enough... but once you see 240-300fps in 1440p, especially on an OLED.. that sharp 4k image starts to take the back burner compared to the smoothness of 1440p at a much higher refresh rate.
I think you are correct. 1440p UW is still the sweet spot. The jump to 4k isn't worth the framerate hit and I like the ultrawide aspect ratio better anyway.
Without irony I say you must be an exceptional person. Most people who aren't playing really competitive FPS games constantly wouldn't notice the difference above 120fps. The difference between 60 and 120 is massive. The difference between 120 and 240? Hardly anyone could, never mind would, notice.
1080p on a 1440p looks garbage becaause the pixels don`t line up (1 pixel represented by 1.3333 pixels) Get 1080p or a 4k monitor: which can actually combine 4 pixels into 1
You can letterbox as well if your monitor is large enough. My monitor is 32 inches and 2k, but it's also a 24inch 1080p monitor. Which isn't too small, you can sit closer as well.
It’s not stupid if you are happy with the performance in the games you play. I’ll receive my MSI MPG 321urx tomorrow and I can’t wait to play my old games in glorious 4K.
@@jonathansaindon788I was more making the joke that I was so stupid as a 4k gamer that I was talking back to the video. 4k gaming is glorious! I'm waiting to upgrade my 12GB 3080 with a 5080 or 5090, then my PC will be perfect!
@@kush2023 womp womp, let him do whatever he wants with his money. Your money your choice, his money his choice, lots of people should try to understand this.
I've been saying 4k gaming is stupid all along.The reason I say that is because people buy a 4k TV and then think that's what they'll use to game on without realizing it cuts your frame rate massively. Let my try to reword that to be clear: 4k gaming exists because people buy a 4k TV. That's it. That's all there is to it. No real thought goes into it, it's just because it's what people have so people think that's normal somehow. Consoles and TV's have tricked people into thinking that 4k is normal. If you want good frame rates and a clear image you want 1440p. Also, I have a 4090 with a 27" 170hz 1440p monitor (Gigabyte M27Q). If your first reaction to that last sentence is "A 4090 is overkill fdor 1440p" you are completely missing the point. It means you don't like high frame rates and like low frame rates becuase you bought a 4k monitor or TV to game with because the number 4k is bigger than 1440p so it has to be better, right? No, it's not.
I play single player RPGs. The RTX 4080 does a very good job on those at 4K. High refresh rate just does not seem that important for these kinds of games while 4K HDR looks very nice and makes the worlds feel more immersive.
Yeah, and there's A LOT of stupid 4K marketing out there. In every promo, trailers... It reminded me of times, when smartphone manufacturers were measuring d**ks of who has the most megapixels in their phones 😂
@@pf100andahalf I would agree. For most people it is just not viable at all. But when you can use it the results are very nice. I would not go back but it was expensive.
@@ImmudzenI'm a similar kind of gamer with a 4080 Super and 4k144hz 32" monitor which I bought earlier this year. High refresh rate is very nice in some fast paced action games but there are plenty where I'd be happy with running from 60-80fps with high details.
@@PeopleDoingStuff huh? Do you have a point to make? Are you implying that Sony is the ‘MacBook’? Because reviews with proper measurements say otherwise.
A lot of people go from one end of the spectrum to the other and never actually understand why one is better than the other. Like going to OLED and thinking its great just because its 4k. Or changing to a high end CPU and thinking its just because its a Mac.
Using decent quality IPS panels at 1080p, 1440p and 4k, the difference is very noticeable. 1080p to 1440p is probably a bigger jump, but both are notable.
I own a 4090 and a 4k 160hz monitor. I made a huge mistake. Once you watch 4k, you don't want to watch a lower resolution anymore. i hate to play gam,es at 60-80 fps, i just hate it. 100+ fps is the only fps acceptable, at least for me... And there are a LOT of games that my 4090 with DLSS and my i9 13900k cant go up than 80 fps, A LOT. Keep 1440p till 5090 or maybe 6090... Once you watch 4K, there is no turning back.
I'm on a 4070 ti super and run 4k fine at 80+. You playing cyberpunk exclusively or what? I'm playing flight sims, arma reforger, fighting games, subnautica exc
@@Leo-yn5fx You are seeing significantly more data per second, which makes the game look play and feel so much more fluid and smooth . Ive been playing on 1440p 144hz( monitors for over 10 years, until last week I went to a 4k 160hz monitor and before I got my new card I was around the 60fps mark on the games I was used to playing 100+ on, and it looked like a slideshow. You can never go back after having 100+ fps and g sync/vrr but if you've never had it you don't know what your missing
@@Runk3lsmcdougalIt really depends on the game for me. If a game is well optimized with good frame times 60 FPS are super smooth. But as most of the games nowadays are optimized like sh u need at least 80 FPS to get the same level of smoothness.
If you want to see the complete view, or do you need to focus on small portions? This makes the difference regarding required resolution. Unfortunately customers tend to pay much for newer technology, they do not even need...
I have a 2080 and it depends on the game for me. Anything that was made before 2018, my GPU can run at 4K/60 or 4k/120. But anything newer in struggles. Like I was only getting 30-40 fps on Alan wake 2 with lowest settings with DLSS performance mode. So I definitely want to upgrade when the new 5000 series comes out
Doesn't this literally depend on your graphical settings. Sure, if you're playing at ultra, you'll struggle. But at med-high, it should still run fine at 60+. Also, upgrading right when a card comes out is a terrible choice since cards always drop significantly like 1-2 years later. I picked up a $500 card that was over $1k at retail just 2 years ago lmfao. IDK why people pay those prices when cards first drop.
@thelonercoder5816 yes it depends on the game engine used, settings, resolution you want to play. If you have an older gpu you will have to see what settings and resolution will get you at least 60 fps or more. People are ready to pay either all at once or use a payment service to pay for their pc build because they have saved for years to upgrade. Yes the cards will cost the most when they release and will be hard to find either for a few or several months. People will buy what they want no matter how long it takes especially if their pc is struggling to how they want to play their games.
@thelonercoder5816 for example my pc is in my profile picture. I went all out with my first all white high end build with the gigabyte aero oc 4090/ 7800x3d cpu/ 64 gigs of ddr5 ram in the white phanteks nv7 case. I paid off the pc using amazon's affirm and it took me 9 months because i paid monthly. I have been upgrading the cosmetics of the pc over time and just recently have bought the gen 5 crucial t705 4 terabyte ssd and the gen 4 Samsung 990 pro 4 terabyte ssd. I am paying monthly on those as well. I bought the phanteks screen, the phanteks vertical gpu mount, the correct fans which I also paid monthly as well. I am looking forward to the 3rd party tests of all of the next gen parts to see if I want to upgrade or not.
Once you go to 4k you dont go back. I have 4k/1440p/1080p monitors on my setup, and i never use the lower res monitors for gaming. Way to blurry/aliased to my eyes
You can DLDSR+DLSS to get rid of the blur/alias. Whether you run 4k DLSS P (which basically any card would have to at least in some games) on a 4k monitor or on a 1440p monitor with DLDSR, you're getting basically the same result.
I had a 27inch 4k 60hz in 2016 and this year i went 32inch 1440p 165hz. Happy with the stepdown gives me a better/modern/bigger screen and saves money for the gpu. Don't play games as much as i used to, so overspending on a modern 4k screen doesn't make sense to me. The lower pixels per inch doesn't bother me.
@@MrAnimescrazy Not sure yet as it's far out years from now. For now the PC I'm getting next year(AM5 build with at least RDNA 4 gpu) is meant to handle my current res of 3440x1440 better. All I know it's going to be an AMD build and most likely with an AMD gpu for the 4k ultrawide.
Yeah I do as well and can literally run max settings and rt and still have smooth experience with high refresh rates. 1440 is literally the sweet spot.
@@White-nl8td I thought the same thing, until I tried it. At least for me 60hz are enough in games, where I want high visual fidelity. It is noticeable at first, but after a few hours it becomes normal again, while the higher res keeps being impressive. The difference between 1440p and 4k is much more noticeable, than the difference between 60hz and 144hz.
@@lyxsm After gaming at 165hz for 8-9 years I literally can't even play at 60 fps anymore. I will probably eventually upgrade to 4k when this current 1440p/165 monitor I have dies, but it will be 4k/144+ hz and I will need a GPU to run it. I have a 4080 but I consider that kinda bare minimum for 4k/144hz currently.
4080 user here as well. I made the switch from a 27 inch LG 1440p 165 hz (180 oc) to a 27 inch Redmagic 4K 160 hz Monitor (I lucked out on a used one in mind condition). At that size the resolution is incredible but on a handful of games I play the GPU needs some help from DLSS to hit that high refresh rate. I couldn’t be happier nonetheless. 4K is a lot more viable than it used to be.
The 3440x1440P resolution is where it is at for value gaming right now. Good screens can be found for less than $300 and most mid-range video cards can handle 3440x1440p fine.
@@GrainGrown "P" stands for "Progressive scan" which pretty much everything does now but there was a time where a lot of monitors and TV's only used interleaved scan (every other line in each pass) which introduced a lot of flicker. Probably pointless now to bother with the P.
Progressive scan. Both the vertical and horizontal rows of pixels refresh simultaneously. It's practically ubiquitous now. In the past, vertical and horizontal rows of pixels would Interlace. Hence 720i or 1080i@@GrainGrown
I've watched a couple of your videos and they are good! After a 22-year hiatus, I returned to PC gaming less than a year ago with a 7800X3D, 4070, 32GB setup leveraging a 1440p display and couldn't be happier. Well, actually I could considering that the 40-series Super was launched like 3 weeks after I bought the PC but I've gotten over it now. 😁
I got 4K monitors because I look at text all day. In games I run 4K with DLSS/FSR set to Performance or Quality, depending on the game. I wouldn't want to go with 1440p because of scaling issues on MacOS (yes, I do run both Windows and MacOS).
As someone who upgraded from a 1440p 165Hz to 4K 144Hz monitor last year, I can honestly say if you're a budget conscious gamer running at 1440p, and wishing you could afford to game at 4K, don't worry. The visual uplift isn't like going from 1080p to 1440p. Yes it's noticable at first, and it's a "nice to have", and doesn't offer the same level of beneficial fidelity uplift. I'm using an RTX4080 at 4K, I think that's the minimum I would consider for a good experience at 4K (either with or without upscaling). One of the benefits of 4K is upscaling is much more practical, so you can get a 'better than 1440p' image still with excellent framerates.
playing only at 4K since 2016 when i got gtx 1080, now using rtx 4080 and that's still great. even dlss and mediums at 4k looks better than native ultra 1440p for me
Yeah I just transitioned from 1440p to 4k and the most obvious difference was that the upscaling looked MUCH better in 4k than in 1440p. Upscaled 1440p had noticeably reduced quality, upscaled 4k still looked very nice. At 1440p I always preferred native. I still prefer native at 4k but DLSS gives me very good results for demanding games.
@@dkindig Yeah, I had the same expirience. Upscaling on 1080p was so bad that I thought it was useless, but on 4k it makes so much sense. Looks better than native 1080p.
Why are we insistent on ultra settings? 4k high settings would probably do the trick for many games and then turn on upscaling and go from there. I can play Cyberpunk on my 7900 XT at 4k without upscaling if I play at high. I don't even turn on ultra settings for most games. Too expensive vs high.
The problem with 1440p is that many people use their PC and monitor for multiple purposes. I have a 4k monitor mainly for productivity with my Windows and Mac laptops. I also connect gaming PC to the monitor. If I run games, I can only choose 1080p or 4k for the one-on-one mapping of the monitor. Run 1440p on 4k monitor causes pixel shifts.
Honestly if I was ONLY playing video games on my monitor I would do 1440p 100% and save money for top of the line system. However, since I also use my monitor for videos and homework the extra pixels make a Huge difference.
when i first got my 4k screen i just couldn't believe my eyes. it was like a literal window to the game, didn't feel like i was watching something on a screen. Sam Neo G7 32inch.
According to Samsung's website, that's a curved display. How your experience with fast motion, regarding black smearing? Does it suffer from backlight glow?
@@anitaremenarova6662 HDR ON is a different story. ppi increase from a 1440p to 4k is monumental. To my eyes, the 1440p is borderline clarity while 4k is just worlds apart. think of it as no anti-aliasing vs 16x MSAA.
@@ghost085 it took a lot of time getting used to a curved screen. i hated curved screens but after using this one i wouldn't want to get back to a flat screen. it does smear but not a lot, white text on a black background is hodgepodge(while scrolling only). backlight glow is unnoticeable even when ambient light is dim/off but there's a very slight halo around objects which again, is unnoticeable unless you're looking for it but sometimes its on your face and you just can't ignroe it. my advice is to get an OLED and skip this one unless you're getting it for cheap or afraid of the OLED burn in.
83" LG G4 4K gaming is awesome at 2.3 meter distance!! RTX4090 and 5800X3D are a nice pair with it as well, mostly 100fps-144fps in all games when using reasonable settings.
its down to screen size for me , 1440p is probably the sweet spot for a 27 - 34" monitor and you get damn good visuals and better performance . Consoles do look nice on 4k OLED but you are not getting true 4k but it still looks amazing.
You could do so since 10 series. Just depends on the game and settings. Still on a 1080ti, can still play most games at 4K maxxed. 4K is not just turn everything up and forget and expect 200 fps. Gotta tinker around
Well, okay, so if I wanted to play at 4k, I'd need at least a 4070 Ti Super to achieve a semblance of a somewhat stable 60 fps _in most games_ and that would set me back 800€. But then I'd need a 4k monitor as well, which would set me back _at least_ 400€ for an image quality that doesn't absolutely suck AND I'd be stuck with a 60hz monitor even if I were to downscale to 1440p. Not only that, but a 4k monitor wtih at least 120hz (I searched for 100hz first, found none in my immediate vicinity) so I'd be allowed to downscale to 1440p in case I wanted to play a fast action game and needed the extra fps, it would set me back almost 1100€, and I didn't even look at the specs, so it's probably god awful in terms of image quality. Is gaming at 4k stupid in 2024? Unless you can throw money at the problem, yes, it is stupid.
I have that exact GPU and I'm still at 1440p. 4K only makes sense if you have the space for a TV-screen on your desk and have money to waste sustaining that resolution by upgrading every other year.
If all you play are games that punish your GPU like AW2 and LotF then sure but personally I play plenty of less graphically intensive games and older games that run well into the triple digits of fps at 4k without DLSS on a 4090 and 7800X3D PC, it's to the point that my monitor is a bottleneck because the refresh rate is only 144hz.
The solution is simple, if u have 90 series cards (30 and 40 generation) go for 4k, if u have 70/80 series cards go for 1440p, and if u have 60 series card stick to 1080p
@@Just_Be_The_Ball dlss performance /ultra performance with medium setting and sub 60fps while disabling ray/path tracing isn't playing 4k optimised. Also before giving me example of some 5 year old game, give me your settings and fps @4k in any of these games: Hellblade saunas saga/horizon forbidden west/Cyberpunk with ray tracing/path tracing
4k with my 4080 super on 28" screen sweet spot for ppi. No problem to lower settings from ultra to medium like shadows or reflections or grass detail for 100-120fps super clear. Next upgrade will be 4k 27-28" oled. I cannot go back to 1440p its too blurry and distracting
There's a flaw trying to compare 4K DLSS Performance to native 1440p on the same 4K panel. The scaling of the 1440p native image won't be great. Ideally you'd want very similar monitors at the same size (one 1440p and one 4K) for this purpose. Maybe you could get away with a larger 4K panel moved far enough away to look the same size.
PC: (3080,10600K,32GB RAM, 3TB memory, 850w PSU) $2600 about 3yr old build TV: 4k 55' LG C2 OLED with Gsync $1800 on sale Not cheap but i cant think of any better gaming experience then 4k 60fps+ max/high settings on a OLED/Gsync display, it seems more you spend on a PC about $2k+ the less pay off it gives you, atleast just in a gaming aspect
Same 55" C2 here, I liked it so much I bought a 2nd 😂 With a 4090 I play mostly at 5000 or 6000 pixels acros, 80-120fps (capped) sometimes without DLSS. Add frame gen in there and it's always very playable. That TV is now around 900 bucks and still better than any monitor I've had.
I've been using an RTX 3090 for 4K gaming on my LG CX 55" OLED 120hz/VRR display in the living room. There are only 2 games where it has struggled with 4K/60. Those games were Starfield and Cyberpunk 2077. You lower a couple of settings and use DLSS and it's fine. Most games easily run 4K/120Hz without issues.
Contrast > Resolution That's not even talking about other image quality advantages. If both displays are calibrated, I can guarantee that a 1440p OLED will look better than 4K IPS both with static images and in motion while providing more FPS. You can even lower graphics settings with an OLED, let's say, from high to medium and it still won't look worse, if not better, while providing you with even better performance.
Have not seen 4k IPS. But just got UW 1440p oled. And it is a whole new experience coming from IPS. Playing games again from the start, for experiencing them on a oled monitor.
The best way to reduce performance demands for gaming is to simply sit further away and use a bigger panel. If you prefer to game @ about 3 feet away which isn't all that far, you can get away with 1080p easily, as well as 90 PPI, you can also do 32" 1440p, 2560 x 1080p 30" at this distance and panel size. 90 PPI at 3 away offers the same value as 110 ppi (which is very common today with QHD displays). 4K is great for sitting very close to your monitor, meaning you can benefit closer than 3 feet, but it also increases the feasibility of larger panels (like 32") up to 48" reasonably - however at the cost of needing minimum a $700+ GPU. 32" 1440p and 30" 2560 x 1080p are the sweet spots for larger display gaming imho. You can get away with a $500 GPU in this performance category easily. 4k Would require a GPU 150-200% more expensive to get away with a good linear frame rate and native image above 60 FPS.
I paid 1200 bucks to build a pc that games excellently at 1440p with proper ray tracing. I can’t really justify anymore money unless I’m using it for work. 1440p is definitely the sweet spot.
Just get a 4k monitor when on sale/Black Friday. Can enjoy your new card. It also won't be such a big expense at once. You can lower the resolution to 1440p and increase the sharpening in the meantime if you cant run 4k smoothly presently. It'll look just as good as native 1440p.
@@lsik231lCounterpoint monitor refresh rates for 4k are spiking while prices are dropping. Probably best to wait until you buy the gpu for your monitor update. A unexpected problem with high end 4k units is the cable isn’t rated for most lengths. Look into the issues with GIGABYTES most recent high end monitor mainly cable length limited to 1.2meters and no dolby vision all for $1,300. Gigabyte Aorus FO32U2P
@@jackmills7758 nah. Which 30 series card is good for 4K in the next five years? None tbh. I'd rather stick with 2k. 4090 is the only card worth getting a 4K monitor for and it's best to wait for the 5090 at this point.
I have an RTX 3070 32GB RAM running on a 1440p 144hz monitor. With that setup I’m reaching 40-60 FPS on custom/high settings on the new Warhammer 40K title “Space Marines 2” ..the game is incredible. I just wish it was optimized.
I just went from a 27 inch 1440p monitor to a 32 inch 4k monitor. So far my drop in performance isn't terrible. I was probably a little cpu bound at 1440p so I think that's why the jump to 4k didnt ding me as bad. Most games I'm at the 80-100 fps at max settings...
Many people tell me 4K sharpness is simply the best but I'm still on 3440x1440 Ultrawide primarily because I want the best performance to visuals ratio. I have a 4090/13900K rig and eventhough I'm sure its capable of gaming at 4K but I value the slightly higher framerates I get at slightly lower than 4K resolution especially for very demanding new AAA titles like Black Myth-Wukong. More importantly I'm so used to 21:9 ultrawide that I don't think I can go back to 16:9.
I'm at 139max framerates with 3090/14900K on Wukong, on 1080p 1080p is great since so many years for me and absolutely enough for the best performance but now with more Detailled Game like wu kong, i wonder if it's worth it for me to try the 1440p without lose too much framerate because, I just DONT WANT play under 100 framerate its impossible for me anymore because all my game is above 100 since 5 years, and i CLEARLY see the difference between 60fps and 144fps So how much framerate do you lose from 1080p to 1440p. should i finnally buy a 240 hertz 1440p in 2024 ? ( ghost of tsushima was 180framerate and absolutely breathtaking visual even on 1080p )
@@williamschlass6371 I'd prefer to be at a slightly lower resolution than 4K (if I'm not on 3440x1440 21:9, I'd be at 2560x1440 16:9) as I value the slightly higher framerates. It's not about being rich its a matter of priorities. I have a friend who wouldn't pay more than $500 for a graphics card but would upgrade his car every 2 years lol.
@@Faz-_ How come you're playing at 1080p on a 3090? But I fully understand why you chose to remain at 1080p for the best framerates. I used to think that way too but I would say 2560x1440 is the 'sweet spot' for best performance with the best visuals. The reason I upgraded to 3440x1440 was because my 24" 1080p monitor seemed too small after some time. My original plan was an upgrade to a 27" @ 2560x1440 monitor but after I saw a 34" 21:9 ultrawide monitor it was too huge a leap forward that I was enticed, with the added bonus that its 3440x1440 resolution was less demanding to drive than a 4K monitor. I think you meant how much framerates you'll lose going from 1080p to 1440p. Have a look at the TechPowerUp's graphic cards reviews, they show performance (fps) differences when playing various games at 1080p/1440p/2160p resolutions. I don't think it will eat up too much framerates. I fully understand that you value >100fps framerates - its the same reason I used DLSS-Balanced rather than DLSS-Quality in Black Myth-Wukong (a whopping difference of 125fps vs 105fps). I'm not into high refresh rate monitors so I can't comment.
4k today is pretty expensive to run well, but it's perfectly doable. I'm the type of person who gets top-end hardware but spends a lot of time with it. At the end of 2022 I went from playing at 1080p with a GTX 1080 to playing at 4k with an RTX 4090, it was a pretty nice upgrade, and the frame-rates even improved quite a bit, even with the bump in resolution, but also thanks to DLSS. It was a massive upgrade and now when I play at 1080p again in my laptop connected to TV it feels horrible, but on the laptop screen itself its still fine.
The 90 series cards are going to sell less, the average consumer can't afford them. Nvidia also is taking advantage of the Biden administration failures and jacking up those prices.
I made the mistake of getting a 32” 4K60 LCD. I can’t go back, not just for games but video and photo editing as well. I just got a 4070 Super and 4K60 is fine in most games and some just need a little DLSS.
4K DLSS/Performance looks good, and all new games should support this. AMD should go for an upscaler tailored to their own cards to get it on par with DLSS.
@@dkindig FSR has the goal to work on everything from Nvidia hardware to Playstations. Nvidia, Intel and soon Sony have there own upscalers and don't need FSR. FSR can't compete on quality because it can't utilize the hardware properly since it has to work on everything. Just drop that requirement, and develop a new modern upscaler that works only on new AMD hardware. FSR as it is is good enough for legacy cards.
@@ZeroZingo All that aside, everything that I can find indicates that FSR is an AMD product and AMD is putting their eggs in that basket. So I would imagine that it is tailored to their cards (and poorly done). If they internally decided to use a less than optimal product then that was a poor business decision by them. They shot themselves in the foot instead.
@@dkindig FSR at one point in time set out to take over the upscaler business, essentially kill DLSS and other upscalers by being the only one developers want to implement. But it is not working out, AMD needs to rethink. Hopefully they can do that on existing hardware otherwise they perhaps need to add more AI capability to RDNA5, if they choose to go the AI route. I would bet that upscaling will be part of the driver set in the near future and developers will only implement one API from Microsoft for upscaling, it´s called DirectSR. All cards will be able to upscale all games using their own algorithms in an efficient manner. But it will not look exactly the same, and AMD needs to be competitive. Having a jack of all trades upscaler is not the way to do that.
Been playing at 3840x1600 since the 1800x came out. It really is a sweet spot for higher than 1440p gaming vs 4k gaming to get great image quality without suffering as much fps loss. There are always tricks to get the most most fps by turning down a few things that have very little effect on visual quality. Even without using upscaling.
If you're moving from a 4090 to a 5090 then 4k all day. If not 1440p is the sweet spot if you weren't to save money and still not be able to tell much difference between 4k and 1440. I consider myself a PC enthusiast so my main rig gotta be in the top 10% for me to feel I can do anything I want to do without hesitation ex: coding, graphic design, etc
I'd say 4k is wasted resources that could be used towards more pathtracing or other visual improvements. I already didn't see a huge improvement from 1080p to 1440p, and I'd be happy to go back any time if it is needed to run a game at max settings. Thanks to my 4070 ti super I don't have that problem at the moment. But I literally can't play games without raytracing anymore. Just started playing starfield and quit after 2 hours. the really bad shadows and lighting just throws me off constantly and destroys my immersion and mood. I got spoiled by cp207,7Alan Wake 2, Dying Light 2, Fortnite with Lumen and RT and so on . I just have to wait for UE5 lumen titles to properly enjoy playing again. It kinda frustrates me that there is still games coming out in 2024 without a proper raytracing implementation. Imagine that tech is like 6 years old now. Back then with tesselation adoption was a lot faster. In my opinion there shouldn't even be a rt on/off button in any game anymore. Just plain low to high slider for RT. It is the same standard now as back then screen space reflections and so on. Also rt should move on to plain path tracing in any game like it is in cp2077 and Alan Wake 2. That is soooo much better than "normal" rt. Devs need to start hearing to the community and bring more of that stuff.
tbh aliasing and taa blur bugs me more than lower res shadows, switching to 4k from 1440p was def an improvement on my 6800xt (since I'm not gonna be raytracing much with it anyway), but i will say much of the visual improvement came from going from IPS to OLED
@@NinjaWeedle Yeah OLED makes a lot of a difference. I see it on my smartphone. I've been waiting for gaming OLED monitors for like 10 years. But now they're finally out but far too expensive for me. I'm kinda strange in that department. I have no problems spending 1k on a graphicscard or 2,5k on a gaming tower, but my monitors always have been like 80-150€ish with my last 1440p monitor being an outlyer with 250€ (reduced from 450€). I used my 1600x1200 monitor literally for 13 years till 2023, then it broke or else I'd still be rocking that i think. For me every monitor above 300€ is price inflated and I kinda wait for a cheaper model, but then I can never decide which to get. But OLED start at like 700€. That is insane.
I have excellent eyesight and do most of my gaming from my couch. I can barely see a difference between 4K and 1440p at that distance, so 4K is completely unnecessary, in my case. I'd much rather have the extra horsepower to boost FPS or play around with RT or various overindulgent Nvidia effects, haha.
i somewhat agree...at gaming distance I can see just a hint of blurriness ifi run my 55 inch in 1440 instead of 4k, id rather just run 4k dlss performance...looks as good as native at a distance and runs like 1440
eh i don't think my eyesight is that great and i can tell pretty easily but its also easy to ignore and higher fps is much more noticeable You dont have to be able to see huge gaps between individual pixels to tell its not as sharp
It all depends on the size of your screen. For most PC displays anything above 1440p has severe diminishing returns due to screen size, not worth the extra cost. If you are gaming on a 65" TV then 4k makes more sense.
Went from 24" 1080p to 27" 1440p. During gameplay I wouldn't say I see much of a difference. It all depends on the angle of view ie how close the screen is. The bigger the screen the farther off you sit generally. At some point the higher the resolution is, it gives you diminishing returns. It's the same with photography and printing your photo. You can print it really big and look up close but you can't see the full picture that way, you have to stand back. There are of course exceptions like if you want more emersion in the game world, a wide screen with a curved display or multiple monitors or projectors to create a wider view etc. But I wouldn't want that as standard, it would be too overwhelming with fast paced action, I'd get nauseous and headaches.
Gaming at 4K isn't the stupid thing. Games being insanely unoptimised while still trying to aim to be highly demanding graphical showcases, that's what's stupid. But everyone pretends that's not an issue because upscaling is here to the rescue so everyone can lean on it as a crutch.
i'm dreaming of playing at 5k, goddamn things are sharp at that resolution especially in VR. Hoping the next generation of GPU's will allow for 5k seemlessly.
Issue is monitors, there aren’t any gaming 5K monitors, just professional and work targeted monitors thus 60hz and whatnot which you don't want. I do hope 5K becomes adpoted in the future though, as something beyond 4K when high end hardware has no trouble doing 4K but 8K is still too ridiculous to feasibly do on anything new.
It's not GPUs holding back stupid resolutions, it's the fact devs and most players would rather they use that processing power to make games look better rather than have a worse looking game at a higher resolution. So new GPUs will come and demanding games will match them and those resolutions will still not make sense.
@@math3capanema 1800 is okay but 4k is still better. I think it really depends on eyesight. My friends with glasses are fine with 2k but myself and other friends with 20/20 or better vision, all prefer 4k (or better i guess)
Agreed, I upgraded to 1440p from a 1080p monitor and almost didn't notice any improvement. Returned the monitor and got a 4k display and yeah I would never go back to 1440p too.
It comes down to preference. DLSS/FSR should be used in either way. IMO the target should be either 1440p or 4K high/ultra settings with RT 60+ fps in single player games. Of course if you are playing some kind of competitive shooter you can just drop settings to medium-low, resolution to 1440p/1080p, disable RT and go for that 90/120fps.
I did upscaling multiple times on PC games, and on EMU, it was always the same as a medium jump. Even on vids shows it's not a big jump and pics show this too. I watched vids on my 4K TV and the jump isn't big like I said it's med.
I recently purchased a 27" monitor and it feels so big compared to my old 24". I doubt most of the gamers need it unless you are a fan of rapidly moving your eyes, especially if you have multiple screens.
@@kotekutalia Imagine people using a 32” monitor. I moved from 27” to 32” just for the sake of Semi glossy OLED 240hz 4K. But the size is too much for me 😅
It works well, there IS an improvement in image quality and I did this for a while at 1440p. I'm running a 4070 Ti Super and I was using DLDSR (nvidia 3d settings) and running 4k scaled to 1440p. The quality was improved but of course fps was limited by the render resolution (4k). I used it for a little while and the biggest bonus was that I was able to see first-hand if my GPU could run 4k well enough for me to move to a 4k monitor.
@@alvarg Yes, I've moved to 4k and have one really demanding title, DLSS has been exceptional, running at 67% scaling and image quality is fantastic. FPS has increased by 30%-40% in the most demanding areas. Next will be looking at 240Hz monitors. Running native in most titles though and still getting good frame rates from the 4070 Ti Super. My monitor is pegged at 140 frames most of the time.
@@vijayla21 Yes, but not using ray tracing or path tracing in anything. System was built primarily for Starfield and Star Citizen. Starfield at 4k native I get 55-65 in Mast District, DLSS Quality puts me at 75-90 with excellent image quality, once outside of cities fps picks up quite a bit. Star Citizen I haven't benched in a while because of Master Modes and digging my HOSAS out is a pain in the butt. Anyway, TLDR, 4070 Ti S seems to be a great gateway 4k card for the price if you're okay with everything else about 4k.
@@nickwu4384 good to know. I am still waiting for 3840x1600 38 oled 38+ And I don't believe that 5120x2160 oled 38+ will be that soon and can't imagine how powerful should be a gpu that should power it.
@@alexander3025 Considering I'm using a 2060 to run 4k I think it will be just fine. Playing all the Yakuza games, and jrpg with dlss performance and it stays at 60 99% of the time.
@@nickwu4384 Yeah I'm looking forward to 4k ultrawide as well. But 4k ultrawide will be a bit more demanding than standard 4k, just like how 3440x1440 is a bit more demanding than standard 1440p so there's that.
Been gaming at 4k for the last 3 years when I got an LG CX OLED 48” TV, and it’s been absolutely fine for the most part, especially with DLSS. And I can also run games in 1440p UW resolution on this screen if I need an additional performance boost. With a 4070 Ti Super most games in 4k push the 120fps limit of my panel. I have to turn up the settings to fully engage the GPU and drop the frame rate under the panel refresh.
I will be honest. When I jumped from 1080p to 1440p I saw a pretty sizeable increase in quality. It was nice and I gamed like that for a good 10+ years. I upgraded to a 32 inch 4k monitor last year. the increase in pixel density would make you think a nice jump from 1440p but honestly I didn't really see much of a difference. Don't get me wrong it is still a very nice monitor but I just felt a bit let down. Even with the added benefit of HDR(when it works), it still feels somewhat lacking. To be honest if I didn't have a 4090 I would have stuck with 1440p at a higher size. I feel 4k doesn't really start to shine till you get upwards of maybe 55 inches or higher?
Here is my order of importance in gaming. Framerate -> Ray Tracing (reflections only) -> Graphics settings -> Resolution. I play at 1080p, 1440p and 4k and barely care about it.
He's no trying to test how good does a 1440p native monitor look compared to a 4k native one. He wants to see if you already bought a 4k monitor and you have to lower the resolution. What's the impact
Sitting in the back, picking their noses, eating their boogers, and turning red in jealousy at the 4K Chads. That's where. Don't be sad, though- unless you're also a console peasant.
10:00 look at the back entrance in this freeze frame the beams on top. So much more clear and detailed and it's 4K DLSS performance internal render at 1080- and it beats 1440p native by a lot. Performance is the same.
It's so much easier to run games at medium and high settings instead of anything above that especially at 4k without to much of a hit on visuals. I'm using a 6900xt and it will does pretty well at 4k after tweaking settings and messing around with upscaling or fluid frames if you need it. Personally I think that native res with FG looks better than using upscaling on top. Just recently picked up a hellhound 7900xtx to game a little better at 4k but sent it back because it couldn't detect my Samsung oled was a 10 bit hdr 144hz screen. It was reporting it as a 8 bit non hdr 120hz screen for some reason. Also part of the bracket that you screw into the case was bent which was odd as the box wasn't damaged. The upgrade excitement is gone now so waiting for next gen
@@Z3t487 That's optimistic, 4K is becoming more and more distant with all the new game engine improvements by 2030 nothing will change because GPUs will only catch up to photo-realistic graphics + full path tracing lighting tech.
Hard to imagine when gpu makers are still throwing us 8gb of vram on the most mainstream product, there's alot of people still game on 1080p, and most of them not because of choice.
The entire point of the video is that 4K can be dumb in some situations and in others. If you only have money for something like a 4060 TI then you probably should not be trying to play in 4K. You're probably better off spending your performance on frames than on additional pixels. 4K is definitely way more viable than ever before though between some of the absurdly powerful GPUs on the market and how good upscaling is now.
@@dauntae24 no it is not, if you are sitting at arms length away from your monitor (which is recommended btw). Even on 27 inches 1440p is noticeably less sharp than 4k. The further you sit away, the less noticeable the differences become.
i have 6800 non xt an i play most games at 4, some while using fsr, my tv is 4k and i sit 2 meters away, but tbh i tested 1440p too, i jsut cant see difference from that distance, and i doubt people sit close to their 4k tvs.... only by making screenshots and going near my tv i can see more detailed things, so 4k is good if you have tons of money and can buy 4080 4090 7900 xtx and using 4k monitor while sitting close, then its worth it, if you game on big tv sitting far away, 1440p is more than ok and mid gpus can upscale to almost perfection these days to 4k if need.
Check out Jawa for the best price to performance builds on the Internet! jawa.link/OwenJuly24 Use code OWEN10 for $10 off your first purchase!
Be careful buying gear from Jawas... while it's normally a good deal, it's hard to trust them little cloaked gremlins.
Buying from Jawa seems much safer. Still a good deal and no sand crawling tech bandits.
minecraft players allow to introduce ourselves
@@gasracing5000yeah, seems like another eBay with no 100% guarantee
What a click bait thumbnail! The host is not so impeccably groomed in the actual video ...
Yea they are buying goods at less than market value to put into new builds for the consumer for low cost solutions. There 100% new stuff is the same prices as others really. Been there done that. You can build your own for cheaper and get better parts in the process. Too many prebuilts out there just as cheap or expensive as one wants, Costco and Micorcenter for example. They only wanted to pay 350 for new in box 4070Ti. Who wouldn't? lol
The best gaming cost saver for me, is to never update the prescription for my glasses. That little bit of “Eyeball AA” makes 720P and 4K look the same.
Same lol. I only care about refresh rate as it’s the only thing I can tell a difference in
@@Korixon. As an aside, watching Cyberpunk 2077 fly at 120+ fps on my laptop (even is we’re talking frame gen here) is pretty mind blowing. Back in my college days with my laptop then (Radeon 5470), it was often a struggle to get just 30 fps at all. Forced myself to play Battlefield 3 at 800 x 450, all Low, and upper 20’s-low 30’s fps, simply because my parents gifted me the game.
And now I have a conniption if a game drops below 60.
WTH, you don’t list what GPUnyou are using? That makes a huge difference, why did you leave that out?
I'd rather go ultrawide 3440x1440 for that immersive experience rather than 4k which is the same regular 16:9, just mostly small improvement in picture quality.
😂
FSR and DLSS is great technology but it feels like game devs just use it as an excuse to produce unoptimized products sometimes
As someone learning unreal engine: yes
Oh 100%. It's crazy how many big titles look and run like shit without it...
More like they make bloated games to bubble up the GPU sales.
I’m suffering with these issues right now! I have an RTX 3070 running on a 1440 144hz monitor.
With that setup I’m reaching 40-60 FPS on custom/high settings on the new Warhammer 40K title “Space Marines 2”
..the game is incredible. I just wish it was optimized.
@NeoRimeOnline They "look" fine without it, they just run low fps unless they are enabled.
1440p is just so much more affordable. It's the sweet spot.
I also will never give up the framerate difference between 1440p and 4K. The only upgrades I'll consider are HDR and *maybe* going over to 21:9 ultrawide.
Here I am upgraded to 4k long ago, 50inch TV 1m.away HDR on, considering vr Google's with 4k per eye..
@@pozytywniezakrecony151 OLED? If so, understandable, what with OLED monitors only being a recent occurrence.
VR does sound like a cool upgrade too. I, personally, just don't want to hop in until it's more refined. In particular, haptic gloves sounds really cool but it's either too much time or money for me to invest in.
@Egementhegamerr 4080 is enough for 4k if I run 4k on 3060ti..ofc not super ultra but some settings are useless
At some point you don’t need more FPS. I stayed at 1080p 60 until I saved enough to move to 4K 120. It doesn’t cost that much more than 1440p 120, unless you play the most demanding games.
@@JGComments A decent 1440p 144Hz monitor at the moment is somewhere around $180-250.
A decent 4K 120 or 144Hz monitor is somewhere around $400-450.
Maybe "doesn't cost that much more" in regards to a whole setup but an extra $200 to have a minimal sharpness boost just doesn't make sense.
I would much rather put that extra $200 into a Mini-LED 1440p monitor that would have proper HDR capability. The perceived improvement via contrast and color would be far better than the diminishing returns of being sharper.
As a cherry on top, it'll look better all while still being able to natively run higher refresh rates.
The problem with palying in 4k is that when you do it for the first time with large enough display, there is no going back. Im plying on OLED 4k TV I use as monitor and it is insane.
Yup me too. I have an lg c1 65 inch 4k oled tv and I have a 4090/ 7800x3d/ 64 gig ddr5 build.
Lg tv c1 with a 1080ti. That monster still do a good job on it.
The fast games I play I wouldn't even notice. My computer can run ultra settings and I still put it on high. I'd rather have the performance bump than something I won't notice while actually playing the game.
I can see it being really cool for flight simulators.
@@DingleBerryschnappsBingo
This, once you see "real" 4k rendering it's a revelation and you can see the differences with upscaling too.
4K gaming is not stupid it's just not affordable. Most people don't upgrade their system every generation. I have a friend who upgraded now after 10 years.
Not only 4k is not affordable, you can't physically get competitive framerates (100-200+ low 1% fps) on ultra settings in modern games without frame generation/DLSS which kinda dilutes your experience.
@@kotekutalia 100% agree, we are not there yet with high frames 4K gaming.
@@kotekutalia what modern competitive game on ultra can't get over 100 fps on a 4090, most competitive shooters are not hard to run and honestly ultra isn't really required if you plan on being competitive anyway.
@@alvarg I didn't say competitive game, I said competitive fps. Aka if you are a comp gamer you will be dissatisfied when you pay $1800 on a GPU and it sometimes still stutters when pathtracing in Cyberpunk on 4k.
@kotekutalia gpus are not ready for path tracing yet, and you would be pretty dumb to buy a card for that purpose. You could use normal ray tracing on a 4090 at 4k with great results.
Competitive fps is a vague term, competitive against what. It wasn't too long ago where 60 fps average at 1080p in the latest games at max settings on the totl gpu was considered good. You can easily get that on a 4090 in pretty much every game except for some UE5 titles which makes sense as that engine was designed to push the limits of hardware.
1440p from 1080p is a huge difference
not really to me the only difference as in well framerate and hertz the monitor can pull is a huge difference for fluent gameplay becasue 1440p monitors very few exist that can do 240hz at 4k and those monitors are near impossible to find if at all findable if its a tv that is perfectly fine for watching stuff only but for gaming no its more of a youtuber/ whatever they are called people. as ill stick to my 1080p gaming monitor from 2021 2022 till christmas when ill hopefully get my dream gaming desktop and a new 2024 1080p to 1440p 280hz gaming monitor from alienware
Yep. Huge difference.
A soft image does not really bother me, but on a 27" the difference is quite large, so as long as I get 60fps at 1440p I surely would not go down to 1080p for higher smothness, but in itself even the lower res still delivers a good experience for me
@@Cole_Zulkowski "not really to me" then your dumbass goes on to say you're getting a 280hz monitor lmao
Barely any difference
144 to 240hz is much more noticeable
7600x and 7900XTX with a 4K 32" 144Hz monitor play everything at High or Ultra and it's great.
Do You play native or upscaled?
@@homerlol9058 it's not relevant, consoles always use upscalers and not very good ones. If dlss, it can be virtually as good as native.
Mw3 4k extreme native 128fps rtx4080
@@TheT0nedudeNo one is concerned about you pitiful console peasants other console peasant opinions don't count either you guys legit don't have proper context to appreciate truly good graphics. If you are playing on a large tv you will absolutely notice a difference between 1080p and 4K. The more you stretch pixels the more important the quantity of them becomes. Edit: Also the higher the resolution the more beneficial upscaling becomes and the more wiggle room you have when it comes to how far you can push the upscaling before it looks bad
@@TheT0nedude its pretty relevant , native is better than upscalings
4K is such an eye candy
I gave up on steady fps and performance
4K scenery just blew your mind
For single player games yeah. Multiplayer I’ll take the frames.
@@rlsjunior797 Even single player shooters feel horrible in 60fps. I only max my settings on controller friendly games like Elden Ring, Spider man, Ghost of Tsushima etc.
Idk, I'd prefer games at 1440p + path tracing, vs 4K no path tracing.
@@agusr32 Agreed and I can't notice any difference in clarity from 1440P to 4K at 27" monitor. Maybe larger screens. But I still use two 1080P monitors and one 1440P for my triple monitor setup. The 1080P ones still look amazing.
@MrArrmageddon 4k looks completely different on a 65 inch tv. 1440p looks slightly blurry and 1080p look pretty blurry.
Resolution isn't everything.
I went from a 1440p 27" VA monitor to a 1440p 27" WOLED monitor.
The upgrade was massive!
Especially contrast and color saturation is soooo much better.
Smoothness is also much better - IF - the framerate is over 80fps, otherwise you'll get judder/stutter because of the superfast response times.
Indeed. People saying they wont go back from a 4k OLED setup is more like saying i wont go back to a TN,VN, IPS panel rather than its resolution. 2k OLED looks as good as a 4k OLED.
@@sspiegel1 Depends on how sensitive you are to color fringing. 4k oled kind of allows you to "brute force" your way pass the subpixel layout issues.
Don’t be mistaken. Very very few console games run at NATIVE 4k. Most of them
use upscaling.
Alot of console games are running lower than 1080p this generation although it's somewhat impressive what the ps5 does with a underclocked rx 6700 non xt
No one really plays at native 4k, the performance impact is massive. Even on 1080p going from DLSS to native is huge
@@Ligmaballin What kind of gpu do you have that needs upscaling at 1080p?
@@Ligmaballin The perfomance difference between 1080p and 1440p is surprisingly minimal on a solid gpu.
@@heatnup RTX 2080 S, i play Cyberpunk with raytracing. Other games run well native, but i use DLSS still for more frames.
4K native? You just need money, that's all.
You don't need native anymore. Upscaling is too good nowadays.
Most cards beginning with 10 series can run 4k now
@@evaone4286 15 years old games maybe yes. Even a 4090 almost can't run games like alan wake 2 in 4k WITH DLSS AND frame generation. How should a 3000 series GPU manage that in new games, let alone 2000 or 1000 series.
@@williehrmann some games like Alan wake require the absolute best card on the market so I thinks it's an outlier in this case. As I said, you can run games at 4k since 10 series, just depends on the settings and fps. But you can get playable fps even on ultra with some cards.
@@mikem2253 Yeah, I meant native and the fact that you need money. Upscaling is quite good and it will be better, there is no other way.
I have been gaming for a few years at 4k on a 120hz tv, with almost no problems. Looks great, would be hard to go back.
Yeah I've had a 4K display for around 5 years now. Started using it with a GTX 1080. Did downgrade from 165hz to 120hz for it though which did slightly suck. Plan to upgrade soon, I've noticed my display has slowly been gaining an odd vinyette and has a dead pixel on the one side. Not easy to notice, but it is there.
PC or Console?
@@sirnodchalot PC
I went to 4k awhile back i kinda wish i didnt should have stayed at 1440p just because i think you get more value from the mid high hardware
Theres midrange 4k monitors now tho and 1440p has plenty of flagships occupying the peak refresh rate oleds while 4k sidegrades with lower fresh rates. For value upper mid range now its 144hz 4k and 240hz 1440p
spent that money on a monitor and didn't get the GPU to run it. Noob
@@pem3480 do u read? i have a 4k monitor runnin 4k gaming what dont you understand? noober
@@Frozoken 4060=1080p 4070=1440p 4080=4k and 4090= all above... He said don't expect big performance changes from new gpus... so the 5060=1080p 5070=1440p 5080=4k ect. Welcome to the new world of gpu 😂 😂
@@azjeep26 yea. Ur run on rant sounds like crap. Get better hardware. broke scrub
As someone who works in video and animation, I can honestly say that if you're not viewing content on a screen larger than 50 inches, 4k is not necessary. I have a 130 inch theater screen with a 4k projector. That was the first time I actually found 4k useful and the only place I care about resolution. Anywhere else, 1080p is fine. 8k is just ridiculous. So yes, 4k has three times the pixels, but the difference doesn't become noticeable until the acreen size gets larger. I do medical animation and rarley need to make a 4k video unless some client has a need to show the content on a large screen at a trade show
Moved to 4K 2 years ago and it's a blast really. I'm having great fun with it in games, videos or reading. Fantastic clarity, no visible dot pitch black grid, feels so smooth like a smartphone screen. Sure it needs more compute power but it's worth it really. I'm glad I didn't stay on 1440p too long. The felt difference from 1440p to 4K was greater than the move from 1080p to 1440p.
Also have to say 4K performance DLSS looks so much better than 1440p quality DLSS because it has higher pixel input so resolves details much better on a vastly higher pixel density screen. In actual use, DLSSQ on my 1440p screen playing RDR2 was borderline acceptable quality while DLSSP on my 4K screen looks way better than 1440p native.
That is my exact experience. Perfectly stated. Cheers
not all games have dlss
4k dlss performance is amazing with high graphics settings!
Exactly. I made a jump from 1080p to 4k and, oh man, what a difference that was.
@@0rdanot all games are playable on PC
So, I actually went from 1080p > 4k > 1440p in my monitor journey. 4k looked incredibly crisp and clear, but the sacrifice in performance just wasn't worth it. However, when I dropped down to 1440p, I still got a much crisper experience than I did at 1080p, but I didn't have to sacrifice nearly as much in terms of performance. Was I getting as high FPS? No, but I'm not looking for 160+ FPS in all my games; if I'm getting 60 FPS minimum, preferably above 90+, then I'm completely fine with it.
since I started to play at 4k at 27'' everything else literally feels like 720p, even 1440p feels like complete garbage. It's crazy and with upscaling and frame gen you don't even need a insane pc anymore
i supose people choose high resolutions for high quality of image right? every time i tried upscaling and frame gen my games become garbage with ghosting and artifacts, i now that many people doesn't care because they got higher fps but what is the point in a higher resolution like 4k with a worst image ( full of ghosting and artifacts ) and less fps than if you was using full hd?
@@catgirllover666rockstargodmonk because it's not the case?haha Atleast in th3 games I played on 4k high settings where I needed upscaling (cp77,ghost of tsushima,jedi survivor,ghostwire tokyo) I could not sense a single difference with quality/ultra quality upscaling. I Switched multiple times in a row between it and even when I was looking at differences for minutes only concentrating on the image and could NEVER tell a single diff. So in my experince quality upscaling on 4k high/ultra settings in the games I have played I could never tell a single difference. Frame gen I only tried with ghost of tsushima where I couln't tell a diff either because I play it on controller so you don't notice any input lag diff
@@sensei_... Same, I'm using a budget LG 4k 27" and also having a blast. I don't even need Quality DLSS. Even Performance DLSS looks great, better than FSR Quality.
@@sensei_...you can’t tell the difference because your eyesight isn’t good enough. Mines not either which isn’t a bad thing
Bro playing a game in ultra with 1440 and 4k is almost the same
I run a 65" LG C1 so yeah- 4k or die. My 7600x/7900xt has been plenty to drive it with a little upscaling here and there as required. Honestly, anything that is 7800xt/4070 Super or higher can run a solid 4k image with upscaling.
4070 Ti Super and recent 4k upgrader here, have been able to run most games at 4k native Ultra, the more demanding ones with DLSS. The upscaling at 4k is subjectively much better at 4k than it was at 1440p and I only have fluidity problems when I get into areas where frames drop to the 60-80 fps range, once up around 90 or more motion is pretty decent. I'm running an LG 32" IPS panel, my TV is older and I'd be locked to 60 fps on it.
@@dkindig the upscaling to 4k is really what I think they designed upscaling to work with the pixel density is just so high.
@@dkindig I play mostly older titles due to the limitation of my hardware but being able to play 4k locked at 60fps on a 42" 4k monitor is amazing. The only game that spanks my system is Red Dead Redemption 2 which uses more than the 8GB of video memory on my lowly RX 6600. What is crazy is that when locked at 60fps the CPU doesn't need to work as hard when gaming in 4k as opposed to 1440p or 1080p at higher refresh rates. I can get 60fps in 4k with the GTA5 built in benchmark with max settings aside from grass being set to high instead of ultra and that is with a 3rd gen Intel 3770k. The one thing that has kept this system going for so long is a bios update allowing me to add an NVME drive on a pci-e slot which I was able to install the OS and boot from. Having 32GB of RAM also helped quite a bit. Of course newer games would most likely crush my system but I am content with playing games that have aged well. New games are a gamble if it is complete on release or if you are just a beta tester until patches come out to rectify the issues.
Same , 55 4k TV. I had to upgrade to a 7800x3d to run a 4090 tho.
@@dkindigi have a ti super and was wondering if it was worth going 4k or not.
"1440p Native TAA" does not fit together, when TAA is enabled you're basically playing in 1080p with 2160p edges.
TAA looks horrible I agree. I use DLAA wherever I can because I cannot stand TAA.
@@AaronWOfficial Or TSR in Unreal Engine games. Technically it is a TAA v2.0 and better.
@@AaronWOfficial you can use DLDSR on nvidia cards and it's even better than DLAA
I still play at 1080p, I dont care about upscale gimmicks, most if not all of the games i play are online comp multiplayers, upscale its a no go, i only play native at max settings so even a 4090 couldn't provide enough fps (+140) at 4k native for most of the games i play. 2k is a no for me so i will wait for better hardware to go 4k.
I just upgraded from 1440p ips panel to an OLED 1440p and wow it looks incredible! I think this is the best route if you want better picture quality but don’t necessarily have the hardware or don’t want to run 4K. 7900XT + 7800X3D btw, pushes an easy 165+ fps in any modern game with high settings
no, ips is bertter than oled
@@lizardx6504wtf hahaha. IPS is so bad compared to OLED stop coping
oled is better than ips only if you won't use it for productivity, it has a lot of cons in that regard
@@rmet255 i use oled for productivity and won’t be going back to ips
@@rmet255 true that brother
Recently dabbled in 4k on my tv. Tried Horizon last night + HDR. It's just absolutely insane. Only way I want to play story-based games now.
7800xt does incredibly well on a lot of these PS4/5 ports at 4k. Getting ~90 fps on Horizon right now with max settings.
Same here with Rx6700xt. Playing GoW, Ghost of Tsushima, Cyberpunk, and the list is increasing.( It's fantastic) I'll stop playing in my 1080p monitor for shooters.... 😂
Last Black Friday I built my rig with a rx 7900 xtx ($930) and ryzen 7800x3d (part of a bundle got great price) and I run most games at 4k max settings with 100-120 fps. I was also able to get a 4k 120hz hdr monitor for $500. Depending on the game I can mess around with ray tracing and still get very playable frames, I never play below 60. The reason I sprung for all that was because I thought good 4k gaming would cost so much more, but that actually all shook out to be reasonable for me and I’m really happy with my rig. With time, ray tracing will become much more optimized so I think my build will hold up well over the years
At 4k DLSS is indistinguishable from Native for me at least. Even performance which scales from 1080p is surprisingly so much more sharper then 1440p. If you have money definitely go for 4k
Just DLDSR your 1440p up in combination with DLSS P.
*than
@@albert2006xp a monitor showing you 1440p even if the render resolution is 4k is still going to look worse than a proper 4k screen that's upscaled from 1080p
@@alvarg DLDSR is pretty good at making things sharp. As long as the pixel density is good and you're not literally in the screen with the magnifying glass, it shouldn't be that huge a difference.
Don't underestimate how much sharper DLDSR can make the image.
@@albert2006xp source, have both 1440p and 4k monitors
1080p to 4k upscale is much sharper and cleaner than 4k downscaled to 1440p
a reason to play at 1440p (with or without DLSS) than 4K:
less GPU Power Consumption and lower Temps.
and cheaper monitors, don't forget that
Upscaling reduces both, too
On my 27" monitor i dont need 4k, but with my projector on a 110" screen its a absolute must have. I want to game on a projector since im a child. But i dont only waited for so long because of the money. I waited for usable 4k projectors because 4k on this screensize is a heeeaaavyyy jump in picturequality. And it was definitely worth it. Im so happy with it! 😍😍😍
One thing worth mentioning is that with a 4k display you can get 32 inch screens which is massively higher pixel density than a 1440p 27 inch screen. Pixel density has a huge impact on how people perceive images. I remember someone seeing my 15inch laptop screen show a 1080p image and think it was 4k because it looked "crisp and clear" when "crisp and clear" really meant a very dense pixel makeup. Same reason 720p can look good on a Switch.
Also worth mentioning that 4k performance mode DLSS has a higher internal resolution than 1440p quality mode: 1080p vs 960p. AND 1080p divides evenly into a 4k resolution (you can use 4 pixels in 4k to represent 1 1080 pixel) so most of the algorithms seem to do a better job getting it to the target resolution in my experience. Still need a beefy boy to jump up to 4k performance over 1440p quality.
How about the 32" Dell UP3218K .. 280 ppi?? .. 8K/60 FPS wouldn't be ideal for racing or fast paced first person shooters, but would be a dream for RPG/grand strategy or third person action/adventure games like The Last of Us or Red Dead Redemption 1/2 -- i.e. the type of games I love most.
I use a 65" LG C1/4090 for most of my 4k gaming but I'm in the market for a smaller monitor. I've had a high appreciation for pixel density, ever since getting a recent Macbook Pro and iPad, so looking for something as good or better than Apple's displays.
@@AdamsOlympia 8k is way too taxing. But a 32 inch 4k looks great
in the last year I went from 1440p UW 165hz OLED to a Neo G8 4k 240hz and now a 1440p Samsung G6 OLED 360hz. What I learned is that 4K is awesome for sure and for 32" or bigger its needed. What I also learned was that I missed being able to hit those higher frame rates. For example in Warzone at native 4k on the lowest settings I could hit 160fps avg. To most people this is more than enough... but once you see 240-300fps in 1440p, especially on an OLED.. that sharp 4k image starts to take the back burner compared to the smoothness of 1440p at a much higher refresh rate.
I think you are correct. 1440p UW is still the sweet spot. The jump to 4k isn't worth the framerate hit and I like the ultrawide aspect ratio better anyway.
Without irony I say you must be an exceptional person.
Most people who aren't playing really competitive FPS games constantly wouldn't notice the difference above 120fps.
The difference between 60 and 120 is massive. The difference between 120 and 240? Hardly anyone could, never mind would, notice.
1080p on a 1440p looks garbage becaause the pixels don`t line up (1 pixel represented by 1.3333 pixels)
Get 1080p or a 4k monitor: which can actually combine 4 pixels into 1
You can letterbox as well if your monitor is large enough. My monitor is 32 inches and 2k, but it's also a 24inch 1080p monitor. Which isn't too small, you can sit closer as well.
What about an 8K display so it can combine:
8K: 1p into 1p
4K: 4p into 1p
1440p: 6p into 1p
1080p: 8p into 1p
720p: 12p into 1p
480p: 18p into 1p
I said "are you calling me stupid!?" Out loud. I'm sitting at my PC by myself.
Average 4k owner
It’s not stupid if you are happy with the performance in the games you play. I’ll receive my MSI MPG 321urx tomorrow and I can’t wait to play my old games in glorious 4K.
@@jonathansaindon788I was more making the joke that I was so stupid as a 4k gamer that I was talking back to the video. 4k gaming is glorious! I'm waiting to upgrade my 12GB 3080 with a 5080 or 5090, then my PC will be perfect!
average 4k money waster
@@kush2023 womp womp, let him do whatever he wants with his money.
Your money your choice, his money his choice, lots of people should try to understand this.
I've been saying 4k gaming is stupid all along.The reason I say that is because people buy a 4k TV and then think that's what they'll use to game on without realizing it cuts your frame rate massively. Let my try to reword that to be clear: 4k gaming exists because people buy a 4k TV. That's it. That's all there is to it. No real thought goes into it, it's just because it's what people have so people think that's normal somehow. Consoles and TV's have tricked people into thinking that 4k is normal. If you want good frame rates and a clear image you want 1440p. Also, I have a 4090 with a 27" 170hz 1440p monitor (Gigabyte M27Q). If your first reaction to that last sentence is "A 4090 is overkill fdor 1440p" you are completely missing the point. It means you don't like high frame rates and like low frame rates becuase you bought a 4k monitor or TV to game with because the number 4k is bigger than 1440p so it has to be better, right? No, it's not.
I play single player RPGs. The RTX 4080 does a very good job on those at 4K. High refresh rate just does not seem that important for these kinds of games while 4K HDR looks very nice and makes the worlds feel more immersive.
Yeah, and there's A LOT of stupid 4K marketing out there. In every promo, trailers...
It reminded me of times, when smartphone manufacturers were measuring d**ks of who has the most megapixels in their phones 😂
@@Immudzen That's a good point. If you have a highest end gpu and cpu 4k works rather well. Most people don't have a highest end gpu though.
@@pf100andahalf I would agree. For most people it is just not viable at all. But when you can use it the results are very nice. I would not go back but it was expensive.
@@ImmudzenI'm a similar kind of gamer with a 4080 Super and 4k144hz 32" monitor which I bought earlier this year. High refresh rate is very nice in some fast paced action games but there are plenty where I'd be happy with running from 60-80fps with high details.
People go from a garbage LCD to some crazy featureful OLED then claim it's the resolution that makes the difference. Every. Single. Time.
Resolution makes a difference too. Are you playing at 720p?
Reminds me of everyone with a old 500 dollar dell laptop that upgrades to a 2k mac book and then proceed to tell you why mac is the best.
@@PeopleDoingStuff huh? Do you have a point to make? Are you implying that Sony is the ‘MacBook’? Because reviews with proper measurements say otherwise.
A lot of people go from one end of the spectrum to the other and never actually understand why one is better than the other. Like going to OLED and thinking its great just because its 4k. Or changing to a high end CPU and thinking its just because its a Mac.
Using decent quality IPS panels at 1080p, 1440p and 4k, the difference is very noticeable. 1080p to 1440p is probably a bigger jump, but both are notable.
I own a 4090 and a 4k 160hz monitor. I made a huge mistake. Once you watch 4k, you don't want to watch a lower resolution anymore. i hate to play gam,es at 60-80 fps, i just hate it. 100+ fps is the only fps acceptable, at least for me... And there are a LOT of games that my 4090 with DLSS and my i9 13900k cant go up than 80 fps, A LOT. Keep 1440p till 5090 or maybe 6090... Once you watch 4K, there is no turning back.
what you mean just use losless scaling frame gen, you triple the fps
I'm on a 4070 ti super and run 4k fine at 80+. You playing cyberpunk exclusively or what? I'm playing flight sims, arma reforger, fighting games, subnautica exc
Why is 100fps that much preferable to a steady 60?
@@Leo-yn5fx You are seeing significantly more data per second, which makes the game look play and feel so much more fluid and smooth . Ive been playing on 1440p 144hz( monitors for over 10 years, until last week I went to a 4k 160hz monitor and before I got my new card I was around the 60fps mark on the games I was used to playing 100+ on, and it looked like a slideshow. You can never go back after having 100+ fps and g sync/vrr but if you've never had it you don't know what your missing
@@Runk3lsmcdougalIt really depends on the game for me. If a game is well optimized with good frame times 60 FPS are super smooth. But as most of the games nowadays are optimized like sh u need at least 80 FPS to get the same level of smoothness.
If you want to see the complete view, or do you need to focus on small portions?
This makes the difference regarding required resolution.
Unfortunately customers tend to pay much for newer technology, they do not even need...
I have a 2080 and it depends on the game for me. Anything that was made before 2018, my GPU can run at 4K/60 or 4k/120. But anything newer in struggles. Like I was only getting 30-40 fps on Alan wake 2 with lowest settings with DLSS performance mode. So I definitely want to upgrade when the new 5000 series comes out
What gpu will you buy from the 5000 series.
Doesn't this literally depend on your graphical settings. Sure, if you're playing at ultra, you'll struggle. But at med-high, it should still run fine at 60+. Also, upgrading right when a card comes out is a terrible choice since cards always drop significantly like 1-2 years later. I picked up a $500 card that was over $1k at retail just 2 years ago lmfao. IDK why people pay those prices when cards first drop.
@thelonercoder5816 yes it depends on the game engine used, settings, resolution you want to play. If you have an older gpu you will have to see what settings and resolution will get you at least 60 fps or more. People are ready to pay either all at once or use a payment service to pay for their pc build because they have saved for years to upgrade. Yes the cards will cost the most when they release and will be hard to find either for a few or several months. People will buy what they want no matter how long it takes especially if their pc is struggling to how they want to play their games.
@thelonercoder5816 for example my pc is in my profile picture. I went all out with my first all white high end build with the gigabyte aero oc 4090/ 7800x3d cpu/ 64 gigs of ddr5 ram in the white phanteks nv7 case. I paid off the pc using amazon's affirm and it took me 9 months because i paid monthly. I have been upgrading the cosmetics of the pc over time and just recently have bought the gen 5 crucial t705 4 terabyte ssd and the gen 4 Samsung 990 pro 4 terabyte ssd. I am paying monthly on those as well. I bought the phanteks screen, the phanteks vertical gpu mount, the correct fans which I also paid monthly as well. I am looking forward to the 3rd party tests of all of the next gen parts to see if I want to upgrade or not.
Once you go to 4k you dont go back.
I have 4k/1440p/1080p monitors on my setup, and i never use the lower res monitors for gaming.
Way to blurry/aliased to my eyes
Yeah, that's why I'm waiting until I can get a PC strong enough to do 4k ultrawide in the future.
You can DLDSR+DLSS to get rid of the blur/alias. Whether you run 4k DLSS P (which basically any card would have to at least in some games) on a 4k monitor or on a 1440p monitor with DLDSR, you're getting basically the same result.
I had a 27inch 4k 60hz in 2016 and this year i went 32inch 1440p 165hz. Happy with the stepdown gives me a better/modern/bigger screen and saves money for the gpu. Don't play games as much as i used to, so overspending on a modern 4k screen doesn't make sense to me. The lower pixels per inch doesn't bother me.
@@masterlee1988what pc specs are you going to buy for the 4k set up?
@@MrAnimescrazy Not sure yet as it's far out years from now. For now the PC I'm getting next year(AM5 build with at least RDNA 4 gpu) is meant to handle my current res of 3440x1440 better. All I know it's going to be an AMD build and most likely with an AMD gpu for the 4k ultrawide.
I play at 1440p 165hz with a 4080 and it’s amazing
Same here. 2k@120 is better than 4k@60 gaming. The smoothness is more important than slight visual upgrade.
Yeah I do as well and can literally run max settings and rt and still have smooth experience with high refresh rates. 1440 is literally the sweet spot.
@@White-nl8td I thought the same thing, until I tried it.
At least for me 60hz are enough in games, where I want high visual fidelity. It is noticeable at first, but after a few hours it becomes normal again, while the higher res keeps being impressive. The difference between 1440p and 4k is much more noticeable, than the difference between 60hz and 144hz.
@@lyxsm After gaming at 165hz for 8-9 years I literally can't even play at 60 fps anymore. I will probably eventually upgrade to 4k when this current 1440p/165 monitor I have dies, but it will be 4k/144+ hz and I will need a GPU to run it. I have a 4080 but I consider that kinda bare minimum for 4k/144hz currently.
4080 user here as well. I made the switch from a 27 inch LG 1440p 165 hz (180 oc) to a 27 inch Redmagic 4K 160 hz Monitor (I lucked out on a used one in mind condition). At that size the resolution is incredible but on a handful of games I play the GPU needs some help from DLSS to hit that high refresh rate. I couldn’t be happier nonetheless. 4K is a lot more viable than it used to be.
The 3440x1440P resolution is where it is at for value gaming right now. Good screens can be found for less than $300 and most mid-range video cards can handle 3440x1440p fine.
What do you think the p in resolutions mean?
@@GrainGrown "P" stands for "Progressive scan" which pretty much everything does now but there was a time where a lot of monitors and TV's only used interleaved scan (every other line in each pass) which introduced a lot of flicker. Probably pointless now to bother with the P.
Progressive scan. Both the vertical and horizontal rows of pixels refresh simultaneously. It's practically ubiquitous now. In the past, vertical and horizontal rows of pixels would Interlace. Hence 720i or 1080i@@GrainGrown
yup, switched to ultrawide when oled monitors came out and dont regret it one bit
Ultrawide looks terrible, why not just get a 65"+ 4k display. Whats the point in having a screen thats basically cut in half horizontally?
I've watched a couple of your videos and they are good! After a 22-year hiatus, I returned to PC gaming less than a year ago with a 7800X3D, 4070, 32GB setup leveraging a 1440p display and couldn't be happier. Well, actually I could considering that the 40-series Super was launched like 3 weeks after I bought the PC but I've gotten over it now. 😁
2:17 video starts
I got 4K monitors because I look at text all day. In games I run 4K with DLSS/FSR set to Performance or Quality, depending on the game. I wouldn't want to go with 1440p because of scaling issues on MacOS (yes, I do run both Windows and MacOS).
1440p is fine on Mac OS. Been running it for years on multi monitor setup for work.
As someone who upgraded from a 1440p 165Hz to 4K 144Hz monitor last year, I can honestly say if you're a budget conscious gamer running at 1440p, and wishing you could afford to game at 4K, don't worry. The visual uplift isn't like going from 1080p to 1440p. Yes it's noticable at first, and it's a "nice to have", and doesn't offer the same level of beneficial fidelity uplift. I'm using an RTX4080 at 4K, I think that's the minimum I would consider for a good experience at 4K (either with or without upscaling). One of the benefits of 4K is upscaling is much more practical, so you can get a 'better than 1440p' image still with excellent framerates.
playing only at 4K since 2016 when i got gtx 1080, now using rtx 4080 and that's still great. even dlss and mediums at 4k looks better than native ultra 1440p for me
I still run 42' 4k with gtx1070 and enjoy it very much.
Yeah I just transitioned from 1440p to 4k and the most obvious difference was that the upscaling looked MUCH better in 4k than in 1440p. Upscaled 1440p had noticeably reduced quality, upscaled 4k still looked very nice. At 1440p I always preferred native. I still prefer native at 4k but DLSS gives me very good results for demanding games.
@@dkindig Yeah, I had the same expirience. Upscaling on 1080p was so bad that I thought it was useless, but on 4k it makes so much sense. Looks better than native 1080p.
@@golimonkey exactly. in metro exodus and witcher 3 dlls is much better than native in 4k
@@golimonkey better than native? riiiight
Why are we insistent on ultra settings? 4k high settings would probably do the trick for many games and then turn on upscaling and go from there. I can play Cyberpunk on my 7900 XT at 4k without upscaling if I play at high. I don't even turn on ultra settings for most games. Too expensive vs high.
Yeah I play all of my games on the second highest settings.
Yeah I rely on high settings mostly(and do ultra either later on for stronger hardware or on non demanding games).
The problem with 1440p is that many people use their PC and monitor for multiple purposes. I have a 4k monitor mainly for productivity with my Windows and Mac laptops. I also connect gaming PC to the monitor. If I run games, I can only choose 1080p or 4k for the one-on-one mapping of the monitor. Run 1440p on 4k monitor causes pixel shifts.
Honestly if I was ONLY playing video games on my monitor I would do 1440p 100% and save money for top of the line system. However, since I also use my monitor for videos and homework the extra pixels make a Huge difference.
"Mom I need an rtx 4090 for School"
when i first got my 4k screen i just couldn't believe my eyes. it was like a literal window to the game, didn't feel like i was watching something on a screen. Sam Neo G7 32inch.
That's the effect of a proper HDR screen. 1440p would give the same experience.
According to Samsung's website, that's a curved display. How your experience with fast motion, regarding black smearing? Does it suffer from backlight glow?
@@ghost085 It has neither but it's an old model with different issues such as scanlines. Nowadays there's better mini-LED screens to go for.
@@anitaremenarova6662 HDR ON is a different story. ppi increase from a 1440p to 4k is monumental. To my eyes, the 1440p is borderline clarity while 4k is just worlds apart. think of it as no anti-aliasing vs 16x MSAA.
@@ghost085 it took a lot of time getting used to a curved screen. i hated curved screens but after using this one i wouldn't want to get back to a flat screen. it does smear but not a lot, white text on a black background is hodgepodge(while scrolling only). backlight glow is unnoticeable even when ambient light is dim/off but there's a very slight halo around objects which again, is unnoticeable unless you're looking for it but sometimes its on your face and you just can't ignroe it.
my advice is to get an OLED and skip this one unless you're getting it for cheap or afraid of the OLED burn in.
Well, your last video did convince me to upgrade to 1440p lol. Like it a lot, image does look crispy.
83" LG G4 4K gaming is awesome at 2.3 meter distance!! RTX4090 and 5800X3D are a nice pair with it as well, mostly 100fps-144fps in all games when using reasonable settings.
its down to screen size for me , 1440p is probably the sweet spot for a 27 - 34" monitor and you get damn good visuals and better performance . Consoles do look nice on 4k OLED but you are not getting true 4k but it still looks amazing.
If you can afford to play then sure go for it... if you can't you don't really have a choice.
You could do so since 10 series. Just depends on the game and settings. Still on a 1080ti, can still play most games at 4K maxxed. 4K is not just turn everything up and forget and expect 200 fps. Gotta tinker around
Well, okay, so if I wanted to play at 4k, I'd need at least a 4070 Ti Super to achieve a semblance of a somewhat stable 60 fps _in most games_ and that would set me back 800€.
But then I'd need a 4k monitor as well, which would set me back _at least_ 400€ for an image quality that doesn't absolutely suck AND I'd be stuck with a 60hz monitor even if I were to downscale to 1440p. Not only that, but a 4k monitor wtih at least 120hz (I searched for 100hz first, found none in my immediate vicinity) so I'd be allowed to downscale to 1440p in case I wanted to play a fast action game and needed the extra fps, it would set me back almost 1100€, and I didn't even look at the specs, so it's probably god awful in terms of image quality.
Is gaming at 4k stupid in 2024? Unless you can throw money at the problem, yes, it is stupid.
I have that exact GPU and I'm still at 1440p. 4K only makes sense if you have the space for a TV-screen on your desk and have money to waste sustaining that resolution by upgrading every other year.
I mean it's always been like this. You want the most premium experience? Expect to pay top dollar.
If all you play are games that punish your GPU like AW2 and LotF then sure but personally I play plenty of less graphically intensive games and older games that run well into the triple digits of fps at 4k without DLSS on a 4090 and 7800X3D PC, it's to the point that my monitor is a bottleneck because the refresh rate is only 144hz.
The solution is simple, if u have 90 series cards (30 and 40 generation) go for 4k, if u have 70/80 series cards go for 1440p, and if u have 60 series card stick to 1080p
I agree. I have an RTX4080 Super. I regard it as a 1440p card and only the RTX4090 as a true 4K card.
RTX 2080 super, i use it on Cyberpunk and play in 1080p locked to 60fps with ray tracing on high
Rx6700xt playing everything high / ultra most games... Chill 60-90fps 4k
Still running 2080ti at 4k, you guys need to optimise.
@@Just_Be_The_Ball dlss performance /ultra performance with medium setting and sub 60fps while disabling ray/path tracing isn't playing 4k optimised.
Also before giving me example of some 5 year old game, give me your settings and fps @4k in any of these games:
Hellblade saunas saga/horizon forbidden west/Cyberpunk with ray tracing/path tracing
4k with my 4080 super on 28" screen sweet spot for ppi. No problem to lower settings from ultra to medium like shadows or reflections or grass detail for 100-120fps super clear. Next upgrade will be 4k 27-28" oled. I cannot go back to 1440p its too blurry and distracting
There's a flaw trying to compare 4K DLSS Performance to native 1440p on the same 4K panel. The scaling of the 1440p native image won't be great.
Ideally you'd want very similar monitors at the same size (one 1440p and one 4K) for this purpose. Maybe you could get away with a larger 4K panel moved far enough away to look the same size.
PC: (3080,10600K,32GB RAM, 3TB memory, 850w PSU) $2600 about 3yr old build
TV: 4k 55' LG C2 OLED with Gsync $1800 on sale
Not cheap but i cant think of any better gaming experience then 4k 60fps+ max/high settings on a OLED/Gsync display, it seems more you spend on a PC about $2k+ the less pay off it gives you, atleast just in a gaming aspect
Same 55" C2 here, I liked it so much I bought a 2nd 😂 With a 4090 I play mostly at 5000 or 6000 pixels acros, 80-120fps (capped) sometimes without DLSS. Add frame gen in there and it's always very playable. That TV is now around 900 bucks and still better than any monitor I've had.
I've been using an RTX 3090 for 4K gaming on my LG CX 55" OLED 120hz/VRR display in the living room. There are only 2 games where it has struggled with 4K/60. Those games were Starfield and Cyberpunk 2077. You lower a couple of settings and use DLSS and it's fine. Most games easily run 4K/120Hz without issues.
4k ips vs 1440p oled? Which is better for gaming? 🤔
OLED all the way. the color on that sht it top tier.
if you put it this way, then OLED for sure 🤔
Get OLED all the way
Contrast > Resolution
That's not even talking about other image quality advantages.
If both displays are calibrated, I can guarantee that a 1440p OLED will look better than 4K IPS both with static images and in motion while providing more FPS. You can even lower graphics settings with an OLED, let's say, from high to medium and it still won't look worse, if not better, while providing you with even better performance.
Have not seen 4k IPS. But just got UW 1440p oled. And it is a whole new experience coming from IPS. Playing games again from the start, for experiencing them on a oled monitor.
The best way to reduce performance demands for gaming is to simply sit further away and use a bigger panel. If you prefer to game @ about 3 feet away which isn't all that far, you can get away with 1080p easily, as well as 90 PPI, you can also do 32" 1440p, 2560 x 1080p 30" at this distance and panel size. 90 PPI at 3 away offers the same value as 110 ppi (which is very common today with QHD displays). 4K is great for sitting very close to your monitor, meaning you can benefit closer than 3 feet, but it also increases the feasibility of larger panels (like 32") up to 48" reasonably - however at the cost of needing minimum a $700+ GPU.
32" 1440p and 30" 2560 x 1080p are the sweet spots for larger display gaming imho. You can get away with a $500 GPU in this performance category easily.
4k Would require a GPU 150-200% more expensive to get away with a good linear frame rate and native image above 60 FPS.
Using dldsr in 4k mode on a 1440p monitor improves the image quite well, but will probably reduce performance more than on a 4k monitor
I paid 1200 bucks to build a pc that games excellently at 1440p with proper ray tracing. I can’t really justify anymore money unless I’m using it for work. 1440p is definitely the sweet spot.
Agreed, most people don't have money to throw away just to keep maintaining that hungry res
for you..
Agreed, for the 'most-bang-for-your-buck' approach 1440p IS the sweet spot. Having said that, I just moved to 4k.
"sweet spot" is such a dumb phrase. It literally just means "i cant afford what I want so I will settle for this"
@@Dempig So… 95 percent of the planet? You may have endless cash to blow on wants but the rest of us have other responsibilities.
I always wandered why 1800p is not a thing. 4k is like 3 times more pixels than 1440p. We need a middle ground
4k is closer to double the amount of pixels.
1440 is perfect spot
4k has -a2.15a- 2.25 times the number of pixels as 1440p
Edit: 4k actually has 2.25 times the number of pixels, not 2.15
@@superpork1superboy771*2.25 times the number of pixels.
I'll get a 4K monitor when the RTX5090 comes out
Just get a 4k monitor when on sale/Black Friday. Can enjoy your new card. It also won't be such a big expense at once. You can lower the resolution to 1440p and increase the sharpening in the meantime if you cant run 4k smoothly presently. It'll look just as good as native 1440p.
4K is a motherfucker of a resolution
@@lsik231lCounterpoint monitor refresh rates for 4k are spiking while prices are dropping. Probably best to wait until you buy the gpu for your monitor update. A unexpected problem with high end 4k units is the cable isn’t rated for most lengths. Look into the issues with GIGABYTES most recent high end monitor mainly cable length limited to 1.2meters and no dolby vision all for $1,300. Gigabyte Aorus FO32U2P
no need to when you have the 30 and 40 series.
@@jackmills7758 nah. Which 30 series card is good for 4K in the next five years? None tbh. I'd rather stick with 2k.
4090 is the only card worth getting a 4K monitor for and it's best to wait for the 5090 at this point.
I have an RTX 3070 32GB RAM running on a 1440p 144hz monitor. With that setup I’m reaching 40-60 FPS on custom/high settings on the new Warhammer 40K title “Space Marines 2”
..the game is incredible. I just wish it was optimized.
I just went from a 27 inch 1440p monitor to a 32 inch 4k monitor. So far my drop in performance isn't terrible. I was probably a little cpu bound at 1440p so I think that's why the jump to 4k didnt ding me as bad. Most games I'm at the 80-100 fps at max settings...
Many people tell me 4K sharpness is simply the best but I'm still on 3440x1440 Ultrawide primarily because I want the best performance to visuals ratio. I have a 4090/13900K rig and eventhough I'm sure its capable of gaming at 4K but I value the slightly higher framerates I get at slightly lower than 4K resolution especially for very demanding new AAA titles like Black Myth-Wukong. More importantly I'm so used to 21:9 ultrawide that I don't think I can go back to 16:9.
If ur rich as u seem to be, samsung made a 4k ultrawide oled recently, look it up. Its as demanding as 8k but it is the first actual 4k ultrawide
@@williamschlass6371I almost bought this…but I’d need to upgrade from a 4080 to 4090 or even 5090 lol. Stayed with my OLED CX for now lol
I'm at 139max framerates with 3090/14900K on Wukong, on 1080p
1080p is great since so many years for me and absolutely enough for the best performance
but now with more Detailled Game like wu kong, i wonder if it's worth it for me to try the 1440p without lose too much framerate because, I just DONT WANT play under 100 framerate its impossible for me anymore because all my game is above 100 since 5 years, and i CLEARLY see the difference between 60fps and 144fps
So how much framerate do you lose from 1080p to 1440p. should i finnally buy a 240 hertz 1440p in 2024 ? ( ghost of tsushima was 180framerate and absolutely breathtaking visual even on 1080p )
@@williamschlass6371 I'd prefer to be at a slightly lower resolution than 4K (if I'm not on 3440x1440 21:9, I'd be at 2560x1440 16:9) as I value the slightly higher framerates. It's not about being rich its a matter of priorities. I have a friend who wouldn't pay more than $500 for a graphics card but would upgrade his car every 2 years lol.
@@Faz-_ How come you're playing at 1080p on a 3090? But I fully understand why you chose to remain at 1080p for the best framerates. I used to think that way too but I would say 2560x1440 is the 'sweet spot' for best performance with the best visuals. The reason I upgraded to 3440x1440 was because my 24" 1080p monitor seemed too small after some time. My original plan was an upgrade to a 27" @ 2560x1440 monitor but after I saw a 34" 21:9 ultrawide monitor it was too huge a leap forward that I was enticed, with the added bonus that its 3440x1440 resolution was less demanding to drive than a 4K monitor.
I think you meant how much framerates you'll lose going from 1080p to 1440p. Have a look at the TechPowerUp's graphic cards reviews, they show performance (fps) differences when playing various games at 1080p/1440p/2160p resolutions. I don't think it will eat up too much framerates. I fully understand that you value >100fps framerates - its the same reason I used DLSS-Balanced rather than DLSS-Quality in Black Myth-Wukong (a whopping difference of 125fps vs 105fps). I'm not into high refresh rate monitors so I can't comment.
I went from 60 hz 1080p 27 inch to 185 hz 1440p 31.5 inch.
Massive difference.
Hell had to move stuff around to get some distance.
4k today is pretty expensive to run well, but it's perfectly doable. I'm the type of person who gets top-end hardware but spends a lot of time with it. At the end of 2022 I went from playing at 1080p with a GTX 1080 to playing at 4k with an RTX 4090, it was a pretty nice upgrade, and the frame-rates even improved quite a bit, even with the bump in resolution, but also thanks to DLSS. It was a massive upgrade and now when I play at 1080p again in my laptop connected to TV it feels horrible, but on the laptop screen itself its still fine.
The 90 series cards are going to sell less, the average consumer can't afford them. Nvidia also is taking advantage of the Biden administration failures and jacking up those prices.
1440p is the Golden middle. Excellent framerates and playable with Path tracing On.
I made the mistake of getting a 32” 4K60 LCD. I can’t go back, not just for games but video and photo editing as well. I just got a 4070 Super and 4K60 is fine in most games and some just need a little DLSS.
4K DLSS/Performance looks good, and all new games should support this. AMD should go for an upscaler tailored to their own cards to get it on par with DLSS.
🤣🤣🤣 Ummm... FSR? Hello?
@@dkindig FSR has the goal to work on everything from Nvidia hardware to Playstations. Nvidia, Intel and soon Sony have there own upscalers and don't need FSR. FSR can't compete on quality because it can't utilize the hardware properly since it has to work on everything. Just drop that requirement, and develop a new modern upscaler that works only on new AMD hardware. FSR as it is is good enough for legacy cards.
@@ZeroZingo All that aside, everything that I can find indicates that FSR is an AMD product and AMD is putting their eggs in that basket. So I would imagine that it is tailored to their cards (and poorly done). If they internally decided to use a less than optimal product then that was a poor business decision by them. They shot themselves in the foot instead.
@@dkindig FSR at one point in time set out to take over the upscaler business, essentially kill DLSS and other upscalers by being the only one developers want to implement. But it is not working out, AMD needs to rethink. Hopefully they can do that on existing hardware otherwise they perhaps need to add more AI capability to RDNA5, if they choose to go the AI route. I would bet that upscaling will be part of the driver set in the near future and developers will only implement one API from Microsoft for upscaling, it´s called DirectSR. All cards will be able to upscale all games using their own algorithms in an efficient manner. But it will not look exactly the same, and AMD needs to be competitive. Having a jack of all trades upscaler is not the way to do that.
Once you game at 4K, you'll never unsee it. I tried 1440p gaming after 4K and I just can't do it.
If it halfs ur frame rate u for sure go back to 1440p. At least if it's an action game.
I'll take more fps over 4K any day. Feeling is more importance than visual that i have to squint at.
try again
@zZiL341yRj736 no, I need both. No restrictions, it's 2024 4k gaming should be the standard
Idk, i have a 4k with a 7900xt and an ultra wide 1440p for my 6900xt. Decent different, but its not mind blowing as you are describing.
I just bought a 4k monitor 🫠
What are your specs?
@@MrAnimescrazy 6700 xt :D 5800X, playing mostly Elden Rirng, FH 5 and Dota 2 :D
Same here. Its pretty awesome and now i can actually test all the fancy upscaling
Been playing at 3840x1600 since the 1800x came out. It really is a sweet spot for higher than 1440p gaming vs 4k gaming to get great image quality without suffering as much fps loss. There are always tricks to get the most most fps by turning down a few things that have very little effect on visual quality. Even without using upscaling.
If you're moving from a 4090 to a 5090 then 4k all day. If not 1440p is the sweet spot if you weren't to save money and still not be able to tell much difference between 4k and 1440. I consider myself a PC enthusiast so my main rig gotta be in the top 10% for me to feel I can do anything I want to do without hesitation ex: coding, graphic design, etc
Yup I will be upgrading from my 4090/7800x3d/b650e motherboard to a 5090/9800x3d or 9950x3d/x870e motherboard etc.
I'd say 4k is wasted resources that could be used towards more pathtracing or other visual improvements. I already didn't see a huge improvement from 1080p to 1440p, and I'd be happy to go back any time if it is needed to run a game at max settings. Thanks to my 4070 ti super I don't have that problem at the moment. But I literally can't play games without raytracing anymore. Just started playing starfield and quit after 2 hours. the really bad shadows and lighting just throws me off constantly and destroys my immersion and mood. I got spoiled by cp207,7Alan Wake 2, Dying Light 2, Fortnite with Lumen and RT and so on . I just have to wait for UE5 lumen titles to properly enjoy playing again. It kinda frustrates me that there is still games coming out in 2024 without a proper raytracing implementation. Imagine that tech is like 6 years old now. Back then with tesselation adoption was a lot faster. In my opinion there shouldn't even be a rt on/off button in any game anymore. Just plain low to high slider for RT. It is the same standard now as back then screen space reflections and so on. Also rt should move on to plain path tracing in any game like it is in cp2077 and Alan Wake 2. That is soooo much better than "normal" rt. Devs need to start hearing to the community and bring more of that stuff.
I've opted for 3440x1440p - at least I'm getting extra FOV for all that performance impact...
tbh aliasing and taa blur bugs me more than lower res shadows, switching to 4k from 1440p was def an improvement on my 6800xt (since I'm not gonna be raytracing much with it anyway), but i will say much of the visual improvement came from going from IPS to OLED
I have a lot of hours in Starfield and can tell you that in general the lighting in Starfield is broken all over the place.
@@NinjaWeedle Yeah OLED makes a lot of a difference. I see it on my smartphone. I've been waiting for gaming OLED monitors for like 10 years. But now they're finally out but far too expensive for me. I'm kinda strange in that department. I have no problems spending 1k on a graphicscard or 2,5k on a gaming tower, but my monitors always have been like 80-150€ish with my last 1440p monitor being an outlyer with 250€ (reduced from 450€). I used my 1600x1200 monitor literally for 13 years till 2023, then it broke or else I'd still be rocking that i think. For me every monitor above 300€ is price inflated and I kinda wait for a cheaper model, but then I can never decide which to get. But OLED start at like 700€. That is insane.
@@Micromation But 4k gets you almost the same horizontal FOV without chopping the top and bottom off...
Question, What about Ulltrawide 3440x1440?
I have excellent eyesight and do most of my gaming from my couch. I can barely see a difference between 4K and 1440p at that distance, so 4K is completely unnecessary, in my case. I'd much rather have the extra horsepower to boost FPS or play around with RT or various overindulgent Nvidia effects, haha.
This i can agree with. If you dont sit up close, your not gonna notice the difference.
i somewhat agree...at gaming distance I can see just a hint of blurriness ifi run my 55 inch in 1440 instead of 4k, id rather just run 4k dlss performance...looks as good as native at a distance and runs like 1440
eh i don't think my eyesight is that great and i can tell pretty easily but its also easy to ignore and higher fps is much more noticeable
You dont have to be able to see huge gaps between individual pixels to tell its not as sharp
@@Frozoken its not that it looks bad, you can tell there's a haze over the image.
It all depends on the size of your screen. For most PC displays anything above 1440p has severe diminishing returns due to screen size, not worth the extra cost. If you are gaming on a 65" TV then 4k makes more sense.
Went from 24" 1080p to 27" 1440p. During gameplay I wouldn't say I see much of a difference. It all depends on the angle of view ie how close the screen is. The bigger the screen the farther off you sit generally. At some point the higher the resolution is, it gives you diminishing returns. It's the same with photography and printing your photo. You can print it really big and look up close but you can't see the full picture that way, you have to stand back.
There are of course exceptions like if you want more emersion in the game world, a wide screen with a curved display or multiple monitors or projectors to create a wider view etc. But I wouldn't want that as standard, it would be too overwhelming with fast paced action, I'd get nauseous and headaches.
Gaming at 4K isn't the stupid thing.
Games being insanely unoptimised while still trying to aim to be highly demanding graphical showcases, that's what's stupid. But everyone pretends that's not an issue because upscaling is here to the rescue so everyone can lean on it as a crutch.
i'm dreaming of playing at 5k, goddamn things are sharp at that resolution especially in VR. Hoping the next generation of GPU's will allow for 5k seemlessly.
Issue is monitors, there aren’t any gaming 5K monitors, just professional and work targeted monitors thus 60hz and whatnot which you don't want.
I do hope 5K becomes adpoted in the future though, as something beyond 4K when high end hardware has no trouble doing 4K but 8K is still too ridiculous to feasibly do on anything new.
It's not GPUs holding back stupid resolutions, it's the fact devs and most players would rather they use that processing power to make games look better rather than have a worse looking game at a higher resolution. So new GPUs will come and demanding games will match them and those resolutions will still not make sense.
4k looks so much better that I would never go back to 1440.
how about 1800p
@@math3capanema 1800 is okay but 4k is still better. I think it really depends on eyesight. My friends with glasses are fine with 2k but myself and other friends with 20/20 or better vision, all prefer 4k (or better i guess)
@@MacVerick especially in a OLED TV!
Agreed, I upgraded to 1440p from a 1080p monitor and almost didn't notice any improvement. Returned the monitor and got a 4k display and yeah I would never go back to 1440p too.
I went from 1080 to 4K overnight when I went from my RX5700 to RX7900 XT. It’s so much clearer and noticeably better.
It comes down to preference. DLSS/FSR should be used in either way. IMO the target should be either 1440p or 4K high/ultra settings with RT 60+ fps in single player games. Of course if you are playing some kind of competitive shooter you can just drop settings to medium-low, resolution to 1440p/1080p, disable RT and go for that 90/120fps.
I did upscaling multiple times on PC games, and on EMU, it was always the same as a medium jump.
Even on vids shows it's not a big jump and pics show this too. I watched vids on my 4K TV and the jump isn't big like I said it's med.
1440p is low res compared to 4K. The visual difference in terms of clarity in 27” between the two is night and day.
27" doubtful
I recently purchased a 27" monitor and it feels so big compared to my old 24". I doubt most of the gamers need it unless you are a fan of rapidly moving your eyes, especially if you have multiple screens.
@@kotekutalia Imagine people using a 32” monitor. I moved from 27” to 32” just for the sake of Semi glossy OLED 240hz 4K. But the size is too much for me 😅
@kotekutalia 24in is too small and your eyes shouldn't be moving much unless your face is attached to the screen.
Unless you are getting over 96fps at 4k........Its a slideshow
high refresh rate trumps resolution anyday
1440p@144Hz FTW !!!
how is downscaling from 4k to 1440p work on PC?
It works well, there IS an improvement in image quality and I did this for a while at 1440p. I'm running a 4070 Ti Super and I was using DLDSR (nvidia 3d settings) and running 4k scaled to 1440p. The quality was improved but of course fps was limited by the render resolution (4k). I used it for a little while and the biggest bonus was that I was able to see first-hand if my GPU could run 4k well enough for me to move to a 4k monitor.
@@dkindig best part is 1440p upscaled to 4k looks even better than 4k downscaled to 1440p and will net you more fps
@@alvarg Yes, I've moved to 4k and have one really demanding title, DLSS has been exceptional, running at 67% scaling and image quality is fantastic. FPS has increased by 30%-40% in the most demanding areas. Next will be looking at 240Hz monitors. Running native in most titles though and still getting good frame rates from the 4070 Ti Super. My monitor is pegged at 140 frames most of the time.
@@dkindig Are you using max settings?
@@vijayla21 Yes, but not using ray tracing or path tracing in anything. System was built primarily for Starfield and Star Citizen. Starfield at 4k native I get 55-65 in Mast District, DLSS Quality puts me at 75-90 with excellent image quality, once outside of cities fps picks up quite a bit. Star Citizen I haven't benched in a while because of Master Modes and digging my HOSAS out is a pain in the butt. Anyway, TLDR, 4070 Ti S seems to be a great gateway 4k card for the price if you're okay with everything else about 4k.
Would prefer ultrawide for gaming over 4k 16:9 any day.
Same, it's why I went with an ultrawide since last year.
UW 4k is coming next year and widely available in 2026.
@@nickwu4384 good to know. I am still waiting for 3840x1600 38 oled 38+
And I don't believe that 5120x2160 oled 38+ will be that soon and can't imagine how powerful should be a gpu that should power it.
@@alexander3025 Considering I'm using a 2060 to run 4k I think it will be just fine. Playing all the Yakuza games, and jrpg with dlss performance and it stays at 60 99% of the time.
@@nickwu4384 Yeah I'm looking forward to 4k ultrawide as well. But 4k ultrawide will be a bit more demanding than standard 4k, just like how 3440x1440 is a bit more demanding than standard 1440p so there's that.
Been gaming at 4k for the last 3 years when I got an LG CX OLED 48” TV, and it’s been absolutely fine for the most part, especially with DLSS. And I can also run games in 1440p UW resolution on this screen if I need an additional performance boost. With a 4070 Ti Super most games in 4k push the 120fps limit of my panel. I have to turn up the settings to fully engage the GPU and drop the frame rate under the panel refresh.
I will be honest. When I jumped from 1080p to 1440p I saw a pretty sizeable increase in quality. It was nice and I gamed like that for a good 10+ years. I upgraded to a 32 inch 4k monitor last year. the increase in pixel density would make you think a nice jump from 1440p but honestly I didn't really see much of a difference. Don't get me wrong it is still a very nice monitor but I just felt a bit let down. Even with the added benefit of HDR(when it works), it still feels somewhat lacking. To be honest if I didn't have a 4090 I would have stuck with 1440p at a higher size.
I feel 4k doesn't really start to shine till you get upwards of maybe 55 inches or higher?
Here is my order of importance in gaming.
Framerate -> Ray Tracing (reflections only) -> Graphics settings -> Resolution.
I play at 1080p, 1440p and 4k and barely care about it.
Bro, you compared 1440p and 4K on a 4K monitor? Seriously? I'm beyond disappointed. I feel like I need a doctor after this trauma. Smh...
He's no trying to test how good does a 1440p native monitor look compared to a 4k native one. He wants to see if you already bought a 4k monitor and you have to lower the resolution. What's the impact
Where's my 1440 UW homies at??
Here.
Ran it for a while. Loved it. I want to 4K UW now.
@@wadeere Yeah I can't wait to get 4k UW someday(like years from now).
Sitting in the back, picking their noses, eating their boogers, and turning red in jealousy at the 4K Chads. That's where. Don't be sad, though- unless you're also a console peasant.
10:00 look at the back entrance in this freeze frame the beams on top. So much more clear and detailed and it's 4K DLSS performance internal render at 1080- and it beats 1440p native by a lot.
Performance is the same.
It's so much easier to run games at medium and high settings instead of anything above that especially at 4k without to much of a hit on visuals. I'm using a 6900xt and it will does pretty well at 4k after tweaking settings and messing around with upscaling or fluid frames if you need it. Personally I think that native res with FG looks better than using upscaling on top. Just recently picked up a hellhound 7900xtx to game a little better at 4k but sent it back because it couldn't detect my Samsung oled was a 10 bit hdr 144hz screen. It was reporting it as a 8 bit non hdr 120hz screen for some reason. Also part of the bracket that you screw into the case was bent which was odd as the box wasn't damaged. The upgrade excitement is gone now so waiting for next gen
Waiting for 2030 when Daniel will have video: "Throw you 4K monitor in trash, 8K is the only way to go..." :)
When 4k becomes the 720p of their time
Nah, in 2030 he will say: 4K is for people on a budget 😂
@@Z3t487 That's optimistic, 4K is becoming more and more distant with all the new game engine improvements by 2030 nothing will change because GPUs will only catch up to photo-realistic graphics + full path tracing lighting tech.
@@Z3t487 Believe it or not, in the early 90's SVGA (480p) was pretty damn high res. And we all thought it looked amazing next to 240p lol.
Hard to imagine when gpu makers are still throwing us 8gb of vram on the most mainstream product, there's alot of people still game on 1080p, and most of them not because of choice.
4k isnt dumb lol
It is on any monitor less than 32”.
@@dauntae24 Yeah 32 inch should be the minimum for 4k. I'm waiting to get a 4k ultrawide in the future though.
The entire point of the video is that 4K can be dumb in some situations and in others.
If you only have money for something like a 4060 TI then you probably should not be trying to play in 4K. You're probably better off spending your performance on frames than on additional pixels.
4K is definitely way more viable than ever before though between some of the absurdly powerful GPUs on the market and how good upscaling is now.
@@dauntae24 no it is not, if you are sitting at arms length away from your monitor (which is recommended btw). Even on 27 inches 1440p is noticeably less sharp than 4k. The further you sit away, the less noticeable the differences become.
@@dauntae24 On that it's still dumb, 4K is standard TV resolution for a reason.
if you can afford it, and you prefer it, there is nothing stupid about it. Ill never game on less , ive had 4k for years now
i have 6800 non xt an i play most games at 4, some while using fsr, my tv is 4k and i sit 2 meters away, but tbh i tested 1440p too, i jsut cant see difference from that distance, and i doubt people sit close to their 4k tvs.... only by making screenshots and going near my tv i can see more detailed things, so 4k is good if you have tons of money and can buy 4080 4090 7900 xtx and using 4k monitor while sitting close, then its worth it, if you game on big tv sitting far away, 1440p is more than ok and mid gpus can upscale to almost perfection these days to 4k if need.