@@cybersamiches4028 To be fair costco cheesecake is actually really good, especially the amount you get, the only mainstream one I have tried that tops it in Japan... is the ikea cheesecake lol. However you don't get much for the money at Ikea.
Reviews weren't just limited to Cyberpunk at 1080p, but Cyberpunk at 1080p with RT Ultra and DLSS 3 Balanced with Frame Gen. Nvidia totally controlled the narrative to paint the 4060 in the best light possible
1:44 Not "just" less VRAM, but less memory bus width (128-bit instead of 192-bit), less CUDA cores, less PCI-Express lanes (8x instead of 16x). It's overall just less, everything was downgraded. This should have been a 4030 and sold for under $99. At $300 Nvidia can go duck themselves.
It uses a 107 series GPU which is what normally has been used in 50-class cards. GeForce 60-class cards have used 106 series GPUs since GTX 660 (except GTX 760 which used a bigger 104 GPU)
Pretty sure Jay couldn't do much with the preview and it was controlled heavily by Nvidia, similar to the Digital Foundry early 4090 showing. The early preview stuff is not the full picture and pretty much a waste of time.
The point of that is to temper expectations by preemptively setting one. Used properly, this business tactic is able to deflate hype when needed, or start one when the product needs it to sell. As with all things, it's a balancing tactic.
@@ae-qw5xi Yeah I think he mentioned that he used the exact settings Nvidia said to use for the 4060 on the comparison cards. His video seems like the most up front presentation about the limits placed on the preview from what I have seen so far.
Any tech "journalist" who allowed Nvidia to control them in this way is not a journalist. They're a shill. Allowing Nvidia to control every single aspect of the "preview" is no different than just doing a paid commercial and reading a script written by Nvidia. Unsubscribe from any channel who accepted Nvidia's deal.
Less cheesecake and its been reduced to sucking it through a straw instead of being delivered on a wide fork isn't the way I like my cheesecake. Also, A single slice of cheesecake costs as much as a whole cheesecake use to. Ill wait until cheesecake prices come down.
If Ray Tracing or power efficiency is important to you then the 4060 is superior to the 3060. If RT or efficiency are not important to you then you better buy an AMD card.
the only place the 4060 really wins is efficiency. In sheer performance the RX6700XT wins. In content creation, the 3060 12GB wins. In price, both options currently slap the 4060 senseless.
Maybe my math is off, but 69 is 16% less than 82. 82 is 19% more than 69. Anyways just cant wait to see benchmarks with less favorable settings and games for 8GB vram. And the state of 4060 8GB owners in a couple years :D
It’s more like 3-9% if you check dracarys gaming videos, which makes the 4060 a competitor to a 6650xt 😂.Why would the uplift be higher than a 3060ti to a 4060ti. It gets worse the lower tier you go. 4050 will be a 4030 level. This should Abe been a 4050 for $200
What would have been the most relevant NVIDIA card he could have compared it against was the 3070. Because 20% to 30% is nothing to talk about in terms of comparing it to the card it's meant to replace. Previous gens would see a 20% uplift, give or take, with a new card versus the previous gen card from the tier up. Not to mention, it was just one game. Not enough information to call this a good or a bad. Granted, this was obviously an NVIDIA cherry pick to make the 4060 not look like a turd, like what happened with the 4060 Ti.
@@nomenicuss2091 3070 will beat this card! if navidia only allow 1080p and cyberpunk, then there is a reason for that! 100%... i guess most other games will not perform like this
@@jurgengalke8127 Yes, he was 100% restricted to this CP2077. NVIDIA was cherry picking something that would show the 4060 in the best light. I just feel that Jay could have pointed that out in the video, but he seemed rather... prematurely positive.
@5:00 82 fps versus 69 fps is a "32%" uplift? In what universe is that true? Try an 18% uplift.. Yikes. That thumbnail with the imaginary "30%" written on it is an abomination. Even Nvidia themselves couldn't have hoped for such favorable propaganda, with them only (misleadingly) claiming "20%" uplift in their cherry picked results (When in reality the uplift is below 15%, which we have known for months based on the effectively identical mobile 4060 and how underwhelming it is). Anyone with half a brain knows that Nvidia "allowed" the "preview" of the 4060 in CyberPunk a couple days early since it shows it in the most favorable light (as VideoCardz wrote in an article yesterday, Cyberpunk is a "Nvidia tech demo"). Even then it lost by 9% again the middling Rx 7600 according to Tech Yes City's benchmarks.
sure, if we ignore a lot about said game and the modes its running under or its updates. Look, there are easy, valid reasons to crap on these cards, we don't need stupid bs.
It is true, that 300 USD are a hard sell, if you happen to have a 3060, BUT JTC actually talked about the Cheesecake, when he showed us that the 4060 offers valid Ray Tracing performance, which the 3060 does not offer. And J talked about the efficiency, which is part of the cheesecake's taste argument we get here. If JTC's video somehow wasn't informative then I don't know what this video is, sorry.
Nvidia should have made the Rtx 4060 12 GB Vram, 192 bit bus and Pcie 4.0 x 16 and it would have been great at $300. I have a strong feeling with its current stats the 4060 is going to bandwidth starved and that will cut down on the longevity and life of the card. Personally I would buy a 3060 or even 3060ti (this one beats the 4060) before I would buy the gimped 4060. To each their own and great video Bryan.
@@FreddyRHernandez So iyo, would it be fine for newer cards to just drop x16 altogether and just use x8 like the 4060? Not a rhetorical question, I'm genuinely curious. If there are no performance differences then pcie4x8 which is basically pcie3x16 would be enough even for high end cards no? So if that's the case, paying for a gpu with pcie 4x16 would be a waste of money because you'd be paying for something you're only ever gonna use half of.
Any tech "journalist" who allowed Nvidia to control them in this way is not a journalist. They're a shill. Allowing Nvidia to control every single aspect of the "preview" is no different than just doing a paid commercial and reading a script written by Nvidia. Unsubscribe from any channel who accepted Nvidia's deal.
One more reason to why I don´t watch Jay2cents anymore, he´s just making a commercial for Nvidia to sell their underspec card, people with 3060 have no reason to upgrade to a 4060 the frame generation actually will make you lose in games like Diablo 4 in PvP, at least for those I know who have tried it (not with an 4060, but a 4090), it´s good for singleplayer and horror games, but that´s it, for competitive use, it´s useless, imo. Thank you for making this review so people can see how much of a scam Jay is, he´s not making decent reviews anymore, but rather taking the money and making a commercial, bordeline illegal and misleading, imo.
I'll keep happily waiting with my undervolted GTX1080Ti. Runs like a champ. Granted, I am not breaking any benchmark records with this card coupled to an X99 rig (w/turbo-unlocked E5-2696v3). However, it does everything I need it to do and it does it with aplomb.
1070 ti for me. Definitely not buying these overpriced crappy cards. The 10 series was by far the best generation since it's release. So many 20 and 30 series cards dying and the 10 series are still running like champs. People like us are the only ones helping keep GPU pricing under control.
That's funny because I use both a 1080 Ti and a 980. :D Likewise undervolted (well actually power limited, achieves the same result). jb, it's interesting you mention X99 because I have a different 1080 Ti on X99 with an oc'd 6850K (4.4GHz, using a Rampage V Extreme). It's mainly an editing rig; testing it for games I found the CPU was indeed holding it back, but it was perhaps very specific to the games I play (a lot of Subnautica, which is single-thread-physics heavy, also GTA V and a few others). The gaming PC with the other 1080 Ti had been a 5GHz i7 2700K, which showed even more of a CPU bottleneck; it's now a 5600X/B450 (drives a 48" 1080p TV, ie. sofa gaming), a change which doubled the performance I was getting before with Subnautica, with large speedups elsewhere aswell. As always though, it depends so much on the game(s), settings, and definitely the resolution. The only game I play which does actually strain the 1080 Ti is RDR2, it's the first time since buying the 1080 Ti used back in 2018 that any title has made me contemplate something newer, but given current absurd pricing and product gimping, I can wait, and if I did get something newer that was much faster then I'd want the power draw to be lower by default, for which there's no viable option atm. I have CP2077 and various other newer games on the 5600X setup, not tried them yet. The other gaming PC drives a 24" 1920x1200 IPS, it's used more for ED and various older games (Oblivion, Stalker COP and FC2). It has a GTX 980 on a Rampage IV Extreme and oc'd 4930K, still runs fine.
HUB did reply on their Q and A video that this 'early preview' was essentially a curated Ad using tech tubers to run it. It's fairly obviously a PR move or perhaps some early damage control because everyone expects the card to be underwhelming vs the 3060 and any other card you can buy for $ 300 USD right now. Like the rest of the 4000 series stack. Good power consumption, nice to have features such as frame gen and overpriced for the vram it has.
@@haukikannelframe gen is mediocre at best but it works best if you're already extrapolating from high framerate (70+). Frame gen from 30fps feels like an absolute ass. It's same with DLSS, works great at higher resolution, while at 1080 it's a dog poop (as every other upscaler). Measuring performance with these features is like measuring your cock and incorporating length of your spine in the measurement...
How is 69 to 82 a 32% increase?? Am I completely wrong here? To calculate the percentage increase from 69 to 82, you can use the following formula: Percentage Increase = [(New Value - Original Value) / Original Value] * 100 Plugging in the values: Percentage Increase = [(82 - 69) / 69] * 100 Calculating: Percentage Increase = (13 / 69) * 100 ≈ 18.84% Therefore, the percentage increase from 69 to 82 is approximately 18.84%.
Well it won't destroy Nvidia in cyberpunk, AMD has absolutely terrible performance in that game. But in almost everything else, yes it will beat the 4060. Also in AI/machine learning application, the 4060 will be better too.
Reviewers like Jay forget most of us are not interested how it fares against a card from 6 months ago. Only a tiny amount swap cards every gen now. We want to know how it fares against cards from 3 or more years ago. Epscially important when dealing with low/mid range cards that most people buy. I rarely bother looking at reviews for the $1000 cards. No point.
They want Jay and others to make us forget that there is 1440p and that these cards should be easily able to run it by now. And that they are specifically hobbled to be shit at 1440p. And Jay did it. Which does leave a bit of a foul taste...
nvidia is only allowing the early cyberpunk data. Wonder how the other games stack up. nvidia is going to need a lot more to show for to make me part ways with my 1060 6GB. I'm seriously leaning towards more on the Radeon 7600 when the price is right.
7:30 Are you seriously advocating for less VRAM? 🤨 Bruh. There are games TODAY that have issues running at 1080p high with 8GB. Imagine games in 6 months, or a year. Any 8GB card in 2023 should be avoided like the plague! I stopped watching the video after that. 16GB VRAM should have been standard LAST gen, and it's not even THIS gen. Nvidia and AMD are actively pushing back the PC gaming market. Shame on them! And stop following Jayz2braincells. He's severely out of touch with the PC market and an Nvidia shill. Cheesecake is a BAD analogy that's only serving these companies' bottom lines. Do better.
Frame generation in its current state is something i see as useless. If they can somehow make it use your inputs in generated frames instead of delaying frames, i can see it becoming useful.
I say skip this gen if you are able to. If you can't or won't, then I think RX 7600 is a better deal at 250 (ideally at 200) than the 300-329 RTX 405... Sorry, I mean RTX 4060 (custom AIB models).
@@Jinny-Wa Yes, we haven't reach the stage where 60 class can have 60+fps in RT games sadly... And I personally don't like fake frame because of the input lag. It's gotta be more noticeable with the 60 class GPUs...
You should skill all generations you can skip! If you cab skip 4000 series -> skip,if you can skip 5000 series -> skip and so on. Some people just can not skip and they need these 4060 and 7600 Gpus we have now!
8:16 You are not entirely correct that rtx 4060 has "faster" memory. Technically the 4060 uses faster gddr6x vs gddr6 but... Rtx 3060 has total memory bandwidth of 360GB/s (192 bit bus) vs 4060 with 272GB/s (128 bit bus) The gddr6x in this case doesn't make up for the difference in bus size. Not only you get more VRAM, but also more bandwidth with 3060. You don't have to take my word for it - you can compare memory specs of both cards on techpowerup DLSS 2/3 is nice, but few games support it so my prediction is 4060 is DOA like rest of 4000 series.
@@iequalsnoob I haven't mentioned and care little about the pre-release benchmarks. I was talking specifically about memory specs and included a timestamp in my comment. The real benchmarks remain to be seen once NDA is lifted.
Not sure if you didn't watch his entire video but, he said a couple of times that this was just an introduction and would have a full video with comparisons and more numbers.
Hmm, in his video Jay said he was "given permission" by Nvidia to give an "exclusive first look", I don't think that negates any embargo any other YT'er has for the 4060...
The Cyberpunk preview is the best case scenario of performance. It is a highly regarded game that is tuned perfectly to run on nvidia cards and has all the bells and whistles the card can use, of course it will run fast with dlss 3, but I did not hear how it feels to play. From what I've heard, frame generation increases input lag and while the game looks smooth it doesn't feel like you are playing at those high frame rates. Other thing to point out is that the 4060 will probably not do as well on 1440p, being on par with the 3060 12GB. I feel nvidia started cutting corners since they though they could get away with it. But it feels they left s giant gap between their top tier cards (4090, 4080, 4070ti) and their more mainstream cards (4070, 4060ti, 4060). We need competition, not only on price but also on performance, the new gen of cards feels like s refresh, at least on raster performance. DLSS is great and all, but to me is a non factir unless it is implement on a hardware/driver level rather that being implemented game by game. AMD needs to step up their game, they completely dropped the ball on this gen so far. Intel is the only hope we hsve, but they will release their newest cards next year, we still have at least half a year until we get decent mainstream cards. To be honest, the better option is either to wait or, if you need an upgrade right now, get last gen AMD or Intel. 6700XT and A759 feel the best cards for Mid and entry level gaming respectively and nothing released recently feel even close to their price/performance. Wile I understand that reviewers need to generally plsy ball with the companies, I hope this gets bashed once again so Nvidia lowers the price of this, the 4060, the 4060ti and the 4070. Man, it sure feels bad to be a pc gamer right now, at least if you need a new graphics card...
Ummmm, is it just me or is your maths wrong? 20% of the rtx 3060's frame rate of 69 is 13.8 fps which would put it at 82.8fps which is basically what the 4060 is getting. That's makes it's 20% better performance at most, not 32% lol. How do you muck up maths that simple?
I will say this: the 4060 is not a bad card if you are looking to upgrade from a very old card. That said, I am happy with my 3060 and I do not support anything less than 12gb Vram.
Yesterday, I was playing Ghostwire: Tokyo at 1080p ultra setting RT off. It used 9GB of vram on my 6750xt. Haha, keep buying nvidia card if you believe in their crap
pretty sure there were some restrictions applied, like no comparative testing etc, similar to he was only allowed to talk about 1080p performance, it was also not a review it was a "preview/teaser" as stated by J2C
Apparently it also had to be using RTX and DLSS 3 with frame generation which the 3060 can't do. Just the frame gen and forced high RT settings probably accounts for nearly all of it's improvement. The 3060 isn't an RT Ultra capable card so they kneecapped it as hard as they possibly could to make the 4060 look as good as possible vs it.
How does one get an *RTX2060 Super* to pull 206W in a game? It's TDP is 180W. My RTX2070 Super pulls 205W *in Kombuster* @ 0.95v / 1950GHz (probably c. 225w stock settings), peaks at c.170W (averaging less than that) in games.
I've been rocking a evga ftw3 3060 ti at 1080p 280hz for a couple years almost now and it still everyday blows my mind and not switching it up anytime soon
Had a 3070 and had the best past year for it...really a great card ...finished a lot of titles but at hogwarts lagacy it made ma mad that in 1440p ultra the card gave me stutters and blurry texture. So exchanged this gpu and added 50 for a rx6800 a beast for the price. Still miss my 3070 though....
Think I'll hang on to my 3060 12 GB for now - At least until I can get an idea of how both cards handle the upcoming Unreal Engine 5 titles. I play at 1080p 60fps, so maybe a CPU or RAM upgrade might be a better option.
mine just arrived today bro. 12gb rev 2.0 gaming oc so excited but all i can do for now is look at it since my mobo yet to arrive. gonna stable diffuse that heck out of it once it arrives
@@Feelix420 I like the idea of the 4060 drawing less power, especially as the UK power prices are ridiculously high atm, but some games I play now use more than 8GB VRAM. At 1080p, with 16GB RAM and Ryzen 5500, most games play flawlessly at extreme settings. With Unreal Engine 5 games in developement, I'll wait to see if it's worth an upgrade, when the embargo lifts.
@@bigj1454 bro uk power bills are absolutely whacked atm its not even funny, all thanks to green energy. maybe gaming aside its time to overthrow some government who knows... btw typing this from my brand new keyboard and pc setup feels nice to finally have a decent setup.
Emulation would be interesting to see. PS2 and up. Apparently Yuzu team chewed out Nvidia for bandwidth cuts, and older cards with higher bandwidth would be be better- at least for switch emu. I think a 25% uplift should be the bare minimum for a 60 class card considering the 3060 was not an upgrade over a 2060, let alone being on par with a 2060 Super.... I imagine this will age like butter with the buswidth though.
i've been waiting for 4060 since first 40 series launch, can you please test warthunder in 4k native without a dlss. Currently I'm at 2060 6gb and texture missing and present because of 6gb vram
Out of curiosity why are you trying to run it in 4k on a 60 series? Would you not prefer to run it at 1440p with more frames? Or simply get a card more capable of 4k?
@@TheSwayzeTrain Why not? Warthunder is an easy to run game, why getting out of your way to buy a more expensive gpu when you're planning to mostly play older but still popular titles?
I'm making a white budget-oriented AM5 PC. I have ASRock A620M PRO RS. I'm wondering if I will lose more and more FPS in future running on the cheapest Ryzen 5 7600 against R7 7700. I already have a tower+fan for the CPU - white.
But he seemed to talk about deflation - I think he's just confusing a reducing rate of inflation with deflation. So prices are still going up, just not as quickly as they were before.
What is there to compare? the 4060 will stutter in games like Last of us/Star wars jedi and Hogwarts... because it doesnt have enough VRAM like the 6700XT does
PLEASE DO A 4060 VS 7600 VIDEO I SEE NONE. People keep comparing the 7600 to the 4060Ti and keep saying its a bad value. When the 3060TI came ot it was $400 the same with the 4060TI. 7600 launched at $280 its $250 right now and you get the performance of a slightly better 6650XT which costed $400 and the 6600XT at $380. Please make it make sense. 7600XT/7650XT aren't even out yet and the 7600 is so close in performance to the 4060TI. Dlss3 is the only thing saving 4000 series right now.a lot of games dont have it and maybe 1 or two of them are multiplayer games. In the end AMD always wins. FSR 3 is going to carry all the cards. Like how fsr1/2 did for the GTX users.
Please don't lie to people. There has been nothing deflation. Deflation requires negative inflation not lower inflation. Any positive inflation number is additive inflation from the same time the previous year. For deflation to happen it would take a negative 12% or more for an entire year to get back to where money has the same value it had in 2020.
If you play at 1080p go ahead and get an 8gb Vram card. If you play 1440 or 4k and got an 8gb card, save money for the next 2 years because you will have to buy another card to replace that 8gb. As the guy said its good for "a couple of years". 2 years max and its dead.
1060->4060 🤔 4060 is looking to cost what I paid for my 1060 when it first launched. That's ridiculous as all other 60-series have been almost double in my region.
The question remains... Is the 32% uplift going to translate when cyberpunk releases it's updated liberty city upgrade/overhaul? It's 32% now... But will it be more or less come the update.
My PC ceeps crashing with 8g vram on 6600xt in D4. High settings on a 21:9 3440:1440p 160hz Display. 90fps but the Vram is a big Problem. It fills up from 7.2g vram usage to 8g+ vram and than boom hard crash. Is it the game poor game designe or is 8g vram the problem ? Make your own dession...
Can include that or the RX 6700, though the graphs just start getting congested at a certain point, RX 6700 will definitely get a mention in the final review though. Great card.
I'll be blunt: You're either really bad at analyzing or you're intentionally misleading customers because you got some pay from Nvidia. The reviews are out, the embargo had been lifted. Touting the cyberpunk best case scenario even though you knew or could have known the post embargo results is just embarassing. Let me give you one word of advice: People aren't stupid. If you lose your credibility because you shill and people lose trust in you, that will significantly damage your future earnings potential. Short term vs long term. Choose wisely. Back on topic: NO, 8GB VRAM isn't enough, even current titles exist where 1080p ultra settings needs more than 8gb VRAM or you get texture flicker (load in of low res textures). Yes, on average the 4060 is slightly better than the 3060. But ON AVERAGE, still means that in VRAM heavy games the 3060 performs better than the 4060, which is more than disappointing. The 4060 is really a rebranded and overpriced 4050 if you just look at the silicon. There is no reason to chose a 4060 over a 6700 xt for gaming. Not only the lack of VRAM is the problem but also the small bus. And no - inflation does not justify the pricing. I don't think you're not intelligent enough to know these things. I do think you're misleading people for financial gain. I can't think of another explanation. Either way, not a good look. I'm curious to see if you delete this post or not.
Good video. Test idea: Max out VRAM and see if faster mainboard RAM helps against stutters while data is transferring between VRAM and RAM (installed on mainboard). Thanks for the video.
IMO this highlights just how bad the 3060 12GB was. It barely beat the 2060 Super and the only reason it's not a completely forgettable card was the additional VRAM that haunts the more powerful 8GB cards in specific games and settings. Does make me wonder if Nvidia is leaving the door open for an Ada round 2 akin to the 700 series or 20x0 Super refreshes. The arch overall seems incredibly potent, but every last card outside of the 4090 has been kneecapped pretty severely, there's a ton of room to slot Super style cards in the gaps without making the existing stock obsolete. The arch clearly has the headroom to deliver way more than it is right now.
You need more samples to compare the two cards imho, just watch for example the hardware unboxed video of the 4060Ti review, 15 games AVG in 1080p the 3060 performs as good as the 6600 XT, both are faster than the 2060S and 2070 overall
My 3060 laptop with 6 GB Vram can run 2077 1080p ultra + RT psycho with 75 fps average and min is 32 in benchmark. So i really wonder what settings did he use to get 30 fps average from 3060 desktop, no dlss 2 at all?? Extremely misleading tests i must say...
I am impatience, I waited for the RX7600 / RTX 4060ti to be released and grabbed an RX 6700XT instead 😅 I wish AMD released 7600XT instead oh well is done now, the card otw to my home, still happy with my purchase.
I feel like more memory won't help with this narrow memory bus (128-bit for both models). It will be excellent card for 3D artists though, for OptiX/CUDA rendering...
I am perfectly happy with my 3060TI in my Alienware. My upgrade path is when a video card at the middle price point is 2X performance that I have now, then it will be a meaningful upgrade. Maybe that means 7060TI in a few years but that is fine with me.
@@NoodlesTBogratlaptops are unironically good. In GTX10X0 era (also time when mobile Intel CPUs could fry bacon), Alienware was actually the only laptop maker that implemented appropriate cooling solution. Their laptops were the only ones that weren't throttling or having lessened performance out of the box with lower power limits. But desktop are indeed garbage, it's truly baffling how they can get some things right and others sooooo wrong. 😂
I agree with you 100% Brian, to have a better GPU you do not need a whole lot to of cheesecake. All hell I am on the GTX 1060 3GB playing my titles I like to play at 1080p on my 32" 60hz TV monitor. Also, what hurts a lot of game enthusiast is that they do not pay attention to details of 1080p GPUs. 4060 8GB, 4060TI 8GB and the 4060ti 16 GB soon to be released in July are only for 1080p gaming and a lot of game enthusiasts assume that these 1080p GPUs handles high refresh rates of these monitors and they poor FPS. And one thing to point out that Brian tested the 4060 and the 4060 TI GPU PCIE 4.0 x8 at 1080p because this PC enthusiast the information of the GPU and his information is well spot on, and he gave you at taste of gamers that has upgraded to high refresh rate monitors what your FPS will be on those high-quality monitors. Overall Mr. Brian thank you sir for the correct information on RTX 4060, RTX 4060TI, and the 3060 12GB low powered GPU and yes, I see myself buying this GPU or any GPU that is set only at 1080p because I am not upgrading no time soon. Great video Brian and informative.
Jay did a SponSored Video of the sh*t Rtx 4060 to change the perception that its good gpu. I got very very disappointed from Jay, he destroyed his own credibility from his own hands.
Would Be great if you could configure GPU memory like you do on a mobo. Add/swap etc. Not sure if it would add a speed bottleneck but man that would be great.
thx for sharing this^^ i thought u could only drop numbers with that raytracing settings and was not allowed to compare... that is what i thought XD but nice to see this graph. special the a750 looks awesome in that chart and the rx 7600 too. i think this will shake a little bit with that dlss thing and raytracing thing and framegeneration thing. with that, i guess it will place directly unter the 4060 ti. followed up by the a750 man. thx man and cheers from germany
It's a piece of garbage, tech going backwards. I am upgrading a PC for a pal and he has 300 quid for a gfx card, so I'm putting a 6700 XT in it, it's just better value/performancer.
FG is early technology which will be implemented in a few games until next gen gpus. And the main problem is that FG reduces image quality just like DLSS. BUT! Nvidia forces to buy it everyone who's interested in Ray Tracing because in some titles FG required for lowering crazy stutters: The Witcher 3, Plague Tale, Hitman 3, Hogwarts: Legacy. It's made intentionally in partnership with Nvidia for selling RTX 4000. It's the fact because some FG games are not CPU limited with enabled RT (including CP2077 if you have at least 12 thread CPU which is today's gaming minimum).
Personally, it'd be interesting to see the 4060 compared against the 1660 alongside the 1060, 2060 and 3060 so owners of that half-step card have an idea where to put their cash for a new card. Personal experience is a 1660Ti is pretty much on-par with the older 980Ti in terms of 1440p100+Hz gameplay in some games like ME Andromeda on Medium settings (trying to squeeze out as many frames for smoother feeling gameplay by sacrificing graphics quality), without using things like upscaling to make a 1080p120 High output fit a native 1440p120 FreeSync monitor.
I just got the 1660 Ti to start blacking out in Borderlands 3 with max settings, V-sync OFF, uncapped framerate, and whatever other "smoothing" settings you can use. Otherwise, at 5760 x 1080 on 60 Hz, it's not a bad card.
I do not play any of the games reviewers use for benchmark comparisons. I played WoW for 12 years, quit when the Shadowlands pre-patch blew away a number of alts and for the last 3, I've been playing GW2. Along the way, the various releases of Elder Scrolls (positive on story and scenery, big minus on any difference besides looks with how your avatars play). My GPU was EVGA 1080Ti FTW3. I've used CRT, 1080P and 1440P TN/IPS monitors, Sony, LG and Samsung 4K TVs and my latest is Hisense 55U8G. Yes, eye candy settings got turned down as graphics demands increased (and playing at 4K). I play what I play because I find them fun, NOT because how pretty they are. Paying $300-500 for a GPU to "only" play games at 1080P seems extravagant. Nvidia would have us believe frame generation is the same as raw performance, snort. I can say my 2000 van can do 300mph***** (dropped off a cliff), so obviously its faster than a formula 1 car and therefor "better". It really feels/seems like Nvidia is making the same "better than" kind of comparison using frame gen in their "better than" graphs. I personally am no longer in the GPU market as I bought a PowerColor 7900XTX, but I do enjoy seeing what AMD, Nvidia and Intel are up to.
Wouldn't touch any the RTX 4xxx as they ALL have cut down memory bandwidth (except the 4090 and its high price) and at least in here in Canada they still over priced. Nvidia has gave middle finger to gamers. You'd be better buy older RTX 3xxx or go buy RX 6800xt+ or newer.. I am still on GTX 1080ti and been thinking of upgrading but just don't have money atm. Yeah I agree RXT 3060/12 GB is better value imo. RTX 4xxx = Garbage unless you can buy 4090. The 192 bit memory bus vs 128 bit memory bus does make difference along as you said with more Vram. 8 gb is NOT enough for 2023+ and beyond. 12 GB will be bare minimum imo. It just sucks that 3060/12 GB here is going for over $400+ Canadian.
I like cheesecake
Costco cheesecake
Ikea cheesecake 🧀
Woolworths cheesecake is the best. Waiting for it to defrost is a pain though 😅
Real late at night, cheesecake. Sensational.
@@cybersamiches4028 To be fair costco cheesecake is actually really good, especially the amount you get, the only mainstream one I have tried that tops it in Japan... is the ikea cheesecake lol. However you don't get much for the money at Ikea.
Reviews weren't just limited to Cyberpunk at 1080p, but Cyberpunk at 1080p with RT Ultra and DLSS 3 Balanced with Frame Gen. Nvidia totally controlled the narrative to paint the 4060 in the best light possible
I’m still trying to figure out if this video is satire or, tech yes just didn’t know about the 4060 “preview”.
Literally every company tries to paint their product in the best light possible...kinda silly to expect otherwise
They probably got better results than they expected...
Whts the price difference in cheesecakes?
@@samgoff5289that's like saying a new car review can only be driven down certain roads, at a certain speed and with no other people in the car.
1:44 Not "just" less VRAM, but less memory bus width (128-bit instead of 192-bit), less CUDA cores, less PCI-Express lanes (8x instead of 16x). It's overall just less, everything was downgraded. This should have been a 4030 and sold for under $99. At $300 Nvidia can go duck themselves.
It uses a 107 series GPU which is what normally has been used in 50-class cards. GeForce 60-class cards have used 106 series GPUs since GTX 660 (except GTX 760 which used a bigger 104 GPU)
Pretty sure Jay couldn't do much with the preview and it was controlled heavily by Nvidia, similar to the Digital Foundry early 4090 showing. The early preview stuff is not the full picture and pretty much a waste of time.
The point of that is to temper expectations by preemptively setting one. Used properly, this business tactic is able to deflate hype when needed, or start one when the product needs it to sell. As with all things, it's a balancing tactic.
well, daniel owen compared some nvidia cards to the 4060 despite those limitations nvidia set
@@ae-qw5xi Yeah I think he mentioned that he used the exact settings Nvidia said to use for the 4060 on the comparison cards. His video seems like the most up front presentation about the limits placed on the preview from what I have seen so far.
@@kamikaze00007It is basically a paid for ad.
Any tech "journalist" who allowed Nvidia to control them in this way is not a journalist. They're a shill. Allowing Nvidia to control every single aspect of the "preview" is no different than just doing a paid commercial and reading a script written by Nvidia. Unsubscribe from any channel who accepted Nvidia's deal.
how do you get 32% of uplift dividing 82 by 69? it is 1.188, meaning the uplift is 18.8% which is far from 32...
Less cheesecake and its been reduced to sucking it through a straw instead of being delivered on a wide fork isn't the way I like my cheesecake. Also, A single slice of cheesecake costs as much as a whole cheesecake use to. Ill wait until cheesecake prices come down.
You make tasty points.
If Ray Tracing or power efficiency is important to you then the 4060 is superior to the 3060.
If RT or efficiency are not important to you then you better buy an AMD card.
the only place the 4060 really wins is efficiency. In sheer performance the RX6700XT wins. In content creation, the 3060 12GB wins. In price, both options currently slap the 4060 senseless.
Maybe my math is off, but 69 is 16% less than 82.
82 is 19% more than 69.
Anyways just cant wait to see benchmarks with less favorable settings and games for 8GB vram.
And the state of 4060 8GB owners in a couple years :D
Also who games at 1080p in this day and age. Like decent 2k 144hz monitors are going for $200. WTH are people using?
That's 19% uplift, not 30%. How is everyone missing this?
exactly, that a major f88kup
It’s more like 3-9% if you check dracarys gaming videos, which makes the 4060 a competitor to a 6650xt 😂.Why would the uplift be higher than a 3060ti to a 4060ti. It gets worse the lower tier you go. 4050 will be a 4030 level. This should Abe been a 4050 for $200
I don't know what calculator he was using, but indeed 82/69 = 1.188 i.e. 18.8% uplift, not 32% !
Noticed this immediately lol
Was thinking the same thing...
Jay's earned two cents.
Two Cents from Jensen
Nvidia just sent him the script to read
@@kasmidjan Jay, Cents, Jensen, Jencents.
I smell a conspiracy. lmfao
Or thirty pieces of silver
@@how2pick4name 🤣🤣🤣👍👍👍
What would have been the most relevant NVIDIA card he could have compared it against was the 3070. Because 20% to 30% is nothing to talk about in terms of comparing it to the card it's meant to replace. Previous gens would see a 20% uplift, give or take, with a new card versus the previous gen card from the tier up. Not to mention, it was just one game. Not enough information to call this a good or a bad. Granted, this was obviously an NVIDIA cherry pick to make the 4060 not look like a turd, like what happened with the 4060 Ti.
Would love to see 3070 comparison myself as well.
@@nomenicuss2091 3070 will beat this card!
if navidia only allow 1080p and cyberpunk, then there is a reason for that!
100%...
i guess most other games will not perform like this
I'm sure Jay wasnt allowed to show it against a rx7600 like Brian did and show a $260 (and falling) amd card beat it.
@@Knebebelmeyer Exactly.
@@jurgengalke8127 Yes, he was 100% restricted to this CP2077. NVIDIA was cherry picking something that would show the 4060 in the best light. I just feel that Jay could have pointed that out in the video, but he seemed rather... prematurely positive.
@5:00 82 fps versus 69 fps is a "32%" uplift? In what universe is that true? Try an 18% uplift.. Yikes. That thumbnail with the imaginary "30%" written on it is an abomination. Even Nvidia themselves couldn't have hoped for such favorable propaganda, with them only (misleadingly) claiming "20%" uplift in their cherry picked results (When in reality the uplift is below 15%, which we have known for months based on the effectively identical mobile 4060 and how underwhelming it is). Anyone with half a brain knows that Nvidia "allowed" the "preview" of the 4060 in CyberPunk a couple days early since it shows it in the most favorable light (as VideoCardz wrote in an article yesterday, Cyberpunk is a "Nvidia tech demo"). Even then it lost by 9% again the middling Rx 7600 according to Tech Yes City's benchmarks.
he tries to teach you about economics and inflation but can't do 5th grade math
A $300 card running a 2.5 year old game at 1080p. What a time to be alive.
sure, if we ignore a lot about said game and the modes its running under or its updates. Look, there are easy, valid reasons to crap on these cards, we don't need stupid bs.
It is true, that 300 USD are a hard sell, if you happen to have a 3060, BUT JTC actually talked about the Cheesecake, when he showed us that the 4060 offers valid Ray Tracing performance, which the 3060 does not offer.
And J talked about the efficiency, which is part of the cheesecake's taste argument we get here.
If JTC's video somehow wasn't informative then I don't know what this video is, sorry.
Nvidia should have made the Rtx 4060 12 GB Vram, 192 bit bus and Pcie 4.0 x 16 and it would have been great at $300. I have a strong feeling with its current stats the 4060 is going to bandwidth starved and that will cut down on the longevity and life of the card. Personally I would buy a 3060 or even 3060ti (this one beats the 4060) before I would buy the gimped 4060. To each their own and great video Bryan.
4060 you have mentioned is renamed by Nvidia to 4070-12GB
4050 is renamed by Nvidia to 4060 to jack up the price 😄
They did, they called it the 4070 and 4070ti.
I like your opinion...just bought an MSI 3060 OC 3-fan.
@@levlevinski596 please stop spreading stupid misinformation like this
@@FreddyRHernandez
So iyo, would it be fine for newer cards to just drop x16 altogether and just use x8 like the 4060? Not a rhetorical question, I'm genuinely curious. If there are no performance differences then pcie4x8 which is basically pcie3x16 would be enough even for high end cards no? So if that's the case, paying for a gpu with pcie 4x16 would be a waste of money because you'd be paying for something you're only ever gonna use half of.
It wasn’t just jay. This was a “preview”. Daniel Owen also did a video on it. It was approved by nvidia.
Any tech "journalist" who allowed Nvidia to control them in this way is not a journalist. They're a shill. Allowing Nvidia to control every single aspect of the "preview" is no different than just doing a paid commercial and reading a script written by Nvidia. Unsubscribe from any channel who accepted Nvidia's deal.
Yeah but that doesn't change anything
One more reason to why I don´t watch Jay2cents anymore, he´s just making a commercial for Nvidia to sell their underspec card, people with 3060 have no reason to upgrade to a 4060
the frame generation actually will make you lose in games like Diablo 4 in PvP, at least for those I know who have tried it (not with an 4060, but a 4090), it´s good for singleplayer and horror games, but that´s it, for competitive use, it´s useless, imo. Thank you for making this review so people can see how much of a scam Jay is, he´s not making decent reviews anymore, but rather taking the money and making a commercial, bordeline illegal and misleading, imo.
This 4060 with 8GB will age like a cheesecake.
😅
I'll keep happily waiting with my undervolted GTX1080Ti. Runs like a champ. Granted, I am not breaking any benchmark records with this card coupled to an X99 rig (w/turbo-unlocked E5-2696v3). However, it does everything I need it to do and it does it with aplomb.
Haha, I still use a GTX980. Lol
I wasn't gonna but criminal scalping prices so I'm patient... or really stingy!😂👍
1070 ti for me. Definitely not buying these overpriced crappy cards. The 10 series was by far the best generation since it's release. So many 20 and 30 series cards dying and the 10 series are still running like champs. People like us are the only ones helping keep GPU pricing under control.
That's funny because I use both a 1080 Ti and a 980. :D Likewise undervolted (well actually power limited, achieves the same result).
jb, it's interesting you mention X99 because I have a different 1080 Ti on X99 with an oc'd 6850K (4.4GHz, using a Rampage V Extreme). It's mainly an editing rig; testing it for games I found the CPU was indeed holding it back, but it was perhaps very specific to the games I play (a lot of Subnautica, which is single-thread-physics heavy, also GTA V and a few others). The gaming PC with the other 1080 Ti had been a 5GHz i7 2700K, which showed even more of a CPU bottleneck; it's now a 5600X/B450 (drives a 48" 1080p TV, ie. sofa gaming), a change which doubled the performance I was getting before with Subnautica, with large speedups elsewhere aswell. As always though, it depends so much on the game(s), settings, and definitely the resolution. The only game I play which does actually strain the 1080 Ti is RDR2, it's the first time since buying the 1080 Ti used back in 2018 that any title has made me contemplate something newer, but given current absurd pricing and product gimping, I can wait, and if I did get something newer that was much faster then I'd want the power draw to be lower by default, for which there's no viable option atm. I have CP2077 and various other newer games on the 5600X setup, not tried them yet.
The other gaming PC drives a 24" 1920x1200 IPS, it's used more for ED and various older games (Oblivion, Stalker COP and FC2). It has a GTX 980 on a Rampage IV Extreme and oc'd 4930K, still runs fine.
I'm still on X58 with a X5670 @ 4.4GHz and GTX 1080
@@DonzLockz6700XT second hand is seeing some excellent prices Rin. I've got mine for 240 for living room PC :D
HUB did reply on their Q and A video that this 'early preview' was essentially a curated Ad using tech tubers to run it. It's fairly obviously a PR move or perhaps some early damage control because everyone expects the card to be underwhelming vs the 3060 and any other card you can buy for $ 300 USD right now. Like the rest of the 4000 series stack. Good power consumption, nice to have features such as frame gen and overpriced for the vram it has.
GDDR6 VRAM Prices Plummet: 8GB of Memory now Costs $27 Tom's Hardware. These should be well south of 300 in all regions.
82/69=1,19 so 4060 is 19% faster than 3060. Or am i missing something? Where that 32% comes from?
If the card was 200$ it would be fantastic.
100$ would be great 😊😮
If it's free it would be great
It everthing is free then its great
if they pay me to get the card it would be fantastic
You’re living in your own reality then 🤦♂️
is worse than the 7600 in a very nvidia optimized title, thats DOA af.
You did forget the fake frame dlls 3.0 that makes it ”great”
😂
@@haukikannelframe gen is mediocre at best but it works best if you're already extrapolating from high framerate (70+). Frame gen from 30fps feels like an absolute ass. It's same with DLSS, works great at higher resolution, while at 1080 it's a dog poop (as every other upscaler). Measuring performance with these features is like measuring your cock and incorporating length of your spine in the measurement...
How is 69 to 82 a 32% increase?? Am I completely wrong here?
To calculate the percentage increase from 69 to 82, you can use the following formula:
Percentage Increase = [(New Value - Original Value) / Original Value] * 100
Plugging in the values:
Percentage Increase = [(82 - 69) / 69] * 100
Calculating:
Percentage Increase = (13 / 69) * 100 ≈ 18.84%
Therefore, the percentage increase from 69 to 82 is approximately 18.84%.
@4:50
@@edgarzakarian1649 he tries to teach you about economics and inflation but can't do 5th grade math
@@tyre1337 Anyone can have a brainfart. It's ok. Just mentioning it so it's corrected.
Rx6700xt used is around 300$ pretty sure it would destroy the 4060 and cyberpunk is probably the best result they showed
well i got 6800 for 215$ and pretty it gonna beat 4060 overall aspect .
@@Sambathgame Great deal. And it works, even better.
@@Sambathgamedang where’d u get that?!
@@rcvillapando it my country , but now the seller raise it back to 300$
Well it won't destroy Nvidia in cyberpunk, AMD has absolutely terrible performance in that game. But in almost everything else, yes it will beat the 4060. Also in AI/machine learning application, the 4060 will be better too.
Reviewers like Jay forget most of us are not interested how it fares against a card from 6 months ago. Only a tiny amount swap cards every gen now. We want to know how it fares against cards from 3 or more years ago. Epscially important when dealing with low/mid range cards that most people buy. I rarely bother looking at reviews for the $1000 cards. No point.
They want Jay and others to make us forget that there is 1440p and that these cards should be easily able to run it by now. And that they are specifically hobbled to be shit at 1440p. And Jay did it. Which does leave a bit of a foul taste...
nvidia is only allowing the early cyberpunk data. Wonder how the other games stack up. nvidia is going to need a lot more to show for to make me part ways with my 1060 6GB. I'm seriously leaning towards more on the Radeon 7600 when the price is right.
A much better use of your money Bro, The 7600 will outperform the nvidia card in 8 out of 10 games and will do it at a much lower pricepoint.
The 4060 doesn't only have less vram but a narrower bus. It's a higly controlled scenario for nvidiapunk 2077.
The limitations they placed on Jay with NiVidia only letting him talk to their advantage and not the entire story was BS
7:30 Are you seriously advocating for less VRAM? 🤨 Bruh.
There are games TODAY that have issues running at 1080p high with 8GB. Imagine games in 6 months, or a year. Any 8GB card in 2023 should be avoided like the plague! I stopped watching the video after that. 16GB VRAM should have been standard LAST gen, and it's not even THIS gen. Nvidia and AMD are actively pushing back the PC gaming market. Shame on them! And stop following Jayz2braincells. He's severely out of touch with the PC market and an Nvidia shill. Cheesecake is a BAD analogy that's only serving these companies' bottom lines. Do better.
Ok crazy guy.
82/69 = 18.8% uplift, not 32%? @ 4:50
Frame generation in its current state is something i see as useless. If they can somehow make it use your inputs in generated frames instead of delaying frames, i can see it becoming useful.
I say skip this gen if you are able to. If you can't or won't, then I think RX 7600 is a better deal at 250 (ideally at 200) than the 300-329 RTX 405... Sorry, I mean RTX 4060 (custom AIB models).
Unless you like RT, then AMD is out of the question. Then again its not like 60 series run RT well at all but still way better with dlss2 and 3
@@Jinny-Wa Yes, we haven't reach the stage where 60 class can have 60+fps in RT games sadly... And I personally don't like fake frame because of the input lag. It's gotta be more noticeable with the 60 class GPUs...
@@Jinny-Wa No
You should skill all generations you can skip!
If you cab skip 4000 series -> skip,if you can skip 5000 series -> skip and so on.
Some people just can not skip and they need these 4060 and 7600 Gpus we have now!
RTX 4070 is the real RTX 4060. This is RTX 4050 renamed as RTX 4060.
8:16
You are not entirely correct that rtx 4060 has "faster" memory. Technically the 4060 uses faster gddr6x vs gddr6 but...
Rtx 3060 has total memory bandwidth of 360GB/s (192 bit bus) vs 4060 with 272GB/s (128 bit bus)
The gddr6x in this case doesn't make up for the difference in bus size. Not only you get more VRAM, but also more bandwidth with 3060.
You don't have to take my word for it - you can compare memory specs of both cards on techpowerup
DLSS 2/3 is nice, but few games support it so my prediction is 4060 is DOA like rest of 4000 series.
BRUH.... they DECREASED the bus size???? WTF Nvidia?? 🤦🤦🤦
Are you blind? The 4060 still handily beats the 3060 even with everything being smaller. Your argument is absolutely pointless
@@iequalsnoob I haven't mentioned and care little about the pre-release benchmarks. I was talking specifically about memory specs and included a timestamp in my comment.
The real benchmarks remain to be seen once NDA is lifted.
Will the general public voted with their money and bought the 4060 in large numbers.
Not sure if you didn't watch his entire video but, he said a couple of times that this was just an introduction and would have a full video with comparisons and more numbers.
Hmm, in his video Jay said he was "given permission" by Nvidia to give an "exclusive first look", I don't think that negates any embargo any other YT'er has for the 4060...
Geeze. 6700 XT beats this up and costs less.
The Cyberpunk preview is the best case scenario of performance. It is a highly regarded game that is tuned perfectly to run on nvidia cards and has all the bells and whistles the card can use, of course it will run fast with dlss 3, but I did not hear how it feels to play. From what I've heard, frame generation increases input lag and while the game looks smooth it doesn't feel like you are playing at those high frame rates.
Other thing to point out is that the 4060 will probably not do as well on 1440p, being on par with the 3060 12GB.
I feel nvidia started cutting corners since they though they could get away with it. But it feels they left s giant gap between their top tier cards (4090, 4080, 4070ti) and their more mainstream cards (4070, 4060ti, 4060).
We need competition, not only on price but also on performance, the new gen of cards feels like s refresh, at least on raster performance. DLSS is great and all, but to me is a non factir unless it is implement on a hardware/driver level rather that being implemented game by game. AMD needs to step up their game, they completely dropped the ball on this gen so far. Intel is the only hope we hsve, but they will release their newest cards next year, we still have at least half a year until we get decent mainstream cards.
To be honest, the better option is either to wait or, if you need an upgrade right now, get last gen AMD or Intel. 6700XT and A759 feel the best cards for Mid and entry level gaming respectively and nothing released recently feel even close to their price/performance. Wile I understand that reviewers need to generally plsy ball with the companies, I hope this gets bashed once again so Nvidia lowers the price of this, the 4060, the 4060ti and the 4070. Man, it sure feels bad to be a pc gamer right now, at least if you need a new graphics card...
Ummmm, is it just me or is your maths wrong? 20% of the rtx 3060's frame rate of 69 is 13.8 fps which would put it at 82.8fps which is basically what the 4060 is getting. That's makes it's 20% better performance at most, not 32% lol. How do you muck up maths that simple?
Nvidia is getting desperate, trying to manipulate public opinion with these restrictive early reviews that show the card in the best light.
I will say this: the 4060 is not a bad card if you are looking to upgrade from a very old card. That said, I am happy with my 3060 and I do not support anything less than 12gb Vram.
Damn right.
In a hunger crisis, take the larger cake; in a VRAM crisis, take the 12GB card
Yesterday, I was playing Ghostwire: Tokyo at 1080p ultra setting RT off. It used 9GB of vram on my 6750xt. Haha, keep buying nvidia card if you believe in their crap
pretty sure there were some restrictions applied, like no comparative testing etc, similar to he was only allowed to talk about 1080p performance, it was also not a review it was a "preview/teaser" as stated by J2C
Apparently it also had to be using RTX and DLSS 3 with frame generation which the 3060 can't do. Just the frame gen and forced high RT settings probably accounts for nearly all of it's improvement. The 3060 isn't an RT Ultra capable card so they kneecapped it as hard as they possibly could to make the 4060 look as good as possible vs it.
I think it’s healthy to listen to those whom you disagree with
NVidia: the way Jayz2Cents was meant to be played.
How does one get an *RTX2060 Super* to pull 206W in a game? It's TDP is 180W. My RTX2070 Super pulls 205W *in Kombuster* @ 0.95v / 1950GHz (probably c. 225w stock settings), peaks at c.170W (averaging less than that) in games.
I've been rocking a evga ftw3 3060 ti at 1080p 280hz for a couple years almost now and it still everyday blows my mind and not switching it up anytime soon
Had a 3070 and had the best past year for it...really a great card ...finished a lot of titles but at hogwarts lagacy it made ma mad that in 1440p ultra the card gave me stutters and blurry texture.
So exchanged this gpu and added 50 for a rx6800 a beast for the price. Still miss my 3070 though....
TH-cam tech review channels are a mess right now.
Alongside graph with benchmark results with 7800x3d there should be a graph of results when using some appropriate cpu.
Daniel Owen already did that what JayZ didn't
A gpu for 300 usd that beeds upscale out of the box... Its a bad gpu. It wont last 2 years minimum... The more you buy the more they (nvidia ) save
Think I'll hang on to my 3060 12 GB for now - At least until I can get an idea of how both cards handle the upcoming Unreal Engine 5 titles. I play at 1080p 60fps, so maybe a CPU or RAM upgrade might be a better option.
mine just arrived today bro. 12gb rev 2.0 gaming oc so excited but all i can do for now is look at it since my mobo yet to arrive. gonna stable diffuse that heck out of it once it arrives
@@Feelix420 I like the idea of the 4060 drawing less power, especially as the UK power prices are ridiculously high atm, but some games I play now use more than 8GB VRAM. At 1080p, with 16GB RAM and Ryzen 5500, most games play flawlessly at extreme settings. With Unreal Engine 5 games in developement, I'll wait to see if it's worth an upgrade, when the embargo lifts.
@@bigj1454 bro uk power bills are absolutely whacked atm its not even funny, all thanks to green energy. maybe gaming aside its time to overthrow some government who knows... btw typing this from my brand new keyboard and pc setup feels nice to finally have a decent setup.
This results are going to change with the next cyberpunk big update
Did you forget that Nvidia only ALLOW the comparation with 2060 and 3060?!
Emulation would be interesting to see. PS2 and up. Apparently Yuzu team chewed out Nvidia for bandwidth cuts, and older cards with higher bandwidth would be be better- at least for switch emu. I think a 25% uplift should be the bare minimum for a 60 class card considering the 3060 was not an upgrade over a 2060, let alone being on par with a 2060 Super.... I imagine this will age like butter with the buswidth though.
3060 while still not worth the price performs little better than a 2070
i've been waiting for 4060 since first 40 series launch, can you please test warthunder in 4k native without a dlss. Currently I'm at 2060 6gb and texture missing and present because of 6gb vram
Out of curiosity why are you trying to run it in 4k on a 60 series? Would you not prefer to run it at 1440p with more frames? Or simply get a card more capable of 4k?
@@TheSwayzeTrain Why not? Warthunder is an easy to run game, why getting out of your way to buy a more expensive gpu when you're planning to mostly play older but still popular titles?
This is why i watch Yes man and TechYEScity 👌
I'm making a white budget-oriented AM5 PC. I have ASRock A620M PRO RS. I'm wondering if I will lose more and more FPS in future running on the cheapest Ryzen 5 7600 against R7 7700. I already have a tower+fan for the CPU - white.
I really like your finance with tech analogy. Would also be nice if you provided the links to the charts in the description. :D
But he seemed to talk about deflation - I think he's just confusing a reducing rate of inflation with deflation. So prices are still going up, just not as quickly as they were before.
At this point, cyberpunk already become like nvidia playground already.. Oh, and i like cheesecake too
Please compare the 4060 to the 6700 xt. The 6700 xt can often be had for $310 or $320. Only $10 over the 4060
What is there to compare? the 4060 will stutter in games like Last of us/Star wars jedi and Hogwarts... because it doesnt have enough VRAM like the 6700XT does
@@Ladiozahh yes *badly optimized games*
@@naipigidi when console have 12gb vram, yet pc master race blame the game using 8gb vram gpu
@@saialexander1873 Exactly. People need to start building their PC using the same or similar specs as the PS5 to enjoy games better
@@Ladioz they already do
PLEASE DO A 4060 VS 7600 VIDEO I SEE NONE. People keep comparing the 7600 to the 4060Ti and keep saying its a bad value. When the 3060TI came ot it was $400 the same with the 4060TI. 7600 launched at $280 its $250 right now and you get the performance of a slightly better 6650XT which costed $400 and the 6600XT at $380. Please make it make sense. 7600XT/7650XT aren't even out yet and the 7600 is so close in performance to the 4060TI. Dlss3 is the only thing saving 4000 series right now.a lot of games dont have it and maybe 1 or two of them are multiplayer games. In the end AMD always wins. FSR 3 is going to carry all the cards. Like how fsr1/2 did for the GTX users.
Please don't lie to people. There has been nothing deflation. Deflation requires negative inflation not lower inflation. Any positive inflation number is additive inflation from the same time the previous year. For deflation to happen it would take a negative 12% or more for an entire year to get back to where money has the same value it had in 2020.
you scared the shit out of me with the filter on jayz part it was like artifacting 😱
I unironically think about grabbing 3060 12GBs now that they go 200 second hand and using it in OptiX/CUDA rendering in Blender Cycles.
Inflation is stabilizing but prices are not gonna come down on everything at least not in America. 7600 is a better card. Except in Ray tracing!
If you play at 1080p go ahead and get an 8gb Vram card. If you play 1440 or 4k and got an 8gb card, save money for the next 2 years because you will have to buy another card to replace that 8gb. As the guy said its good for "a couple of years". 2 years max and its dead.
1060->4060 🤔
4060 is looking to cost what I paid for my 1060 when it first launched. That's ridiculous as all other 60-series have been almost double in my region.
Just wait for RTX 4060 12GB for a few months.
The question remains... Is the 32% uplift going to translate when cyberpunk releases it's updated liberty city upgrade/overhaul?
It's 32% now... But will it be more or less come the update.
My PC ceeps crashing with 8g vram on 6600xt in D4. High settings on a 21:9 3440:1440p 160hz Display. 90fps but the Vram is a big Problem. It fills up from 7.2g vram usage to 8g+ vram and than boom hard crash. Is it the game poor game designe or is 8g vram the problem ? Make your own dession...
find someone to swap with 3060
@@d9zirable This problem is just in D4 for me .I would Rather sell my Blizzard account than going to team greed again ...
@@MCGERHADL lmao
No 6700XT 12GB? 😢
Can include that or the RX 6700, though the graphs just start getting congested at a certain point, RX 6700 will definitely get a mention in the final review though. Great card.
Out of curiosity, why 10th gen instead of 11th gen i9? More core is better than pcie 4.0?
I'll be blunt:
You're either really bad at analyzing or you're intentionally misleading customers because you got some pay from Nvidia. The reviews are out, the embargo had been lifted.
Touting the cyberpunk best case scenario even though you knew or could have known the post embargo results is just embarassing.
Let me give you one word of advice:
People aren't stupid.
If you lose your credibility because you shill and people lose trust in you, that will significantly damage your future earnings potential. Short term vs long term. Choose wisely.
Back on topic:
NO, 8GB VRAM isn't enough, even current titles exist where 1080p ultra settings needs more than 8gb VRAM or you get texture flicker (load in of low res textures).
Yes, on average the 4060 is slightly better than the 3060. But ON AVERAGE, still means that in VRAM heavy games the 3060 performs better than the 4060, which is more than disappointing.
The 4060 is really a rebranded and overpriced 4050 if you just look at the silicon.
There is no reason to chose a 4060 over a 6700 xt for gaming.
Not only the lack of VRAM is the problem but also the small bus.
And no - inflation does not justify the pricing.
I don't think you're not intelligent enough to know these things.
I do think you're misleading people for financial gain.
I can't think of another explanation.
Either way, not a good look.
I'm curious to see if you delete this post or not.
Will it play Command and Conquer Generals?
Where are the reviews on this card at 1440 and 4k? As I understand it nVidia has put an embargo on reviews at anything except 1080.
Good video. Test idea: Max out VRAM and see if faster mainboard RAM helps against stutters while data is transferring between VRAM and RAM (installed on mainboard). Thanks for the video.
Your ray tracing numbers for the 7600 look totally wrong, might want to check that.
IMO this highlights just how bad the 3060 12GB was. It barely beat the 2060 Super and the only reason it's not a completely forgettable card was the additional VRAM that haunts the more powerful 8GB cards in specific games and settings.
Does make me wonder if Nvidia is leaving the door open for an Ada round 2 akin to the 700 series or 20x0 Super refreshes. The arch overall seems incredibly potent, but every last card outside of the 4090 has been kneecapped pretty severely, there's a ton of room to slot Super style cards in the gaps without making the existing stock obsolete. The arch clearly has the headroom to deliver way more than it is right now.
You need more samples to compare the two cards imho, just watch for example the hardware unboxed video of the 4060Ti review, 15 games AVG in 1080p the 3060 performs as good as the 6600 XT, both are faster than the 2060S and 2070 overall
My 3060 laptop with 6 GB Vram can run 2077 1080p ultra + RT psycho with 75 fps average and min is 32 in benchmark. So i really wonder what settings did he use to get 30 fps average from 3060 desktop, no dlss 2 at all?? Extremely misleading tests i must say...
I am impatience, I waited for the RX7600 / RTX 4060ti to be released and grabbed an RX 6700XT instead 😅
I wish AMD released 7600XT instead oh well is done now, the card otw to my home, still happy with my purchase.
The 6700xt is a great choice
Waiting for the comparison for the 4060 TI 16gb model compared to the 3060ti 8gb ? Is there a big difference between DDR6x and DDR6 ?
I feel like more memory won't help with this narrow memory bus (128-bit for both models). It will be excellent card for 3D artists though, for OptiX/CUDA rendering...
2:00 You forgot to mention the memory bus and bandwidth.
I am perfectly happy with my 3060TI in my Alienware. My upgrade path is when a video card at the middle price point is 2X performance that I have now, then it will be a meaningful upgrade. Maybe that means 7060TI in a few years but that is fine with me.
The fact you have an Alienware means you aren't qualified to give advice.
@@dangerous8333 People with alienware should be sectioned under the mental health act
@@NoodlesTBogratlaptops are unironically good. In GTX10X0 era (also time when mobile Intel CPUs could fry bacon), Alienware was actually the only laptop maker that implemented appropriate cooling solution. Their laptops were the only ones that weren't throttling or having lessened performance out of the box with lower power limits. But desktop are indeed garbage, it's truly baffling how they can get some things right and others sooooo wrong. 😂
there are no upgrade paths on an alienware
I agree with you 100% Brian, to have a better GPU you do not need a whole lot to of cheesecake. All hell I am on the GTX 1060 3GB playing my titles I like to play at 1080p on my 32" 60hz TV monitor. Also, what hurts a lot of game enthusiast is that they do not pay attention to details of 1080p GPUs. 4060 8GB, 4060TI 8GB and the 4060ti 16 GB soon to be released in July are only for 1080p gaming and a lot of game enthusiasts assume that these 1080p GPUs handles high refresh rates of these monitors and they poor FPS. And one thing to point out that Brian tested the 4060 and the 4060 TI GPU PCIE 4.0 x8 at 1080p because this PC enthusiast the information of the GPU and his information is well spot on, and he gave you at taste of gamers that has upgraded to high refresh rate monitors what your FPS will be on those high-quality monitors. Overall Mr. Brian thank you sir for the correct information on RTX 4060, RTX 4060TI, and the 3060 12GB low powered GPU and yes, I see myself buying this GPU or any GPU that is set only at 1080p because I am not upgrading no time soon. Great video Brian and informative.
Jay did a SponSored Video of the sh*t Rtx 4060 to change the perception that its good gpu. I got very very disappointed from Jay, he destroyed his own credibility from his own hands.
What about a RTX 4070 TUF OC 12GB Graphics card ?
Would Be great if you could configure GPU memory like you do on a mobo. Add/swap etc. Not sure if it would add a speed bottleneck but man that would be great.
Another awesome video, always appreciate the financials in the mix. Keep banging out the content
I got a 3060 12g from a buddy two months ago for $280. I am not disappointed. Especially since im coming from a GTX 970.
Seeing the CP 2077 benchmark I will buy one!
*Q:* Bryan what going on with the YEN?
thx for sharing this^^ i thought u could only drop numbers with that raytracing settings and was not allowed to compare... that is what i thought XD but nice to see this graph. special the a750 looks awesome in that chart and the rx 7600 too. i think this will shake a little bit with that dlss thing and raytracing thing and framegeneration thing. with that, i guess it will place directly unter the 4060 ti. followed up by the a750 man. thx man and cheers from germany
It's a piece of garbage, tech going backwards.
I am upgrading a PC for a pal and he has 300 quid for a gfx card, so I'm putting a 6700 XT in it, it's just better value/performancer.
If you have a 2070 Super can you throw in results plz?
if I had known that DLSS decreases VRAM usage I would not have bought the 3060 12 GB
Not by that much
It's not that big of a difference and the 3060 8GB performs worse than the 12GB model
@@andrexskin Yeah, but the 3060 8GB specs are also lower, not just less memory.
@@Groovy-Train Yes, that's what I meant with "performs worse"
@@gagec6390 good to know
FG is early technology which will be implemented in a few games until next gen gpus. And the main problem is that FG reduces image quality just like DLSS. BUT! Nvidia forces to buy it everyone who's interested in Ray Tracing because in some titles FG required for lowering crazy stutters: The Witcher 3, Plague Tale, Hitman 3, Hogwarts: Legacy. It's made intentionally in partnership with Nvidia for selling RTX 4000. It's the fact because some FG games are not CPU limited with enabled RT (including CP2077 if you have at least 12 thread CPU which is today's gaming minimum).
Personally, it'd be interesting to see the 4060 compared against the 1660 alongside the 1060, 2060 and 3060 so owners of that half-step card have an idea where to put their cash for a new card.
Personal experience is a 1660Ti is pretty much on-par with the older 980Ti in terms of 1440p100+Hz gameplay in some games like ME Andromeda on Medium settings (trying to squeeze out as many frames for smoother feeling gameplay by sacrificing graphics quality), without using things like upscaling to make a 1080p120 High output fit a native 1440p120 FreeSync monitor.
I just got the 1660 Ti to start blacking out in Borderlands 3 with max settings, V-sync OFF, uncapped framerate, and whatever other "smoothing" settings you can use. Otherwise, at 5760 x 1080 on 60 Hz, it's not a bad card.
I do not play any of the games reviewers use for benchmark comparisons. I played WoW for 12 years, quit when the Shadowlands pre-patch blew away a number of alts and for the last 3, I've been playing GW2. Along the way, the various releases of Elder Scrolls (positive on story and scenery, big minus on any difference besides looks with how your avatars play). My GPU was EVGA 1080Ti FTW3. I've used CRT, 1080P and 1440P TN/IPS monitors, Sony, LG and Samsung 4K TVs and my latest is Hisense 55U8G. Yes, eye candy settings got turned down as graphics demands increased (and playing at 4K). I play what I play because I find them fun, NOT because how pretty they are. Paying $300-500 for a GPU to "only" play games at 1080P seems extravagant.
Nvidia would have us believe frame generation is the same as raw performance, snort. I can say my 2000 van can do 300mph***** (dropped off a cliff), so obviously its faster than a formula 1 car and therefor "better". It really feels/seems like Nvidia is making the same "better than" kind of comparison using frame gen in their "better than" graphs. I personally am no longer in the GPU market as I bought a PowerColor 7900XTX, but I do enjoy seeing what AMD, Nvidia and Intel are up to.
Wouldn't touch any the RTX 4xxx as they ALL have cut down memory bandwidth (except the 4090 and its high price) and at least in here in Canada they still over priced. Nvidia has gave middle finger to gamers. You'd be better buy older RTX 3xxx or go buy RX 6800xt+ or newer.. I am still on GTX 1080ti and been thinking of upgrading but just don't have money atm.
Yeah I agree RXT 3060/12 GB is better value imo. RTX 4xxx = Garbage unless you can buy 4090.
The 192 bit memory bus vs 128 bit memory bus does make difference along as you said with more Vram. 8 gb is NOT enough for 2023+ and beyond. 12 GB will be bare minimum imo. It just sucks that 3060/12 GB here is going for over $400+ Canadian.