Then release gpus that is actually good then? I had sold 5700 xt for an 2070s and couldn't be happier, driver working like an ass on AMD, perfwise? Not so much. Working software working like shit on AMD
I think this is probably good for the industry as a whole. We shouldn't want only a single company being able to produce the best chips in TSMC. They already have a lot going on between everything they are producing for Apple, AMD, Qualcomm etc. Samsung is likely to be heavily investing in EUV production at the moment, which is good. Some serious demand through Nvidia may help support this effort.
Samsung's been pumping out 7nm EUV products since Feb. In fact, Samsung is on their own 7nm++ Samsung's problem is that they're unreliable...in the sense that they'll shift production capacity around with priority for their own products.
AMD are gonna break a lot of hearts if they repeat their showing with GCN, Vega and Navi once again. Everytime people are convinced that this is the time AMD has an ace up its sleeve and Nvidia will finally be put in its place and then AMD just disappoints. Hope this time is different.
They might be breaking hearts if they do repeat it, but they come close to the competition and fairer prices, which someone who cares more about price like me would be fine with it.
@@NatrajChaturvedi I think that AMD dropped the ball back in those days mainly because it was putting most of it's resources to try and compete with Intel. Now that they finally have a solid footing against Intel, that freed up financial and manpower resources that were invested in the graphics division. Given the fact that they have awesome results from 7nm in the CPU market, the likelihood of Big Navi actually NOT dropping the ball this time is very high.
The upcoming console gen using TSMC probably helped with them really getting chummy with AMD. They have at minimum 10 million APUs between MS and Sony just for this year. Nvidia's volume is 'low' compared to AMDs volume between desktop and console.
Also TSMC probably haven't forgotten that time Nvidia publicly scolded their 42nm node for being "practically useless as a technology"... because ATI beat the crap out of them.... on that very same node...
@Ignatius David Partogi TSMC don't care about amd's profit margin when they are getting paided the same. And AMD is probably happy that they have dev support and sharing tech thanks to consoles.
I have no desktop and a single high end GPU costs 5 times the minimum monthly wage in my country. There's no way I could even dream of buying one of those. Yet I still watch every one of these videos. Go AMD.
@@hasnihossainsami8375 that's where you're wrong. I had an Alienware 15 in 2012 that literally did just that. On battery they downclocked a lot to be able to run the thing but when you plugged it in it drew 300 watts at peak.
@@youhackforme Dude, an Alianware is a very niche market. Most notebooks have discrete or mid range GPUs that for the price can't afford the necessary cooling to draw 300w. So yeah, AMD will probably take a big chunk of Nvidia's market share
It's shocking that by the end of the year AMD may have gone from a total laughing stock just a few years ago to being an industry leader in two markets against competitors which are both much larger than AMD.
Thats what happens when the best have no competition to worry about they raise prices cut costs and features and try to get them stock prices higher make the shareholders happy what choice does the consumer have ? well AMD filled that hole
@@Koozwad Incorrect. She is the brains, too. She has a PhD from MIT in semiconductor design, and was one of the first ever to prove the concept and viability of silicon-on-insulator technology, which is now used throughout the industry. More than that, upon becoming CEO of AMD, she carried on her predecessor Rory Reed's company reforms which greatly helped to turn them around (especially the diversification of their portfolio and the restructuring of their insane debt loads). She has changed things even more, in her own right, by furthering the diversification of their portfolio and also by restructuring the entire company itself. A lot of credit ought to be and is given to Reed for turning AMD around, but Su has proven his changes to be the right ones and has significantly built upon that foundation and steered the company in all the right directions. Lisa Su is the mouth, but she most definitely is also the brains of AMD. But she hasn't done it alone, either. She has been smart enough to surround herself with good people, to recruit the right people for their respective jobs, and to get rid of the people who have been perennial thorns in AMD's side and holding the company back for so long.
Just like how AMD surprisingly one-upped Intel with the Ryzen CPU's, I hope AMD one-ups Nvidia with the RDNA 2; if nothing else, to force Nvidia to compete by offering at lower prices!
Force Nvidia to lower their prices is what I can hope for at the least. This is one of the reasons why you ought to be real dumb to not root for AMD, regardless of which GPUs or CPUs you prefer having.
AMD are gonna break a lot of hearts if they repeat their showing with GCN, Vega and Navi once again. Everytime people are convinced that this is the time AMD has an ace up its sleeve and Nvidia will finally be put in its place and then AMD just disappoints. Hope this time is different.
AMD doesn't seem to try to undercut NVIDIA's pricing though. If NVIDIA increases prices in a performance class, AMD also does it. And let's hope they won't have that driver debacle they still have with Navi. A lot of people have their cards crash in a lot of games and switch to NVIDIA if they can't get it to work with driver updates.
so an 800w would probably be minimum required for a system like that and if you wanted your power supply to run as efficient as possible you're looking at a 1600w
@@jamestor6700 Peak efficiency and overhead state that with a 750w draw, you'd want a 1000w PSU, so even though you could get away with the 800w, you wouldn't _want_ to. It's not the most widely known concept, but the math for getting an "ideal" PSU is: Your expected wattage useage+33.25%, rounded up=the PSU you'd want to get (provided no other upgrades are expected).
@@RAW_Reality "The supplies achieve their maximum efficiency when operated at 50% of their load. In fact, the manufacturers guarantee the maximum efficiency only when the supply is run at 50% load." thats from sunflower-uk(dot)com as well as velocitymicro(dot)com youtube hates links which is why I used such an odd format
In hindsight..... XD Looking back to the 1080ti days, AMD published that they were bringing out an all powerful card; an NVidia killer. NV really stepped up with the 1080ti and crushed all AMD's hopes and dreams. I cant help but feel that NV have been very canny this time around aswell: all leaks prior to today hinted at an underwhelming card lineup, with leaked specs roughly 50% of reality... was that done on purpose not to troll us, the consumers, but actually to troll AMD into believing they had a chance?!
The reason Pascal was so good is because it was Maxwell on a full node lower than 28nm. Maxwell was originally intended to be manufactured on 20nm, but due to the transition to finFETs, never came to fruition and remained on 28nm. They had to make major architectural improvements to achieve a "normal" generational leap on the same node. Thank Maxwell for the 1080ti lol. It's crazy that AMD couldn't do on 14nm what Nvidia did on 28nm. There are many Maxwell gpus that could hit 1600mhz.
Mah boi was lookin slick AF on that opening scene and turned into an ex-nam vet alley bum by the end of it... That's how you know homework was done, sources where triple checked and leaks where double verified to facilitate straight fact spittage! I salute you Sir.
@R3TGx Covid has made it very hard for Aussies to get hardware, that and the stores are charging insane prices, so I split the difference and still came out ahead. Sold my Rift S too, again made another $150 over the original price :)
To me choosing between Nvidia and AMD will probably come down to the implementation of the RT and the new power connector, if I have to upgrade my PSU to get the Nvidia GPU and the cards compete reasonably closely, it will swing it in AMD's favour for me.
AMD did mention 50% performance per watt increase for RDNA 2. Taking into account that perf. per watt usually takes into account uArch and node... the node AMD will be using is N7P (which only allows either 10% improvement in efficiency, or 7% increase in performance - so, one of the two, but NOT both). 40% from uArch and 10% from node. 2080ti = 50% faster than 5700XT at 4K (for the most part). That means that a 40CU RDNA 2 = 2080ti at 4K (at 225W) - without an IPC boost. Dropping clocks by 25% would result in about 12.5% drop in performance, 50% drop in power consumption (since power consumption increases/decreases exponentially with clocks) and (maybe) leading to something like this: 40CU RDNA 2 = 12.5% slower than 2080ti at 4K (at 112.5W - 135W) 48CU RDNA 2 = 5-10% faster than 2080ti at 4K (at 157.5W - 180W) 60CU RDNA 2 = 31-36% faster than 2080ti at 4K (at 225W - 250W). 72CU RDNA 2 = 49-54% faster than 2080ti at 4K (at 292.5W - 315W). 80CU RDNA 2 = 75-80% faster than 2080ti at 4K (at 337.5W - 350W). That's if you take into account rumors of 72 and 80CU versions and RDNA 2 topping out at 300-350W... which could be difficult since yields wouldn't necessarily be too great because an 80CU version of RDNA 2 would DOUBLE existing chip sizes. Of course, you could gain something similar with increases in clocks instead, but you'd end up hitting the point of diminishing returns like this. If AMD gains an IPC increase of about 7-10% as well, that would either increase performance by similar amount on above values, or they would remain the same with roughly 14-20% drop in power consumption. This is just my own personal estimate... by no means should this be taken as 'accurate'. There's also the fact AMD will be using raytracing technology in RDNA2... which may or may not affect above performance. AMD will be using a hybrid method for Raytracing as mentioned here: www.tweaktown.com/news/66455/amds-new-patent-explains-hybrid-ray-tracing-approach/index.html#:~:text=This%20method%2C%20according%20to%20AMD's,based%20ray%20tracing%20be%20solved.&text=This%20is%20the%20chip%20that,with%20ray%20tracing%20support%2C%20too. The method seems to be using existing hw (which has a different layout and uses it differently) for Raytracing without needing additional space (and without a performance impact that NV experiences in RTX 2xxx series).
I'm on a 390X only because the manufacturer ran out of cooling assemblies for my 290X (one of the fans had a shot bearing) and offered me a swap via Warranty RMA. XFX has generally been good to me.
My last Radeo Card was a 7970GHz Sapphire Vapor-X. I loved it. Went on to an R9 Fury X that coil whined so bad i brought it back and took a 980ti home. Now I'm waiting for the new cards 👍
The end of last year was the only time in a long time that they've had serious driver issues. And one of the reasons it took them so long to resolve was because they simply could not replicate on their rigs the failures people were reporting. But they eventually did resolve them nonetheless. Besides that, I don't think there will be a repeat of that with this launch. For one, those issues were specifically affecting the first generation of RDNA cards. They have been resolved, and now these upcoming cards are RDNA 2. For another, RDNA 2 is the GPU used in the Xbox Series X, and no doubt they have been working closely with Microsoft to finalize and refine their software and driver stacks on both the console _and_ for PC.
@@madmax2069 oh really???? My 5700 is working fine. It needed a bios update when I bought it but that is the fault of ASUS, not AMD. Also, most reviewers have stated that they can't even duplicate these complaints. I'm not saying that the drivers aren't without problems but everyone seems to forget that Nvidia's TDR problems went on for YEARS.
@@colesherrill7472 Spoken like a true Nvidia fanboy! XD I own 300 GPU's from 9 different manufacturers from 1986 to 2019. AMD and Nvidia both had shit drivers and good drivers. On average, since the HD4XXX series, AMD's have been better. Not to mention longevity of support! XD
That is sad truth about gaming industry and graphics technology in general. Performance percentage claims are joke as todays flagships cant even double 3 gen older cards fps. I remember times when new games did not even support 3 gen old cards. Today you see people praising in forums if developer allows you to run gtx580 on high settings :p
so he got the power connecter, the cooler, the ports, the TDP, the CUDA count(5376*2=10752), memory speed, core clock, 4k perf, launch date, CUDA 11 right the RT performance and the extra features are off by a lot though. so i think its fair to say he got it right over 70%. I don't really know why people call him lying.
I hope AMD punches Jensen real hard this year. They got way too cocky with their demo RTX 2000 series. Glad i stuck with GTX 1080 Ti and skipped the RTX 2000 test run. Well see how it goes in a few months.
I'm still rocking a 1070 man, and I'm looking forward to these new cards. Hope for one of the best competition this year between the two companies where the consumers benefit the most! 😁
i dont know why but my gut feeling is that nvidia is still gonna pull this off somehow. i feel like the 2000 series was a pretty weak move but that this gen they will get back on track. who knows though we just have to wait n see
AMD still dont have a card that beats a 1080ti... I mean what else can we say... If anyone out there thinks that AMD is beating NVIDIA this year when NVIDIA actually releases a new generation on the same die they are just smoking far too much weed.
@@FinnReinhardt Yes, you will. Thing about it is that if you listen carefully to this reviewers words he has hedged against being called out on anything after the fact. Oh he suggests stuff, but does he outright come and say that his prediction is that AMD will win the next gen? No. He says shit like they 'could' win. I'm happy to tell you right now - THEY WONT WIN, AMPERE WILL BLOW THEM AWAY.
I just want whichever side ends up being more powerful, but I’d be nervous about going RDNA2 based on my negative experiences with driver issues with my 5700XT. They were solved eventually but it’s so hard to tell what’s a driver issue and what isn’t and it starts making you paranoid with every glitch.
Yeah that's my only concern, I have had driver issues in the past as well with them and I just don't know this go around. Maybe best to wait and see what happens.
I ditched amd because the hassle i had with the hd 4890, paying top Dollar for their top card and every other driver breaking something so i swapped it with a gtx 275, next upgrade was a sweet gtx 580 and then a 680 and then a 970, the 1060 and now a 2070 super, never looked at team amd again. Thats hiw much drivers affect a user to. Make decisions for years.
Im never going AMD card again, since i built my first computer 20 years ago now, ive had 3 ATI/AMD cards. And the drivers have always been piss poor. Nvidia always release new drivers with new games, and are quick on the ball. While AMD can stick with shitty drivers for weeks or months for new titles. And thats not even going in to all the other issues... I hope AMD beat nvidia, honestly i do. Cos that can only be good for the market and prices. But as long as nvidias top cards are still within the ball park performance/cash ill still rather get a 5% downgrade in performance for the same price, just to get a nvidia card instead.
@@z0lid Yeah im in kinda the same boat as you, I built my first PC same time round 20 years back and used to go with AMD but have only ever had issues with them. Last few builds with Nvidia have been really solid and I would prefer to take a 5% performence hit if it means stability this go around. Nvidia need to be taken down a peg and we need competition but AMD really need to up there game in the driver department.
I just want AMD to wipe that punchable smug smile off of Jensen Huang's face. And maybe Lisa Su can steal his jacket too. Hopefully AMD can keep the momentum going, and stay on top for around 3 generations of cards or so, so that they can build up a war chest on the GPU side, just like they are trying to do on the CPU side.
What I'm most excited about, is games being built from the ground up for the RDNA architecture, since Xbox and Ps5 will finally be using the same chips as PC. Games should be very well optimized!
@TurboCMinusMinus usually game devs develop with the lowest common denominator in mind when creating a multiplatform game. So I don't see crossplatform gaming being that much affected by the SSD, there will.be minor differences, with the gap really showing in 1st party titles for the PS5 doing some crazy stuff
Tbh Price is really the only front amd can really compete. They only Start having raytracing and dont have things like DLSS etc. When it comes to drivers amd to this Point was not close to competing with Nvidia for 10+ years. Im happy if they get their shit together but i dont believe in Miracles.
@@TrueThanny Yeah, they've already doubled their share price this year. But of course the problem with buying stocks right now is another crash is likely to happen. So would be better to wait for that. But then it might not. What to do!!
For the love of God, I hope the drivers are good. I've had quite a few of my friends skipping the 5700/XT and going with a 2060/2070 just because of the issues that has/had been present.
Definitely a fair criticism. I struggled for quite a few months. Only to find my biggest issue was my Ryzen OC. PBO + AutoOC is apparently broke on my mobo and caused countless crashes(only while gaming of course). Thankfully Tom made a video about that a few months back and I got it fixed.
That was me right there. I had to return my 5700xt after a week because the amount of issues I had with that made me really mad. I bought a 2070 Super right after, and it worked right away.
@@SteveHonakerII likewise, 2950x build on Zenith Extreme and ASUS rog strix 5700xt, some minor issues along to road but ended up tracking down the issues with mobo, ryzen master, win10 and Radeon all not wanting to jive
Not everyone uses these cards for gaming. One area where AMD has been unable to compete is with high end 3D rendering engines like Arnold GPU, Octane and Redshift. There are a whole lot of VFX artist and animators that use GPU based rendering in software like Houdini, Maya and C4D etc. These rendering engines use CUDA and recently RTX to accelerate 3D rendering many many times over what CPU based rendering is capable of in the same time frame. With RTX for instance we have seen speed increases of up to 800% in Octane. Unfortunately for the professional market none of AMDs cards are supported by anything other than AMDs own ProRender at the moment which almost no one uses because quite frankly it sucks. It’s slow and it’s lack of features and lack of studio pipeline integration ( no cryptomattes, lack of AOVs etc ) have made it pretty much DOA. I realize to most gamers this is probably a non consequential edge market but Nvidia’s hardware has truly revolutionized the CG and VFX industry making it possible for small and medium sized studios to be able to compete with bigger players just by cramming workstations with a few GPUs each and giving small studios the final frame delivery capabilities that have up until now only been possible with expensive render farms. I built a workstation recently that has 4x 2080ti’s and final frame delivery in a recent project was around 45 seconds per frame. When doing the same scene using CPU based rendering ( 32 core ) the same frame took nearly 72 minutes. This is a pretty big deal and unfortunately AMD doesn’t do the R&D or the code development that Nvidia does to make these types of things possible on their side.
Yeah it is true that AMD suffers in that area. Simply because CUDA is more integrated then opencl. It's not like the cards would have less power to do things. Nvidia just did the right thing years ago and it became almost an standard albeit it's only on their cards. Trying to catch up to that seems rather impossible at this point in time even if AMD did invest money.
@@MrRyusuzaku Is not only that, the card are simply better at brute force computation. - double datapath and exec unit for concurrent int and fp32 execution (by thinking of how you usually work with vectors and calculus over indexes/pointers this is a killing feature) - RT core used to accelerate every algorithm that use BVH structure - tensor core for heavy matrix multiply They have more hardware functionality exposed than amd... the last trick in ampere A100 is the direct load from ram to L1 bypassing L2 and increased bandwidth of cache in general..
renderfarms and research centers alone buy more nvidia highend cards per year than actual gamers do , hence nevidia putting more and more focus into innovate technology like ai , rt, and cuda ... soon most autonomous cars will run with an nvidia card
Xtoor Gaming blender uses Redshift and Octane render too. Yes it’s got cycles for free but even but even cycles requires CUDA if your going to get any real speed out of it. Also when you start collaborating with studios on commercial projects many of them use Redshift and Arnold GPU as part of their production pipeline. If you want to earn a living then you have to adapt to their workflow and many if not most are using Arnold, and Redshift for production and Octane for look dev. They choose these render engines inside of whatever app be it Blender, C4D, Houdini or Maya because Nvidia / CUDA / RTX save a lot of time - with the bottom line usually being time = money and all ...
Rdna2 hasn't even been revealed yet. There was a lot of gimmicky charts in the nvidia reveal, but rasterization perfomance is still probably 40-50% faster than the 2080ti, which is what all the leaks said. Amd leaks said 40-50% as well. they will be competitive. ampere is the first time nvidia flagships have been competitively priced and it's probably because they know amd will compete
@@user-ps7ij6ge6d Everybody and there mom thought it was going to be 35% or less for $1200 to $1500. Instead it is roughly what you said. 40-50% on average, with more possible depending on the title. For $700. That is a MASSIVE difference from what anyone expected. Now also take into account that Nvidia is also doing this on an inferior 8nm Samsung node (which is closer to 10nm in regards to pitch and density), and they have tons of room for improvement. Why? Because Samsung 7nm EUV production is going to TRIPLE by years end. A 3080Ti that has better gaming performance than the 3090 is entirely possible (happened in the 10 series vs the Titan) within 8-10 months. Then a refresh of super cards immediately after. If we don't hear any credible leaks by the 17th when the 3080 is supposed to go on sale. It's almost 100% certain that AMD cant compete with Nvidia in terms of performance. They will only again be relegated to the low and mid tier. Where Nvidia will still out sell them lol.
@@user-ps7ij6ge6d the only leaks that have come out are wildly baseless. Literally the same B.S. that happens every year. AMD even did it directly with Vega. Hyped the crap out of it, only to disappoint like hell lmao. If AMD isn't even doing that THIS time. Then it's because they can't compete. I want to see spec sheet leaks from credible sources. Go ahead and save this comment so we can both come back after the 17th to see who was right. Edit: Price is VERY relevant when you are talking about cornering huge swaths of the market. No one thought that Nvidia would price their chips this cheap for this level of performance. It can potentially back AMD up into a corner if their 7nm wafer costs were far greater.
@@DESX312 ampere literally costs the exact same as last gen. 699 for 80, 499 for 70. basically 1499 for 2080ti, at least in canada. Also the marketing team that was responsible for vega left, it is a completely new team now. Not to mention original navi actually did what it was supposed to. 2070s performance for cheaper. in canada, the 5600 xt was over $100 cheaper than the 2060 for the same performance. Sure, nvidia will have more performance but rdna 2 will definitely compete
I’ve been on Nvidia for so long I can’t remember the last time I had an AMD card (myb 12 years ago) I cant believe what I’m saying but given AMD momentum and performance for cheap, I’m certain that card will be worth it. If the benchmarks are close I’m buying AMD.
MY last AMD GPU was the 6950 and I ditched it because the drivers were bad, and I was getting god rays cut in half diagonally in Bulletstorm and some mods for Doom3 just would not load. I switched to the nvidia GTX 580 and was much happier. Now I've got an expensive Gsync monitor so I'm locked into Nvidia for the forseable future, so it doesn't matter how good AMD gets for the price, I'm forced to look out for the best value nvidia 3000 series card.
I'm not fanboying over AMD, I'm fanboying that someone would teach Nvidia a lesson. Intel already got their lesson, and the only reason why I don't want them to go bankrupt is that we need competition. Nvidia is horrible, but much more conning ways. Anyone interested should watch AdoredTVs video of Nvidia anti-consumer ... (don't remember the title, but that will find it), not that it's exhaustive on the topic, but might show light for many people. Nvidia is master at getting away with their horrible shit, and people forget how much problems Nvidia has had, even in the recent history.
Hmm... So Ampere might be bottlenecked by rasterization, while big navi - by rt? Battle of bottlenecks gen. Ib4 nvidia's pulling out tessellation trick again with underground rays or some shit.
I thought he said 40% increase performance in rasterization from Ampere? Honestly it looks like the 7nm vs 8nm is going to be what keeps it close. Nvidia getting clobbered by not having as much sway with manufacturers since AMD is in both consoles.
We're gonna have to choose between the two it seems like, sucks either way. If this happens I'm gonnna hold off on buying anything until the Super refresh and just wait to play Cyberpunk.
UE5 will break any shinanigans. Nanite and Lumen are serious technology disruption. Their engine would force the bottleneck into shader and IO performance.
@@johnwilliams615 its coming lol. People dont realize that AMD was basically bankrupt in 2014 and doing marginally better by Zen's launch. Its only been 3 years for them to prove they can deliver on time and have competitive products, which they do now in almost every segment of computing besides AI. CDNA is coming soon. That will be their zen moment in AI cards. Again, in only 3 years they've done this
That's a part of the puzzle not being talked about. The current state of Cuda core optimizations and their denoising software is amazing. I don't see that in AMD cards.... Yet! but we shall see...
D.B.C.|T1M0 you do realize that 300W is most likely for the TITAN/3090? That won’t be in laptops lol, also, that wattage is normal for the highest performance card
@@miguelpereira9859 you realize the amount of engineering that goes into laptops? Plus, remember, you're getting a monitor, keyboard and touchpad with them as well. Packing anywhere near desktop levels of performance into a chassis under 30 mm thick is insane, and warrants the extra cost.
But in reality being forced to do everything with sony may have given them higher yields given the 3070 price, and they where able to make insanely fast cards for cheaper. I hope amd kills it with navi but i think you put way to much emphasis on them being forced to go with a larger process as being a thing that could kill them when in reality seems like it had very little impact if not helped them
I've been saying to my father and to everyone I know that the we should really wait for the end of this year because GPU market is going to be insane and really worth the wait
If AMD can guarantee proper drivers this time around they'll win, but if I have to choose between a (much) slower (more expensive) but stable Ampere or a unstable but (much) faster (cheaper) RDNA 2, I'll have to go the former.
I don't think it's a driver issue. I firmly believe that from the majority of the complaints, the problem is a hardware bug that wasn't fully ironed out before the release date. That would 100% explain the randomness of the issue and is something AMD would absolutely never admit. Here's to hoping AMD's engineering team actually managed to figure out what the problem was and fix it because hardware issues at that level are an absolute nightmare to debug.
Half of these "driver issues" are people who are stupid and simply try to install their new AMD GPU and drivers on top of their OS with a bunch of stuff installed by Nvidia on it, which causes conflicts and errors.
@@LiveType I have always had far more trouble with AMD than either nvidia or Intel with bugs in their drivers, especially their OpenGL implementations. This has been the case for 10+ years writing graphics code, and having to spend so much time tracking down bugs and finding workarounds to broken AMD drivers.
@@shawnpitman876 that is so true. They install AMD stuff without clean uninstalling their previous Nvidia software, and the registry entries and bits of software left over start to conflict and wreak havoc. I used an HD 7790 for years and it never gave me any issues.
@@Aethid Oh, I'm not saying AMDs drivers aren't bad/worse than Nvidia's. It's just that 1st Gen Navi likely has a hardware problem on top of AMDs lackluster software.
I'm loving the 17:30 cut hahaha I'm really on the fence with next gen. It feels like the choice will be between pure rasterization (AMD) or a complete, futuristic software solution (Nvidia). If DLSS turns out as good as it's looking right now what's the point of having 10/15% more pure power with AMD if flipping a switch on Nvidia's side nets you 40% higher FPS with comparable image quality? That's really my concern.
Dlls can be good, but how Many game developer makes game that support %20 Nvidia GPUs? Not Many! There just Are too few GPUs that support that technology. The situation gets better. 5 more years and %50 of Nvidia GPUs will support it. But there still Are amd and Intel cpus that don`t... it may suffer eventually the same fate as PhysX and GSynch and eventually fade out. So it depends on how Many new games in a year will support dlls. 4 games / year? 5 games / year? We don`t know...
Glad you started using the diagonal background watermark! Don't let people steal your slides! Keep them coming Tom. I've already started to see tech news websites cite your channel for their articles.
Well... shit. 🤣 Nvidia trolled all the "leakers". This is why I don't like the rumor mill and speculations regarding phones, gpus, cpus, consoles etc. Just fucking wait for the official announcement and take all the leaks with a ton of salt. I hope AMD will come up with a good product, that can compete with Nvidia's lineup, but given the situation, I would be surprised if they do. In the end the consumer will win from this fight, so I say bring it AMD, because Nvidia has fucking brought it brother.
Well, I hope this is AMD's year, they have struggled for a long time, and competition is good for the consumers. Great Video! I hop you keep them coming!
AMD is for all intents and purposes first generation ray tracing, so I'm curious how their cards will perform against Nvidia in the ray tracing department. That and software/driver.
@@TimStCroix yep..they Will even have a dlss counterpart..they showed the "directML super resolution" with forza Horizon 3 (might as quelle work on older dx games which would be bonkers) and It looks amazing..they said they could use this gp-gpu approach instead of the dedicated cores one (we know Nvidia loved dedicated stuff ti cito out competition)..but if this really works with basically every dx11/12 title at driver level Oh boy It could be massive. That could also be a way consoles Will upscale their res to 4k..like the new and better iteration of checkerboard rendering..using rdna2 will mean they could use this even on pc
@Gareth Tucker I mean new to the AMD GPU lineup. Nvidia is on their second generation ray tracing with Ampere. So I'm just interested in how well AMD does.
Yeah i knew turing was a demo for ray tracing and now im in the market for new high end gpu, im looking at both options and ray tracing performance will be very important to me, also dlss 3.0 seems very nice especially that i play those games that has TAA, this will be very interesting.
@@h1tzzYT as I said in the other commenti, there Is a chance microsoft's version of dlss Will work with every dx title as It was shown with forza horizon 3
Overall that final Nvidia leaker had the most reliable info. The RDNA2 info at that time was almost completly wrong. The stuff that was somewhat correct can be safely attributed to luck or changes on AMDs side further down the line. Overall conclusion was far too optimisitc. Seen worse prediction videos: I'd rate 6/10.
Same with AMD CPUs. I remember reading it's illogical Ryzen could ever beat the mighty Intel. No way is they are nearly as competent as Intel as AMD was a different company back when AMD was relevant etc
Venâncio Ferreira amd has been bushing the limits of technology recently unlike Nvidia disappointing 20 series. With the insane prices with not much performance boost. AMD actually cares for consumers offering a lower price product with competitive performance to more expensive brands
@@fredreickweaver809 But AMD has been working on raytracing behind the scenes for the past 2-3 years for the consoles... So it could swing either way...
"My hair was worn out at the end" Happens to the best of us. 😂😂 Jokes aside, some really spicy stuff in this vid. Really thinking about delaying to upgrade from my 980SLI once again if they are going for a refresh. But I badly need a new card, that "SLI" just doesn't cut it anymore for my 1440p 165hz gsync monitor. What would you say, get the top NV card and sell it before the new stackup comes out, or wait another half a year or so for the refresh?
@@DrakkarCalethiel well that is on you the newer models support both freesync and gsync usually. but i can imagine them to cost more. in any case nvidia is doing good so far
@@S55M3 Well if it is "G sync compatible" it will probably work with an AMD GPU. If it actually has Nvidia's proprietary hardware, I think you are stuck with Geforce. You could trade it for a Freesync monitor though or even sell it and use the money for a Freesync monitor.
Did it though? His Nvidia leaks were pretty accurate and AMD still hasn't even announced their new cards yet... I'll come back when they release theirs to make the judgement Personally I only really cared about the Nvidia leaks anyways because that's what I wanted to buy lol
Dude so glad your a full time you tuber now. Keep up the great work and heres to your channel growing larger. Note: Since Nvidia and AMD are competing with new software stacks, perhaps a future video comparing the differences between both companies. Ex: NVChache comparison is HBCC , Tensors cores comparison are Shaders and so on.
I came here to say just that. People need to understand, Nvidia has positioned themselves to be the leaders in super computers, they are a majority stake in the top 500 super computers. I realized with the 10 series that Nvidia will not take a second off. we watched them cannibalize themselves, going as far as to release two titans, when the 1080 ti bested the current titian, during the 20 series the released the "super" editions on top of ti editions... with the 3070,they've essentially recreated the 970, that card is insane... I'm glad my 1080ti allowed for me to skip the 20 series. a 3080 will have me set for a long long long time... 144+ fps ultra wide... 5120×2160... it's going to have to have another leap in 4-6 years for me to maybe upgrade that
@@FirstBornConservative for the enthusiast high end GPU segment, AMD is just a reminder for nVidia that they should not fall asleep. At best, AMD can beat RTX 3080 but nVidia will release a even more competittive card, maybe RTX 3080 with 12 GB of VRAM or similar after RDNA2 launches.
I still dont believe it.... damn its nVidia after all. nVidia to be the old AMD (pricing), unbeliavable, that was a surprise (not just price, even RTX 3000 specs).
@@trueminecraftfacts Dont trust companies in general..... marketing school teaching you basically how to betray customers and its actually not a joke, its fact just in "shiny" words. BTW I respect AMD/ATI for various reasons BUT they still betrayed/lied to the customers several times in some ways.....
I’ve always said gaming PCs make great heaters in the winter. You get basically 100% efficiency because anything that’s not used for processing is used to make heat. Nothing wasted.
09:28 This has been debunked. Nvidia is still in the process of acquiring specification approval from PCI-SIG to use this new 12 pin design, so it is NOT going to make it into the early launch Ampere series. Maybe a Ampere refresh on 7nm, 6 months later.
Hope this is true, competition is always great. Although, every time AMD has had a new architecture it has turned out to be mostly a let down. We'll see, though.
The upgrade to make the memory components (RAM, SDD card) work together to increase transfer speeds. Makes perfect sense. That way there will be less bottlenecks in the system anymore, brilliant. Good information my man!
I respect that initial conclusions can be revisited, from what I've heard I agree too, high power consumption and 8nm, that makes sense. Fingers are crossed they don't release a 3080ti at first and just a 3080, leaves them room to respond if Big Navi is any good or not (in their opinion)
I hope so too. At least it does give consumers that assurance that something better is coming rather Nvidia just ignoring AMDs advantage and only benchmarking DLSS and RT to say they are faster, then still selling for more. It would also make sense for Nvidia's image, if they just don't release their highest price tier even though Nvidia loses they can just say "our competeting product isn't out yet". Still that gives AMD time they have no direct competion for their top card but I think NVidia would prefer that than having their mighty xx80ti brand losing.
wasn´t 8nm to be around 51 Mtr/mm^2? This is a full node from turing cards, and only 17% more cuda cores?( Full dies :5376/4608 ) When turing (On the same node ) gave 20% (4608/ 3840) with comparable power comsumption, is samsung that shit of a factory?
I jumped ship from Intel to AMD this year on my build I put together in May with a 3900X. However it would take A LOT for me to consider the same going from NV to AMD. I've owned multiple AMD video cards in the past and they've generally always be problematic in one way or another (pretty much always drivers). The 480X I have in my backup gaming PC has been the best AMD card I've owned, but it's still had some overlay menu/draw issues in software like Autodesk Maya. Aside from AMD matching or beating NV performance, they would have to at least match them in feature set and drivers/software which I doubt will happen anytime soon or be near the quality NV has. For now I'd still never have an AMD GPU in a PC I had to rely on for anything outside gaming. If it has to do any sort of productivity work, I'll stick with NV until AMD can actually prove themselves.
Another thing to consider is that nvidia RnD is so much ahead of AMD. AMD always have to adopt technologies when Nvidia sets new standards. It is like everytime AMD starts feeling comfortable they get wrench thrown in.
And Nvidia need dlls support in games... how Many Nvidia GPUs support dlls... 20%? So do we get 3 games a year to support dlls? 4 games? 5 games? We don`t know. All in all not Many game developers make code Path that so few GPUs will support...
AMD's (good) relationship with TSMC seems to be a key to their current success on CPU and possible future success on GPU. Getting ahead of Intel and NVIDIA on miniaturization seems critical. I wonder if this was a master plan by AMD, or whether Intel and NVIDIA acting like arrogant buffoons was a free bonus?
@@clenbuterol4989 Tell that to the Intel CPU and NVIDIA GPU in my current desktop. I'm just calling balls and strikes. I'm sure AMD will eventually become arrogant and buffoonish as well in the future.
GT-AVIATOR im not really gaming too much right now...but whenever I do have some time to game, Im using my surface laptop 2 (gtx 1050) for very light gaming
Shreshta Jaiswal you may be right, but click bait titles and forming such a strong opinion before real news is even confirmed seems a bit disingenuous.
20:20 Henry Cavill sweating in his house somewhere...
Literal pocket change to him doe let's be fair
He'll just throw his 2080ti in the garbage when the new stuff comes out.
@@MrC77 he will sell it online to some bathwater buyer for the price of 4x3090ti, signed ofc.
He probably has the money to buy hundreds of 3080ti s
He is a multimillionaire, a 1000$ gpu wont hurt him.
He'll just give it to some random prolly xd
Dear drivers team, plz don't screw this
Oh, they will. It's AMD. :)
The reason they have messed up in the past is because AMDs software team is rather small so they take longer to fix issues etc
Ironically, on Linux the driver situation is the other way around. Everyone hates the nvidia drivers.
@@nissengummihone Yep Amd is better on Linux however everyone I know uses Windows so I guess it's not well known enough
What people fail to realise is Nvidia's driver team, is bigger than AMD in its entirety...put that into perspective...
AMD winning will be good for everyone, there's no denying that AMD forced Intel to do better, and the same thing will happen with Nvidia
Well, there is denying that... intel hasn’t done better yet lol
@@loganweist8250 That's Intel's fault now though, AMD did their homework, Intel is just slouching around
@Thomas Borisov To add on... AMD was working on their Zen architecture since 2012
When intel's chips named after lakes start to sink AMD's be Ryzen.
@@SRC267 Intel might be designing a new socket for their new tear lake lineup
Who's here after the official reveal livestream?
Farquaad was wrong. I'm sure he wanted Nvidia to show expensive, unworthy products.
This video didn’t age well
@@CHI-LO none of he's videos age well lol
@@alexfizz7402 Yeah. A lot of people always think that NVIDIA is always in the wrong and AMD is always in the right.
True Minecraft Facts they are definitely in the wrong now. Lol
Nvidia: “ignorance is bliss”
AMD: “we want to beat Nvidia but the laws of attraction don’t work like that”
Hello 2020!
Then release gpus that is actually good then? I had sold 5700 xt for an 2070s and couldn't be happier, driver working like an ass on AMD, perfwise? Not so much. Working software working like shit on AMD
I think this is probably good for the industry as a whole. We shouldn't want only a single company being able to produce the best chips in TSMC. They already have a lot going on between everything they are producing for Apple, AMD, Qualcomm etc. Samsung is likely to be heavily investing in EUV production at the moment, which is good. Some serious demand through Nvidia may help support this effort.
Samsung's been pumping out 7nm EUV products since Feb. In fact, Samsung is on their own 7nm++
Samsung's problem is that they're unreliable...in the sense that they'll shift production capacity around with priority for their own products.
17:24 he so confident in what he was saying that he had to change shirts
And again at 20:47... :)
That was pretty funny
@@louisfriend9323 and people still say he's a shill
This is not a Vega moment by the sound of it. This is Fermi 2.0. Better performance at unacceptable levels of power consumption and heat.
for the AMD side it's -Fury-X- , -VEGA- , -NAVI- , Big NAVI?
@@vh9network Probably more like the 5870 or R9 290X. If AMD prices sensibly.
AMD are gonna break a lot of hearts if they repeat their showing with GCN, Vega and Navi once again. Everytime people are convinced that this is the time AMD has an ace up its sleeve and Nvidia will finally be put in its place and then AMD just disappoints. Hope this time is different.
They might be breaking hearts if they do repeat it, but they come close to the competition and fairer prices, which someone who cares more about price like me would be fine with it.
@@NatrajChaturvedi I think that AMD dropped the ball back in those days mainly because it was putting most of it's resources to try and compete with Intel. Now that they finally have a solid footing against Intel, that freed up financial and manpower resources that were invested in the graphics division. Given the fact that they have awesome results from 7nm in the CPU market, the likelihood of Big Navi actually NOT dropping the ball this time is very high.
The upcoming console gen using TSMC probably helped with them really getting chummy with AMD. They have at minimum 10 million APUs between MS and Sony just for this year. Nvidia's volume is 'low' compared to AMDs volume between desktop and console.
Also TSMC probably haven't forgotten that time Nvidia publicly scolded their 42nm node for being "practically useless as a technology"... because ATI beat the crap out of them.... on that very same node...
@Ignatius David Partogi where are you getting the 1% profit number from?
@Ignatius David Partogi TSMC don't care about amd's profit margin when they are getting paided the same. And AMD is probably happy that they have dev support and sharing tech thanks to consoles.
@@emmetkeane3474 from nowhere! they're making up numbers to convey their idea
The current consoles apus already are made by tsmc. 28nm for regular consoles and 16nm for x and pro.
I have no desktop and a single high end GPU costs 5 times the minimum monthly wage in my country. There's no way I could even dream of buying one of those. Yet I still watch every one of these videos.
Go AMD.
AMD's 7nm GPUs may overtake Nvidia's in laptops. I mean, you can't just push hundreds of watts on a laptop just to win.
@@hasnihossainsami8375 I truly hope so. Can't stand Intel's iGPUs anymore.
@@hasnihossainsami8375 that's where you're wrong. I had an Alienware 15 in 2012 that literally did just that. On battery they downclocked a lot to be able to run the thing but when you plugged it in it drew 300 watts at peak.
@@gannonrosencrans5696 Brazil. Brazilian gamers need to rise up.
@@youhackforme Dude, an Alianware is a very niche market. Most notebooks have discrete or mid range GPUs that for the price can't afford the necessary cooling to draw 300w. So yeah, AMD will probably take a big chunk of Nvidia's market share
It's shocking that by the end of the year AMD may have gone from a total laughing stock just a few years ago to being an industry leader in two markets against competitors which are both much larger than AMD.
Thats what happens when the best have no competition to worry about they raise prices cut costs and features and try to get them stock prices higher make the shareholders happy what choice does the consumer have ? well AMD filled that hole
@@lucidx2 The problem was the leadership, or lack thereof. Lisa Su has turned everything upside down.
That's if his "sources" are true, I remember another ytber that they had all the shit last time and they were full of shit just for views
@@JustrazJD She's the mouth of AMD, not the brains.
@@Koozwad Incorrect. She is the brains, too. She has a PhD from MIT in semiconductor design, and was one of the first ever to prove the concept and viability of silicon-on-insulator technology, which is now used throughout the industry.
More than that, upon becoming CEO of AMD, she carried on her predecessor Rory Reed's company reforms which greatly helped to turn them around (especially the diversification of their portfolio and the restructuring of their insane debt loads). She has changed things even more, in her own right, by furthering the diversification of their portfolio and also by restructuring the entire company itself.
A lot of credit ought to be and is given to Reed for turning AMD around, but Su has proven his changes to be the right ones and has significantly built upon that foundation and steered the company in all the right directions. Lisa Su is the mouth, but she most definitely is also the brains of AMD. But she hasn't done it alone, either. She has been smart enough to surround herself with good people, to recruit the right people for their respective jobs, and to get rid of the people who have been perennial thorns in AMD's side and holding the company back for so long.
Just like how AMD surprisingly one-upped Intel with the Ryzen CPU's, I hope AMD one-ups Nvidia with the RDNA 2; if nothing else, to force Nvidia to compete by offering at lower prices!
Force Nvidia to lower their prices is what I can hope for at the least.
This is one of the reasons why you ought to be real dumb to not root for AMD, regardless of which GPUs or CPUs you prefer having.
AMD are gonna break a lot of hearts if they repeat their showing with GCN, Vega and Navi once again. Everytime people are convinced that this is the time AMD has an ace up its sleeve and Nvidia will finally be put in its place and then AMD just disappoints. Hope this time is different.
It's been almost confirmed for a while that prices aren't going down this Gen, they'll be the same.
So the 3060 will still be $350
AMD doesn't seem to try to undercut NVIDIA's pricing though. If NVIDIA increases prices in a performance class, AMD also does it. And let's hope they won't have that driver debacle they still have with Navi. A lot of people have their cards crash in a lot of games and switch to NVIDIA if they can't get it to work with driver updates.
@@shehrozkhan9563 Any sources? If you don't have any, it's anything but confirmed.
So an Intel processor using 330w and a gpu using 400w and.... a nuclear reactor to power it all? That's 750w without any other hardware. Holy shit.
That’s the middle setting on a space heater.
so an 800w would probably be minimum required for a system like that and if you wanted your power supply to run as efficient as possible you're looking at a 1600w
@@jamestor6700 holy shit, thats insane.
@@jamestor6700 Peak efficiency and overhead state that with a 750w draw, you'd want a 1000w PSU, so even though you could get away with the 800w, you wouldn't _want_ to.
It's not the most widely known concept, but the math for getting an "ideal" PSU is: Your expected wattage useage+33.25%, rounded up=the PSU you'd want to get (provided no other upgrades are expected).
@@RAW_Reality "The supplies achieve their maximum efficiency when operated at 50% of their load. In fact, the manufacturers guarantee the maximum efficiency only when the supply is run at 50% load." thats from sunflower-uk(dot)com as well as velocitymicro(dot)com
youtube hates links which is why I used such an odd format
Don't do that, don't give me hope
It has happened plenty times before, so it's bound to happen again.
A good Nvidia card would only drive AMD prices down, we win either way
@@andersjjensen but this time he talked to the person working in amd. Its not that he is estimating all this by himself.
Dont be fooled by this fanboy, NVIDIA just simply have the best technology. AMDs current cards still don't perform as well as an 1080ti...
@@Blight-fp3vt but their current card never claimed to beat the 1080ti but now they are doing with big navi.
In hindsight..... XD
Looking back to the 1080ti days, AMD published that they were bringing out an all powerful card; an NVidia killer. NV really stepped up with the 1080ti and crushed all AMD's hopes and dreams. I cant help but feel that NV have been very canny this time around aswell: all leaks prior to today hinted at an underwhelming card lineup, with leaked specs roughly 50% of reality... was that done on purpose not to troll us, the consumers, but actually to troll AMD into believing they had a chance?!
Nope.
@@firagabird ok lol
The reason Pascal was so good is because it was Maxwell on a full node lower than 28nm. Maxwell was originally intended to be manufactured on 20nm, but due to the transition to finFETs, never came to fruition and remained on 28nm. They had to make major architectural improvements to achieve a "normal" generational leap on the same node. Thank Maxwell for the 1080ti lol. It's crazy that AMD couldn't do on 14nm what Nvidia did on 28nm. There are many Maxwell gpus that could hit 1600mhz.
Mah boi was lookin slick AF on that opening scene and turned into an ex-nam vet alley bum by the end of it... That's how you know homework was done, sources where triple checked and leaks where double verified to facilitate straight fact spittage! I salute you Sir.
0:09 Tom: *takes a sip of tea / coffee* "I'm gonna be honest"
Me: Oh, here we go!
Sold my 2080 TI for $150 more than I paid for it :)
@R3TGx Lack of stock combined with COVID. Also Nvidia's impressive dominant mindshare.
*Stonks
@R3TGx Covid has made it very hard for Aussies to get hardware, that and the stores are charging insane prices, so I split the difference and still came out ahead. Sold my Rift S too, again made another $150 over the original price :)
To me choosing between Nvidia and AMD will probably come down to the implementation of the RT and the new power connector, if I have to upgrade my PSU to get the Nvidia GPU and the cards compete reasonably closely, it will swing it in AMD's favour for me.
im so happy for you dude!! lets gooo
40% with 300W sounds like crappy Samsung process
Sounds like high clock speeds (power consumption increases exponentially with clock speeds)
AdoredTV has a video speculating about the Samsung node Nvidia is using for Ampere, it might interest you,
th-cam.com/video/NTGkW9cRUKI/w-d-xo.html
AMD did mention 50% performance per watt increase for RDNA 2.
Taking into account that perf. per watt usually takes into account uArch and node... the node AMD will be using is N7P (which only allows either 10% improvement in efficiency, or 7% increase in performance - so, one of the two, but NOT both).
40% from uArch and 10% from node.
2080ti = 50% faster than 5700XT at 4K (for the most part).
That means that a 40CU RDNA 2 = 2080ti at 4K (at 225W) - without an IPC boost.
Dropping clocks by 25% would result in about 12.5% drop in performance, 50% drop in power consumption (since power consumption increases/decreases exponentially with clocks) and (maybe) leading to something like this:
40CU RDNA 2 = 12.5% slower than 2080ti at 4K (at 112.5W - 135W)
48CU RDNA 2 = 5-10% faster than 2080ti at 4K (at 157.5W - 180W)
60CU RDNA 2 = 31-36% faster than 2080ti at 4K (at 225W - 250W).
72CU RDNA 2 = 49-54% faster than 2080ti at 4K (at 292.5W - 315W).
80CU RDNA 2 = 75-80% faster than 2080ti at 4K (at 337.5W - 350W).
That's if you take into account rumors of 72 and 80CU versions and RDNA 2 topping out at 300-350W... which could be difficult since yields wouldn't necessarily be too great because an 80CU version of RDNA 2 would DOUBLE existing chip sizes.
Of course, you could gain something similar with increases in clocks instead, but you'd end up hitting the point of diminishing returns like this.
If AMD gains an IPC increase of about 7-10% as well, that would either increase performance by similar amount on above values, or they would remain the same with roughly 14-20% drop in power consumption.
This is just my own personal estimate... by no means should this be taken as 'accurate'.
There's also the fact AMD will be using raytracing technology in RDNA2... which may or may not affect above performance.
AMD will be using a hybrid method for Raytracing as mentioned here:
www.tweaktown.com/news/66455/amds-new-patent-explains-hybrid-ray-tracing-approach/index.html#:~:text=This%20method%2C%20according%20to%20AMD's,based%20ray%20tracing%20be%20solved.&text=This%20is%20the%20chip%20that,with%20ray%20tracing%20support%2C%20too.
The method seems to be using existing hw (which has a different layout and uses it differently) for Raytracing without needing additional space (and without a performance impact that NV experiences in RTX 2xxx series).
LOL I'm still on my old 290X, AMD's last GPU that beat Nvidia. Glad I waited and now it looks like I might be able to buy their next winning GPU.
Still on a Sapphire R9 290 vapor-x. What a beast: Hawai.
I'm on a 390X only because the manufacturer ran out of cooling assemblies for my 290X (one of the fans had a shot bearing) and offered me a swap via Warranty RMA. XFX has generally been good to me.
A 10 year old gpu lol
@@trent4439 290x came out in 2013 october.
My last Radeo Card was a 7970GHz Sapphire Vapor-X. I loved it. Went on to an R9 Fury X that coil whined so bad i brought it back and took a 980ti home. Now I'm waiting for the new cards 👍
Looks like NVIDIA would outperform AMD at cooking breakfast and keeping their owners warm during winter time :)
If I could attach a GPU powered grill to the top of my PC, I totally would. Nice panini with key gaming? Heck yeah!
They had us in the first half, not gonna lie
Search Japanese TH-camr cooks food on his computer CPU
But watch how the NV fan bois forget about all the heat shit talking they did and pretend that hotter is better.
@@JS-be8ll AMD : nvidia ,if you're hotter than me. Does that mean i am "cooler " than you ?.
But AMD needs to have it's drivers working and optimised at launch this time
And THATS the catch. I'd have a new AMD card if they could write a graphics driver to save their life.
The end of last year was the only time in a long time that they've had serious driver issues. And one of the reasons it took them so long to resolve was because they simply could not replicate on their rigs the failures people were reporting. But they eventually did resolve them nonetheless.
Besides that, I don't think there will be a repeat of that with this launch. For one, those issues were specifically affecting the first generation of RDNA cards. They have been resolved, and now these upcoming cards are RDNA 2. For another, RDNA 2 is the GPU used in the Xbox Series X, and no doubt they have been working closely with Microsoft to finalize and refine their software and driver stacks on both the console _and_ for PC.
@@colesherrill7472 I think you mean to say couldn't. AMD couldn't write driver's to save their life.
@@madmax2069 oh really???? My 5700 is working fine. It needed a bios update when I bought it but that is the fault of ASUS, not AMD. Also, most reviewers have stated that they can't even duplicate these complaints. I'm not saying that the drivers aren't without problems but everyone seems to forget that Nvidia's TDR problems went on for YEARS.
@@colesherrill7472 Spoken like a true Nvidia fanboy! XD
I own 300 GPU's from 9 different manufacturers from 1986 to 2019. AMD and Nvidia both had shit drivers and good drivers. On average, since the HD4XXX series, AMD's have been better. Not to mention longevity of support! XD
20:19 "I hope you waited to upgrade..." I sure did... either card would be a worthy replacement for my GTX680!
i'm still on a GTX970 lol
I was on 550Ti until a year ago
@@TristanSchaaf My son is still on a GTX570...
Guys, just get a used cheap RX580 and call it a day, it's such cheap upgrade than can run every game at 1080p 60fps fine.
That is sad truth about gaming industry and graphics technology in general. Performance percentage claims are joke as todays flagships cant even double 3 gen older cards fps. I remember times when new games did not even support 3 gen old cards. Today you see people praising in forums if developer allows you to run gtx580 on high settings :p
so he got the power connecter, the cooler, the ports, the TDP, the CUDA count(5376*2=10752), memory speed, core clock, 4k perf, launch date, CUDA 11 right
the RT performance and the extra features are off by a lot though.
so i think its fair to say he got it right over 70%. I don't really know why people call him lying.
I hope AMD punches Jensen real hard this year. They got way too cocky with their demo RTX 2000 series. Glad i stuck with GTX 1080 Ti and skipped the RTX 2000 test run. Well see how it goes in a few months.
I'm still rocking a 1070 man, and I'm looking forward to these new cards. Hope for one of the best competition this year between the two companies where the consumers benefit the most! 😁
Only had to look 5 comments down to find the 1080 ti virtue signal!
Still got a 980. Feel like a fool not getting a 1080 Ti.
i dont know why but my gut feeling is that nvidia is still gonna pull this off somehow. i feel like the 2000 series was a pretty weak move but that this gen they will get back on track. who knows though we just have to wait n see
1080ti is such a beast card. picked up mine for like 800$ a few months after release. came with a game or two also.
Nice haircut Tom!
He looks like he traveled back to the '90s.
@@handlemonium No no, tom stayed in the 90's.
I think AMD will go with another "Jebaited" and pull out a 5950XT with 80CUs.
This would be great.
Go for it.
A secret weapon just in case the RTX 3080 pulls too much ahead.
Tom Hsia but nvidia will also have the 3080 ti or 3090. They always have a backup card waiting
AMD still dont have a card that beats a 1080ti... I mean what else can we say... If anyone out there thinks that AMD is beating NVIDIA this year when NVIDIA actually releases a new generation on the same die they are just smoking far too much weed.
@@Blight-fp3vt We'll see..
@@FinnReinhardt Yes, you will. Thing about it is that if you listen carefully to this reviewers words he has hedged against being called out on anything after the fact. Oh he suggests stuff, but does he outright come and say that his prediction is that AMD will win the next gen? No. He says shit like they 'could' win. I'm happy to tell you right now - THEY WONT WIN, AMPERE WILL BLOW THEM AWAY.
I just want whichever side ends up being more powerful, but I’d be nervous about going RDNA2 based on my negative experiences with driver issues with my 5700XT. They were solved eventually but it’s so hard to tell what’s a driver issue and what isn’t and it starts making you paranoid with every glitch.
Yeah that's my only concern, I have had driver issues in the past as well with them and I just don't know this go around. Maybe best to wait and see what happens.
I ditched amd because the hassle i had with the hd 4890, paying top Dollar for their top card and every other driver breaking something so i swapped it with a gtx 275, next upgrade was a sweet gtx 580 and then a 680 and then a 970, the 1060 and now a 2070 super, never looked at team amd again. Thats hiw much drivers affect a user to. Make decisions for years.
@@najeebshah. funny you mentioned the gtx 580, you should hear Tom's opinion about the Fermi series and how the drivers were trash back then.
Im never going AMD card again, since i built my first computer 20 years ago now, ive had 3 ATI/AMD cards.
And the drivers have always been piss poor. Nvidia always release new drivers with new games, and are quick on the ball. While AMD can stick with shitty drivers for weeks or months for new titles. And thats not even going in to all the other issues... I hope AMD beat nvidia, honestly i do. Cos that can only be good for the market and prices. But as long as nvidias top cards are still within the ball park performance/cash ill still rather get a 5% downgrade in performance for the same price, just to get a nvidia card instead.
@@z0lid Yeah im in kinda the same boat as you, I built my first PC same time round 20 years back and used to go with AMD but have only ever had issues with them. Last few builds with Nvidia have been really solid and I would prefer to take a 5% performence hit if it means stability this go around. Nvidia need to be taken down a peg and we need competition but AMD really need to up there game in the driver department.
Tried to play it off that only your viewers or forum posters were hyping up big Navi performance when you were the one stoking the fire
I just want AMD to wipe that punchable smug smile off of Jensen Huang's face. And maybe Lisa Su can steal his jacket too. Hopefully AMD can keep the momentum going, and stay on top for around 3 generations of cards or so, so that they can build up a war chest on the GPU side, just like they are trying to do on the CPU side.
Lisa Su is about to peg Jensen.
Yeah no, They are both related to each other. Lisa is jensons niece
I hope she puts on a leather jacket at release.
Jensen is a pioneer in computer graphics. Any cockiness he displays is well deserved even if it comes back to bite him in the ass later.
What I'm most excited about, is games being built from the ground up for the RDNA architecture, since Xbox and Ps5 will finally be using the same chips as PC. Games should be very well optimized!
Devs will screw it up and Nvidia will bribe them. Just watch.
i hope so, i really would like a amd gpu again, been 14yrs last time i had a ati card.
I agree and some Gamer's forget this will help AMD GPU's with video games. Hopefully AMD GPU less expensive than Nvidia GPU's.
@TurboCMinusMinus usually game devs develop with the lowest common denominator in mind when creating a multiplatform game. So I don't see crossplatform gaming being that much affected by the SSD, there will.be minor differences, with the gap really showing in 1st party titles for the PS5 doing some crazy stuff
Thats part of the sucess of gcn finewine already the current gen basically uses an rx 570
Lol the switch from the (team) green shirt to the amd one was perfect. Well played sir. 😆
40% or 50% is still astounding.
It will come down to features, drivers and of course price..
Drivers are what kept me away from amd. But those prices always kept them as an option everytime I'm ready to upgrade.
@@Acenumba19 drivers are what caused me to sell my last AMD card (aaaand prices for that card were high during the mining craze😅)
Tbh Price is really the only front amd can really compete. They only Start having raytracing and dont have things like DLSS etc. When it comes to drivers amd to this Point was not close to competing with Nvidia for 10+ years. Im happy if they get their shit together but i dont believe in Miracles.
we still haven't seen real world tests by 3rd parties but I'm pretty sure this video won't age well after the anouncement...
DF did a little bit of testing and it's kinda legit
AMD shares are going to Sky Rocket after this console generation releases.
Probably not. Their P/E ratio is already quite high enough. They'll need to show increased earnings to warrant a further increase in share price.
@@TrueThanny Yeah, they've already doubled their share price this year. But of course the problem with buying stocks right now is another crash is likely to happen. So would be better to wait for that. But then it might not. What to do!!
Already baked into the stock price.
So happy I bought at $17 :)
(after hearing about early Ryzen leaks)
@@cynicle Smart:)
For the love of God, I hope the drivers are good. I've had quite a few of my friends skipping the 5700/XT and going with a 2060/2070 just because of the issues that has/had been present.
Definitely a fair criticism. I struggled for quite a few months. Only to find my biggest issue was my Ryzen OC. PBO + AutoOC is apparently broke on my mobo and caused countless crashes(only while gaming of course). Thankfully Tom made a video about that a few months back and I got it fixed.
That was me right there. I had to return my 5700xt after a week because the amount of issues I had with that made me really mad. I bought a 2070 Super right after, and it worked right away.
@@SteveHonakerII likewise, 2950x build on Zenith Extreme and ASUS rog strix 5700xt, some minor issues along to road but ended up tracking down the issues with mobo, ryzen master, win10 and Radeon all not wanting to jive
Yep, exactly. AMD has to prove itself to me stability and reliability wise before I entertain the idea of buying an AMD GPU again,
LANo yep ur the only consumer that matters.
Lmao changing your shirt at 17:27
Not everyone uses these cards for gaming. One area where AMD has been unable to compete is with high end 3D rendering engines like Arnold GPU, Octane and Redshift. There are a whole lot of VFX artist and animators that use GPU based rendering in software like Houdini, Maya and C4D etc. These rendering engines use CUDA and recently RTX to accelerate 3D rendering many many times over what CPU based rendering is capable of in the same time frame. With RTX for instance we have seen speed increases of up to 800% in Octane. Unfortunately for the professional market none of AMDs cards are supported by anything other than AMDs own ProRender at the moment which almost no one uses because quite frankly it sucks. It’s slow and it’s lack of features and lack of studio pipeline integration ( no cryptomattes, lack of AOVs etc ) have made it pretty much DOA. I realize to most gamers this is probably a non consequential edge market but Nvidia’s hardware has truly revolutionized the CG and VFX industry making it possible for small and medium sized studios to be able to compete with bigger players just by cramming workstations with a few GPUs each and giving small studios the final frame delivery capabilities that have up until now only been possible with expensive render farms. I built a workstation recently that has 4x 2080ti’s and final frame delivery in a recent project was around 45 seconds per frame. When doing the same scene using CPU based rendering ( 32 core ) the same frame took nearly 72 minutes. This is a pretty big deal and unfortunately AMD doesn’t do the R&D or the code development that Nvidia does to make these types of things possible on their side.
Yeah it is true that AMD suffers in that area. Simply because CUDA is more integrated then opencl. It's not like the cards would have less power to do things. Nvidia just did the right thing years ago and it became almost an standard albeit it's only on their cards. Trying to catch up to that seems rather impossible at this point in time even if AMD did invest money.
@@MrRyusuzaku
Is not only that, the card are simply better at brute force computation.
- double datapath and exec unit for concurrent int and fp32 execution (by thinking of how you usually work with vectors and calculus over indexes/pointers this is a killing feature)
- RT core used to accelerate every algorithm that use BVH structure
- tensor core for heavy matrix multiply
They have more hardware functionality exposed than amd...
the last trick in ampere A100 is the direct load from ram to L1 bypassing L2 and increased bandwidth of cache in general..
renderfarms and research centers alone buy more nvidia highend cards per year than actual gamers do , hence nevidia putting more and more focus into innovate technology like ai , rt, and cuda ... soon most autonomous cars will run with an nvidia card
who would use such thing when we got Blender for free !
Xtoor Gaming blender uses Redshift and Octane render too. Yes it’s got cycles for free but even but even cycles requires CUDA if your going to get any real speed out of it. Also when you start collaborating with studios on commercial projects many of them use Redshift and Arnold GPU as part of their production pipeline. If you want to earn a living then you have to adapt to their workflow and many if not most are using Arnold, and Redshift for production and Octane for look dev. They choose these render engines inside of whatever app be it Blender, C4D, Houdini or Maya because Nvidia / CUDA / RTX save a lot of time - with the bottom line usually being time = money and all ...
Who came back after seeing Nvidia put amd in the grave?
Rdna2 hasn't even been revealed yet. There was a lot of gimmicky charts in the nvidia reveal, but rasterization perfomance is still probably 40-50% faster than the 2080ti, which is what all the leaks said. Amd leaks said 40-50% as well. they will be competitive. ampere is the first time nvidia flagships have been competitively priced and it's probably because they know amd will compete
@@user-ps7ij6ge6d Everybody and there mom thought it was going to be 35% or less for $1200 to $1500.
Instead it is roughly what you said. 40-50% on average, with more possible depending on the title. For $700.
That is a MASSIVE difference from what anyone expected.
Now also take into account that Nvidia is also doing this on an inferior 8nm Samsung node (which is closer to 10nm in regards to pitch and density), and they have tons of room for improvement.
Why? Because Samsung 7nm EUV production is going to TRIPLE by years end. A 3080Ti that has better gaming performance than the 3090 is entirely possible (happened in the 10 series vs the Titan) within 8-10 months.
Then a refresh of super cards immediately after.
If we don't hear any credible leaks by the 17th when the 3080 is supposed to go on sale. It's almost 100% certain that AMD cant compete with Nvidia in terms of performance. They will only again be relegated to the low and mid tier. Where Nvidia will still out sell them lol.
@@DESX312 Performance wise, rdna 2 is on par with ampere by the leaks. the price is irrelevant in terms of performance
@@user-ps7ij6ge6d the only leaks that have come out are wildly baseless. Literally the same B.S. that happens every year.
AMD even did it directly with Vega. Hyped the crap out of it, only to disappoint like hell lmao.
If AMD isn't even doing that THIS time. Then it's because they can't compete.
I want to see spec sheet leaks from credible sources.
Go ahead and save this comment so we can both come back after the 17th to see who was right.
Edit: Price is VERY relevant when you are talking about cornering huge swaths of the market.
No one thought that Nvidia would price their chips this cheap for this level of performance.
It can potentially back AMD up into a corner if their 7nm wafer costs were far greater.
@@DESX312 ampere literally costs the exact same as last gen. 699 for 80, 499 for 70. basically 1499 for 2080ti, at least in canada. Also the marketing team that was responsible for vega left, it is a completely new team now. Not to mention original navi actually did what it was supposed to. 2070s performance for cheaper. in canada, the 5600 xt was over $100 cheaper than the 2060 for the same performance. Sure, nvidia will have more performance but rdna 2 will definitely compete
I’ve been on Nvidia for so long I can’t remember the last time I had an AMD card (myb 12 years ago) I cant believe what I’m saying but given AMD momentum and performance for cheap, I’m certain that card will be worth it. If the benchmarks are close I’m buying AMD.
Did you miss hd 5000, 7000 och 290 series of cards?
Only reason I don't go AMD for video cards is the poor driver support and lack of features like DLSS.
@@RyviusRan DLSS, I can do without no problem, but I agree 100% with ya on Drivers and control panel... it's was bad back then.
@@christophernylander6728 Didn't.. but friends had them, and remember them having issues sometimes, waiting for fixes and such for some time.
MY last AMD GPU was the 6950 and I ditched it because the drivers were bad, and I was getting god rays cut in half diagonally in Bulletstorm and some mods for Doom3 just would not load. I switched to the nvidia GTX 580 and was much happier. Now I've got an expensive Gsync monitor so I'm locked into Nvidia for the forseable future, so it doesn't matter how good AMD gets for the price, I'm forced to look out for the best value nvidia 3000 series card.
Why do people fanboy over gpus and fight each other in the comments? LOL
Its in our blood. Doesnt matter if its GPU, console or football team :D
I'm not fanboying over AMD, I'm fanboying that someone would teach Nvidia a lesson. Intel already got their lesson, and the only reason why I don't want them to go bankrupt is that we need competition. Nvidia is horrible, but much more conning ways. Anyone interested should watch AdoredTVs video of Nvidia anti-consumer ... (don't remember the title, but that will find it), not that it's exhaustive on the topic, but might show light for many people. Nvidia is master at getting away with their horrible shit, and people forget how much problems Nvidia has had, even in the recent history.
Argument for Arguments sake.
They buy one, and that is there "team." It is foolish logic that scales to every part of life.
because they're really a cult
Hmm... So Ampere might be bottlenecked by rasterization, while big navi - by rt? Battle of bottlenecks gen. Ib4 nvidia's pulling out tessellation trick again with underground rays or some shit.
RT water to simulate water flowing underground in cyberpunk 2077
I thought he said 40% increase performance in rasterization from Ampere? Honestly it looks like the 7nm vs 8nm is going to be what keeps it close. Nvidia getting clobbered by not having as much sway with manufacturers since AMD is in both consoles.
@@chaselewis6630 AMD has Ryzen CPUs on TSMC, too.
We're gonna have to choose between the two it seems like, sucks either way. If this happens I'm gonnna hold off on buying anything until the Super refresh and just wait to play Cyberpunk.
UE5 will break any shinanigans. Nanite and Lumen are serious technology disruption. Their engine would force the bottleneck into shader and IO performance.
Remember when Nvidia said that the 2080 Max-Q would be more powerful than the next-gen consoles?
Remember 3 years ago when AMD said Poor Volta? Still waiting on an AMD card that can beat Volta...
@@johnwilliams615 its coming lol. People dont realize that AMD was basically bankrupt in 2014 and doing marginally better by Zen's launch. Its only been 3 years for them to prove they can deliver on time and have competitive products, which they do now in almost every segment of computing besides AI. CDNA is coming soon. That will be their zen moment in AI cards. Again, in only 3 years they've done this
@@johnwilliams615 You are actually waiting for that?
@@denverbasshead I hope Lisa gets to retire wealthy. That lady did work organizing that company and look at the results
Now I feel bad for Intel. Raja almost drove Radeon to the ground, and now that he's gone RTG makes a comeback with a card about as good as HD 7970.
Anyone else getting wierd feedback from audio? some chirps at the edges of words, if that makes sense? Am I hearing the Matrix? Oh god...ive gotta go
Nah fam, you ain't hearing things. I got it too...
You hear that, Mr. Anderson? That is the sound of inevitability. It is the sound of your death. Goodbye, Mr. Anderson.
Sounds fine to me at 1080.
yeah im hearing it too
Raytracing and deep learning may change what is thought of as the performance crown.
That's a part of the puzzle not being talked about. The current state of Cuda core optimizations and their denoising software is amazing. I don't see that in AMD cards.... Yet! but we shall see...
Gaming laptop manufacturers: Aight, I’mma head out
Can you imagine having to design a laptop cooling solution for a 300-400W GPU? Fuck me ...
D.B.C.|T1M0 you do realize that 300W is most likely for the TITAN/3090? That won’t be in laptops lol, also, that wattage is normal for the highest performance card
Laptops are so fucking overpriced, it's insane
@@miguelpereira9859 you realize the amount of engineering that goes into laptops? Plus, remember, you're getting a monitor, keyboard and touchpad with them as well. Packing anywhere near desktop levels of performance into a chassis under 30 mm thick is insane, and warrants the extra cost.
Gaming laptops are a gimmick. You're not meant to game with high-end GPUs on laptops. Everything about them sucks for gaming.
But in reality being forced to do everything with sony may have given them higher yields given the 3070 price, and they where able to make insanely fast cards for cheaper. I hope amd kills it with navi but i think you put way to much emphasis on them being forced to go with a larger process as being a thing that could kill them when in reality seems like it had very little impact if not helped them
No, Nvidia maxed out, have a huge power draw and their FE cooler design costs $150 to make.
I've been saying to my father and to everyone I know that the we should really wait for the end of this year because GPU market is going to be insane and really worth the wait
You should probably stop annoying people that don't wanna hear about this kind of stuff
@@ofon2000 xD bro but they do they are all people who are looking into getting something new or know about the industries new
If AMD can guarantee proper drivers this time around they'll win, but if I have to choose between a (much) slower (more expensive) but stable Ampere or a unstable but (much) faster (cheaper) RDNA 2, I'll have to go the former.
I don't think it's a driver issue. I firmly believe that from the majority of the complaints, the problem is a hardware bug that wasn't fully ironed out before the release date. That would 100% explain the randomness of the issue and is something AMD would absolutely never admit.
Here's to hoping AMD's engineering team actually managed to figure out what the problem was and fix it because hardware issues at that level are an absolute nightmare to debug.
Half of these "driver issues" are people who are stupid and simply try to install their new AMD GPU and drivers on top of their OS with a bunch of stuff installed by Nvidia on it, which causes conflicts and errors.
@@LiveType I have always had far more trouble with AMD than either nvidia or Intel with bugs in their drivers, especially their OpenGL implementations. This has been the case for 10+ years writing graphics code, and having to spend so much time tracking down bugs and finding workarounds to broken AMD drivers.
@@shawnpitman876 that is so true. They install AMD stuff without clean uninstalling their previous Nvidia software, and the registry entries and bits of software left over start to conflict and wreak havoc.
I used an HD 7790 for years and it never gave me any issues.
@@Aethid Oh, I'm not saying AMDs drivers aren't bad/worse than Nvidia's. It's just that 1st Gen Navi likely has a hardware problem on top of AMDs lackluster software.
I've literally just been binging your past videos today and the instant I saw this video and I stopped my game
I'm loving the 17:30 cut hahaha
I'm really on the fence with next gen. It feels like the choice will be between pure rasterization (AMD) or a complete, futuristic software solution (Nvidia).
If DLSS turns out as good as it's looking right now what's the point of having 10/15% more pure power with AMD if flipping a switch on Nvidia's side nets you 40% higher FPS with comparable image quality? That's really my concern.
Because you'll have both with AMD. That's why they put ML on their RDNA2 cards.
Now play Cod MW with DLSS 2.0/3.0 and start trying to snipe people across the map
I want to know if multiplayer games can even be compatible with DLSS
Not only comparable, but some times better. Yeah, I also thought of that. Let's hope AMD has something to show.
Dlls can be good, but how Many game developer makes game that support %20 Nvidia GPUs? Not Many! There just Are too few GPUs that support that technology. The situation gets better. 5 more years and %50 of Nvidia GPUs will support it. But there still Are amd and Intel cpus that don`t... it may suffer eventually the same fate as PhysX and GSynch and eventually fade out. So it depends on how Many new games in a year will support dlls. 4 games / year? 5 games / year? We don`t know...
AMD has FidelityFX with upscaling and CAS.
Glad you started using the diagonal background watermark! Don't let people steal your slides! Keep them coming Tom. I've already started to see tech news websites cite your channel for their articles.
Well... shit. 🤣 Nvidia trolled all the "leakers".
This is why I don't like the rumor mill and speculations regarding phones, gpus, cpus, consoles etc. Just fucking wait for the official announcement and take all the leaks with a ton of salt.
I hope AMD will come up with a good product, that can compete with Nvidia's lineup, but given the situation, I would be surprised if they do. In the end the consumer will win from this fight, so I say bring it AMD, because Nvidia has fucking brought it brother.
Well, I hope this is AMD's year, they have struggled for a long time, and competition is good for the consumers. Great Video! I hop you keep them coming!
Loaded TH-cam just now to see that you uploaded a video literally a minute ago. Might as well watch early I guess xD.
Same XD
ECKS DEE AMIRITE FELLAS? HA HA XD
Exact same.. I was just thinking this and BOOM. Comes in with the clutch.
Being early to info like this does make it feel good, there’s no denying that 😁
I always watch a bit later so I can go through the comments
17:28 Insert AMD shill joke after clothing swap.
AMD is for all intents and purposes first generation ray tracing, so I'm curious how their cards will perform against Nvidia in the ray tracing department. That and software/driver.
I'm pretty sure Microsoft and Sony, both, have helped AMD come up with a good ray tracing solution. I predict they're going to surprise everyone.
@@TimStCroix yep..they Will even have a dlss counterpart..they showed the "directML super resolution" with forza Horizon 3 (might as quelle work on older dx games which would be bonkers) and It looks amazing..they said they could use this gp-gpu approach instead of the dedicated cores one (we know Nvidia loved dedicated stuff ti cito out competition)..but if this really works with basically every dx11/12 title at driver level Oh boy It could be massive. That could also be a way consoles Will upscale their res to 4k..like the new and better iteration of checkerboard rendering..using rdna2 will mean they could use this even on pc
@Gareth Tucker I mean new to the AMD GPU lineup. Nvidia is on their second generation ray tracing with Ampere. So I'm just interested in how well AMD does.
Yeah i knew turing was a demo for ray tracing and now im in the market for new high end gpu, im looking at both options and ray tracing performance will be very important to me, also dlss 3.0 seems very nice especially that i play those games that has TAA, this will be very interesting.
@@h1tzzYT as I said in the other commenti, there Is a chance microsoft's version of dlss Will work with every dx title as It was shown with forza horizon 3
Overall that final Nvidia leaker had the most reliable info. The RDNA2 info at that time was almost completly wrong. The stuff that was somewhat correct can be safely attributed to luck or changes on AMDs side further down the line. Overall conclusion was far too optimisitc. Seen worse prediction videos: I'd rate 6/10.
I really hope RDNA 2 has good ray tracing implementation.
It won’t...
John Williams Why are you saying it won't? I don't recall Tom saying that, which does not mean he didn't.
Who needs that shit. I want a lot of frames, no stupid bling bling shadows that cut my fps half.
I really hope they have something like dlss 3.0 or most modern games will run significantly better on ampere even with 10-20% more raw horsepower.
@@Fischihappen yeah, that would be nessesary to keep up with nvidia.
I came to see his beautiful jawline.. and tech tips on the upcoming AMD GPU
Stalker alert
wierdo alert
#notahomo
TheGuruStud that’s what we’re all here to see. ;)
Nice haircut, Tom. And thanks for all the effort.
10 months later I look back at this video and go.....
NOPE
How many times did we have this conversation when AMD is releasing new GPU's?
Same with AMD CPUs. I remember reading it's illogical Ryzen could ever beat the mighty Intel. No way is they are nearly as competent as Intel as AMD was a different company back when AMD was relevant etc
@@timothygibney5656 Nvidia doesn't sleep like Intel was sleeping.
Venâncio Ferreira amd has been bushing the limits of technology recently unlike Nvidia disappointing 20 series. With the insane prices with not much performance boost. AMD actually cares for consumers offering a lower price product with competitive performance to more expensive brands
I think Ampere will lose to Big Navi with ray tracing turned off, but then still beat them in ray tracing performance.
Seem reasonable, they get to do Rtx v2, and amd goes from scratch.
@@fredreickweaver809 But AMD has been working on raytracing behind the scenes for the past 2-3 years for the consoles... So it could swing either way...
Highly plausible. Won't bother me though. I mostly play FPS, light and particle effects are not your friend there. They screw with the sight lines.
@@andersjjensen if they have some form of rtx, its a buy
well RT is more important in the coming years, cyberpunk WD legion and now the nexgen consoles also support rt
"My hair was worn out at the end" Happens to the best of us. 😂😂
Jokes aside, some really spicy stuff in this vid. Really thinking about delaying to upgrade from my 980SLI once again if they are going for a refresh. But I badly need a new card, that "SLI" just doesn't cut it anymore for my 1440p 165hz gsync monitor.
What would you say, get the top NV card and sell it before the new stackup comes out, or wait another half a year or so for the refresh?
I'd say wait for the refresh, I semi regret buying a g sync monitor now because I'm stuck with the green team
VXR 2.0 Exactly the same happened to me. Got a new monitor a month ago with gsync.
@@DrakkarCalethiel well that is on you the newer models support both freesync and gsync usually. but i can imagine them to cost more. in any case nvidia is doing good so far
Never allow yourself to be locked into buying from one company.
@@S55M3 Well if it is "G sync compatible" it will probably work with an AMD GPU. If it actually has Nvidia's proprietary hardware, I think you are stuck with Geforce. You could trade it for a Freesync monitor though or even sell it and use the money for a Freesync monitor.
10:47 "five hundred and thirty.... seventy-six CUDA cores". I like it
whelp,
This didn't age well...
His sources are made up and he is a simple bull s...ter. Welcome to where people lie and make up stuff for the internet moneyz. I unsubbed
Like anything this guy says : Its all pro PS5 / AMD jerk circle bullshit.
I've read the title today and thought whats going on, good Thing that this TH-camr is talking shit 😅
Did it though? His Nvidia leaks were pretty accurate and AMD still hasn't even announced their new cards yet... I'll come back when they release theirs to make the judgement
Personally I only really cared about the Nvidia leaks anyways because that's what I wanted to buy lol
@@TitusW honestly, does it matter if he is right when all his speculations are just that and nothing more? seems like a shot in the dark
The AMD shirt transitions get me every time 😂
That hair, is there a suit coming next?
And a monocle 🧐
Yeah, this video was bullshit. Digital Foundry benchmarked the 3080 to have 1.6x to 1.8x the performance of a 2080 for the same price.
That 2080 was even overclocked!
Damn this is getting exciting !
I love all the info about GPUs recently but if you have anything, I’d love to hear more about 3rd gen ryzen!
Dude so glad your a full time you tuber now. Keep up the great work and heres to your channel growing larger.
Note: Since Nvidia and AMD are competing with new software stacks, perhaps a future video comparing the differences between both companies. Ex: NVChache comparison is HBCC , Tensors cores comparison are Shaders and so on.
This aged like milk
I came here to say just that.
People need to understand, Nvidia has positioned themselves to be the leaders in super computers, they are a majority stake in the top 500 super computers.
I realized with the 10 series that Nvidia will not take a second off. we watched them cannibalize themselves, going as far as to release two titans, when the 1080 ti bested the current titian, during the 20 series the released the "super" editions on top of ti editions...
with the 3070,they've essentially recreated the 970, that card is insane... I'm glad my 1080ti allowed for me to skip the 20 series. a 3080 will have me set for a long long long time... 144+ fps ultra wide... 5120×2160... it's going to have to have another leap in 4-6 years for me to maybe upgrade that
Like anything this guy says : All one sided rumors
@@FirstBornConservative Keep in mind, all of the over 4k showcases were with DLSS 2.0 on. Without it, it barely chugs along at 6/8k
@@FirstBornConservative for the enthusiast high end GPU segment, AMD is just a reminder for nVidia that they should not fall asleep. At best, AMD can beat RTX 3080 but nVidia will release a even more competittive card, maybe RTX 3080 with 12 GB of VRAM or similar after RDNA2 launches.
Tom rockin the comb over... yet needs more beer
Today's beer raises estrogen(female hormone). Too much beer(or well too high estrogen) and you get a 'beer belly' and man boobs.
AMD is gonna have to something AMAZING after that 3070 reveal. Basically a 2080Ti for $499! WTF
I still dont believe it.... damn its nVidia after all. nVidia to be the old AMD (pricing), unbeliavable, that was a surprise (not just price, even RTX 3000 specs).
@@samfkt So does that mean that if AMD said the same you WOULD STILL believe them but not Nvidia?
@@trueminecraftfacts Dont trust companies in general..... marketing school teaching you basically how to betray customers and its actually not a joke, its fact just in "shiny" words. BTW I respect AMD/ATI for various reasons BUT they still betrayed/lied to the customers several times in some ways.....
I’ve always said gaming PCs make great heaters in the winter. You get basically 100% efficiency because anything that’s not used for processing is used to make heat. Nothing wasted.
09:28 This has been debunked. Nvidia is still in the process of acquiring specification approval from PCI-SIG to use this new 12 pin design, so it is NOT going to make it into the early launch Ampere series. Maybe a Ampere refresh on 7nm, 6 months later.
Hope this is true, competition is always great. Although, every time AMD has had a new architecture it has turned out to be mostly a let down. We'll see, though.
Patrick Burke this time it is not new architecture. Just new version of Navi, so,They build on last year technology.
@@haukionkannel does that mean that the driver will be the same as navi 1?
@@haukionkannel RDNA 2 is a different architecture from RDNA. Similar, but with a good amount of changes. Much more efficient.
“Still five hundred and thirty six seven CUDA cores” huh I’ve never heard of that number before
20:48 here's your shortcut to the goodest of boys. Thank me later
The upgrade to make the memory components (RAM, SDD card) work together to increase transfer speeds. Makes perfect sense. That way there will be less bottlenecks in the system anymore, brilliant. Good information my man!
I thought this guy died in the Magic Bus in the middle of Alaska! So happy he is still alive and now a successful tech youtuber.
I respect that initial conclusions can be revisited, from what I've heard I agree too, high power consumption and 8nm, that makes sense. Fingers are crossed they don't release a 3080ti at first and just a 3080, leaves them room to respond if Big Navi is any good or not (in their opinion)
I hope so too. At least it does give consumers that assurance that something better is coming rather Nvidia just ignoring AMDs advantage and only benchmarking DLSS and RT to say they are faster, then still selling for more. It would also make sense for Nvidia's image, if they just don't release their highest price tier even though Nvidia loses they can just say "our competeting product isn't out yet". Still that gives AMD time they have no direct competion for their top card but I think NVidia would prefer that than having their mighty xx80ti brand losing.
wasn´t 8nm to be around 51 Mtr/mm^2? This is a full node from turing cards, and only 17% more cuda cores?( Full dies :5376/4608 ) When turing (On the same node ) gave 20% (4608/ 3840) with comparable power comsumption, is samsung that shit of a factory?
I jumped ship from Intel to AMD this year on my build I put together in May with a 3900X. However it would take A LOT for me to consider the same going from NV to AMD. I've owned multiple AMD video cards in the past and they've generally always be problematic in one way or another (pretty much always drivers). The 480X I have in my backup gaming PC has been the best AMD card I've owned, but it's still had some overlay menu/draw issues in software like Autodesk Maya.
Aside from AMD matching or beating NV performance, they would have to at least match them in feature set and drivers/software which I doubt will happen anytime soon or be near the quality NV has. For now I'd still never have an AMD GPU in a PC I had to rely on for anything outside gaming. If it has to do any sort of productivity work, I'll stick with NV until AMD can actually prove themselves.
Another thing to consider is that nvidia RnD is so much ahead of AMD. AMD always have to adopt technologies when Nvidia sets new standards. It is like everytime AMD starts feeling comfortable they get wrench thrown in.
who here after official ampere reveal :)
thats why I bagged a cheap RX580 for my new build and not an 5700XT..... knowing that RDNA2 is around the corner...
Tom I hope the channel is supporting you more than your previous job. Love your content, keep it up.
big fan of the new haircut! been loving the videos recently
DLSS 3.0 and Ray tracing is where RDNA needs to match nvidia.
And driver support
And Nvidia need dlls support in games... how Many Nvidia GPUs support dlls... 20%? So do we get 3 games a year to support dlls? 4 games? 5 games? We don`t know. All in all not Many game developers make code Path that so few GPUs will support...
Have you also noted the line that nVidia will push using DLSS 3.0 for every benchmark?
This aged like milk.
I am in love with this channel. I literally wait for a notification every day now.
Same. Dude's gonna blow up.
@@dapperstache7747 yup. He has the best tech news out of all the channels i know. So glad i found it
Will dlss 3.0 come to rtx 20 series too? Sorry if it’s a stupid question.
AMD's (good) relationship with TSMC seems to be a key to their current success on CPU and possible future success on GPU. Getting ahead of Intel and NVIDIA on miniaturization seems critical. I wonder if this was a master plan by AMD, or whether Intel and NVIDIA acting like arrogant buffoons was a free bonus?
@@clenbuterol4989 Tell that to the Intel CPU and NVIDIA GPU in my current desktop. I'm just calling balls and strikes. I'm sure AMD will eventually become arrogant and buffoonish as well in the future.
I would love for AMD to win this year! I just sold my rtx 2070 super, just waiting on whoever releases the best next gen gpus to upgrade
So what on earth are you gaming on atm?
GT-AVIATOR im not really gaming too much right now...but whenever I do have some time to game, Im using my surface laptop 2 (gtx 1050) for very light gaming
Lol even doubling 5700xt performance doesn’t even come close. Not even when compared to third tier card. Looks like nvidia stole candy from a baby.
Farquaad was wrong. I'm sure he wanted Nvidia to show expensive, unworthy products.
Shreshta Jaiswal you may be right, but click bait titles and forming such a strong opinion before real news is even confirmed seems a bit disingenuous.
@Shreshta Jaiswal In recent years, sure but back then actually not.....
Hopefully this competition will drive down the prices of video cards, doubtful that will happen though.
Nvidia Ampere or AMD Big Navi for VR Gaming on the HP Reverb G2?
I would say either, but we're going to have to wait for the independent benchmarks to be sure which is the better deal.