Revisit: Vega 56 & 64 at Same Clocks (800-1020MHz HBM2)
ฝัง
- เผยแพร่เมื่อ 20 พ.ย. 2024
- Revisiting our Vega 56 vs. Vega 64 shader comparison with the same clocks, now with more frequencies tested.
Ad: Be Quiet! Dark Base Pro 900 White on Superbiiz goo.gl/zHSK9j
Article: www.gamersnexu...
Part 1: • Vega 56 & 64 Shader Co...
We have a new GN store: store.gamersne...
Like our content? Please consider becoming our Patron to support us: / gamersnexus
** Please like, comment, and subscribe for more! **
Follow us in these locations for more gaming and hardware updates:
t: / gamersnexus
f: / gamersnexus
w: www.gamersnexus...
Editorial: Steve Burke
Video: Andrew Coleman
Links to Amazon and Newegg are typically monetized on our channel (affiliate links) and may return a commission of sales to us from the retailer. This is unrelated to the product manufacturer. Any advertisements or sponsorships are disclosed within the video ("this video is brought to you by") and above the fold in the description. We do not ever produce paid content or "sponsored content" (meaning that the content is our idea and is not funded externally aside from whatever ad placement is in the beginning) and we do not ever charge manufacturers for coverage.
The article is here: www.gamersnexus.net/guides/3072-vega-56-vs-vega-64-at-same-clocks-part-2-revisit
Part 1 is here: th-cam.com/video/jHRP1uVBqfY/w-d-xo.html
stop making us 64 owners cry
You guys crying about your Vega 64s. I got an FE and AMD hasn't updated it in three months. It probably performs worse than a 56 in gaming.
Maybe I should've gotten a Titan XP instead. Worse at quite a bit of workstation loads, but at least it games decently for the price.
i haven't seen my 64 go over 1530 mhz even then only breifly i get massive fps drops it goes to 45 , in league of legends!! granted it will mostly sit on 110. Pubg drops everywhere goes right down to the 20s (min at loading area)38-45 isnt uncommon. No game feels stable. Right now i feel this isn't an improvement over my r9 390 at almost twice the cost!! I wonder if i just lost the silicon lottery, and if i did at what point is it an rma issue? Running benchmarks tonight to see if i can sort this before taking it further with msi.
ok turned off chill stopped the frame drops much happier
Steve, in previous comment sections of your videos people commented on you looking tired or drained out. we the fan just want you to be your normally cheerful and awesome self. we care about this channel and your health. thank you for being the tech tuber with such a positive and well tested/researched channel. you have the best environment in the comments and a thankful fan base. please keep up the great work and thank you for having thermals in mind. I rely on your case overviews to determine the best cooling case. ty again.
KAT Erwhall
He stated that in the previous video, he tried to keep it low because he was on a hotel room, you can't be making a lot of noise on those places. You'll get kick.
He sounds normal in this video because obviously he's in his own place...
It would be interesting to follow up on this since it's been a year. The 64 seems to be performing better in games now that Vega drivers have had time to mature.
Been 2 years I see no performance uplifts on the Vega 64 over the Vega 56. My personal Vega 56 overclocks to 1700mhz easy while my Vega 64 just reaches upto 1670mhz and really there's no performance difference. Both cards are undervolted. 56 consumes around 190w while the 64 around 210w. I'd say the 56 sometimes pulls out the victory over the 64 and is a really good purchase. I have updated both cards to their latest 2020 drivers and must say I'm impressed with the Vega 56
CAMTECH how does your Vega 56 only consume 190W? my Gigabyte Oc V56 clocks at 1460mhz and hbm at 930mhz at 180-200W. Core Voltage is set to 960mv but it always uses like 920mv, 50 less than the set goal
@@batu-740 At 180W my Vega 56 is just at around 1620MHz core 970mhz memory.
At 160W my Vega 64 is at 1550mhz core 1050mhz memory
I just told the max OC I achieved out of my Vega cards, that is 1700 on 56 and 1670 on 64.
At 1700mhz my Vega 56 brand consumes ~240w and 64 at 1670mhz consumes ~260W.
I have undervolted my Vega 56 so as to reach 1620MHz @ just 975mv and have never seen power consumption over 190W.
Also, on my Vega 64, I have stable at 1525mhz at 975mv and power is just around ~160W.
Watch my video for more details. I have posted a Vega 64 video. If you have any questions, you can message me in the comments section of my video
@@camtech1190 just got a Vega 64 for my SFF hackintosh corona virus project. I'll be checking you channel.
@@mc365mc I have just uploaded Vega 64 undervolt video. If that helps, it's good 🙂
Just want to say I really appreciate the editing and work that goes into these videos.
Love the in-depth look into the performance in different scenarios. Any time someone asks about performance I send them these videos.
You just made me love my vega 56 even more.
esoterjc I’m pissed and happy about getting a 56 for msrp but I’m still trying too figure out the logic behind it
God damn, so many benchmarks, such a hard work done here, appreciate!
3 cards stacked on top of each other, will you go for 4 next?
mGPU is now limited to 2 GPUs even on the AMD side :D
TheWarTube he means in the thumbnail
Perhaps, if 4 cards will stack at all on that table, lol. Isn't that 1070Ti rumor going around of late, hmmm. Steve and crew may need to stack way more than 4 cars, lol.
Ha! What are you up to lately, ZDG? Haven't seen you in Discord!
been busy with 3D stuff on this wilfully inadequate FX CPU, lol. 3delight and most other things do not play well together. Even with 3delight limited to less than half the FX8350, other stuff just lags and studders so bad it's painful at best.
Even if it's too late but the results are very easy to understand. The Vega architecture is pretty hard bandwidth limited. Vega 64 compared to vega 56 has about 10% more shaders. But the vega56 HBM memory has significant tighter memory timings. So there is a trade off. For Vega 64 @ 56 settings this results in basically the same performance. Vega 56 @ 64 settings this also results in basically the same performance.
With Vega 64 BIOS the memory will run with 1,35V. With Vega 56 BIOS the memory will run with 1,25V. There is no other way to modify this voltage! Because of the higher voltage one can higher clock the memory on Vega 64 but one looses the tighter memory timings. So again here will be a trade off. If you have a Vega 56 which will run above 950MHz you have at least to hit 1100MHz and above on this Vega Card with an 64-BIOS flashed to increase performance because of the memory timmings. Of course one can both clock the memory higher and at the same time tighten up the memory timings which will results in the best possible performance per watt. You always increase performance with Vega by increasing the bandwidth. You always improve power (i.e. lower) by reducing core voltage state. You always stabilize your GPU with a certain GPU clock for the desired GPU voltage State. Never the other way around.
So in a nutshell. Increase memory clock and tighten up memory timings. Reduce voltage state. Find stabilile GPU clock for desired GPU voltage state. Enjoy more performance by significant less power draw.
18:31 _"Representative of at least 1% of them, since that's about the amount of cards we have from Vega 56, versus the initial launch."_
Seriously, *BRUTAL.* 🤣
Hey thanks GN, i love this content as its still relevant to me today 😂👍
The cards are bottlenecked by the geometry pipeline. Without NGG Fastpath (primitive shaders) working on the Vega in the drivers, the 4 poly per clock just isn't cutting it.
Thing is, AMD promised an increase to around 9 poly per clock or somewhere around that IIRC. If we got the Vega architecture improvements as AMD promised back in January this could have been a beast but we didn't.
It'll get where it needs to be, it's just that it might be in a year's time which is the problem. I mean, we know to expect FineWine™ so Vega buyers will be happy, but it should've been there at launch.
T51B1 you should've gone with "Work in progress, ALWAYS" FineWine™
notlekrut basically Vega is new bulldozer. On the paper its the new super architecture that destroy everthing and in real eh eh eh . Hawai/Fiji dieshrink 14nm will be much better.
My Vega 56 (on water) is running the Vega 64 bios at 1050-1650mhz with +15% on the power.
Hotest Ive seen is 52 degrees and I LOVE it with my 4k Freesync screen :D
1100-1700mhz also works but I put it a little lower.
Hey , i know this comment was made 1 year ago , but maybe you will still reply :D I wanted to ask , is your vega 56 still holding up with 64 bios , because i will get 56 in about a month or so , but i am not sure if vega 56 will hold 64 clocks for like a few years and still go strong, because it isn't so cheap that i could just fry it :D Thank you ~
I've given up on this card and yet here I am. Thanks, Steve.
Steve, always great content..
I think it might also be interesting to compare vega to fury clock for clock. Because in theory, if the vega 64 boosts correctly, it should be about 40-50% faster, simply because of its clockspeed advantage, but that isnt what we've been seeing at all
Nice work.
I find it strange that we dont see a bigger difference between the 56 and the 64.
I wonder though, what would happen if you left the core clock low (arround 800 MHz) and just boosted the HBM speed. It would be interesting to see if the exstra compute units of the 64 would make a difference in preformance if it were clocked verry low. If this was the case, then that may indicate that the memory speed is holding back the 64 vs the 56.
If not, well then the problem is somewhere else.
Keep ut the good work
Yo Steve, you and the crew. Get some rest! Thanks for the informative videos. But TH-cam isnt going anywhere!
In all seriousness, try to get Pantene Shampoo to sponsor this channel :D It is like the perfect place for their advertising.
so given the difference in cores is about 7:8 for the 56 to 64 relatively, does that mean that the extra cores aren't being used at all by the software(given the performance of )? if so could you try running the benchmarks on the Nvidia side with all the clocks matched and see if you get a similar situation at any point with those cards.
I think the Vega 56 and some aftermarket cooler like the morpheus core might be a sick combo. I'll get vega and test that shit as soon as it's under 400€ ;)
Going to explain What is going on with Froza 7 and the GTX 1080 To vs RX Vega 64?
Great analysis, thanks. It leads to 2 conclusions: At least 512 stream processors are not being used by the V64 in certain scenarios. Maybe all Vega models don't scale well in a number of games and benchmarks, so the number of SPs in use might be even below 3548 across the board in many games. Indicators are the identical performance of the V64 and V56 despite the 56 having 8 CUs, respectively 512 SPs less, the as underwhelming perceived gaming performance, the underwhelming fps/W ratio gain when compared to Polaris and the relatively low performance gain clock for clock when compared to FuryX despite a brand new architecture. -Vega remains an enigma and I continue to think the cards still offer a lot of untapped gaming performance potential... maybe way above the 1080ti (?) -> see Forza 7. I really do believe the Vega drivers aren't bad, they rather seem to be abysmally bad.-
Second conclusion after thinking it over, it seems that Radeon Stream Processors have a similar issue like AMD's CPU cores used to have before the Zen architecture: Radeon Stream Processors (SP) simply lack IPC when compared against Nvidia Cuda Thread Processors. Therefore games that scale perfectly well across all Radeon SPs may run faster than on Nvidia GPUs of the same price range. Other games that don't scale well, and that seems to be the majority, are more dependent on strong cores, offering high IPC and clock speed. This appears to be strikingly similar to the performance issues AMD had with its FX CPUs compared to Intel's Core-i CPUs. That would also answer the question why Radeon GPUs offer so much more total computation power and yet fail to beat less powerful Nvidia GPUs in gaming performance. It's the same story over and over again.
Final conclusion: Nvidia is simply better in gaming and AMD needs to design cores with more "horse power". Nvidia has much more powerful cores at higher clock speeds and if needed they can easily produce massively parallel monster GPUs/Computation Accelerators with many cores to offer both to the consumer: a high IPC and many cores (1080ti and the Volta computation GPU). The reason why Vega beats the 1080ti in Forza 7 is probably due to Vega for a change scales well, while Nvidia's 1080ti due to some reasons probably not. But this may change with a simple driver fix. The Radeon Group needs a Zen-like miracle to happen soon to not completely get out of touch with the currently undisputed market leader.
Power ain't the issue, AMD's shaders are actually pretty dang powerful. The issue is literally everything around them, the front end can't feed that many shaders outside of highly paralel compute work, the arch is limited to 4 shader engines and 4 geometry engines so as you scale up the chip ends up overweight on shaders since geometry resources don't scale up anymore, the ROP's could probably also do with an increase as well and the thing wants more memory bandwidth.
They could have made this thing in the same config as Hawaii and got pretty much the same gaming performance.
Hi, Great video as always, can you please share your settings used to stabilise the card, power consumption and temperature at each settings . Thanks
Here i am, with a Factory 56 on 64 specs, staying cool and below 200W under load with a Morpheus II mounted to it. Got it for 300€ 1 year ago, now AMD releases the RX5600XT with less RAM for the same money.
It will maintain a well scaling middleweight champion for 2020+, and for my 32"75hz Freesync 2K Display, it keeps being the perfect choice.
And it shows, when you got a good feeling about something, you have to inform yourself if in doubt - which mostly proves dumb rumors to be wrong in retrospect.
It also shows how most people lack even the most basic understanding of economy, and therefore muddle up their facts.
Vega 56 prices are fantastic, and so I am confused by who exactly (in the US at least) would buy newer Navi or Turing cards at the $250-$300 price point. With that said, I myself went a bit higher end and am rocking a Nitro+ 5700xt running at ~1915mhz game clock at 1.068v and 1900mhz vram. Core draws ~170w under full load in such circumstances.
So I wonder what will happen when they release Vega 32 if the rumors are true. How will that affect things. I can't imagine that card would have half the Vega 64 power. Maybe 70%
Steve, I'm after you experience...
I have a vega 56 with a waterblock on it in my custom loop, card never gets over 40c
First some background.
I can overclock the crap out of it, I have put a vega 64 bios on it, even tried a vega64 LC bios to see how that went, no go (crashed every time I tried to run a benchmark)
So can get the hbm to 1100 no worries, not artefacts nothing, but one step higher and it crashes every time, is this the ceiling that nobody can exceed?
For the core I have run it as high as 1750, and it clocks to that speed in afterburner when I aim for it. I have gotten a 11965 graphics score in 3dmark firestrike extreme.
So my question is, when you are overclocking the 56, when it crashes, what does it do?
when I push the card way up over 1700, when it crashes, it shuts down the computer completely, it doesn't stall, or bluescreen or anything, it's like the power has been cut to the computer, just dead. I haven't had this sort of experience with a card before, and was wondering if it could be something else causing the shutdown?
I have a corsair 750w Ps with the vega and a i5 6600k, watching the power draw in the Corsair software, it doesn't get near the max power, although it gets up into the high 500w area.
So just chasing your experiences with the card?
Cheers for your time :)
Wish you would cover allocating ram to vega 64 hbm... what is the most you can send it and what is a good amount for most users for gaming , video rendering and...
great video,
and i can't wait for the strix review (i know delay due to driver stuff), not that i want the strix vega, but rather seeing what is going on with bios and what not, as i would prefer to be able to switch to an oc bios, that runs vega 56 hbm2 at vega 64 voltage, so no fuck warranty, when the card dies and i can't flash the card back to stock.
also it makes me wonder if vega 56 performed to similar to vega 64 at same hbm2 clocks, so they "had to" slow the vram to create the gaming difference at stock.
Interesting. Vega 64 is a really hard sell anyway one slices it. Vega 56 isn't too bad overall of a card (well, I do wish power consumption was a bit lower though I think it's livable). But given the current market situation with shortages and mining, it doesn't change my mind that I have a strong preference of the GTX series for recommendations and personal use. In time, if Vega 56 prices become more reasonable, I could bring myself around to recommending it or using it in a build - I just don't see that happening anytime soon. Then again, with GTX 1070 ti rumors floating around, that my not even be in the (graphics) cards...
free sync vs gsync should determine your purchase. If you have one or the other, your choice is simple since buying a new monitor is dumb. If you dont have one, freesync is cheaper. The same display has about a $150-$200 delta in price which should make a rx vega 56 the right buy.
Bayan Zabihiyan For the first couple of months,sure. Watch his power consumption video on Vega vs 1070,you will quickly loose those 200$ by paying way higher power bills...
Depends on sales and what screens are in question... I've been monitoring prices (haha - did it again) for my build, and I'm finding I'm winding up a little better in terms of short and long term costs going with nVidia (so GTX/G-sync). I am buying a pair of monitors - my 16:10 screens are nice (though "old"), but I want to step up in resolution this round.
Just go thorugh the anumbers and then decide if the maintaining cost can outweigh the price of a gsync monitor (over FreeSync) and a 1070 or even 1080.
I did exactly that, and after less than 3 years buying a 1080 is actually cheaper than getting a Vega56 for me.
Honestly sometimes you glean more from the revisits than the inital reviews... looking forward to watching this one
We learn a lot in the process. You're able to enter it with the experience of the last content piece behind you, so it's easier to figure out what steps may help with the new round of testing.
(Unrelated to Video yo)
Coffee-Lake Single Thread in CB R15 leaked by Logan from Tek Syndicate on instagram/twitter: ~230 points
Hey Nexus, I was curious if you could back a TH-cam guide, or even better have already made a proper video guide on OC-ing both the Vega 56 & 64? Thanks! (Also, if anyone knows a good one, feel free to comment it here)
Do you guys test with MSAA/SSAA/any kind of backend/ROP based AA on or off?
I will start saving for the next GTX XXXX Ti GPU instead... maybe next gen will be more 4k capable.
RTX 2080 Ti, $1200
@@owen-ng8oe lol fuck no for $1200. $650 yes
Would the core count difference be more prominent while rendering in Blender using OpenCL? Or even viewport rendering?
Could matter in Cycles/Blender rendering. That falls under the 'production' category that we haven't tested.
Vega RX vs. Fury X Clock-for-Clock. Please.
Saving up for that 1080ti...
DavidGX I just going to wait for AMD'S next line. At worstcase, wait for a sale. I don't like Nivida's practices, like requiring an account to get drivers.
the hexagons in the charts are a bit distracting :P..... maybe tone the brightness on them a bit ?
Still no proper Vega custom cards out....
November.
Well, partner boards are just quieter and cooler... There is no OC headroom on vega....
Since you can't OC the card any more than the FE versions can, there is no point to get them out. Those cards are just no good which I am the fucking most sad, I fucking waited months for this shit to be released.
The Asus card Jay reviewed had BIOS or driver problems or something, so that wasn't a proper review of it. That, and even if you can't OC it more, you get better temperatures and noise, which is the main reason for AiB cards in the first place.
Thats not a bios problem. Amd just pushed the vega cards as far as they go to keep up with the 1070 and 1080 , so there´s no overclocking headroom no matter the cooling. The main reason for buying third party cards is overclocking. Temps and noise are secondary, at least for me...
So basically the R9 290X/390X were the most efficent in terms of CU count = Performance?
Man I really need a gpu upgrade but these Vega cards are overpriced as hell and I'm not desperate enough to buy nvidia yet. In my country you can get a 1080 for the price of vega 56 and the 1080s are also overpriced.
This is a really bad time to buy a gpu, hopefully things get better in the (near) future. Maybe after custom vega cards start to drop prices come down a bit.
So what is holding back VEGA 64 over 56.. there is a nice bump in core count, and results keep showing (for the most part) almost no difference at same clocks..
Any thoughts on that?
Still would love to see some OpenCL (Blender/LuxMark) results. Pure compute test.
Here's the bottom line, buy the best card you can that's available right now, for the budget you have. You can wait, and wait, and wait, and wait until you're blue in the face and the cows come home, and the case of Vega, be massively disappointed. Just buy the best card available now. AMD has not made a compelling product, that's not your fault. Companies don't deserve sympathy. They are not charities. Buy the best product, regardless of who makes it. I feel like we have to employ a lot of, "special thinking," to rationalize why people should buy Vega. (This is not a comment on GN's video, just in general.)
It's not "special thinking" to know the Vega 56 is a good card if you can find it at a decent price. Unless being concerned about cost is strange.
Being concerned about cost is totally fine, but right now at least, Vega is next to impossible to find at MSRP (or at all). Why wait around for it to come down when there are better options out there at MSRP that you can buy right now? That's my point at least.
Yes, you can wait and wait, and there's always something better in the pipeline, BUT there ARE times to buy and times to wait. It was time to buy when I was able to snag my R9 290 Gigabyte Windforce card for $259 Canadian back in late 2013. This is one of the times to wait for custom-cooled Vegas, which should be rolling into stores in October/November. Vega has features that will blow the 1000-series out of the water. The game engine makers are just starting to enable them in their games. Forza-7 is just the beginning. Paying $750-$1000 for a 1080 Ti right now will make you kick yourself hard when Vega code-path optimized games start flooding out, and they WILL flood out very shortly.
Because I can get better options _under_ MSRP right now. I don't have to buy what in my opinion is a pretty wimpy offering by NVidia at the price they want, just because it's cheaper than the other overpriced cards. Your bottom line isn't very practical, TBH.
Yeah right lol
Any evidence of better binned HBM2 for Vega 64? While the conclusion of wherher to recommend 56 or 64 may not change, higher expected HBM2 clocks would definitely affect the performance comparison.
yo deserve more subscribers!!!
How do you get vega 56 memory up to 1020 MHz you wizard?!? Mine gets unstable past 900 haha. (Although I don't have your experience and expertise). I have the red devil edition which allows me to draw extra current, similar to the allowed current draw on the vega 64, but do I still need to bios flash to 56 for the additional memory speed?
15ish% more CU's but basically 1% gaming performance scaling. Seems vega 64 and 56 wasn't meant for us gamers. However in compute leveraged titles like dirt it seems the 64 can almost run down a 1080ti. It would seem if any game was optimized for vega it'd be much faster. I don't see that happening though. The developers will feed the market share.
So AMD could have made a 3072 SP card and still come close to 4096 SP card and make it much smaller.
GCN currently doesn't scale, their 4 shader engines on every die bottleneck all those shaders and we see this in vega 64 vs 56, its showing that those extra shaders are doing nothing. this card is a compute card. if navi does multi die, then it might scale better since they could make some "sweetspot" ratios ie rx 580 ratio but cut in half to make the die smaller, then put 4 of them on an interposer with hbm2, so 4 dies at 1024 cores and 2 shader engines make 8 shader engines with 4096 dies, ie way better than 4
Ok question Steve. If it's true on theory that it could be the fact that the 56 clocks higher due to less shaders, have you tried to flash 64 with a 56 bios and see if that is true? Or can that even be done?
Amd have to known this. So why the hell they clocked vega so ineffcintly high with +-5% performance boost but with +120W ? Wtf
$1000 for Vega 64 here, while a 1080 is $699 (Galax), so bit of a no brainer which one to buy. Unless you quite enjoy throwing your money away...
How do they compare max overclock on both the core and memory for v64 and v56 because the memory and core clocks differently between them?
So do AMD cards just score lower in Unigien Heaven (and I guess Vally) compared to Nvidia ones? I trying out the Extreme Present got a FPS score of 132.6 with my 980 Ti.
At the 1080p test, I get ~3300 with Tessellation disabled an ~2550 with Extreme. As the Vega 56 is around 1070 in performance and the 980 Ti is also around 1070, I was expecting a much higher score from the 64 at least
Maybe the cards are bottlenecked by the geometry units. Perhaps they would pull apart below a certain level of polygons.
Anyway, this suggests that Vega has some headroom for better graphics in some way without losing performance. We will see after optimizations are made on the newer games.
why didnt you try to lower the memorybandwith bottleneck? why not 1400mhz(core) @ 1000mhz hbm? or even lower coreclock? that would be the only way to really prove that the CUs dont scale. right now the chance that you are memorybandwidth starved is still pretty high, since you also reduced hbm frequ with lower core clocks.
Man where are partner cards? Im sitting here ready to go for vega but not without a better cooler than the blower one...
Any crossfire (now mgpu) tests coming? would like to see how it does since SLI has had really poor improvements this generation
I wonder if its possible to deactivate some compute units. And if it's possible, i wonder what would happen then.
Yo, why do you guys scale your charts so poorly? You're testing games in 4k. You're getting 30-40 FPS. The table doesn't need to go to 150 FPS. It looks more scrunched than Steve's hair.
Still want to see you zap that 56 with a 64 bios, then play with it.
How about clock for clock with fury x again. That is if all vega features are enabled like primative discard.
I really hope it's a driver issue. Otherwise, Vega 20 will suck too and Navi will need a complete architectural change to be any good.
what kind of power consumption on each card
What is the test between 11:15 and 11:45?
Okay super position! Thank you Steve! Steve
where's the old desk?
I'm really kind of angry about this. I have a water cooled 64 it hits 1727 and 1100 mem firestrike 24989, extreme 12190.
I've seen into the 25000 range in firestrike on my V64 Air with an EK block and it only holds ~1700 core and 1100 mem.
What are those 8 extra CUs doing? 😂
1:32 OH
I like apples
Im more an orange guy
grapes
Now you're just comparing apples to oranges.
umm what's going on "AMD RX Vega 64 Outperforms NVIDIA GTX 1080 Ti By Up To 23% In DX12 Forza 7"
Exactly, How comes their engine is so different from everyone elses?
its the game its self. once games start utilising vegas software and hardware you will see improvements in performance.
ZankDigiTrash cpu/other bottleneck. In 2k/4k resolution vega is classic shit
ZankDigiTrash I've seen very different results in the majority of benches to that German one.
At 1080p it beats the 1080ti but at 1440p Vega 64 slips down 18% slower and so on.
Seems to be Vega at 1080p shares more workload with the CPU, but like I said the German bench was the only one I have seen where it beats a 1080ti over 1080p.
This is not uncommon as the 1080ti performance shows at higher resolutions.
Might be due to the fact that AMDS cards fully support DX12 so 1080p CPU is used a lot more, where as soon as the resolution increases and thus GPU workload, the real power of the 1080 Ti comes into play. So no thius isn't the fact that AMD is better, it's that Nvidia still doesn't properly support DX12
So...although I can afford the 64, I might as well just get the 56 and OC it to get about the same performance as a 64 in games? I have a 34 in freesync monitor and I want to get the best Vega I can get, but I don’t mind saving a few hundred bucks if the 56 will get me the same FPS ...
Eric Lafond I don’t know why you would opt for this over the higher end Nvidia cards?
@George I don't know why people would opt for inferior and more expensive nvidia card with drivers that kill itself in the past when AMD was ahead
Get 1080
George Morley because I already invested in a kick ass freesync monitor and paid $500 can less than the equivalent GSync one.
hows my vega 56 struggling so bad with 935 mhz on hbm, 936 and im gettin artifacts. any ideas?
You probably were unlucky and got a card with Hynix memory, which doesn't clock nearly as good as Samsung HBM2.
Check what you got with GPU-Z
@@MerolaC thx for the reply. jst read more about it, and yeah its the hynix. unlucky me
What have you got the memory controller (floor) voltage set to?
Set the last to state to at least 1000 mV.
Also make sure that your core P-states (for 6 and 7) are 1060 and 1070 or higher. This might help you. Or not... as the Hynix rarely goes much over 935MHz. Even with modded bios and/or powerplay mods.
Forza 7 have difference between 64 and 56. And both outperform GTX 1080ti
SOMEBODY TELL FORD A MUSTANG WILL NEVER COMPARE TO A CHALLENGER
This is mad weird....why even 64 exist? Lol
Non the less though, very interesting...
Looks like you may as well stick with Vega 56.. there is no real gain from choosing V64 (aircooled) meh..
*Forza 7 laughs at all the AMD jokes and winks*
Cory Friar 4K Nvidia comes back in the averages albeit still losing in the minimums. Something is up with that game's optimization.
deathbat6916 how is it if amd wins in a game it's either made for amd or it wasn't optimized.
Cory Friar I'm not saying that at all. The Division runs well on AMD and Nvidia, and it's consistent across all resolutions. For some reason this game and Dirt 4 love AMD at the lower resolutions but not so much at the higher resolutions. Nvidia wins at 4K, I'd imagine if it were more consistent on both vendors and AMD still won, that they'd win at 4K too.
Is your Vega 64 card broken in some way? Its performance is so different compared to other sites.
No.
The performance between sites is irrelevant as testing differs, scenes differ, etc. The delta between V64 and V56 is what matters, and that's basically the same as others.
Also, this test was conducted using different testing than our reviews, so you can't compare data like that, anyway.
This is not a comparison of v64 and v56. It is a comparison of them when they are clocked the same. v64 is clocked much higher out of the box, so this testing is unrelated.
I understand. Thanks.
Lars Bolduc Yeah, but V64 has got 14% more units, but this is never seen... constantly. Less then 5% AT BEST.
There must be something wrong in driver or something else.
This has definitely been seen before. All the time. The R9 290 is almost as good as the R9 290X at the same clocks, the R9 Fury is almost as good as the R9 Fury X at the same clocks. It's just a limitation of AMD's GCN architecture, the number of cores doesn't matter much because of the memory bandwidth limitations.
Why don't AMD just make two cards. One with larger bandwidth and the other just higher clocked or higher number of cores...
they dont make money on rx vega, so why bother.
You want to give me that case 😍
Nice
Welp
I wish we had more dullards buying AMD GPU's so Nvidia can keep pushing the performance envelope, and this is coming from a ATI/AMD buyer up until 2015.
The 1000 series is a refresh, and they pushed back the release of Volta for consumers because people didn't buy the competition, so let me know how fantasyland works out for you.
Vega has been one of the most disappointing "high end" (...) cards from AMD in several years.
Hey AMD, I got new slogan for ya to try out on a commercial:
"AMD, always one step behind, ALWAYS"
Then you have a fade out of a sunset with a vega 56 on fire on the ground and some chinese miner dude that hasn't showered in months dancing naked on top of it...
Only the GPU department tho, Intel appears to have shat their pants if their reaction is something to go by.
Ati
Meanwhile Polaris and Ryzen.
MeMad Max it's more like "Work in progress, ALWAYS" FineWine™ those who buy Radeon card used gets the final product while those buying new are lab rats, beta tester. Early Access if you may
lol that sucks ass and is the worst thing a company could do. lets take the people that pay out the ass on day 1 and just fuck them with bad tech. years later we will make ait good jsut when they are done with it. why do people think fine wine is so good, they're just fucking over the original user (and also making themselves look like shit)by not having refined drivers from the start
Vega Forza test
😒 meh whocares
Why do you think it is that ATI/RTG can hit Home Runs on it's midrange cards since, well, since before the X600-x800 series, but every high end card since it's been AMD has been the worst high end gaming GPU's ($500+) at release, but the best midrange cards on the market 2 years after release? (Unless they stick two GPU's on one board, ei 7990, which was great but for $1000 well).
I'm past the disappointment that these were (gaming wise) good (the 56) high mid range/low high end and the worst high end (the 64) and retailers just thought miners or home professionals would buy them all marked up a minimum of $150 save maybe a few hundred cards worldwide that sold at SEP/MSRP. They will be, the best midrange cards in a couple of years though. Heh, it just seems like this keeps happening (not the price hikes, the performance).
They are great at pushing the envelope and then letting Nvidea take what they start, refine it, make it much better, while they, instead of improving upon, push the envelope again and end up in the same spot. Ahead tech wise, but behind in performance, while other companies take their ideas when prices for components of the whole are less expensive, mix their own ideas in where AMD's failures in architecture were, keep the good stuff, and end up way ahead.
We all know Nvidea could drop 3 Volta cards on V105 and V104, destroy Pascal and anything Vega related, for the same MSRP while it would cost them less. They could literally hold the V103-V100 on the consumer side for over 2 years and AMD has no answer. None.
I waited for Fury Nano but it was waay to expensive. I bought GTX 970. Then I waited for Vega but it turned out to be SHIT. Now I'm waiting for GTX 1070 TI. If it's not good enough I will buy GTX 1080.
3rd :3
456156th
Cut ya hair, dead ends everywhere, shave ya face, and brush ya teeth.
What the hell is dead ends, hair is dead as soon as it comes out scalp.
Nvidia
So... there is no point buying a vega, and if you absolutely must, just buy the vega 56?
lol amd
This you pisses me off. GTX fa ta winzz
We know by now that VEGA is trash......AMD Fanboys CRY and get over it; I hear there is a nice GTX 1070ti coming out that will no-doubt wreck VEGA at $400
Vega 56 beats 1070, Vega 64 is a little bit slower, sometimes equal to a 1080 in games so where is the failure? I dont see one
If Vega is trash what is Polaris then? What is the sub-x70 Pascal line then? Potatos that can't even run Minecraft? Lol lil bitch you outta your mind
1 year later 64 still beats 1070ti...