AMD RX Vega 56 review: even if you could afford it, you probably shouldn’t buy one | PCGamesN

AMD RX Vega 56 review: even if you could afford it, you probably shouldn’t buy one

AMD RX Vega 56 review

The AMD Radeon RX Vega 56 is the second-tier graphics card in the red team’s top range of GPUs. That would normally make it a price/performance favourite, but that’s not the case.

The Vega 56 was always onto a bit of a hiding to nothing. Team Radeon's Sith-inspired rule-of-two chip design means every GPU they create eventually gets released as a pair of gaming graphics cards. So, following on from the AMD RX Vega 64, comes their lower-spec RX Vega 56, struggling to keep up with a GTX 1070 and beaten bloody by Nvidia's GTX 1070 Ti. Poor lamb.

To get the most out of your new graphics card you’re going to need one of the best gaming monitors around.

Much of AMD's Vega struggles can be traced back to the choice of HBM2 as the memory technology to be used across the professional and consumer variants of the Vega design. If AMD had simply opted for the traditional GDDR5/X memory Vega would have been here a lot sooner, and, importantly, a lot cheaper. Unfortunately it seems changing the memory compatibility is a toughie or we might have seen lower-spec cards.

Bless the ol’ AMD Vega architecture, it’s had a bit of a hard time. Since it was first announced, in hushed whispers, as the first high-end Radeon GPU in an age, it’s been hailed both as the saviour of AMD’s graphics division and the architecture that might damn it. In all honestly, it’s neither.

The lack of stock and confusion over pricing initially left a bit of nasty taste in the mouth, with accusations flying around about the original RX Vega 56’s $399 price given out to reviewers being only a short-term deal.

That's a moot point now, with the nightmare of cryptocurrency miners grabbing all available GPUs. Both Vega cards are now almost as rare as a reasoned Twitter debate, making prices still artificially inflated at $529 (£463) even after the crypto market has calmed down a touch.

Click on the jump links to get to the section double-quick.

AMD Radeon RX Vega 56 benchmarks

AMD Radeon RX Vega 56 performance

AMD Radeon RX Vega 56 verdict

AMD Radeon RX Vega 56 specs

Subscribe to PCGamesN on YouTube

There’s not a huge difference between the GPUs at the heart of the top-end RX Vega 64 and this lower-caste RX Vega 56. As the name suggests, the second-tier card has 56 compute units (CUs) compared with the 64 CU count of the higher-spec card. That means AMD has jammed 3,584 GCN cores into the RX Vega 56 instead of the previous card’s 4,096 GCN cores.

Other than that it’s purely a case of some tweaked power settings and lower clockspeeds - everything else is identical, down to the cooling design. That means we’re still talking about 8GB of HBM2 video memory and a 2,048-bit bus. It’s the same on-die configuration that makes the Vega 10 GPU a 486mm2 mammoth (the GP102 used in the GTX 1080 Ti is just 314mm2 for comparison) with 12.5bn transistors inside it.

There’s no liquid-chilled version of the RX Vega 56, nor are there any super-sexeh Silver Shroud variants. From the looks of things, AMD only made maybe five of either anyway, and I think those just went out to friends and family…

In terms of the actual Vega GPU architecture, we’ve gone into detail about what AMD’s engineers have been trying to create in our RX Vega 64 review. Essentially, they’re calling it their broadest architectural update in five years which is when AMD unleashed the original Graphics Core Next (GCN) design - and it brings with it some interesting new features.

One of which is the Infinity Fabric interconnect, binding the GPU with all the other components in the chip, making it a more modular architecture than previous generations. It’s also the key factor in making multi-GPU packages in the future. There’s also the new compute unit and AMD’s high-bandwidth cache controller (HBCC) too, offering the promise of the tech improving as more developers start taking advantage of it.

AMD Radeon RX Vega 56 benchmarks

DirectX 12 benchmarking

AMD Radeon RX Vega 56 performance

AMD Radeon RX Vega 56 performance

The big challenge for the RX Vega 56 - outside of trying to magic up affordable stock for people to actually buy - is to push past the similarly priced Nvidia competition. Just as the RX Vega 64 is priced to go head-to-head with the GTX 1080, the RX Vega 56 was released into the wild with the GTX 1070 in its sights. Unfortunately Nvidia hit back with the similarly unavailable GTX 1070 Ti...

Like the RX Vega 64’s attempts to overthrow the GTX 1080, it’s kinda difficult to call an outright winner. As we said about the flagship Vega GPU, it’s an architecture which seems to have been primarily designed for a gaming future that’s yet to have been created. As such, it’s rather lacklustre in its legacy gaming performance.

With games built using the last-gen DirectX 11 API the second class Vega is left trailing in the wake of the GTX 1070, let alone the GTX 1070 Ti. But when you start to bring in tests based around the newer DirectX 12, or Vulkan, instruction sets, Vega’s modern architectural design allows it to take the lead.

Unfortunately, despite DX11 now being very much a legacy API, with DX12 being over a year old, the majority of PC games are still being released using the older system. And that means for the majority of PC games that come out over the next six months, at least, the GTX 1070 is likely to retain that performance advantage.

Granted, Vega’s performance improvements in DX12 and Vulkan is encouraging for how it’ll fare down the line, as they become the dominant APIs, but right now anyone lucky enough to be able to find the RX Vega 56 for a decent price is still going to be paying the same money for a card that struggles to beat a smaller, more efficient, year-old GPU in pretty much every game that’s in their Steam library.

That’s not the only competition for the RX Vega 56, however, as there’s also the small matter of fratricide. As is its wont, despite the $100 price difference in their relative SEP, AMD hasn't made sweeping changes to the core configuration of the two Vega variants. Essentially, the RX Vega 56 is simply operating with 12.5% fewer cores, yet in performance terms is only ever around 7-10% slower.

That means there’s a good chance gamers will end up going for this mildly chopped down Vega instead of the much more expensive card. Or there would be if the cards were consistently available for reasonable prices. We’ve seen etailers where there are RX Vega 64 SKUs available for around the same price, sometimes less, than that shop’s cheapest available RX Vega 56.

Yeah, Vega’s weird.

AMD Radeon RX Vega 56 verdict

AMD Radeon RX Vega 56 verdict

The overall design of the Vega architecture seems to have been about laying a marker in the sand, defining a branching point for future generations of their GPU technology. It's almost sacrificed legacy gaming performance for the promise of future applications of its feature set. The little extras the Vega architecture has baked into it look like they could be genuinely game-changing… but only if developers actually end up taking advantage of them. And that's a pretty big if.

If it was guaranteed that the HBCC and Rapid Packed Math shenanigans were going to be employed across the board, and not just by AMD best-buds Bethesda, then the new Radeon tech would definitely be the one to go for over the old-school Nvidia design. But it’s not a sure thing, and we don’t really know what performance improvements these Vega features might offer either.

Wolfenstein II: The New Colossus has shown some impressive Vega performance in the face of the GTX 1080 competition, with AMD's traditional Vulkan speed wins in the low-level API. But Vega is showing greater gains than the Polaris architecture, which would indicate there is something to the Rapid Packed Math stuff. We're going to be doing some more investigating on that in the future.

If you were spending around $400 on a graphics card today - even if there were RX Vega 56’s available at their suggested price - it would still be difficult to make the Radeon recommendation. It’s the more advanced architecture, but in raw performance terms the smaller, slightly cheaper, more efficient Nvidia GPU is likely to get you higher frame rates in more of the games you’re playing at the moment.

If AMD could have priced Vega more aggressively against the still-strong two year-old Nvidia Pascal architecture, then it would have a better chance of taking the market by storm, but unfortunately the price of HBM2 is a big sticking point. And with AMD reportedly losing $100 on each card sold at the suggested price, it looks like it's done all it can on that front.

I do like what AMD are trying to do with Vega, but with all things being equal, sacrificing current performance for the chance of higher frame rates in a few future games is likely to be too much of a stretch for most PC gamers. 

PCGamesN verdict: 6/10

GOTW
Sign in to Commentlogin to comment
DuoBlaze avatarDave James avatarAnakhoresis avatarmikeweatherford7 avatarproject17 avatarAlucard0Reborn avatar+4
mikeweatherford7 Avatar
4
9 Months ago

These cards respond extremely well to undervolting. The difference can be dramatic, higher stable clocks, at much lower temps, and lower fan levels. My Reference Vega56 can do 1550 Mhz (actual full 3d load clock) at 1040 volts, and 950 MHZ HBM clock. At these settings it's significantly faster then a 1070, on the heels of a 1080, at reasonable temps and noise levels. This result seems to be typical with Vega56 owners.

3
Alucard0Reborn Avatar
3
2 Months ago

I know this is pretty old but you're comparing this card to the 1080TI and I don't think that's fair since it's Nvidia equivalent is the 1070. Is there a reason you did that? Are you biased towards Nvidia? Or were you just comparing them because they came out around the same time? Just because a card comes out around the same time as another card doesn't mean they're competing against each other at that specific time period.

3
Dave James Avatar
627
2 Months ago

If you're referring to the video then we were comparing the two reference card designs because they represent the pinnacle of the two companies' gaming GPU ranges, but if you're referring to the review I'm not sure what you mean.

The benchmarks are all comparing the RX Vega 64 and 56 to the GTX 1070 and GTX 1080. We used the GTX 1080 for comparison as AMD themselves said that was the card they were targeting for the top-end Vega.

There's nothing in the conclusion comparing it to the GTX 1080 Ti at all. I can assure you there's no specific bias here, in fact ours was one of the most positive Vega 64 reviews posted at launch.

0
Rumple4skin Avatar
3
2 Months ago

Well im running the Rx580 in wolfinstien 2 everything maxed mien laben profile and get 60fps constant. I believe this is considerably weaker then the vega. So i call bullshit that the vega is only comparable to the 1070 the rx580 is more on par with the 1060-1070

3
DuoBlaze Avatar
98
9 Months ago

Those total power draw numbers are not at all like what I've observed. I've had a vega 64 liquid card in my PC with all default wattman settings for weeks. During peak (spikes - meaning above average) utilization I still have not exceeded 475w TPD from my entire PC. With my RX480 that never exceeded 400w. Either the that 345w number is wrong or I've got some major headroom for overclocking.

As it goes right now using balanced profile the radiator fan is almost inaudible while gaming, barely beyond the noise level of my chassis fans and runs much cooler than my air cooled RX480 did.

I feel like reviewers are not being accurate with power draw, noise and heat numbers.

2
Dave James Avatar
627
9 Months ago

I may be wrong here, but I think you might be getting confused between total platform power draw and the rated thermal design point (TDP) of the cards.

The TDP of your liquid-chilled Vega 64 is 345W, essentially meaning its cooling has been designed to dissipate that level of energy use under load.

The total platform power draw shown in the benchmarks above represent the amount of energy being drawn from the wall into our test rig's PSU. That was taken using a discrete energy meter while running Battlefield 4 at max settings and at 1440p; the same method I've used for all of my GPU tests for a good few generations of graphics card.

The temperature readings are taken from Afterburner, which we leave running throughout the entirety of our testing suite to catch the maximum temperature, as well as long term GPU frequencies too.

0
Anakhoresis Avatar
685
9 Months ago

Well Tom's hardware did report that they recorded peaks from the standard air-cooled card (just the card, not including the rest of the system components) of 385w. Though their monitoring equipment is probably more sensitive to peaks.

1
project17 Avatar
6
9 Months ago

i must say the rx vega 56 is an amazing card from yesterday as they had a driver problem that was fixed for web browsing. the gaming side is very good had a 1440p monitor with freesynce at 75 refresh so i cap the card at this and have had all games running high settings and only wild lands goes to about 50 fps but freesynce takes care of that as would not have noticed it i did not have fraps running. very good come back from amd

2
oscarjung Avatar
1
2 Months ago

Well, as you were speculating, in Far Cry 5, Vega killed Nvidia. Your test results are all over the place with 1070 besting 1080 and Vega 64 loosing to 56, I think you should recheck your rig and setup. For 400 I would take the Vega 56 over 1070, you can OC it to trail the 1080, under-volt it and it has Freesync and better future potential and lifetime since it was not launched in early 2016.

1
Galactic Squid Avatar
1
2 Months ago

Anti AMD much? Reading this being someone with both cards or in an unbias manner, it's clear your agenda here. Maybe you are just looking for clicks, but you picked the few benchmarks where the 1070 wins, and you HAD to include Time Spy or people would complain. I own both cards, and the Vega 56 is better than my 1070, if only marginally in nearly every benchmark or game that I've run.

1
CMD_Storm Avatar
1
2 Months ago

So many things wrong here. First AMD wouldnt have been better off with GDDR5. They couldn`t have even made a card like that. HBM was needed because in the end it was soooo power hungry. It literally would have used 50% more power. Next, guys who buy these are not console gamers who just compare stock settings. Its incredibly misleading. Basically for 99% of games a Vega 56 is a Vega 64 when you mod it. It blows the doors off of 1070`s and even beats out 1070ti for the most part. It basically is neck and neck with a 1800 MHZ 1080, so faster than stock cards but slower than things like EVGA or Asus top picks. And then you play DX12 and realize it smokes everything but a TI, and guess what DX11 or 10 is not the future. For some reason people forget or are blind to the fact that it will age 10x better and in a climate where we dont know how much GPU`s will be in a year that's important. While studios design in DX12 and Vulcan and realize now AMD is back at least with CPU`s they are getting a ton more love in drivers and optimizations. Before Ryzen they literally didn`t exist to gaming companies, and 15% increases to performance is common place when games optimize to certain tech. AMD should get massive credit for starting to make HBM the new high end normal. Guys like yo demonized the decision so badly and yet we will all benefit from it as Nvidia now has to issue its nex line with it.

The other major thing against it was price, but guess what, its neck and neck now with its Nvidia opponent and getting Free Sync running at 100-144hz is literally 50% the cost as Gsync. Basically, if you want a27-32inch gaming monitor its 4-500 for a FreeSync and 850 to 1100 for a gysnc. So the overall package actually goes to AMD if a person is building a brand new rig. This topic was soooo more nuanced and complex than you tech writers who were disappointed [and mostly Nvidia guys] made it out to be. As it sits now...

If you are upgrading a machine and have a nice monitor at 1440p with Gysnc its a no brainer to go Nvidia. Or if you want 4k @ 60 FPS its still Nvidia, or if you dont care and will use a capped fps. But TONS of guys are going from [email protected] 60hz to [email protected]+ because the tech has just now caught up to it, and as I said AMD is the best overall bet right now. 1440p is simply stunning on a 32 inch 144hz monitor and nothing before the Vegas or 1080 could really handle it.

Also, since Nvidia is such a market rapist and is purely profit driven at every turn, you know the next line will somehow be released in a torturous way where most guys will have to wait a year from now, and the 1080s will plummet to 300ish$ because all they are good for is gaming and are hyper inflated now. The Vega will still be worth 75% of its value because it will still have low numbers and will ALWAYS mine like a banshee. So its wise to buy a 56 until Nvidia`s midgrade comes that can run 1440p sell your 56 for 450-500 bucks and call it a even trade.

1