Date: August 14, 2017
Author(s): Rob Williams
After months and months of anticipation, AMD’s RX Vega series has arrived. The first model out-of-the-gate is the RX Vega 64, going up against the GTX 1080 in gaming. In lieu of a look at gaming to start our Vega coverage, we decided to go the workstation route – and we’re glad we did. Prepare yourself to be decently surprised.
April 30, 2018 Addendum: Updated performance can be found here.
When most people receive the latest top-end graphics card from AMD or NVIDIA, they get straight to testing its gaming performance. Me? Well, I’m not most people. I am hoping that there is some method to my madness, though. Hear me out.
AMD’s Radeon RX Vega is one of the worst-kept secrets in the history of the industry. Well before embargo-abiding press could release details, leakers around the globe told us everything we need to know. That includes relative performance to NVIDIA’s GeForce GTX 1080, the fact that it’s going to run hotter and draw more power over the competition, and that overall, it’s not the launch AMD was hoping for.
As we were allowed to reveal a couple of weeks ago, the $499 USD RX Vega 64 is designed to go up against NVIDIA’s GeForce GTX 1080, also priced at $499 USD. At the moment, though, the least expensive GTX 1080 I can find on Amazon is ~$539 USD. It’s not expected that Vega 64 (and 56) will launch with their SRPs in tact (current listings have had Vega at well over $1000), so the sad reality is, you’re going to be paying a premium on any GPU solution right now, unless you happen to get lucky.
Back to the story at hand, based on the myriad leaks that occurred surrounding RX Vega, which have made it sound like Vega 64 will never actually be able to beat the GTX 1080 at gaming, I decided to take a look at this card first from a workstation / compute perspective, to see if any unexpected advantages could be seen. Perhaps surprisingly, there are indeed some, and some of those are downright impressive.
Note: You can check out our preliminary gaming results here, more will come later when we get the full article prepared.
Admittedly, another reason I decided to take a look at the compute performance first is because our respective test rig was still hooked up and wrapping up testing conducted since I posted my look at AMD’s Radeon Pro WX 3100. In that review, six GPUs were tested in total; for this one, that’s been bumped to ten.
This is going to be the first of at least three articles surrounding workstation and gaming performance of RX Vega. This article sees the Vega 64 tackle the usual gauntlet of workstation tests, whereas the next article will took an in-depth look at gaming performance in our apples-to-apples tests. For those wanting quick and dirty gaming results, I have published a couple here. You can have a look at the reviewer’s kit we received right here.
|AMD Radeon Series||Cores||Core Base MHz||Core Boost MHz||FP32 (TFLOPS)||FP16 (TFLOPS)||Memory||Bandwidth||TDP|
|Radeon RX VEGA 64 LCE||4096||1406||1677||13.7||27.5||8192 MB||484 GB/s||345W|
|Radeon RX VEGA 64||4096||1247||1546||12.66||25.3||8192 MB||484 GB/s||295W|
|Radeon RX VEGA 56||3584||1156||1471||10.5||21||8192 MB||410 GB/s||210W|
|Radeon R9 Fury X||4096||1050||–||8.6||4096 MB||512 GB/s||275W|
|Radeon R9 Fury||3584||1000||–||7.16||4096 MB||512 GB/s||275W|
|Radeon R9 Nano||4096||1000||–||8.19||4096 MB||512 GB/s||175W|
|Radeon RX 580||2304||1257||1340||6.17||8192 MB||256 GB/s||185W|
|Radeon RX 480||2304||1120||1266||5.83||8192 MB||256 GB/s||150W|
|Radeon RX 570||2048||1168||1244||5.1||4096 MB||224 GB/s||150W|
|Radeon RX 470||2048||926||1206||4.94||4096 MB||211 GB/s||120W|
|Radeon RX 560||1024||1175||1275||2.61||4096 MB||112 GB/s||80W|
|Radeon RX 460||896||1090||1200||2.15||4096 MB||112 GB/s||75W|
|Radeon RX 550||512||1100||1183||1.21||4096 MB||112 GB/s||50W|
I mentioned at the outset that Vega’s higher-than-desired power consumption hasn’t been a secret, and the rated spec of 295W TDP for Vega 64 confirms that (RX 580 is 185W, by contrast). This unfortunately leads me to the first major complaint about Vega 64, or at least this particular Vega 64. The “reference” cooler (if you want to call it that) looks nice, but it’s far from being an ideal solution. The card will run hot, as 295W would suggest, but to the point where throttling can take place.
The fact that the liquid-cooled version of Vega 64 exists highlights the fact that the GPU itself can achieve far greater performance than this RX 480-style cooler can complement. Sometimes, AMD and NVIDIA release new GPUs to reviewers without a reference version; companies like ASUS, GIGABYTE, MSI, PowerColor, or others, will send us custom designs. The GTX 1050 and 1050 Ti were sampled like this, and I truly feel Vega 64 should have been, too.
The fact of the matter is, reviewers can’t show Vega 64 or 56 in the light they should actually be shown. Some reviewers may have received the liquid-cooled version of the card, and I’d expect to see huge gains in performance there. And, better still, the card is sure to run cooler, with reduced chance of ever throttling. If these reference designs mimicked an ASUS STRIX, with three large fans, I am sure that would have aided in performance as well.
The moral of the story is, at some point soon, we need to look at vendor cards once they begin to hit the market, and hopefully at that time, drivers will be even better optimized, giving us a nice overall uptick in performance overall. Still, this acts as a good first test – just don’t treat the performance as gospel. Flat out, I would not recommend buying a Vega card with this reference cooler. It can’t handle the heat, and as a result, throttling can occur, and dBAs will raise.
Alright, from the get-go, a couple of things need to be made clear. The performance seen on RX Vega 64 is not meant to be representative of AMD’s workstation and compute performance on Vega in general. The upcoming WX 9100 and the preexisting Vega Frontier Edition are going to be equipped with Pro-specific optimized drivers, so while some performance will be expected here, some won’t be.
AMD didn’t send out Vega FE review samples, so this is my very first look at Vega in this context. I will be able to add some Vega FE results to our charts in the weeks ahead, nonetheless, and likewise, a Quadro P5000 is currently en route to help complete the overall look at current WS GPU options.
On the following pages, I’ll be putting AMD’s Radeon RX Vega 64 through a gauntlet of real-world and synthetic tests, utilizing apps from Autodesk, Adobe, SPEC, SiSoftware, and a handful of others (including light gaming tests for good measure).
All tests are run at least twice to produce an accurate result, and if for some reason an odd result creeps up, I do a third run. In the case of this particular review, no tests had to go that route, as most of the benchmarks are very good at delivering similar results with each repeated run.
The Windows 10 Pro (Creators Update) install used for testing has a couple of things disabled: User Account Control, Firewall, Search Indexer, OneDrive, and all notifications. During the install, everything on the Customize screen was disabled. All testing is conducted at 2560×1440 resolution (with the exception of 4K 3ds Max testing), with driver Vsync options left default.
Techgage’s workstation GPU test PC is built to be reflective of a high-end desktop that rules out as much as it can of bottlenecks. Intel’s top-end Core i9-7900X is used here, giving us a ton of breathing room on the CPU side. Kingston’s super-fast KC1000 M.2 SSD and 64GB of its HyperX FURY DRAM gives us the same breathing room on the storage and memory side.
Here’s the full list of specs:
|Techgage Workstation Test System|
|Processor||Intel Core i9-7900X (10-core; 3.3GHz)|
|Motherboard||GIGABYTE X299 AORUS Gaming 7|
|Memory||Kingston HyperX FURY (4x16GB; DDR4-2666 16-18-18)|
|Graphics||AMD Radeon RX Vega 64 8GB (Radeon 17.30.1051 Beta)|
AMD Radeon Pro WX 7100 8GB (Radeon 17.Q2.1)
AMD Radeon Pro WX 5100 8GB (Radeon 17.Q2.1)
AMD Radeon Pro WX 4100 4GB (Radeon 17.Q2.1)
AMD Radeon Pro WX 3100 4GB (Radeon 17.Q2.1)
NVIDIA TITAN Xp 12GB (GeForce 385.12)
NVIDIA GeForce GTX 1080 Ti 11GB (GeForce 385.12)
NVIDIA Quadro P6000 24GB (Quadro 385.12)
NVIDIA Quadro P4000 8GB (Quadro 384.76)
NVIDIA Quadro P2000 4GB (Quadro 384.76)
|Storage||Kingston KC1000 960GB M.2 SSD|
|Power Supply||Corsair 80 Plus Gold AX1200|
|Chassis||Corsair Carbide 600C Inverted Full-Tower|
|Cooling||Corsair Hydro H100i V2 AIO Liquid Cooler|
|Et cetera||Windows 10 Pro (64-bit; build 15063)|
|For an in-depth pictorial look at this build, head here.|
The benchmark results are categorized and spread across the next six pages. On page 2, AMD’s ProRender plugin is used in Autodesk’s 3ds Max 2017 to render two scenes, while two de facto benchmarking tools, as well as a newbie, wrap it up: Cinebench, LuxMark, and V-Ray Benchmark. Page 3 is home to an encode and CAD test, thanks to Adobe’s Premiere Pro CC 2017 and two 4K projects, and also Autodesk’s AutoCAD 2016, exercised through the use of the excellent Cadalyst benchmark.
SPEC produces so many benchmarks worthy of inclusion in our workstation GPU content, that it’s earned itself its own page. So on page 4, SPECviewperf helps us gain an understanding of viewport performance across 9 different applications. SPECapc 3ds Max 2015 and Maya 2012 finish things up with exhaustive tests in their namesake Autodesk products.
Like SPEC, Sandra’s test suite is large, so page 5 is dedicated to four of its tests: Cryptography, Financial Analysis, Scientific Analysis, as well as memory bandwidth. Two quick and dirty gaming benchmarks are featured on page 6: Futuremark’s 3DMark, and Unigine’s Superposition. Finally, the last page includes power results (sadly, no temperatures this go around), as well as the final thoughts.
So without further ado, let’s get this train moving.
The best use case for GPUs in rendering is using them for realistic lighting, something achieved through AMD’s ProRender and NVIDIA’s Iray ray tracing renderers. ProRender can make use of NVIDIA’s hardware (with a warning about a lack of optimization), but the reverse isn’t true. Since it’s fair game as an OpenCL renderer, I test both AMD and NVIDIA cards with ProRender.
This testing makes use of Autodesk 3ds Max 2017, which is the most recent version ProRender supports (the same applies to Iray). Both scenes used for testing render with 500 iterations, giving us a nice looking result, but not a production one (that’d require at least 2,500). The scenes include an AMD autoshow, and a cool dragon; both of which can be snagged for free from the ProRender GitHub page.
The lack of professional optimization rears its ugly head here. Technically, the Vega 64 should stomp the WX 7100, but for what I assume are in fact a lack of optimizations, it doesn’t happen. The real winner of this entire lineup is, oddly enough, the lowbie Quadro P2000. It delivers middle-of-the-road performance for a mere $420 USD. The WX 7100 makes notable gains beyond that, earning its $620 USD price tag.
To compare our collection of workstation graphics cards across other renderers, Cinebench R15(.038), LuxMark 3.1, and V-Ray Benchmark are used. Cinebench is a good gauge of OpenGL performance in Cinema4D, whereas LuxMark tests the cards’ prowess for OpenCL. LuxMark is also used for gauging peak power draw (found on the final page), and while it doesn’t push the GPU as hard as a gaming benchmark does, it offers a realistic look at rendering performance-per-watt. Chaos’ V-Ray Benchmark is a brand-new entrant, acting like the others to give us a performance gauge based on its namesake rendering engine.
Well, well, well. Would you look at this? AMD’s Radeon RX Vega 64 falls just behind the Radeon Pro WX 7100 in Cinebench, but it soars past the same GPU in both LuxMark and V-Ray. In fact, I should stress that your eyes don’t deceive you: Vega 64 really does beat the entire lineup in LuxMark. That includes beating out the $5,000 Quadro P6000.
What’s really interesting about this result is the fact that this kind of domination couldn’t be seen in ray tracing via AMD’s ProRender 3ds Max test, proving that if you don’t see a performance boost with one renderer, you may with another.
LuxMark is admittedly a bit of a niche test, but V-Ray sure isn’t. Here, the Vega 64 only fell behind the Quadro P6000 and TITAN Xp. Gaming on Vega might be rough around the edges, but clearly, some compute tests fare much better.
October 5, 2017 Addendum: Our V-Ray results have been proven incorrect here, and have been retested. Please refer to here.
To test the accelerated encoding perks of different GPUs, Adobe’s Premiere Pro CC 2017 is used. For production, the best use of GPUs is to rendering the countless number of filters, and accelerate scaling down to other resolutions. Encoding one 1080p video to another might not exhibit much of a speed-up (or one at all) on the GPU, but 4K to 1080p can often benefit.
Two projects help test two different scenarios here. The first is a 1080p project that includes a bunch of filters, while the second makes use of the open source movie Tears of Steel to resize the 3840×2160 release (a 4096×2160 version is also available) down to 1080p.
Media encoding has been NVIDIA’s (and Intel’s, for what it’s worth) forte, and that’s easily seen when you compare the lower-end Quadros to even the high-end WX 7100. Fortunately, Vega suffers no questionable performance: it keeps up with NVIDIA in one encode, and comes very close in the other.
Some of SPEC’s benchmarks on the following page take a look at CAD performance, but AutoCAD is left out. So with the help of Cadalyst, a benchmark produced by the people at the website of the same name, both 2D and 3D performance is tested (along with I/O and CPU, but that isn’t needed here).
We saw the RX Vega 64 beat out AMD’s own Pro-targeted WX series (Polaris-based, only) in Adobe Premiere Pro, and here, with AutoCAD, we see improved performance yet again. Clearly, AutoCAD is NVIDIA’s domain, which is probably why I never hear AMD utter the name, but all things considered, Vega 64 performs very well here.
When it comes to benchmarking hardware for serious use cases, there is no place better to look than SPEC. I’ve dubbed the folks there as “the masters of benchmarking”, as each one of SPEC’s tools are meticulously crafted by professionals to deliver results as relevant and accurate as possible – a goal shared by us at Techgage.
Three SPEC suites are used for testing here, starting with SPECviewperf, for viewport performance across nine applications. SPECapc 3ds Max 2015 and Maya 2012 finish up the page to help us gauge performance in the respective Autodesk applications. I used to include SPECwpc, but realized it’s best left for comparing one machine to another, not one component to another.
There are a lot of results to go through here, so let’s go slow. In CATIA, AMD’s Vega 64 outperformed the Quadro P2000, while it slid just in behind the ~$280 WX 4100 in SolidWorks. It’s worth noting that as far as gaming GPUs in workstation scenarios go, Vega 64 managed to beat out the GTX 1080 Ti – not bad. The same can be said about AMD’s latest top-end card in Siemens NX – a gain of 2.5x is no joke.
Vega 64’s solid compute performance carries over to energy, medical, and Showcase, performing on par with NVIDIA’s GTX 1080 Ti in the first two, and exceeding its performance in the latter. So how about 3ds Max and Maya? There, Vega 64 delivers performance that far exceeds the more expensive WX 7100, so that’s what I’d call a good thing.
SPECviewperf showed us that the Vega 64 performed better than the WX 7100 in 3ds Max viewport performance, and according to SPECapc, overall use will see the same kind of gains. NVIDIA rules this particular roost, but Vega 64 puts up a good fight. It suffers compared to the Quadro P4000 at 4K, but that GPU costs $300 more, so it pretty much manages to scale in the end.
Uhh… let’s just move on, shall we?
OK, in all seriousness, I’d blame this performance loss to unoptimized drivers. I regret not having Vega FE listed here for the sake of comparison, but that is something that will be remedied in the weeks ahead.
On the previous page, I mentioned that SPEC is an organization that crafts some of the best, most comprehensive benchmarks going, and in a similar vein, I can compliment SiSoftware. This is a company that thrives on offering support for certain technologies before those technologies are even available to the consumer. In that regard, its Sandra benchmark might seem a little bleeding-edge, but at the same time, its tests are established, refined, and accurate across multiple runs.
While Sandra offers a huge number of benchmarks, just four of the GPU ones are focused on: Cryptography, Financial Analysis, Scientific Analysis, and also memory bandwidth. Some of the results are a bit too complex for a graph, so a handful of tables are coming your way.
Hot damn. I feel like these kinds of gains are those that AMD should promote out the wazoo. So many reviews posted today are likely to paint a rough picture of this card’s gaming performance, but on the other side of the fence, compute performance on Vega quite simply kicks ass. The results here may be able to give an impression of Vega 64’s future mining performance. Mining benchmarks you’ll see around the web in other launch reviews will show an edge over a top-end NVIDIA card. I would not be surprised if AMD optimizes its driver sometime in the future to vastly improve mining performance. Vega should technically be better than the 30MH/s you’ll see reported today, based on all of the compute performance seen here.
|Sandra 2017 – Financial Analysis (FP32)|
|NVIDIA TITAN Xp||14 G/s||2.5 M/s||6.7 M/s|
|NVIDIA Quadro P6000||11.6 G/s||2.3 M/s||6.5 M/s|
|NVIDIA GeForce GTX 1080 Ti||11.6 G/s||2.2 M/s||6 M/s|
|AMD Radeon RX Vega 64||9.4 G/s||3 M/s||4.4 M/s|
|NVIDIA Quadro P4000||6.5 G/s||1.1 M/s||2.9 M/s|
|AMD Radeon Pro WX 7100||5.2 G/s||1.3 M/s||1.9 M/s|
|NVIDIA Quadro P2000||3.8 G/s||656 k/s||1.8 M/s|
|AMD Radeon Pro WX 5100||3.4 G/s||478 k/s||672 k/s|
|AMD Radeon Pro WX 4100||2.2 G/s||531 k/s||773 k/s|
|AMD Radeon Pro WX 3100||2.5 G/s||321 k/s||467 k/s|
|Results in options-per-second. 1 GOPS = 1,000 MOPS; 1 MOPS = 1,000 kOPS.|
|Sandra 2017 – Financial Analysis (FP64)|
|AMD Radeon RX Vega 64||2.2 G/s||186 k/s||542 k/s|
|NVIDIA TITAN Xp||1.44 G/s||142 k/s||297 k/s|
|NVIDIA Quadro P6000||1.3 G/s||131 k/s||271 k/s|
|NVIDIA GeForce GTX 1080 Ti||1.3 G/s||134 k/s||272 k/s|
|NVIDIA Quadro P4000||622 M/s||63 k/s||129 k/s|
|AMD Radeon Pro WX 7100||958 M/s||81 k/s||239 k/s|
|NVIDIA Quadro P2000||360 M/s||36 k/s||75 k/s|
|AMD Radeon Pro WX 5100||406 M/s||49 k/s||97 k/s|
|AMD Radeon Pro WX 4100||395 M/s||35 k/s||98 k/s|
|AMD Radeon Pro WX 3100||219 M/s||18 k/s||55 k/s|
|Results in options-per-second. 1 GOPS = 1,000 MOPS; 1 MOPS = 1,000 kOPS.|
The RX Vega 64 continues to perform extremely well in compute tests, slotting in just behind the 1080 Ti in single-precision, and leading the pack in a very significant way in the double-precision test. That’s thanks to the fact that Vega’s architecture delivers 1:16 the performance of single-precision, in double-precision, vs. NVIDIA’s 1:32.
|Sandra 2017 – Scientific Analysis (FP32)|
|NVIDIA Quadro P6000||6.4 TFLOPS||495 GFLOPS||5.9 TFLOPS|
|NVIDIA TITAN Xp||7 TFLOPS||258 GFLOPS||5.5 TFLOPS|
|AMD Radeon RX Vega 64||6 TFLOPS||344 GFLOPS||5.3 TFLOPS|
|NVIDIA GeForce GTX 1080 Ti||6 TFLOPS||217 GFLOPS||5.12 TFLOPS|
|NVIDIA Quadro P4000||3.1 TFLOPS||128 GFLOPS||2.7 TFLOPS|
|AMD Radeon Pro WX 7100||2.5 TFLOPS||210 GFLOPS||2.2 TFLOPS|
|NVIDIA Quadro P2000||1.8 TFLOPS||87 GFLOPS||1.7 TFLOPS|
|AMD Radeon Pro WX 5100||945 GFLOPS||138 GFLOPS||755 GFLOPS|
|AMD Radeon Pro WX 4100||1 TFLOPS||85 GFLOPS||917 GFLOPS|
|AMD Radeon Pro WX 3100||670 GFLOPS||70 GFLOPS||647 GFLOPS|
|GEMM = General Matrix Multiply; FFT = Fast Fourier Transform; N-Body = N-Body Simulation.|
|Sandra 2017 – Scientific Analysis (FP64)|
|AMD Radeon RX Vega 64||611 GFLOPS||164 GFLOPS||475 GFLOPS|
|NVIDIA TITAN Xp||352 GFLOPS||199 GFLOPS||277 GFLOPS|
|NVIDIA GeForce GTX 1080 Ti||332 GFLOPS||163 GFLOPS||267 GFLOPS|
|NVIDIA Quadro P6000||323 GFLOPS||128 GFLOPS||253 GFLOPS|
|NVIDIA Quadro P4000||156 GFLOPS||95 GFLOPS||127 GFLOPS|
|AMD Radeon Pro WX 7100||276 GFLOPS||80 GFLOPS||194 GFLOPS|
|NVIDIA Quadro P2000||90 GFLOPS||53 GFLOPS||84 GFLOPS|
|AMD Radeon Pro WX 5100||123 GFLOPS||56 GFLOPS||103 GFLOPS|
|AMD Radeon Pro WX 4100||109 GFLOPS||33 GFLOPS||84 GFLOPS|
|AMD Radeon Pro WX 3100||63 GFLOPS||33 GFLOPS||49 GFLOPS|
|GEMM = General Matrix Multiply; FFT = Fast Fourier Transform; N-Body = N-Body Simulation.|
With its beefy double-precision performance (as far as gaming cards go, at least), Vega 64 soars to the top of that respective chart, and manages to best the 1080 Ti in single-precision.
With its HBM2 memory in tow, the RX Vega 64 places right behind NVIDIA’s GTX 1080 Ti, by about 27GB/s. Its interface transfer, however, manages to best everything else in the lineup. How this correlates to real-world performance is hard to gauge, especially since it’s up to the rest of a GPU’s architecture to properly complement the bandwidth it’s given.
Gaming is generally not a big focus for professional GPU lines, but the fact of the matter is, they can game. That especially applies to the top-tier cards on the market, as they all perform similarly to the top-tier gaming cards from the same vendor of the same generation. So what’s the caveat with gaming on workstation cards? A lack of game-specific optimizations.
While on the GeForce or Radeon (non-Pro) side, the companies constantly roll out updates that improve general performance in gaming or performance specific to one title, Quadro and Radeon Pro drivers don’t have the same granularity where gaming’s concerned.
Both the GeForce and Radeon cards in this lineup use their respective gaming drivers, so it stands to reason they’ll perform better than their workstation counterparts if specs are shared.
To get a quick gauge on the performance of our workstation GPU collection in gaming, we use Futuremark’s 3DMark and Unigine’s Superposition.
It’s not listed in these results, but RX Vega 64 is on par with the GTX 1080, with all GPU scores in 3DMark being extremely close between the two. Overall, the RX Vega 64 performs as expected here overall. AMD recently told me that it prefers to optimize its current drivers for DX12 and Vulkan over DX11, and these Futuremark results back that up.
Both of Futuremark’s tests put AMD’s RX Vega 64 fourth from the top, and Superposition does the very same thing. Of course, gauging gaming performance of a gaming GPU is going to be best done in a gaming review, and fortunately, one of those is en route. For quick and dirty results to preface the review, take a peak.
To test workstation GPUs for both their power consumption and temperature at load, I utilize a couple of different tools. On the hardware side, I use a trusty Kill-a-Watt power monitor which our GPU test machine plugs into directly. For software, I use LuxMark to stress the card for a wattage reading, and then start Unigine Superposition to stress the card in a gaming scenario to gauge the worst-case with temperatures (recorded with GPU-Z).
To test, the area around the chassis is checked with a temperature gun, with the average temp recorded. Once that’s established, the PC is turned on and left to sit idle for a few minutes. At this point, GPU-Z is run along with LuxMark. I immediately choose the Hotel render after start, and then OpenGL GPU rendering. Peak power draw is monitored, and then Superposition is kicked-off to push the card as hard as it can for temperature’s sake.
Unfortunately, the current version of GPU-Z doesn’t record the temperatures of RX Vega, and I couldn’t find any quick replacements. Still, I don’t think it will be hard to imagine that it would find itself on the top of the chart, based on what I’ve seen from this reference cooler.
The one graph I am able to provide backs up our assumptions that RX Vega 64 is a market-leader in power consumption – but not in a good way. I will note, however, that AMD provides different power profiles in the Radeon software that will help you drop overall wattages at the minor expense of lost performance. I did not have a chance to explore this too heavily in time for this article, but I did conduct this quick test:
Based on this one test, it would seem that using the power save profile would be wise – at least, if you believe shaving 77W from your total power draw is more important than +5% performance. AMD ships Vega with multiple power profiles, so you can spend some time ekeing as much performance out of your card as possible. I’ll do more testing on this as the week goes on, and report on my findings as soon as I can.
When I decided to defy all logic and make my debut look at AMD’s RX Vega one that treats it like a Radeon Pro card, I started to feel regret as I put more time into testing, because I just wasn’t seeing much worth reporting on. In fact, I almost decided to write a much more succinct article mostly looking at the areas where RX Vega shines.
I can honestly tell you that when I decided to suck it up and follow through with testing the entire workstation suite on RX Vega, I didn’t expect it to perform so well in so many different areas. In talking to site friends, all of whom have their looks at RX Vega gaming today, Vega 64 is going to slot under GTX 1080, matching it in some cases, but never besting it. I began to feel like a look at a non-workstation card in workstation scenarios where the card would fall behind its competition would be pointless. At least up until the point when I compiled all of the results, and began to see impressive performance all over the place.
While it didn’t dominate the ProRender benchmark as much as I expected it to, the RX Vega 64 reigned supreme in LuxMark, matching the Quadro P6000 in the Hotel Lobby render, and storming past it in the Neumann and LuxBall renders. In video encoding, RX Vega 64 delivered performance close to on par with NVIDIA’s, and performance that far exceeded that of AMD’s WX Polaris line, including the WX 7100.
In crypto, the RX Vega 64 simply killed it, beating out every single GPU outside of the TITAN Xp in SHA2-256 Hashing (and even then, it was pretty damn close). In CATIA and SolidWorks, AMD’s latest top-end card managed to beat out NVIDIA’s GeForce GTX 1080 Ti. And, last but not least, it delivered market-leading performance in science and finance, while gaining a major advantage in double-precision performance (about 2x NVIDIA).
All in all, this is extremely impressive. If you’re a workstation user wanting a GPU that will give you good performance in both workstation and gaming workloads, the RX Vega 64 is, surprisingly enough, a pretty attractive choice…
…but that all said, there are some (maybe obvious) caveats that work against the RX Vega 64.
To reiterate what I said on the first page, the cooler design of the RX Vega 64 we were given doesn’t do any favors to its performance. I couldn’t record temperatures due to lacking monitoring software, but given how loud the fan could get in gaming, it struck me as pretty obvious that some throttling had to have been occuring. In the power section above, I mentioned that AMD includes multiple power profiles with Vega, and in my opinion, one that’s not default, should be default.
In rendering, total PC power draw hit 375W with RX Vega 64, and in gaming, that got a boost to 412W. At its default value, it delivers a 3DMark score about 2% better than the power saving profile will dish out, but it comes with the gain of shaving a ton off of the power draw. +70W for +2% performance is ridiculous; multiply that +70W by every single Vega gamer out there who will simply use their GPU at its stock configuration. I think it’s great that AMD is giving people a choice, but I really think its default choice should be different.
I plan to delve more into how Vega handles its different power profiles later in the week. Following this review will be a full look at the gaming performance of the card, which will then be followed by a look at the “Best Playable” settings for our current arsenal of games at 4K and 3440×1440.
Ultimately, for gaming, RX Vega 64 sits near GTX 1080, but NVIDIA’s card comes out ahead overall. That’s really saying something considering the GTX 1080 came out 15 months ago. In compute, however, which has been the overall focus of this article, RX Vega 64 struck back, surpassing even the GTX 1080 Ti (and sometimes TITAN Xp) in select tests. That aspect of RX Vega is downright impressive. It seems very likely to me that sooner than later, mining performance will also see a boost on these cards, because the compute advantage is there, just waiting to be exploited.
I won’t have my final conclusions on the gaming aspect of RX Vega for another day at least, but you can read my initial opinions to satiate your appetite in the meantime. For compute workloads? If you use anyone of the tools that RX Vega excels at running, you can definitely reap some nice rewards with this GPU. I’d just recommend holding out for vendor cards with improved coolers.
You can check out Amazon for prices, but be warned that the prices will be a bit silly to begin with, and it’s likely best to wait for AIB cards with better coolers to hit the market first. Even the Vega FE cards a little pricey right now.
Copyright © 2005-2019 Techgage Networks Inc. - All Rights Reserved.