Date: July 16, 2010
Author(s): Rob Williams
In the market for a dual-GPU capable AMD motherboard, and one that’s capable of achieving some huge overclocks? Gigabyte has you covered, with its 890FXA-UD5. In addition to having native SATA 3.0 support, USB 3.0 support can also be found, along with 4 PCI-E x16 graphics slots, a near-perfect board design and good pricing.
In late April, AMD released its first-ever six-core processors, the Phenom II X6 1090T and 1055T. At the same time, it also launched it’s latest highest-end chipset, the 890FX. Compared to 890GX, there is little different, but for those looking to use dual-GPU’s and achieve the highest overclocks, the right choice is 890FX. Price-wise, it’s hard to even notice the difference between the two, so if a board is priced right, looks good and offers the functionality you need, picking one up is a no-brainer.
To help us put the 1090T processor through its paces at its launch, AMD sent along MSI’s 890FXA-GD70, a board we briefly talked about in that review. My original goal was to get more 890FX offerings in, to do a little roundup, but as MSI’s board failed on us, and we likewise failed to receive another, the roundup idea began to fall through. There’s ASUS, of course, but I wanted a mainstream roundup, and that company’s mainstream part, the M4A89TD Pro was too busy flying off the shelves, or something to that effect, and again, no sample.
As this review insinuates, Gigabyte was the only company able to send us a perfectly operational mainstream 890FX board, and while I feel a bit silly for not having other 890FX boards to compare to, I’ll work with what I have. For the sake of comparison, I also benchmarked Gigabyte’s MA790FXT-UD5P, which happens to look almost identical to this one, and uses AMD’s last-gen highest-end chipset, the 790FX. Because the boards are so similar, the benchmarks will be interesting (or not).
Like the rest of Gigabyte’s top-tier motherboards, the “A” in the model name implies that it features full USB 3.0 support, which includes the “Power Boost” feature which provides additional power to such devices, including those that simply need to be plugged in to charge. Gigabyte first introduced this feature last fall, and it’s apparent that it’s a useful one, since other companies have since adopted the same idea.
For your S-ATA 3.0 needs, the board features two ports which are powered by AMD’s SB850 southbridge. The board also touts an “Auto Unlock” feature, which gives you the power to potentially unlock “locked” cores on your processor, whether it be a dual-core or a triple-core. Your mileage may vary here, and as we’ve been testing out this board with the six-core 1090T, the feature served no purpose.
As a whole, the 890FXA-UD5 doesn’t look much different from most of Gigabyte’s other boards. That’s not necessarily a bad thing, though, as at first glance, the layout looks good, and there are a sufficient number of fan connectors scattered about (five in total, including for the CPU cooler). If there’s just one complaint I do have, it’s that the eight-pin motherboard power connector is hidden behind the “Ultra Durable” heatsink, and as it’s right in line with the blue cover, removing the connector once plugged in is a little tricky.
Taking a look at the bottom right-hand corner of the board, we can see eight SATA ports, with the two white ones being the 6Gbit/s offerings, along with a fan connector and ATX chassis connectors. You might notice also that there are three internal USB connectors, whereas on many boards, there are just two.
I also need to point out the fact that the BIOS battery is in a very easy-to-access spot, and it’s appreciated. It’s only once in a blue moon when I find myself having to resort to yanking the battery out, but when it’s located in an area where components need to be first removed (like under the GPU), it’s frustrating.
The board features four DIMM slots as expected, capable of supporting up to 16GB of RAM. For those who are still holding onto a floppy drive, or IDE storage devices, you’re well taken care of here. Another one of the fan connectors is located here, right in between these connectors and the 24-pin motherboard connector.
As the 890FX chipset supports graphics cards in a dual x16 configuration, seeing a couple of those slots here is not surprising. But, it is surprising to see four of them. Perfect if you want to run even more than two GPUs, for something like [email protected] Or, you can use the spares for other PCI-E devices, even if they don’t fully fill out the x16 slots.
In addition to the four x16 slots, there are two x1 and also a legacy PCI connector, for your network adapter, audio card, or what-have-you.
To the left of the CPU socket (real perspective, north-east in the picture) is an 8+2 power phase design. Eight are for the CPU itself, while the other two are for the memory and memory controller. In this same image, you can see the slightly hidden 8-pin motherboard connector, and also the CPU fan connector which is readily accessible.
On the back I/O panel we can see that not much is missing. In fact, nothing is missing that I can see. Both FireWire standards are taken care of, and in addition to those, there are ten USB ports (two are 3.0), 2x LAN ports, the six-channel audio, and also a PS/2 port for either a mouse or keyboard.
In way of accessories, there are four SATA cables, a floppy connector, an IDE connector, the back I/O port, some stickers, and of course, the manual and driver DVD.
Overall, the board and package seems to fit its $190 price tag well, and I can’t see what could be added to make the entire deal even sweeter. The board is well-designed, looks good, has a smart layout, and lots of flexibility. Let’s next check out its BIOS to see if it matches up.
If you’ve used a Gigabyte BIOS in recent years, any Gigabyte BIOS today isn’t going to throw you for a loop. For the most part, their design and function have remained mostly the same, aside from back-end tweaks and added options for the most die-hard of overclockers. In that way, there really isn’t much to complain about here, as anything you’d possibly want to tweak is available, and likewise for overclocking, all of the purely asinine voltage levels are also here.
One thing I do wish was changed, though, is the ability to key in a lot more of the values with the keyboard. I’m not sure about the host clock control, since I hardly ever touch it, but for things like multipliers and voltage values, all have to be selected either by hitting enter and scrolling through a list, or using the + and – buttons. Both are far more inconvenient than simply typing in the value.
In addition, for the voltage areas, there’s a small problem when the voltage levels are changed to another color other than white, because you’re entering dangerous territory. Once they change color, say red for example, you can’t easily tell which option you are currently on, because there’s no denotation of any sort to tell you. If you have more than one voltage level at a “dangerous” level, then this can get a little tedious fast, since it would require you to hit the + or – button to find out which value is changing. In the grand scheme, this is a minor issue, but I still find it an odd design.
For the most part, the BIOS here is solid, with no major complaints from me.
At Techgage, we strive to make sure our results are as accurate as possible. Our testing is rigorous and time-consuming, but we feel the effort is worth it. In an attempt to leave no question unanswered, this page contains not only our testbed specifications, but also a fully-detailed look at how we conduct our testing.
If there is a bit of information that we’ve omitted, or you wish to offer thoughts or suggest changes, please feel free to shoot us an e-mail or post in our forums.
The table below lists the hardware for our current motherboard-testing machine, which remains unchanged throughout all testing, with the exception of the motherboard. Each motherboard used for the sake of comparison is also listed here, along with the BIOS version used. In addition, each one of the URLs in this table can be clicked to view the respective review of that product, or if a review doesn’t exist, you will be led to the product on the manufacturer’s website.
Intel LGA1156 Test System
|Processors||AMD Phenom II X6 1090T – Six-Core, 3.20GHz, Stock Voltage|
Gigabyte 890FXA-UD5 – 890FX, F3 BIOS (June 6, 2010)
Gigabyte MA790FXT-UD5P – 790FX, F8K BIOS (July 5, 2010)
Corsair DOMINATOR 4x2GB – DDR3-1600 8-8-8-24, 1.65v
ASUS Radeon HD 5850 1GB (Catalyst 10.6)
When preparing our testbeds for any type of performance testing, we follow these guidelines:
Because it gives a more realistic interpretation of motherboard/CPU performance, we leave all of the power-related options in the BIOS to their default selection.
|Our Windows 7 Desktop for AMD Motherboard Testing||(Wallpaper Credit)|
To aide with the goal of keeping accurate and repeatable results, we alter certain services in Windows 7 from starting up at boot. This is due to the fact that these services have the tendency to start up in the background without notice, potentially causing slightly inaccurate results. Disabling “Windows Search” turns off the OS’ indexing which can at times utilize the hard drive and memory more than we’d like.
For all intents and purposes, we don’t benchmark motherboards to see which performs better than the other, because unlike processors, graphics cards, and many other PC components, motherboards are not purchased for their speed, because any with a given chipset should perform like all the rest. Most often, a purchase of a motherboard comes down to features, design and overclocking-ability. None of these are affected by performance benchmarking.
Rather, the goal we set out when benchmarking motherboards is to make sure that one doesn’t falter in some way compared to the rest, so even though we do run benchmarks typical of our other performance-related articles, we use them here more as a stress-test. That way, if a motherboard does have a fault in some regard, we’ll be able to spot it.
For the sake of accomplishing this, we use Adobe’s Lightroom 3.0, Autodesk’s 3ds Max 2010, Futuremark’s 3DMark and PCMark Vantage, HD Tune Pro 3.5, SANDRA 2010 SP2, SPECviewperf 11 and TMPGEnc Xpress. Stability of the motherboard is tested at stock and overclocked speeds with LinX 0.6.4, an effective LINPACK stress-tester.
Aside from 3DMark Vantage, we don’t perform any game-related tests, as our experience has proved to us that they are a waste of time. You simply are not going to notice a difference between gaming on one motherboard and gaming on another, but we do use 3DMark Vantage for its ease, and its ability to push the entire system hard.
Futuremark is no stranger to most any enthusiast out there, as the company’s benchmarks have been used to gauge our PC’s worth for many years. Although the company’s 3DMark Vantage (which we also use for testing) is arguably more popular than PCMark Vantage, the latter is a great tool to measure a system’s overall performance across many different scenarios.
Unlike SYSmark, PCMark is more of a synthetic benchmark, as very little is seen to the user during the run. However, each test tackles a specific and common scenario that’s typical of many computer users – enthusiasts and regular users alike – such as photo manipulation, gaming, music conversion, productivity, et cetera.
The main problem right now with PCMark is its inability (at least for us) to produce an overall score when being run under Windows 7. Even when run in compatibility mode (which is required by 3DMark), the application will crash during the Memories test, despite that particular test executing fine when run as its own suite. So, no overall score is produced, but the seven individual scores are.
While SYSmark uses modest numbers for their scoring, ranging in the hundreds, Futuremark opts for much higher scores with their entire suite, with the lowest being the TV and Movies, ranging around the 6,000 mark. On the high-end, our Intel SSD is capable of pushing the test’s HDD scenario well beyond 20,000.
For the most part, the performance differences seen here are minor, but the 890FXA-UD5 does come out ahead. If the higher HDD score is something to go by, the improved SATA controller might have something to do with the rest of the test results also being improved.
Autodesk’s 3ds Max is without question an industry standard when it comes to 3D modeling and animation, with DreamWorks, BioWare and Blizzard Entertainment being a few of its notable users. It’s a multi-threaded application that’s designed to be right at home on multi-core and multi-processor workstations or render farms, so it easily tasks even the biggest system we can currently throw at it.
For our test, we use a project found on the samples DVD that comes included with 3ds Max. It features a bathroom scene with numerous objects and makes heavy use of ray-tracing. We render this scene at 1080p, to mimic a render that one might do for a 3D movie.
Photo manipulation benchmarks are more relevant than ever, given the proliferation of high-end digital photography hardware. For this benchmark, we test the system’s handling of RAW photo data using Adobe Lightroom, an excellent RAW photo editor and organizer that’s easy to use and looks fantastic.
For our testing, we take 100 RAW files (in Nikon’s .NEF file format) which have a 10-megapixel resolution, and export them as JPEG files in 1000×669 resolution, similar to most of the photos we use here on the website. We also apply a light sharpening effect for glossy paper. Such a result could also be easily distributed online or saved as a low-resolution backup. This test involves not only scaling of the image itself, but encoding in a different image format. The test is timed indirectly using a stopwatch, and times are accurate to within +/- 0.25 seconds.
So far, both boards are neck-in-neck, again flip-flopping a bit. The largest gain was seen with our Lightroom test, with the 890FXA-UD5 clearing its run close to 4 seconds faster than the 790FXT.
When it comes to video transcoding, one of the best offerings on the market is TMPGEnc Xpress. Although a bit pricey, the software offers an incredible amount of flexibility and customization, not to mention superb format support. From the get go, you can output to DivX, DVD, Video-CD, Super Video-CD, HDV, QuickTime, MPEG, and more. It even goes as far as to include support for Blu-ray video!
There are a few reasons why we choose to use TMPGEnc for our tests. The first relates to the reasons laid out above. The sheer ease of use and flexibility is appreciated. Beyond that, the application does us a huge favor by tracking the encoding time, so that we can actually look away while an encode is taking place and not be afraid that we’ll miss the final encoding time. Believe it or not, not all transcoding applications work like this.
For our test, we take a 3.99GB RAW (FRAPS) AVI video of Call of Duty: Modern Warfare 2 gameplay with stereo audio and transcode it to Windows Media format, 20Mbit/s, at the native resolution of 1080p.
You can’t get much closer than this, literally.
SPEC, short for Standard Performance Evaluation Corporation, is an organization comprised of market-leading vendors with the sole purpose of developing realistic benchmarks that can be used to gauge the performance of hardware of all sorts, from full-blown workstations to individual components, such as CPUs and GPUs, along with memory subsystems and so forth.
SPECviewperf is one of the organization’s flagship benchmarks which focuses on OpenGL viewport performance. For those who might be unaware, a viewport is the main screen developers handle when designing new products, or renders, and in order for fluid motion, the action must be smooth. As the vast majority of viewports for industry use utilize OpenGL (because of its cross-platform nature), and therefore performance results from this benchmark are important.
This isn’t so much a great motherboard benchmark as it is a gauge for overall stability. Since this is a rather intensive test, it helps us out in making sure that there are no faults to be seen.
As expected, both boards delivered nearly the same performance throughout all of the tests. Interestingly, though, the 790FXT inched just ahead of the 890FXA overall.
While application performance shouldn’t vary much between motherboards, one area where we can see greater differences is with synthetic benchmarks – at least with those that test both the storage and memory bandwidth/latency. Even still, if differences are seen, you are very unlikely to notice the difference in real-world usage, unless the performance hit is significant, which we’ve not found on any board we’ve tested in the past.
To test the storage I/O, we use a tool that we’ve been using for a number of years, HD Tune. The developer released a “Pro” version not long ago, so that’s what we are using for all of our storage-related benchmarking. The drive being tested is a secondary, installed into the first available Slave port, and is not the drive with the OS installed. To avoid potential latency, the drive is tested once Vista is idle for at least five minutes, and CPU usage remains stable at <1%.
I admit I found these results a little bit interesting, because unlike our PCMark Vantage test, the older 790FXT board scored higher here with HD Tune. The differences are not large enough to become a problem, but it’s still interesting nonetheless.
Like Futuremark, SiSoftware is another company that needs no introduction. As far back as I can remember using Windows, I was using Sandra to check up on my machine, and to stress it. Over time, the company has added in numerous ways to benchmark your PC, and there’s pretty much nothing it can’t tackle. The company even recently added in GPGPU benchmarking, so it’s really on top of things.
We’re back to normality here, with very little difference overall between the two boards.
Although we generally shun automated gaming benchmarks, we do like to run at least one to see how our GPUs scale when used in a ‘timedemo’-type scenario. Futuremark’s 3DMark Vantage is without question the best such test on the market, and it’s a joy to use, and watch. The folks at Futuremark are experts in what they do, and they really know how to push that hardware of yours to its limit.
The company first started out as MadOnion and released a GPU-benchmarking tool called XLR8R, which was soon replaced with 3DMark 99. Since that time, we’ve seen seven different versions of the software, including two major updates (3DMark 99 Max, 3DMark 2001 SE). With each new release, the graphics get better, the capabilities get better and the sudden hit of ambition to get down and dirty with overclocking comes at you fast.
Similar to a real game, 3DMark Vantage offers many configuration options, although many (including us) prefer to stick to the profiles which include Performance, High and Extreme. Depending on which one you choose, the graphic options are tweaked accordingly, as well as the resolution. As you’d expect, the better the profile, the more intensive the test.
Performance is the stock mode that most use when benchmarking, but it only uses a resolution of 1280×1024, which isn’t representative of today’s gamers. Extreme is more appropriate, as it runs at 1920×1200 and does well to push any single or multi-GPU configuration currently on the market – and will do so for some time to come.
Like the vast majority of our benchmarks throughout this review, neither motherboard totally overcomes the other in terms of performance, and gaming isn’t much different.
Before discussing results, let’s take a minute to briefly discuss what I consider to be a worthwhile overclock. As I’ve mentioned in past content, I’m not as interested in finding the highest overclock possible as much as I am interested in finding the highest stable overclock. To me, if an overclock crashes the computer after a few minutes of running a stress-test, it has little value except for competition.
How we declare an overclock stable is simple… we stress it as hard as possible for a certain period of time, both with CPU-related tests and also GPU-related, to conclude on what we’ll be confident is 100% stability throughout all possible computing scenarios.
For the sake of CPU stress-testing, we use LinX. Compared to other popular CPU stress-testers, LinX’s tests are far more gruelling, and proof of that is seen by the fact that it manages to heat the CPU up to 20Â°C hotter than competing applications, like SP2004. Also, LinX is just as effective on AMD processors. Generally, if the CPU survives the first half-hour of this stress, there’s a good chance that it’s mostly stable, but I strive for a 12 hour stress as long as time permits.
If the CPU stress passes without error, then GPU stress-testing begins, in order to assure a system-wide stable overclock. To test for this, 3DMark Vantage’s Extreme test is used, with the increased resolution of 2560×1600, looped nine times. If both these CPU and GPU tests pass without issue, we can confidently declare a stable overclock.
Because we received a faulty motherboard with our X6 1090T launch kit, I wasn’t able to overclock the processor as I had hoped. Since then, I hadn’t found another opportunity, but with this review, I was finally able to buckle down and see what our sample is made of. Believe it or not, it seems to be built with less awesome than retail chips!
In looking around the Web, it seems like 4.0GHz overclocks are rather straight-forward to achieve, but that wasn’t the case with our sample. At least, not where stable operation is concerned. LinX is admittedly hardcore, but I can’t feel entirely confident unless it passes that test, and in this case, 4.0GHz and higher just didn’t happen, no matter the voltage or tweaks I made.
Temperatures were not the issue, either, since our Corsair H50 did a great job in keeping all of the cores at well under 50°C. After a bunch of tweaking, I found that even 3.95GHz wasn’t going to happen, but when I whittled the frequency down just a little bit more, 3.92GHz became stable:
Achieving this wasn’t cut-and-dry though, as I did have to raise the CPU voltage a fair bit, from the stock 1.475v up to 1.600v. This isn’t a “dangerous” spot (the entry turns from white to pink at 1.875v), but I wouldn’t want to go too much higher. For this overclock, I didn’t have to touch any of the other voltages, but the RAM was still at 1.65v as that’s required to have it operate at its stock-clock of DDR3-1600 8-8-8.
Because this was my first foray into overclocking this CPU, it’s hard for me to critique the success of OC’ing on this board. As we get more boards in, we’ll see if we can’t push that chip just a bit higher. But from what I’ve seen, and all the time I spent overclocking with the board, I have no complaints. And from experience of overclocking with Gigabyte’s other boards, I am fairly confident that our sample just won’t go any higher, rather than it being the board at fault.
It goes without saying that power efficiency is at the forefront of many consumers’ minds today, and for good reason. Whether you are trying to save money or the environment – or both – it’s good to know just how much effort certain vendors are putting into their products to help them excel in this area. Both AMD and Intel have worked hard to develop efficient chips, and that’s evident with each new launch. The CPUs are getting faster, and use less power, and hopefully things will stay that way.
To help see what kind of wattage a given configuration draws on average, we use a Kill-A-Watt that’s plugged into a power bar that’s in turn plugged into one of the wall sockets, with the test system plugged directly into that. The monitor and other components are plugged into the other socket and is not connected to the Kill-A-Watt. For our system specifications, please refer to our methodology page.
To test, the computer is first boot up and left to sit at idle for five minutes, at which point the current wattage is recorded if stable. To test for full CPU load, LinX is run with 6144MB memory usage for a total of eight – ten minutes. During that run, the highest point the wattage reaches on the meter is captured and becomes our “Max Load”.
The feature-sets on these two boards are mostly the same, but the 890FXA-UD5 ups the ante with SATA 6Gbit/s and USB 3.0, so the minor hike in power consumption seems warranted.
Although this was the first 890FX board I really had the opportunity to spend a lot of time benchmarking the heck out of, I come out of testing impressed, although that in itself isn’t too surprising. As a whole, the 890FXA-UD5 reminds me a lot of the 790FXT-UD5P, and for the most part, it’s primarily because the differences lay with the chipset and SATA/USB 3.0… there’s not too much different.
The 890FXA-UD5 offers a huge amount of functionality, from the 4x PCI-E x16 slots to the 5 fan ports scattered about. The addition of SATA and USB 3.0 make the entire deal that much sweeter, but as time goes on, those features are becoming less of a luxury and more of a common occurrence. The whole package here is good, though, with nary a major complaint from me.
Price-wise, this board falls in at around $180, which puts it in line with other mainstream 890FX boards of its feature-set. There are some 890FX boards that cost less, but some features are of course sacrificed, and really, it’s up to you to decide if they are worthy of being sacrificed or not. If you don’t particularly care for the dual-GPU x16 capability, for example, you could always back down to 890GX boards. Even Gigabyte’s own 890GPA-UD3H looks quite good and full-featured, and retails for just $140. It does offer less features as whole though, but nothing important as I can see.
If in the market for an 890FX board, you won’t go wrong with this one. I still can’t speak too much on overclocking though, until I have some basis for comparison. I’m leaning towards the fact that our 1090T just isn’t that great of an overclocker, but until I benchmark it with another 890XX board, I won’t be able to say for certain.
Have a comment you wish to make on this article? Recommendations? Criticism? Feel free to head over to our related thread and put your words to our virtual paper There is no requirement to register in order to respond to these threads, but it sure doesn’t hurt!
Copyright © 2005-2019 Techgage Networks Inc. - All Rights Reserved.