Latest News Posts

Social
Latest Forum Posts

ASUS P6X58D Premium
Bookmark and Share

asus_p6x58d_043010_article_logo.jpg
Print
by Rob Williams on May 3, 2010 in Motherboards

Intel may have launched its X58 chipset nearly a year-and-a-half ago, but board vendors continue to come out with new product as technologies improve. This past winter, ASUS released the P6X58D Premium, a high-end offering that boasts support for both SATA 3.0 and USB 3.0, and one that just begs to be pushed hard with overclocking.

Test System & Methodology

At Techgage, we strive to make sure our results are as accurate as possible. Our testing is rigorous and time-consuming, but we feel the effort is worth it. In an attempt to leave no question unanswered, this page contains not only our testbed specifications, but also a fully-detailed look at how we conduct our testing.

If there is a bit of information that we’ve omitted, or you wish to offer thoughts or suggest changes, please feel free to shoot us an e-mail or post in our forums.

Test System

The table below lists our machine’s hardware, which remains unchanged throughout all testing, with the exception of the motherboard. Each board used for the sake of comparison is also listed here, along with the BIOS version used. In addition, each one of the URLs in this table can be clicked to view the respective review of that product, or if a review doesn’t exist, you will be led to the product on the manufacturer’s website.

Component
Model
Processor
Intel Core i7 Extreme 965 – Quad-Core, 3.2GHz, 1.25v
Motherboards
ASUS P6T Deluxe – X58-based, 0804 BIOS (11/04/08)
ASUS P6X58D Premium – X58-based, 0808 BIOS (04/09/10)
ASUS Rampage II Extreme – X58-based, 0705 BIOS (11/21/08)
EVGA X58 SLI – X58-based, SZ21 BIOS (03/04/09)
Gigabyte EX58-UD4P – X58-based, F6 BIOS (02/26/09)
Gigabyte EX58-UD5 – X58-based, F4b BIOS (11/14/08)
Intel DX58SO – X58-based, 2786 BIOS (11/12/08)
Memory
OCZ Gold 3x2GB – DDR3-1333 7-7-7-20-1T, 1.60v
Graphics
Audio
On-Board Audio
Storage
Power Supply
Chassis
Display
Cooling
Et cetera

When preparing our testbeds for any type of performance testing, we follow these guidelines:

    General Guidelines

  • No power-saving options are enabled in the motherboard’s BIOS.
  • Internet is disabled.
  • No Virus Scanner or Firewall is installed.
  • The OS is kept clean; no scrap files are left in between runs.
  • Hard drives affected are defragged with Diskeeper 2008 prior to a fresh benchmarking run.
  • Machine has proper airflow and the room temperature is 80°F (27°C) or less.
    Windows Vista Optimizations

  • User Account Control (UAC) and screen saver are disabled.
  • Windows Defender, Firewall, Security Center, Search, Sidebar and Updates are disabled.

To aide with the goal of keeping accurate and repeatable results, we alter certain services in Windows Vista from starting up at boot. This is due to the fact that these services have the tendency to start up in the background without notice, potentially causing slightly inaccurate results. Disabling “Windows Search” turns off the OS’ indexing which can at times utilize the hard drive and memory more than we’d like.

Application Benchmarks

When benchmarking a graphics card or processor, performance is expected to scale in a certain manner, but that’s not the case with motherboards. Since motherboards tend to only be as fast as the hardware installed on them, we don’t run an exhaustive collection of benchmarks for the sake of avoiding redundancy. For the most part, one motherboard with an equal chipset to another should offer close to equal performance.

Our primary goal with motherboard-related benchmarking is to see if one motherboard is lacking in a certain area when compared to the rest. These discrepancies, if they exist, are usually caused by lackluster components on the board itself, which is why higher-end motherboards tend to see slightly better results than the more budget-oriented offerings.

To properly test the performance of a motherboard, we run a small collection of system-specific tools, such as SYSmark 2007, Sandra and HD Tune Pro. We then run real-world benchmarks using popular multi-media applications, such as Adobe Lightroom. To see how a board stacks up in the gaming arena, we benchmark using both Call of Duty: World at War and Half-Life 2: Episode Two.

We strongly feel that there is such thing as too many benchmarks when it comes to a motherboard review, so we keep things light, while still being able to offer definitive performance data.

Game Benchmarks

In an attempt to offer “real-world” results, we do not utilize timedemos in any of our reviews. Each game in our test suite is benchmarked manually, with the minimum and average frames-per-second (FPS) captured with the help of FRAPS 2.9.6.

To deliver the best overall results, each title we use is exhaustively explored in order to find the best possible level in terms of intensiveness and replayability. Once a level is chosen, we play through repeatedly to find the best possible route and then in our official benchmarking, we stick to that route as close as possible. Since we are not robots and the game can throw in minor twists with each run, no run can be identical to the pixel.

Each game and setting combination is tested twice, and if there is a discrepancy between the initial results, the testing is repeated until we see results we are confident with.

The two games we currently use for our motherboard reviews are listed below, with direct screenshots of the game’s setting screens and explanations of why we chose what we did.

Call of Duty: World at War

1680×1050
2560×1600


The Call of Duty series of war-shooters are without question some of the most gorgeous on the PC (and consoles), but what’s great is the fact that the games are also highly optimized, so no one has to max out their machine’s specs in order to play it. Since that’s the case, the in-game options are maxed out in all regards.

Half-Life 2: Episode Two

1680×1050
2560×1600

It might have been four-years-ago that we were able to play the first installment of the Half-Life 2 series, but it’s held up well with its new releases and engine upgrades. This is one title that thrives on both a fast CPU and GPU, and though it’s demanding at times, most any recent computer should be able to play the game with close to maxed-out detail settings, aside from the Anti-Aliasing.

In the case of very-recent mid-range cards, the game will run fine all the way up to 2560×1600 with maxed-out detail, minus Anti-Aliasing. All of our tested resolutions use identical settings, with 4xAA and 8xAF.


  • http://Www.mylifewithgaming.wordpress.com/ Mitchell Hall

    I doubt I’ll get an answer because this is so old but does anyone know if this is still a capable motherboard today ? And also is it compatible with a gtx 980? Thanks in advance.

    • http://techgage.com/ Rob Williams

      This is still a fantastic board. It’s the CPU that’d matter most when considering a high-end GPU like the GTX 980, but considering the fact that X58 was the enthusiast platform, I doubt that’d be an issue. I’d personally pick up a 980 if I were still running X58.

      • http://Www.mylifewithgaming.wordpress.com/ Mitchell Hall

        Oh wow thanks a lot. I actually did buy the gtx 980 it came in today. What happened was I bought a pc that someone built a few years ago with the intention of upgrading it enough to get it running 4k.. I plan on buying a second gtx 980 in the coming weeks to run in SLI. I’m very much an amateur with all of this and so it’s really been overwhelming. I’m just glad that the card I bought is going to run ! I also bought a Kingston HyperX 240GB Solid State Drive but that hasn’t come in yet. I guess I should have really known for sure all the things I’m buying is going to work but I can always return them.

        • http://techgage.com/ Rob Williams

          Considering that you’re looking to add a second 980 to the mix, I’d recommend Goolging around to see if others are running that card in SLI on a PCIe 2.0 machine. If you can believe it, I don’t even have such a machine here kicking around anymore, so I unfortunately am unable to test.

          While 2.0 isn’t likely to be a big detriment for a single GPU, its bandwidth might be a bit limited for a second one. I am truly not sure.

          Actually, I’ll shoot off an email to NVIDIA and see if anyone there can tell me anything about it.

          • http://Www.mylifewithgaming.wordpress.com/ Mitchell Hall

            Wow that would be incredible, thanks. I saw when reading up a bit on it that it supported sli and crossfire so I just assumed it would be okay. From the positioning of it though it looks as though I would have to remove my WiFi card in order to make both gpus fit. I’m not 100% sure but just taking a look at the way it is set up at the moment it looks that way.

            I really appreciate you taking the time to check that out for me so thanks again !

          • http://techgage.com/ Rob Williams

            It’s not a matter of whether or not the cards will fit on the board; it has to do with the available bandwidth. PCIe 3.0 came out years ago and doubled the bandwidth of PCIe 2.0, and while low-end cards won’t be affected too much, high-end cards can be – especially in multi-GPU configurations. NVIDIA told me this:

            “Yep, not a good idea. Only consider going that route if he’s using a 1080p monitor with no plans to upgrade anytime soon.”

            Unfortunately, that’s what I expected. So while your CPU and other components might not be a bottleneck, the PCIe bus definitely is.

          • http://Www.mylifewithgaming.wordpress.com/ Mitchell Hall

            Ahh I see that really sucks. Yeah I have a 40 inch 4k tv for a monitor. So what you’re saying is then, I will be fine running my single gtx 980 at the moment (which will hopefully allow at least 1080p/60fps) on this motherboard, but if I want to bring it up to 2 gtx 980s in sli to attempt 4k resolutions I’ll have to switch out the motherboard. Well I guess that will be a project for the near future. I’ll just install this for now. Sorry one other question.. the gtx 980 needs a 500w psu so does that mean if and when I decide to upgrade to 2 of them, will I need an 1000w power supply ?

          • http://Techgage.com/ Matthew Harris

            I hate to disagree with Rob here but you’re not going to need a new mobo. I’ve seen benchmarks run on PCI-e 3.0 8X (same bandwidth as 2.0 16X) running SLI with both 970’s and 980’s that showed no loss in FPS. You’re going to be fine.

          • http://techgage.com/ Rob Williams

            At what resolution? He’s looking to run 4K, which is far more bandwidth-heavy than 1440p or lower.

          • http://Www.mylifewithgaming.wordpress.com/ Mitchell Hall
          • http://techgage.com/ Rob Williams

            It’d be REALLY difficult for an SSD to simply not be compatible. Funny enough, I am using the exact same SSD. You’re fine with it.

          • http://Www.mylifewithgaming.wordpress.com/ Mitchell Hall

            Haha great. Sorry about all the questions, but this has been really helpful. I literally don’t know one person who knows about this stuff so I’ve kind of been on my own
            Thanks again.

          • http://techgage.com/ Rob Williams

            The questions are no problem. Do you happen to have a registered copy of 3DMark? If you do, you could run the 4K test, and tell me your score. Then I can run the same on out test platform here (which conveniently has a 4K monitor hooked up and a 980 in the machine).

          • http://Www.mylifewithgaming.wordpress.com/ Mitchell Hall

            I actually have no idea what that is haha but if it’s free I will gladly do it. I only have the single gtx 980 atm like I said and it just came in today so I’m planning on installing it tonight but yeah I would gladly do it if possible.

          • http://techgage.com/ Rob Williams

            There’s a free version but it doesn’t include the 4K test. Some GPUs just give a code for the app, so I just took my chances. It’s all good. I am not sure of a good benchmark to run on both rigs, because that one is specifically designed for 4K.

          • http://Www.mylifewithgaming.wordpress.com/ Mitchell Hall

            Oh maybe it will come with one then. Once i get time to get at it this evening I’ll check back let you know how it all went. Hoping this all goes smoothly haha.

          • http://techgage.com/ Rob Williams

            Nah, it’d come with the card. It’s no big deal, just would have been a simple way to test our machines against each other. Unigine @ 4K might be another good bet, but it’s not a benchmark “designed for” 4K. As Matt said, you could be fine getting a second card. I really wish I had an old rig to test that against. It’d make for an interesting article.

            I actually need to do some 980 SLI testing soon @ 4K, so maybe by the time you have everything together, we could compare results or something.

          • http://Www.mylifewithgaming.wordpress.com/ Mitchell Hall

            Yeah that’s what I meant, I haven’t even opened the box yet so it may be in there. I’ll let you know. That sounds great, anything I can do I will gladly do it. I love to learn about this stuff.

          • http://techgage.com/ Rob Williams

            Good to hear! It’s really a topic I should have covered long ago. I’m regretting it now.

          • http://Techgage.com/ Matthew Harris

            They tested it at 4K from what I remember and only saw a couple of percent difference between the two. Less than 2FPS on most games which is within the margin for error.

          • http://techgage.com/ Rob Williams

            A link would be helpful ;-)

          • http://Techgage.com/ Matthew Harris

            Rob, I watch so many youtube videos that my history will only go back 2 or 3 days. Finding a link will take me a while but I’ll give it a shot on Saturday when I’ve got some time to burn.

          • http://techgage.com/ Rob Williams

            Please don’t worry about it. I really can’t be bothered to watch a video for it :P

          • http://Techgage.com/ Matthew Harris
          • http://techgage.com/ Rob Williams

            A single timedemo isn’t a true gauge of things, especially when it’s a game that runs so well to begin with. I really wish I had a 2.0 rig kicking around so I could test this.

          • http://Techgage.com/ Matthew Harris

            That’s a quick link I dug up just to highlight it, but, the video I watched which tested nV hardware ran 4 or 5 games and the results were all in line with what he’s (she’s?) showing on that thread. They saw between 1-2 FPS difference between the two specs.

          • http://techgage.com/ Rob Williams

            That’s interesting, but considering the fact that NVIDIA told me that dual GTX 980s would be a bad idea with PCIe 2.0, I’m still a little skeptical. Most people don’t take benchmarking as seriously as I do, so even if I saw the video, I am not sure I’d believe it.

          • http://Techgage.com/ Matthew Harris

            Look for 3dMark results using 980’s in SLI on AMD AM3+ hardware. There isn’t a solitary PCIe 3.0 mobo out for AM3+ with the exception of the Sabertooth GEN3 which has simulated 3.0.

          • http://techgage.com/ Rob Williams

            So… that’s interesting. I forgot that AMD’s boards didn’t have PCIe 3.0, so I guess I -can- do comparisons. I could do GTX 980 SLI testing between an AMD FX 9590 and Intel 4970K.

            That’s if I can find the time. I am still trying to get out from under a massive pile of content.

          • http://Techgage.com/ Matthew Harris

            Yep. We’re still stuck in the dark ages. That is unless you want to run FM2+ which offers 3.0 but doesn’t offer more than 4 CPU cores and offers the added benefit of no GPU PhysX since the nV drivers will detect the APU as an AMD GPU and disable GPU acceleration.

          • http://techgage.com/ Rob Williams

            Aye, at this point the FX series is embarrassing. I helped a friend build such a rig a few months ago, and it was so difficult to find a motherboard that didn’t feel like it was years out-of-date. And I didn’t even realize at that time that it didn’t have PCIe 3.0.

            Such limitations might not matter in the grand scheme, but it does highlight just how far behind AMD is on the enthusiast stuff.

          • http://Www.mylifewithgaming.wordpress.com/ Mitchell Hall

            Really? Well that would be pretty great if it ends up working. I would have eventually upgraded my mobo anyway, but it would definitely be great to know I don’t have to in order to get the 4k resolutions I’m looking for. This is all very new to me. I’ve been a console gamer my whole life and still very much am but I’ve been excited to get into pc gaming for a while, and I figured if I was going to do it, I wanted to get the best visuals I possibly could. It’s also for quite a few titles that I’m really excited to play that are only on pc.

          • http://Techgage.com/ Matthew Harris

            I run AMD which doesn’t offer PCIe 3.0 for AM3+ so this topic has been of great interest to me.

          • http://techgage.com/ Rob Williams

            Look here for power requirements:

            http://techgage.com/article/asus-geforce-gtx-970-strix-edition-graphics-card-review/10/

            I hit 382W at full load using a single 980, which involves a PC with a six-core CPU overclocked to 4.5GHz. That means a 500W should suffice just fine. Add 200W for another 980, and that means a 700W should suffice. At that point I’d likely go with an 800W for the sake of just playing it safe, and giving everything some breathing room (and, it should be mentioned, the test bench only has a single SSD, not a bunch of HDDs and ODDs, which will tack on power requirements).