In last week’s review of Gigabyte’s P55A-UD4P, I raised a concern about using either S-ATA 3.0 or USB 3.0 devices and still achieving the best graphics performance possible. The problem, is that because these devices share the PCI-E bus, and also thanks to the lack of overall PCI-E lanes to begin with on P55, the primary graphics port will have degraded performance.
All you need to do is install a S-ATA 6Gbit/s drive, or a USB 3.0 device, and the primary PCI-E slot is pushed down to 8x speed, which may affect gaming performance on some GPUs. I’m of the belief that the performance would be next to nothing for most GPUs out there, but I do believe it could be an issue with dual-GPU cards, which require a lot more bandwidth than single-GPU offerings.
As it stands today, all of Gigabyte’s P55 offerings have this issue, but its X58 boards do not, because there is ample supply of PCI-E lanes to begin with. I touched base with ASUS to see if its P55 boards suffered the same kind of PCI-E degradation, and the simple answer was, “no”. To get around any potential issue, ASUS implements what’s called a PLX chip, which takes PCI-E lanes from the PCH to create a PCI-E 2.0 lane, which gets split in half to allow half of the available bandwidth to go to the S-ATA 3.0 usage, and the other half for USB 3.0.
Adding a PLX chip has its upsides and downsides, but the only real downside I foresee is the added cost to the board. At this point in time, Gigabyte offers a few P55 boards that go as low as $130, while the least-expensive ASUS P55 board I could find retailed for over $200. Those boards from ASUS would of course be far more feature-robust than Gigabyte’s $130 boards, but it is a little hard to look away when you are trying to set yourself up with a new build with both S-ATA 3.0 and USB 3.0 for the best price possible.
One other thing I mentioned in last week’s review of Gigabyte’s board is that I’d like to test out the effects of the degraded PCI-E slot, to see if there is a reason for concern or not. After all, if all you are using is a modest GPU, it might not need the full performance that the slot ordinarily offers, and this might be the reason Gigabyte seems so nonchalant about the entire matter. I still believe dual-GPU cards could cause an issue, but that’s something else that will be tested once I get my hands on what I need.