GTX 480 SLI PCIe Bandwidth Perf. - x16/x16 vs. x4/x4
Previously we tested real world gaming with x16/x16, x16/x8, x8/x8 dual video card configurations and compared results. Some of our readers were very surprised with the data. This time we have scaled down to x4/x4 to see if that negatively impacts performance at all in a single display configuration.
If you haven’t been keeping up with our recent round of PCIe Bandwidth performance testing, you’ve missed some great testing and tremendously interesting results. Let’s recap a bit, and make sure that you know what we've tested so you can look back and make comparisons.
PCIe x16/x16 versus x16/x8 - In this evaluation we tested several different graphics cards in SLI and CFX configurations to see what the real world differences were between x16/x16 and having the secondary card operate in x8 mode. Our discovery was that there was not a discernible difference in gameplay performance, and no difference in gameplay experience. It did not change anything by having your secondary graphics card in an electrical x8 slot or operating in x8 PCIe 2.0 mode. This was good news for people with newer systems that want to move video cards further apart for cooling reasons or were forced to run in this mode due to other devices taking up PCIe lanes or motherboard limitations.
PCIe x16/x16 versus x8/x8 - Next we took the fastest graphics card combination we have, GeForce GTX 480 SLI, and compared x16/x16 versus x8/x8 PCIe 2.0 modes. Our results revealed that in x8/x8 mode performance was lower only with triple display setups at 5760x1200 resolution. There were definite signs of lower framerates on the graphs, though in-game it was hardly noticeable. We also found that at 2560x1600 x8/x8 made absolutely no difference at all, so if you were gaming on a single display there was no harm in running at x8/x8.
In light of this testing, this reader wanted to know if x4/x4 (equivalent to x8/x8 PCIe 1.X) had any detriment to gameplay performance. Therefore, in this quick look we are downgrading both video cards to PCIe x4 2.0 mode, but we are only going to run at 2560x1600 resolution. We already know x8/x8 affects performance at 5760x1200, so it stands to reason x4/x4 would even more. However, x8/x8 did not affect performance at 2560x1600, so we want to see if x4/x4 does at all. And quite frankly, we don't see too many HardOCP readers running Eyefinity or NV Surround with motherboards quite that old. So overall, we don't see much reason for the testing, but we just wanted to know too.
The system setup is as follows: MSI X58 Eclipse (two x16 PCIe slots native), Intel Core i7 920 @ 3.6GHz, 6GB DDR3, Dell 3007WFP, Win7 x64. We are using NVIDIA drivers 258.96 WHQL.
We are comparing GeForce GTX 480 SLI x16/x16 to x4/x4 in three games, Aliens vs. Predator, Bad Company 2 and Metro 2033. We are taping off pins to force x4 mode on both video cards.
GTX 480 SLI - 2560x1600
Aliens vs. Predator
In Aliens vs. Predator at 2560x1600 with 4X AA we do see some performance differences by downgrading to x4/x4 PCIe. The average framerate is lower, but not huge, what pops out at me more is the minimum framerate and the amount of times the performance dropped down between 32-45 FPS. It seems with x4/x4 there were more drops in performance than x16/x16. The game was still more than playable though, even with those drops, but it is at least interesting that they show up there in x4/x4.
Battlefield: Bad Company 2
In Battlefield: Bad Company 2 at 2560x1600 with 4X AA we were honestly shocked that there was no difference at all. Performance is right in-line with x16/x16 performance. There was one area, where a mortar is basically blowing up right in front of you kicking up a lot of dust and debris (particles), and x4/x4 was actually faster there, but that is really all about the dynamic gameplay of the game. That explosion doesn’t always happen the exact same way, sometimes it causes more of a performance hit just depending on precisely the timing of the character running through the smoke and debris. So in summary, the performance is exactly the same between x16/x16 and x4/x4 in this game.
In Metro 2033 the pattern falls more closely to the behavior of BC2 above. We just simply experienced no performance differences. The minimum framerate is 3 FPS lower, and there were a few more dips below 30 FPS than x16/x16, but in the game we honestly didn’t notice this. The gameplay experience was the same, and the game felt the same. The average framerate is really close between the two.
The Bottom Line
The results are actually a bit shocking to us to be honest. We weren’t so surprised that in the previous evaluation x8/x8 did not cause any differences at 2560x1600 but did at 5760x1200. However, we thought certainly at x4/x4 PCIe 2.0 mode there would be some kind of a bottleneck at 2560x1600, but the results have proven otherwise. Even with all the data that GTX 480 SLI is pushing across the PCIe bus, x4/x4 is NOT a bottleneck in a single display setup at 2560x1600 with AA enabled. The only game to show us any difference was AvP, but it did not affect the gameplay experience. Therefore, if you are on an aging PCIe 1.X system at x8/x8 mode (equivalent to PCIe 2.0 x4/x4) on a single display fear not, you are not holding back the performance of GTX 480 SLI or we guess with any CrossFireX or SLI configuration.