October 17, 2011
Almost nobody should do what I am about to describe – that is, install and use more than one video card. Nobody really needs that much graphics performance. It's also technically complex and a little expensive. But sometimes you gotta say to hell with rationality and embrace the overkill.
Why? Battlefield 3, that's why.
I've been a fan of the series from the earliest days of Battlefield 1942, and I lost hundreds of hours to Battlefield 2 and Battlefield: Bad Company 2. I even wrote about the original Battlefield 2 demo on this very blog six years ago. So, yeah, I'm a superfan from way back. As much as I was anticipating Battlefield 3, I have to say the open beta convinced me it is everything I always wanted, and more. Glorious sandbox warfare on an enormous, next-generation destructible battlefield is a beautiful thing.
Since PC was the lead platform for Battlefield 3, it is the rare current game that isn't dumbed down to PS3 and Xbox 360 console levels; it is a truly next-generation engine designed to scale over the next few years of PC performance.
This also means it's going to be rough on current PCs; at a minimum, you'll need a fast dual core CPU, and a modern video card with 512mb or more video memory. It only goes up from there. Way up. Like most games, Battlefield 3 is far more limited by video card performance than CPU performance. This is normally the place where I'd trot out my standard advice urging you to buy one of the new hotness video cards released this holiday season. But unfortunately due to difficulties with the 40nm to 28nm process transition for ATI and NVIDIA, there aren't any new hotness video cards this year.
So what's a poor performance addicted Battlefield superfan to do? Double down and add another video card for more performance, that's what. Both ATI and NVIDIA have offered mature multi-GPU support for a few years now, and they've mostly settled on a simple Alternate Frame Rendering (AFR) strategy where each video card alternates between frames to share the graphics rendering work.
The little arrow there is a bridge attachment that you place on both cards so they can synchronize their work. Yes, there is a bit of overhead, but it scales surprisingly well, producing not quite double the performance but often in the area of 1.8x or so. Certainly enough to make it worth your while. You can technically add up to four video cards in this manner, but as with multiple CPUs your best bang for the buck is adding that second one; the third, fourth, and beyond provide increasingly diminished returns.
The good news is that the market crash in BitCoin GPU mining (if you don't know what this is, don't ask… please) means there is a glut of recent video cards up for sale on eBay right now. I have the same AMD Radeon HD 5870 that I've had since early 2010. I picked up another 5870 on eBay for a mere $170. This is a great video card, well ahead of its time when it was originally released, and even now only 10% slower than the fastest video card AMD makes. I simply dropped the second card in my system and installed the bridge connector.
You may recognize this computer as a further tweaked version of my last build (which is still awesome, by the way, and highly recommended). Anyway, for this to work, you'll need to establish a few things about your computer before rushing out and buying that second video card.
- A motherboard that has two video card capable PCI Express slots. Most aftermarket and enthusiast motherboards have this, but if you bought a system from say, Dell, it's less clear.
- A power supply with enough headroom to drive two video cards. Warning: modern gaming video cards are major power hogs -- they easily pull 100 to 200 watts under load. Each. Sometimes more than that! Be absolutely sure you have a quality power supply rated for a minimum of 600 watts. Each video card will have two power connectors, either 6 or 8 pin. Check that your power supply offers enough connectors, or that you have converters on hand.
- A case with sufficient airflow to dissipate the 400 to 800 watts of CPU and GPU heat that you'll be generating. Understand that this is serious amounts of heat while gaming, way more than even the highest of high end PCs would normally produce. Yes, it is possible to do this quietly (at least in the typical idle case), but it will take some engineering work.
Beyond that, I found there are some additional peculiarities of multi-GPU systems that you need to be aware of.
- Make sure that the two cards you use are not only of the exact same family (minor vendor differences are OK) but also have identical clock and memory speeds. It's not supposed to matter, but I found that it did and I had to flash one of my cards to set it to default speeds to match the other card.
- Do not attempt to overclock your system while getting the multiple GPUs up and running. In particular, be extremely careful not to mess with the bus speed as timings are critical when dealing with two GPUs on the PCI Express bus synchronizing their work. Trust me on this one.
- Do a clean video driver remove and install the very very latest video drivers after putting the second card in. I recommend Driver Sweeper to remove any traces of your old drivers before rebooting.
Don't say I didn't warn you about this stuff, because I said it would be technically complex in the first paragraph. But after a (lot) of teething pains, I'm happy to report that multiple GPUs really does work as advertised. I can crank up games to the absolute maximum settings on my 27" monitor and get nearly constant 60 frames per second. As you can see in the below example, we go from 44 fps to 77 fps in Battlefield: Bad Company 2.
Now, Battlefield 3 (beta) is so very bleeding edge that I can't quite get it to max settings even with two GPUs in play. But I can now run very high settings, much higher than I could with a single GPU.
To be honest, it's unlikely I will continue with multiple GPUs through 2012 when the next-generation video cards are released. With every new video card introduction, you're supposed to get about the same performance in the new card as you did with two previous generation video cards working together. So at best this is a sort of sneak preview, cheating time by pretending we have a next-generation video card today. There are obvious efficiencies involved in performing that parallelization on a single GPU die rather than through two distinct video cards sitting on the PCI bus.
There's also the issue of micro-stuttering. I personally haven't found that to be a big problem unless you're pushing the graphics settings beyond what even two cards can reliably deliver. But if the frame rate dips low enough, the overhead of synchronization between the cards can interfere with overall frame rate in a perceptible way.
A single fast video card is always going to be the simpler, easier, and cheaper route. But multiple video cards sure is nifty tech in its own right, and it wasn't too expensive to get started with at $170. In the meantime, I'm having a ball playing with it, and I am dying to test my configuration with the final release of Battlefield 3 on October 25th. Join me if you like!
[advertisement] What's your next career move? Stack Overflow Careers has the best job listings from great companies, whether you're looking for opportunities at a startup or Fortune 500. You can search our job listings or create a profile and let employers find you.
Posted by Jeff Atwood
You still running this on Vista?
Never been a fan of running two power hungry GPUs in parallel. I love to game and spend 2-3 hours a day playing games but the excessive power usage just isn't worth it for me. I'll keep playing BF3 at 1080p 45fps that my current i5-2500k, GTX460 system does.
What 27" monitor do you have?
old, single ATI 5870 power results (4.4 Ghz CPU)
Idle at Windows desktop: 128w (multiple monitors)
Prime95 full load: 255w
video stress (game): 210w
Prime95 + video: 332w
new, dual ATI 5870 results (4.0 Ghz CPU)
Idle at Windows desktop: 120w (single monitor) 136w (multiple monitors)
Prime95 full load: 215w
video stress (game): ~400w
video stress (furmark): ~530w (!!)
Prime95 + furmark: ~620w (!!)
Not exactly apples to apples since I lowered the overclock of the i7-2600k CPU a bit to compensate. We have obscene amounts of CPU power anyway, that's not even remotely the bottleneck in any current or new game. But, almost exactly a doubling of power consumption when gaming, which I guess isn't unexpected. That furmark thing is a monster, though. Really scary. No game loads the system as much as that damn synthetic benchmark does.
Some typical gaming numbers. I played a bit of Bulletstorm at max settings, 2048x1152 and got:
- CPU temp max 59c
- GPU temp max 64c, fan duty 49%
That's without turning the fan knob up at all. To be fair, Battlefield 3 loads it maybe 2x as much as this, but there's still a TON of headroom with the fan knob turned up.
With prime95 + furmark -- which I consider to be utterly absurd but a good representative of the "you will never ever see it this bad in real life" absolute crazy possible maximum:
- CPU temp 76c
- GPU temp 84c, fan duty 86%
I don't like running this test for too long because it's ... ridiculous, but it does trend toward stabilization at about those numbers. Note that the video card fans don't even get to 100% which indicates the cards are not totally stressed, temperature wise, even in the dumb furmark case.
Out of curiosity, does your case fit the Corsair H100 with push/pull fans without mods? I've been considering getting the 600T
For all your obsessive chasing of the latest and greatest, I'm surprised to see you still using internal drive bays.
Swappable hard-drives makes loads of things a whole lot easier.
Some problems creep up when using a mini-ITX build: you can have only one card. But thanks to some good cases and video card manufacturer, the GTX590 can come to the rescue.
Another BF fan since 1942 here! :)
I just got a GTX570 for BF3, the beta ran great for me. At 1980x1200 4xAA with High settings I got around 45 FPS. When the game is released I'll tweak the settings and try to get a stable 60 FPS without sacrificing graphics, often it's the most subtle effects that are the most demanding.
In multiplayer games like BF3 though you might sometimes want to run lower settings even if your machine could handle higher, because all those effects can obscure your view and make it harder to spot enemies. In BF: Vietnam, there was a lot of high grass to hide in, but if you turned the settings down the grass would disappear, rendering hiding enemies visible.
@jdege How in the name of god do you compare swappable hard drive bays to the latest and greatest? Why would you even need swappable hard drive bays? Nowadays, hard drives are so absurdly large (with the notable exception of SSDs), that you could fit any OS you could possible hope to use on a single hard drive.
"Why would you even need swappable hard drive bays?"
Backups. Multiple OSes. Experimentation.
It's a lot easier to install Windows and Linux on separate drives, and to swap the drives, than to keep them working together in a dual boot.
Right now I'm in the process of trying out Ubuntu 11.10. So I've made a copy of my Ubunto 10.04 disk, booted from it, and am currently running Ubuntu's upgrade process. If something goes wrong, it's easy enough to swap back to the original.
Like you said, HDs are cheap, these days. Enough so that having a bunch of them is quite reasonable. And being able to swap them in and out without having to crack the case is a major convenience.
I'm surprised this post came out specifically for BF3. At 1080p I ran the game on Ultra and had no problem with 60FPS even on Caspian with a ton of crap going on.
My setup is (all timings are stock, no o.c.):
Resolution is set at 1920x1080.
I think for most people SLI/Xfire for BF3 alone would not be needed. This should only be necessary if you're running a crazy resolution...say 2560 x 1600. If you look at the latest Steam survey, less than 1% have better than 1900 resolution.
People were doing multiple video cards a long time ago with 3DFX cards. Back then, SLI stood for Scan Line Interleave. Instead of interleaving frames, they would interleave individual lines of the same frame. Amazing how little things change.
It's a lot easier to install Windows and Linux on separate drives, and to swap the drives, than to keep them working together in a dual boot.
It isn't hard at all. Even if you re-install Windows, it takes all of two minutes to boot from the Linux install disc and put GRUB back in the MBR. Swapping drives is unnecessary mechanical wear-and-tear. It's like stopping to physically replace the gearbox in your car because it's too hard to push in the clutch and move the stick to switch gears.
"It isn't hard at all."
If it's just dual-booting a Windows and a Linux installation, perhaps. But how many versions of Windows do you have? Of Linux?
When you upgrade from Windows 7 to Windows 8, will you do it on your live system, or on a copy? If you do it on your live system, how long will it take to restore from backup, after something goes wrong?
For that matter, how long has been since you restored to a bare drive from your backups? If you've never done so, how do you know your backups are actually working?
"Swapping drives is unnecessary mechanical wear-and-tear."
Drives are cheap.
I'm really surprised that you are supporting Battlefield 3 after they announced that it would not be available on steam. This sounds exactly like the type of thing you would be against. If you haven't heard of this issue, I highly recommend you do some research.
The basic facts are:
Battlefield 3 won't be available on steam. In fact, EA has been pulling a number of games off steam.
EA is pushing the use of their own download manager. This download manager is terrible, it doesn't handle any of the DLC, patching, keys, or installation. The user still has to do that on their own.
EA has a terrible track record with DRM. Remember SecureRom? Thousands of users were unable to play games they legally purchased because of faulty DRM schemes. This has happened more than once on EA titles.
EA has stated that they will remove your ability to re-download you have purchased 1 year after the purchase date (unless you pay extra for "re-download insurance").
EA has stated that they will delete your account (and your license to all content purchased under that account) after 2 years of inactivity.
This is a giant step backwards for the gaming community.
>I lost hundreds of hours to Battlefield 2 and Battlefield: Bad Company 2.
Forget the tech, I want to know how you find time to do games and still get so much done, dang.
I just bought a Radeon 6870 and that low 37.3 benchmark score makes me think I should take it back and get something else.
Any advice from the gallery? I need to stay under $200 total.
"I just bought a Radeon 6870 and that low 37.3 benchmark score makes me think I should take it back and get something else."
Look at the benchmark. 2560x1600 with max settings and 4xAA. If you're going to run that resolution with max settings then yes you will want a new card. If you're running 1920x1080 or 1600x900 you'll probably be ok. Also, some of the settings are redundant at higher resolutions. The higher the resolution, the less AA you need. The difference in 8x AA and no AA at 2560x1600 is much less noticeable than the same comparison at 1440x900.
"I just bought a Radeon 6870 and that low 37.3 benchmark score makes me think I should take it back and get something else. "
The 6870 is no slouch. If you look at the chart this is at 2560x1600 resolution with max settings. I doubt if you are on a budget you have a monitor at that resolution.
Also the chart is mainly SLI/CF configurations. The GTX 580 is the fastest single card for this particular game and still only manages 57 fps.
If you feel you must get a better card I would look at a HD6950 these have the same hardware as a HD6970. A firmware modification can swap it for you and unlock the extra processing free of charge. Nice little trick.
I know the third crd doesnt do much for the frames, but it does something else. It rids you of the horrible micro-stutter.
Hold on, the lowest possible config on PC requires an 8800GT? In other words, if I haven't upgraded within the past few years, I won't actually be able to play the game? Unlike my console brethren, who'll be able to run it just fine on ancient hardware?
This is why anti-consolization people drive me nuts. I don't want to spend hundreds of bucks on upgrades, so why should I have a lesser on experience on PC than on consoles, even though I have substantially better hardware?
Thanks for the feedback. I'll stick with the 6870 for a year or so then move to the next gen.
I am sorry, but I find this whole concept ridiculous.
First of all, I've been a gamer for the last 22 years of my life, starting with an ancient Amstrad 6128 and a software developer for the last 15 years. I could always find time to play games, some times I would even lock myself for 1-2 days in order to complete a game so that I could back to my work (I co-own a media & software company). I am not really into multiplayer, since I prefer a solid story and gameplay over a clan. Even so, I like to being able to play a game with maximum details at my monitor's resolution and as such I usually buy at least decent video cards (such as the Radeon X800, the nVidia 8800GTX and the Radeon HD 6850 a few months ago).
What I cannot fathom is the need to increase my electricity bill and my computer's noise to ridiculous extents by buying two power hungry video cards just to have sligthly better visuals in a single game. I often wander around the net and see people that will always attempt to install and overclock the very best in their computers, usually in overclocker forums, users that in the vast majority of cases have absolutely no need for such watthour burners. The only "extreme" need I had is a third monitor but this had a real and valuable ROI.
The main problem with the the dual video card approach is that it allows the game companies to stay inefficient. I could buy an HD 6970 which is an immense powerhouse of a card (similar to what 580 is) and still wouldn't be enough according to Battlefield's developers! F*** that noise, I'd rather you sit down and work on your engine more efficient like Rage, Burnout:Paradise City and Crysis (2) did, long before Battlefield.
That's only the half of the story on the matter. The second issue I have is with the gamers who buy into the super hardware trend.
Honestly, do you really care if you are "only" able to achieve a 2x or even 4x antialiasing on your computer, instead of an edge detect 8x? WIll it make such a difference if the shadowmap used is of 8192^2 resolution instead of 16384^2? Will you really notice a x64 tessellation?
After a point in cost we have diminishing returns (to a ridiculous extent). Nor you have to go very far in order to achieve the 80% of the maximum graphical experience. To that extent, probably you'll never even notice the difference of the remaining 20% of a game's graphic potential in a game like Battlefield. I would go as far as to say that this is a con we willingly submit into while not based into a realistic need, even if you see yourself as a gamer.
I had a few more things to say about the whole commando mentality some gamers develop (while lucky themselves for not having to join the army) but I'll digress. Cheers.
Wow, Your cabinet is very well laid out. No clutter of wires inside and also you have managed to avoid extra dirt that gets accumulated inside a cabinet overtime.
What is that material (Grey coloured, is that foam?) that you have put in on the blank spaces of your cabinet, I think that is doing the trick of saving your cabinet from mud accumulation. How did you manage to manage the cables inside of your cabinet?
Please throw some light on that too.
Note that the plain single 6970 still puts out over 40 fps on that scale.
Faster refresh than television, and this is on a 27" monster monitor (2560x1600 means a 27" ala Dell or Apple) on maximum quality.
As you sort-of state, this is all a complete waste of time.
But then, people who can buy a $900-1000 monitor for gaming can afford two high end video cards for marginal improvement, I guess...
(I have a 6950 at home, and I cannot get it to be "slow" in maximum quality on any game I've tried on a 1080p monitor.)
> What I cannot fathom is the need to increase my electricity bill and my computer's noise to ridiculous extents by buying two power hungry video cards just to have sligthly better visuals in a single game.
BF3 is a truly next-generation engine, and real-time visuals like that simply require more horsepower than older engines.
Video quality is a bit sketchy, but this tech talk covers a lot of it:
It's amazing stuff.
(I agree that 4x AA isn't always *necessary*, but higher resolutions -- ideally the native resolution of your LCD panel -- does look significantly better.)
Jeff, no argument on the 30 vs 60fps; the difference does not get even close to how smoother 60fps feel.
My argument is on the need to max out the graphics performance for the actual returns. You are paying $60 for the game plus $(triple digit number here) for the VGA plus the increase in the electricity cost. Is a game really worth all this?
True, you may not want to wait 2 years in order to get better hardware and be able to play Battlefield 3 smoothly when everyone will have jumped ship on Battlefield 4 or Call of Duty: Space warfare or whatever, but in those two years your two VGAs will become obsolete and will have to be replaced. Let's have a closer look:
I paid a rather measly €150 (about $200) for my 6850. Your 2x 5870s cost about $620 (about €450) according to the prices you've posted. If by saving $300 I have to stay content with Medium graphics, then by all means, so be it. And should we be talking about a single player experience that offers replayability, I could revisit the "powerhouse" with my single Radeon HD8870 or my nVidia 770 that will probably use less electricity too; I doubt that any game can become ugly in such an amount of time.
It's the same story with the 24GB of RAM posted a while ago. Sure it's nice and cheap, but you'll have to throw the DIMMs away when Sandybridge's successor arrives. Unless you have a real need for that much memory (by real, I mean something that makes you money and absolutely cannot be done efficiently with less memory), you've effectively wasted your money.
PS. To be honest, I've experienced a difference between 2x and 4x AA - it's actually noticeable. Still, even 2x looks better than no AA so I'll try to at least achieve that.
Regarding the native resolution, if I have to reduce my resolution due to performance reasons, I usually exit the full screen mode and play on windowed. My main display is a 27" one, so it looks big enough even with a 1440x900 game window resolution, plus Alt-tabbing works much more robustly :)
Jeff have you considered after market fans/heatsinks for your GPU's. I can't tell if they are already or not, your image looks like it has heat pipes sticking out but i'm not sure if they were part of the normal 5870.
I have a single 6950 flashed bios to be a 6970 it was just too noisy with the small fans though. I added a Thermalright Shaman heatsink and fan (14 cm fan) and now its virtually silent especially as I reduce the PWm fan speed and still cooler. Admittedly this takes a third available PCI space but the difference is incredible compared to stock. Not sure if you could fit two cards in though even blocking all your PCI/PCI-express spaces.
Think we need better motherboard and case designs for GPU's to be honest we shouldn't have to block expansion ports so often.
tried gaming in stereo 3d? its awesome!
> I'd rather you sit down and work on your engine more efficient like Rage, Burnout:Paradise City and Crysis (2) did, long before Battlefield.
Rage is your example of an optimized game? There's lots of reported issues even on consoles, not to mention on PC. It's basically unplayable on my HD6870+E8400 system that can handle Crysis 2 at ultra high settings, and Crysis isn't a good example of an optimized game either. Battlefield 3 beta performed better for me than Crysis 2 did.
> I just bought a Radeon 6870 and that low 37.3 benchmark score makes me think I should take it back and get something else.
This was exactly my situation and train of thought, until I realized that the benchmark was for a resolution larger than 1080p and used 4x AA.
"Rage is your example of an optimized game? There's lots of reported issues even on consoles, not to mention on PC."
Less so patched and with the latest ATI drivers.
I spoke about optimization, not QA. True Rage has serious bugs with AMD VGAs which is inexcusable, but on the other hand, after installing the update and their special driver I got excellent performance with the textures locked at 8192K. The game didn't look as great as they made it up to be but it was smooth on what its developers perceived as high quality settings.
Crysis 2 had perfect framerate at high settings, it got really bad later at the game when you encounter a raining stage. Plus it was a "meh" of a game too.
In any case, I got hold of BF3 and tried it out. It behaved perfectly at Ultra settings with AA disabled (30-60fps steadily) and while it wasn't as smooth as in Low settings, now more than ever I can't justify the cost of a second video card. In other words, this means that DICE actually did a good job with its engine, but employed lots of scare tactics (perhaps to force people to buy expensive hardware). Keep in mind that I played the game in a 1920x1200 resolution.
PS. The graphics of Rage are a joke when compared to Frostbite 2's. It's difficult for me to believe that a genius like Carmack would produce something that lacks many modern features (such as dynamic lighting), especially since these were implemented in older versions of his engine.
I bet that foam keeps your CPU and other components nice, snug and warm.
Also, Christopher Elwell (christopherelwell.com) is an excellent web developer.
haha, you can delete that last post. I tried to post in your name, but apparently editing the cookie wasn't enough. Does say "You are currently signed in as Jeff Atwood." though.
As usual excellent job. Thanks to your perception of things and a very good command of the techniques, you offer us some very beautiful and good things. Quick, your site is very pleasant to view.
voyance en direct