dogen wrote:
Mostly good advice for someone wanting to buy a dedicated gaming rig. But your advice on integrated graphics is a bit out of date. I've had no problems playing recentish games (Portal, Bioshock Infinite, Skyrim, etc) on Intel 4000/5000 chipsets (the former in an Intel NUC -- very nice little back-of-the-monitor box; the latter, a Macbook Air) -- not at the highest quality, or with AA and/or aniso turned way up, but very playable nevertheless. And, I've enjoyed seeing a measurable drop in my monthly electricity usage since I moved my gaming rig back to my visualization lab at work (where it belongs). I'd say the drop in energy usage, combined with my home office no longer being unpleasantly hot, is well worth the small reduction in GFX quality.
Some modern single players games with draw-distance tricks can be dumbed down and played like it's 2005. Especially those that port to or from consoles and their low-budget graphics AND if you're willing to use those low-budget settings. Others, like shooters, flight sims, PC only (Star Citizen, most every MMO, etc.), can't. That's why it's a pass.
Right now the BEST IG chips from Intel run worse (at about 90% of) than an NVIDIA GT 640 (
released to the budget market in 2011). The GT 640 is not exactly a great card or well respected:
... In early June, Kepler finally jumped in the kiddie pool aboard the GeForce GT 640, an offering situated firmly in budget territory. The card launched at $99 and currently retails in the $99.99-109.99 range, almost squarely opposite the Radeon HD 7750. Nvidia makes a couple other versions of the GT 640, too, but those are for PC vendors, and you won't find them listed at e-tailers like Newegg and Amazon.
So, yes, you can force graphical performance to something that's pretty low. And get an almost playable FPS. But, it's a big mistake as games only become more complex and these chipsets can't even run older, popular games reasonably well, like these Battlefield FPS results:
http://www.xbitlabs.com/images/cpu/amd- ... ield-2.png
Battlefield 3 is one of the most popular online shooters. It is not a new title, yet it is still a rather heavy application that’s suitable for benchmarking flagship graphics cards. When running on integrated graphics cores at the Full-HD resolution, the game cannot be expected to have a playable frame rate even if you lower its visual quality settings to their minimums. You can only get something like an average 24 fps with slowdowns to 20 fps.
You also mentioned BioShock, yes you can really dumb it down and 'play it.' But here are the FPS benchmarks:
http://www.xbitlabs.com/images/cpu/amd- ... hock-2.png
BioShock Infinite runs on Unreal Engine 3, a highly popular and widely used game engine. Judging by the results, the engine doesn't work well with Intel's integrated graphics cores.
So, we're not running games, worth a damn, on the Unreal Engine. That cuts out quality/decent graphics on the XCOM reboot, the Mass Effect Series, the Infinity Blade series, the Gears of War series, the Borderlands series, the Batman series. And a ton of other games that are not my style, interest or (perhaps) not that great, but other people like...
And we have the same issue with games on the Crysis Engine -- SOME of them barely run at an acceptable FPS at the lowest settings possible. Some just don't, like the 4000 you mentioned:
http://www.xbitlabs.com/images/cpu/amd- ... ysis-1.png
And we haven't gotten into the tearing issue these chipsets have. But I think I've made my point. Avoid IG if you want to game. You might be able to dumb SOME games down and get a low-quality experience. Other games won't even load. And what you're getting is a really crappy and a mere $150 card can get you at least two years of quality game play before you need to consider an upgrade.
Whereas the IG... Well, the results speak for themselves.