Haswell Update:

Because we have only managed to get hold of the top Haswell processor thus far, it is a little difficult to see where Haswell lies.  On the front of it, Haswell is more than adequate in our testing scenario for a single GPU experience and will perform as well as a mid-range CPU. It is when you start moving up into more GPUs, more demanding games and higher resolutions when the big boys start to take control.

On almost all fronts, the i7-4770K is the preferred chip over anything Sandy Bridge-E, if not by virtue of the single threaded speed it is due to the price difference.  Sandy Bridge-E is still there if you need the raw CPU horsepower for other things.

Our analysis also shows that without the proper configuration in the BIOS, having a GPU at PCIe 2.0 x1 is really bad for scaling.  On the ASUS Z87 Pro, the third full-length PCIe slot is at x1 bandwidth, as it shares the four PCIe lanes from the chipset with other controllers on board – if it is moved up to PCIe 2.0 x4, then the other controllers are disabled.  Nonetheless, scaling at either PCIe 2.0 x1 or x4 cannot compete with a proper PCIe 3.0 x8/x4/x4 setup.

Over the course of Haswell, we will update the results as we get hold of PLX enabled motherboards for some of those x8/x8/x8/x8 layouts, and not to mention the weird looking PCIe 3.0 x8/x4/x4 + PCIe x2.0 x4 layouts seen on a couple of motherboards in our Z87 motherboard preview.

As mentioned in our last Gaming CPU testing, the results show several points worth noting.

Firstly, it is important to test both accurately, fairly, and with a good will.  Choosing to perform a comparative test when misleading the audience by not understanding how it works underneath is a poor game to play.  Leave the bias at home, let the results do the talking.

In three of our games, having a single GPU make almost no difference to what CPU performs the best.  Civilization V was the sole exception, which also has issues scaling when you add more GPUs if you do not have the most expensive CPUs on the market.  For Civilization V, I would suggest having only a single GPU and trying to get the best out of it.

In Dirt3, Sleeping Dogs and Metro2033, almost every CPU performed the same in a single GPU setup.  Moving up the GPUs and Dirt 3 leaned towards PCIe 3.0 above two GPUs, Metro 2033 started to lean towards AMD GPUs and Sleeping Dogs was agnostic.

Above three GPUs, the extra horsepower from the single thread performance of an Intel CPU was starting to make sense, with as much as 70 FPS difference in Dirt 3.  Sleeping Dogs was also starting to become sensitive to CPU choice.

We Know What Is Missing

As it has only been a month or so since the last Gaming CPU update, and my hands being deep in Haswell testing, new CPUs have not been streaming through the mail.  However, due to suggestions from readers and a little digging, I currently have the following list to acquire and test/retest:

Celeron G1101
Celeron G1620
Pentium G2020
Pentium G6950
i3-2100
i5-3570K
i5-4570T
i5-4670K
i3-560
i5-680
i5-760
i5-860
i5-880
i7-920
i7-950
i7-980X
QX9775
Q6600
Xeon E3-1220L v2
Xeon E3-1220v2
Xeon E3-1230v2
Xeon E3-1245v2
Athlon II X2 220
Athlon II X2 250
Athlon II X2 280
Athlon II X3 425
Athlon II X3 460
Sempron 145
Phenom II X3 740
Phenom II X4 820
Phenom II X4 925
Phenom II X6 1045T
FX-4130
FX-4200
FX-4300
FX-4350
FX-6200
FX-6350
A8-5600K + Core Parking retest
A10-5800K + Core Parking retest

As you can imagine, that is quite a list, and I will be breaking it down into sections and updates for everyone.

But for now, onto our recommendations.

Recommendations for the Games Tested at 1440p/Max Settings

A CPU for Single GPU Gaming: A8-5600K + Core Parking updates

If I were gaming today on a single GPU, the A8-5600K (or non-K equivalent) would strike me as a price competitive choice for frame rates, as long as you are not a big Civilization V player and do not mind the single threaded performance.  The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580, as well as feel the same in the OS as an equivalent Intel CPU.  The A8-5600K will also overclock a little, giving a boost, and comes in at a stout $110, meaning that some of those $$$ can go towards a beefier GPU or an SSD.  The only downside is if you are planning some heavy OS work – if the software is Piledriver-aware, all is well, although most processing is not, and perhaps an i3-3225 or FX-8350 might be worth a look.

It is possible to consider the non-IGP versions of the A8-5600K, such as the FX-4xxx variant or the Athlon X4 750K BE.  But as we have not had these chips in to test, it would be unethical to suggest them without having data to back them up.  Watch this space, we have processors in the list to test.

A CPU for Dual GPU Gaming: i5-2500K or FX-8350

Looking back through the results, moving to a dual GPU setup obviously has some issues.  Various AMD platforms are not certified for dual NVIDIA cards for example, meaning while they may excel for AMD, you cannot recommend them for team Green.  There is also the dilemma that while in certain games you can be fairly GPU limited (Metro 2033, Sleeping Dogs), there are others were having the CPU horsepower can double the frame rate (Civilization V).

After the overview, my recommendation for dual GPU gaming comes in at the feet of the i5-2500K.  This recommendation may seem odd – these chips are not the latest from Intel, but chances are that pre-owned they will be hitting a nice price point, especially if/when people move over to Haswell.  If you were buying new, the obvious answer would be looking at an i5-3570K on Ivy Bridge rather than the 2500K, so consider this suggestion a minimum CPU recommendation.

On the AMD side, the FX-8350 puts up a good show across most of the benchmarks, but falls spectacularly in Civilization V.  If this is not the game you are aiming for and want to invest AMD, then the FX-8350 is a good choice for dual GPU gaming.

A CPU for Tri-GPU Gaming: i7-4770K with an x8/x4/x4 (AMD) or PLX (NVIDIA) motherboard

By moving up in GPU power we also have to boost the CPU power in order to see the best scaling at 1440p.  It might be a sad thing to hear but the only CPUa in our testing that provide the top frame rates at this level are the top line Ivy Bridge and Haswell models.  For a comparison point, the Sandy Bridge-E 6-core results were often very similar, but the price jump to such as setup is prohibitive to all but the most sturdy of wallets.  Of course we would suggest Haswell over Ivy Bridge based on Haswell being that newer platform, but users who can get hold of the i7-3770K in a sale would reap the benefits.

As noted in the introduction, using 3-way on NVIDIA with Ivy Bridge/Haswell will require a PLX motherboard in order to get enough lanes to satisfy the SLI requirement of x8 minimum per CPU.  This also raises the bar in terms of price, as PLX motherboards start around the $280 mark.  For a 3-way AMD setup, an x8/x4/x4 enabled motherboard performs similarly to a PLX enabled one, and ahead of the slightly crippled x8/x8 + x4 variations.  However investing in a PLX board would help moving to a 4-way setup should that be your intended goal.  In either scenario, the i7-3770K or i7-4770K are the processors of choice from our testing suite.

A CPU for Quad-GPU Gaming: i7-3770K with a PLX motherboard

So our recommendation in four-way, based on results, would nominally be an i7-3770K.  We cannot recommend the 4770K as of yet, as we have no data to back it up!  Although this will be coming in the next update, and if any predictions are made, the 4770K would be the preferential chip based on single thread speed and the newer chip. 

But even still, a four-way GPU configuration is for those insane few users that have both the money and the physical requirement for pixel power.  We are all aware of the law of diminishing returns, and more often than not adding that fourth GPU is taking the biscuit for most resolutions.  Despite this, even at 1440p, we see awesome scaling in games like Sleeping Dogs (+73% of a single card moving from three to four cards) and more recently I have seen that four-way GTX680s help give BF3 in Ultra settings a healthy 35 FPS minimum on a 4K monitor.  So while four-way setups are insane, there is clearly a usage scenario where it matters to have card number four.

Our testing was pretty clear as to what CPUs are needed at 1440p with fairly powerful GPUs.  While the i7-2600K was nearly there in all our benchmarks, only two sets of CPUs made sure of the highest frame rates – the i7-3770K/4770K and any six-core Sandy Bridge-E.  As mentioned in the three-way conclusion, the price barrier to SB-E is a big step for most users (even if they are splashing out $1500+ on four big cards), giving the nod to an Ivy Bridge configuration.  Of course that CPU will have to be paired with a PLX enabled motherboard as well.

One could argue that with overclocking the i7-2600K could come into play, and I do not doubt that is the case.  People building three and four way GPU monsters are more than likely to run extra cooling and overclock.  Unfortunately that adds plenty of variables and extra testing which will have to be made at a later date.  For now our recommendation at stock, for 4-way at 1440p, is an i7-3770K CPU.

What We Have Not Tested

In the intro to this update, I addressed a couple of points regarding testing 1440p over 1080p, as well as reasons for not using FCAT or reporting minimum FPS.  But one of the bigger issues brought up in the first Gaming CPU article comes from the multiplayer gaming perspective, when dealing with a 64-player map in BF3.  This is going to be a CPU intensive situation for sure, dealing with the network interface to update the GPU and processing.  The only issue from our side is repetitive testing.  I focused a lot on the statistics of reporting benchmarking results, and trying to get a consistent MP environment for game testing that can be viewed at objectively is for all intents and purposes practically impossible.  Sure I could play a few rounds in every configuration, but FPS numbers would be all over the place based on how the rounds went.  I would not be happy on publishing such data and then basing recommendations from it.

The purpose of the data in this article is to help buying decisions based on the games at hand.  As a reader who might play more strenuous games, it is clear that riding the cusp of a boundary between CPU performance might not be the best route, especially when modifications start coming into play that drag the frame rates right down, or cause more complex calculations to be performed.  In that situation, it makes sense to play it safe with a more powerful processor, and as such our recommendations may not necessarily apply.  The recommendations are trying to find a balance between performance, price, and the state of affairs tested in this article at the present time, and if a user knows that the future titles are going to be powerful and they need a system for the next 3-5 years, some future proofing is going to have to form part of the personal decision when it comes down to paying for hardware. 

When I have friends or family who come up to me and said ‘I want to play X and have Y to spend’ (not an uncommon occurrence), I try and match what they want with their budget – gaming typically gets a big GPU to begin and then a processor to match depending on what sort of games they play.  With more CPUs under our belt here at AnandTech, with an added element of understanding on where the data comes from and how it was obtained, we hope to help make such decisions.

As always, we are open to suggestions!  I have had requests for Bioshock Infinite and Tomb Raider to be included – unfortunately each new driver update is still increasing performance for these titles, meaning that our numbers would not be relevant next quarter without a full retest.  I will hopefully put them in the testing with the next driver update.

GPU Benchmarks: Sleeping Dogs
Comments Locked

116 Comments

View All Comments

  • ninjaquick - Tuesday, June 4, 2013 - link

    "If you were buying new, the obvious answer would be looking at an i5-3570K on Ivy Bridge rather than the 2500K"

    Ian basically wanted to get a relatively broad test suite, at as many performance points as possible. Haswell, however, is really quite a bit quicker. More than anything, this article is an introduction to how they are going to be testing moving forward, as well as a list of recommendations for different budgets.
  • dsumanik - Tuesday, June 4, 2013 - link

    2 year old mid range tech is competitive with, and cheaper, than haswell.

    Hence anandtech's recommendation.

    The best thing about haswell is the motherboards, which are damn nice.
  • TheJian - Wednesday, June 5, 2013 - link

    This is incorrect. It is only competitive when you TAP out the gpu by forcing them into situations they can't handle. If you drop the res to 1080p suddenly the CPU is VERY important and they part like the red sea.

    This is another attempt at covering for AMD and trying to help them sell products (you can judge whether it's intentional or not on your own). When no single card can handle the resolutions being forced on them (1440p) you end up with ALL cpu's looking like they're fine. This is just a case of every cpu saying hurry up mr. vid card I'm waiting (or we're all waiting). Lower the res to where they can handle it and cpus start to show their colors. If this article was written with 1080p being the focus (as even his own survey shows 96% of us use it OR lower, and adding in 1920x1200 you end up with 98.75%!!) you would see how badly AMD is doing vs Intel since the video cards would NOT be brick walled screaming under the load.

    http://www.tomshardware.com/reviews/neverwinter-pe...
    An example of what happens when you put the vid card at 1080p where cpu's can show their colors.
    "At this point, it's pretty clear that Neverwinter needs a pretty quick processor if you want the performance of a reasonably-fast graphics card to shine through. At 1920x1080, it doesn't matter if you have a Radeon HD 7790, GeForce GTX 650 Ti, Radeon HD 7970, or GeForce GTX 680 if you're only using a mid-range Core i5 processor. All of those cards are limited by our CPU, even though it offers four cores and a pretty quick clock rate."

    It's not just Civ5. I could point out how inaccurate the suggestions in this 1440p article are all day. Just start looking up cpu articles on other web sites and check the 1080p data. Most cpu articles show using a top card (7970 or 680 etc) so you get to see the TRUTH. The CPU is important in almost EVERY game, unless you shoot the resolution up so high they all score the same because your video card can't handle the job (thus making ANY cpu spend all day waiting on the vid card).

    I challenge anandtech to rerun the same suite, same chips at 1080p and prove I'm wrong. I DARE YOU.

    http://www.hardocp.com/article/2012/10/22/amd_fx83...
    More evidence of what happens when gpu is NOT tapped out. Look at how Intel is KILLING AMD at hardocp. Even if you say "but eventually I'll up my res and spend $600 on a 1440p monitor", you have to understand that as you get better gpu's that can handle that res, you'll hate the fact you chose AMD for a cpu as it will AGAIN become the limiter.
    "Lost Planet is still used here at HardOCP because it is one of the few gaming engines that will reach fully into our 8C/8T processors. Here we see Vishera pull off its biggest victory yet when compared to Zambezi, but still lagging behind 4 less cores from Intel."

    "Again we see a new twist on the engine above, and it too will reach into our 8C/8T. While not as pronounced as Lost Planet, Lost Planet 2 engine shows off the Vishera processors advancements, yet it still trails Intel's technology by a wide margin."

    "The STALKER engine shows almost as big an increase as we saw above, yet with Intel still dealing a crippling gaming blow to AMD's newest architecture."
    Yeah, a 65% faster Intel is a LOT right? Understand if you go AMD now, once you buy a card (20nm maxwell etc? 14nm eventually in 3yrs?) you will CRY over your cpu limiting you at even 1440p. Note the video card Hardocp use for testing was ONLY a GTX 470. That's old junk, he could now run with 7970ghz or 780gtx and up the res to 1080p and show the same results. AMD would get a shellacking.

    http://techreport.com/review/24879/intel-core-i7-4...
    Here, techreport did it in 1080p. 20% lower for A10-5800 than 4770K in crysis 3. It gets worse with farcry 3 etc. In Far Cry 3 i4770k scored 96fps at 1080p, yet AMD's A10-5800 scored a measly 68. OUCH. So roughly 30% slower in this game. HOLY COW man check out Tomb Raider...Intel 126fps! AMD A10-5800 68fps! Does Anandtech still say this is a good cpu to go with? At the rest 98.75% of us run at YOU ARE WRONG. That's almost 2x faster in tomb raider at 1080p! Metro last light INtel 93fps, vs, AMD A10-5800 51fps again almost TWO TIMES faster!

    From Ian's conclusion page here:
    "If I were gaming today on a single GPU, the A8-5600K (or non-K equivalent) would strike me as a price competitive choice for frame rates, as long as you are not a big Civilization V player and do not mind the single threaded performance. The A8-5600K scores within a percentage point or two across the board in single GPU frame rates with both a HD7970 and a GTX580, as well as feel the same in the OS as an equivalent Intel CPU."

    He's not even talking the A10-5800 that got SMASHED at techreport as shown in the link. Note they only used a RAdeon 7950. A 7970ghz or GTX 780 would be even less taxed and show even larger CPU separations. I hope people are getting the point here. Anandtech is MISLEADING you at best by showing a resolution higher than 98.75% of us are using and tapping out the single gpu. I could post a dozen other cpu reviews showing the same results. Don't walk, RUN away from AMD if you are a gamer today (or tomorrow). Haswell boards are supposed to take a broadwell chip also, even more ammo to run from AMD.

    Ian is recommending a cpu that is lower than the one I show getting KILLED here. Games might not even be playable as the A10-5800 was hitting 50fps AVG on some things. What would you hit with a lower cpu avg, and worse what would the mins be? Unplayable? Get a better CPU. You've been warned.
  • haukionkannel - Wednesday, June 5, 2013 - link

    Hmm... If the game is fast enough 1440p then it is fast enough for 1080p... We are talking about serious players. Who on earth would buy 7970 or 580 for gaming in 1080p? That is serious overkill...
    We all know that Intel will run faster if we use 720p, just because it is faster CPU than AMD, nothing new in there since the era of Pentium4 and Athlon2. What this articles telss is that if you want to play games with some serious GPU power, you can save money buy using AMD CPU when using single or even in some cases double GPU. If you go beyond that the CPU becomes a bottleneck.
  • TheJian - Thursday, June 6, 2013 - link

    The killing happened at 1080p also which is what techreport showed. Since 98.75% of us run 1920x1200 or below, I'm thinking that is pretty important data.

    The second you put in more than one card the cpus separate even at 1440p. Meaning, next years SINGLE card or the one after will AGAIN separate the cpus as that single card will be able to wait on the CPU as the bottleneck goes back to cpu. Putting that aside, hardocp showed even the mighty titan at $1000 had stuff turned of at 1080p. So you are incorrect. Is it serious overkill if hardocp is turning stuff off for a smooth game experience? 7970/GTX680 had to turn off even more stuff in the 780GTX review (titan and 780gtx mostly had the same stuff on, but the 7970ghz and 680gtx they compared to turned off quite a bit to remain above 30fps).

    I'm a serious player, and I can't run 1920x1200 with my radeon 5850 which was $300 when I bought it. I'm hoping maxwell will get me 30fps with EVERYTHING on in a few games at 1440p (I'm planning on buying a 27 or 30in at some point) and for the ones that don't I'll play them on my Dell 24 as I do now. But the current cards (without spending a grand and even that don't work) in single format still have trouble with 1080p as hardocp etc has shown. I want my next card to at least play EVERY game at 1920x1200 on my dell, and hope for a good portion on the next monitor purchase. With the 5850 I run a lot of games on my 22in at 1680x1050 to enable everything. I don't like turning stuff down or off, as that isn't how the dev intended me to play their game right?

    Apparently you think all 7970 and 580 owners are all running 1440p and up? Ridiculous. The steam survey says you are woefully incorrect. 98.75% of us are all running 1920x1200 or below and a TON of us have 7970, 680, 580 etc etc (not me yet) and enjoying the fact that they NEVER turn stuff down (well, apparently you still do on some games...see the point?). Only DUAL card owners are running above as the steam survey shows, go there and check out the breakdown. You can see the population (even as small as that 1% is...LOL) has TWO cards running above 1920x1200. So you are categorically incorrect or steam's users change all their resolutions down just to fake a survey?...ROFL. Ok. Whatever. You expect me to believe they get done with the survey and jack it up for UNDER 30fps gameplay? Ok...

    Even here, at 1440p for instance, metro only ran 34fps (and last light is more taxing than 2033). How low do you think the minimums are when you're only doing 34fps AVERAGE? UNPLAYABLE. I can pull anandtech quotes that say you'd really like 60fps to NEVER dip below 30fps minimum. In that they are actually correct and other sites agree...
    http://www.guru3d.com/articles_pages/palit_geforce...
    "Frames per second Gameplay
    <30 FPS very limited gameplay
    30-40 FPS average yet very playable
    40-60 FPS good gameplay
    >60 FPS best possible gameplay

    So if a graphics card barely manages less than 30 FPS, then the game is not very playable, we want to avoid that at all cost.
    With 30 FPS up-to roughly 40 FPS you'll be very able to play the game with perhaps a tiny stutter at certain graphically intensive parts. Overall a very enjoyable experience. Match the best possible resolution to this result and you'll have the best possible rendering quality versus resolution, hey you want both of them to be as high as possible.
    When a graphics card is doing 60 FPS on average or higher then you can rest assured that the game will likely play extremely smoothly at every point in the game, turn on every possible in-game IQ setting."

    So as the single 7970 (assuming ghz edition here in this 1440p article) can barely hit 34fps, by guru3d's definition it's going to STUTTER. Right? You can check max/avg/min everywhere and you'll see there is a HUGE diff between min and avg. Thus the 60fps point is assumed good to ensure above 30 min and no stutter (I'd argue higher depending on the game, mulitplayer etc as you can tank when tons of crap is going on). Guru3d puts that in EVERY gpu article.

    The single 580 in this article can't even hit 24fps and that is AN AVERAGE. So unplayable totally, thus making the whole point moot right? You're going to drop to 1080p just to hit 30fps and you say this and a 7970 is overkill for 1080p? Even this FLAWED article here proves you WRONG.

    Sleeping dogs right here in this review on a SINGLE 7970 UNDER 30fps AVERAGE. What planet are you playing on? If you are hitting 28.2fps avg your gameplay SUCKS!

    http://www.tomshardware.com/reviews/geforce-gtx-77...
    Bioshock infinite 31fps on GTX 580...Umm, mins are going to stutter at 1440p right? Even the 680 only gets 37fps...You'll need to turn both down for anything fluid maxed out. Same res for Crysis 3 shows even the Titan only hitting 32fps and with DETAILS DOWN. So mins will stutter right? MSAA is low, you have two more levels above this which would put it into single digits for mins a lot. Even this low on msaa the 580 never gets above 22fps avg...LOL. You want to rethink your comments yet? The 580's avg was 18 FPS! 1440p is NOT for a SINGLE 580...LOL. Only 25fps for 7970...LOL. NOT PLAYABLE on your 7970ghz either. Clearly this game is 1080p huh? Look how much time in the graph 7970ghz spends BELOW 20fps at 1440p. Serious gamers play at 1080p unless they have two cards. FAR CRY 3, same story. 7970ghz is 29fps...ROFL. The 580 scores 21fps...You go right ahead and try to play these games at 1440p. Welcome to the stutterfest my friend.
    "GeForce GTX 770 and Radeon HD 7970 GHz Edition nearly track together, dipping into the mid-20 FPS range."
    Yeah, Far Cry will be good at 20fps.

    Hitman Absolution has to disable MSAA totally...LOL. Even then 580 only hits 40fps avg.

    Note the tomb raider comment at 1440p:
    "The GeForce GTX 770 bests Nvidia’s GeForce GTX 680, but neither card is really fluid enough to call the Ultimate Quality preset smooth."
    So 36fps and 39fps avg for those two is NOT SMOOTH. 770 dropped to 20fps for a while.

    A titan isn't even serious overkill for 1080p. It's just good enough and for hardocp a game or two had to be turned down even on it at 1080p! The data doesn't lie. Single cards are for 1080p. How many games do I have to show you dipping into the 20's before you get it? Batman AC barely hits 30's avg on 7970ghz with 8xmsaa and you have to turn physx off (not nv physx, phsyx period). Check tom's charts for gpus.

    In hardocp's review of 770gtx 1080p was barely playable with 680gtx and everything on. Upping to 2560x1600 caused nearly every card to need tessellation down and physx off in Metro Last Light. 31fps min on 770 with SSAA OFF and Physx OFF!
    http://hardocp.com/article/2013/05/30/msi_geforce_...
    You must like turning stuff off. I don't think you're a serious gamer until you turn everything on and expect it to run there. NO SACRIFICING quality! Are we done yet? If this article really tells you to pair expensive gpus ($400-1000) with a cheapo $115 AMD cpu then they are clearly misleading you. It looks like is exactly what they got you to believe. Never mind your double gpu comment paired with the same crap cpu adding to the ridiculous claims here already.
  • Calinou__ - Friday, June 7, 2013 - link

    "Serious gamers play at 1080p unless they have two cards."

    Fun fact 2: there are properly coded games out there which will run fine in 2560×1440 on mid-high end cards.
  • TheJian - Sunday, June 9, 2013 - link

    No argument there. My point wasn't that you can't find a game to run at 1440p ok. I could cite many, though I think most wouldn't be maxed out doing it on mid cards and surely aren't what most consider graphically intensive. But there are FAR too many that don't run there without turning lots of stuff off as many sites I linked to show. Also 98.75% of us don't even have monitors that go above 1920x1200 (I can't see many running NON-Native but it's possible), so not quite sure fun fact2 matters much so my statement is still correct for nearly 99% of the world right? :) There are probably a few people in here who care what the top speed of a Veyron SS is (maybe they can afford one, 258mph I think), but for the vast majority of us, we couldn't care less about it since we'll never buy a car over 100K. I probably could have said 50K and still be right for most.

    Your statement kind of implies coders are lazy :) Not going to argue that point either...LOL. Not all coders are lazy mind you...But with so much power on pc's it's probably hard not to be lazy occasionally, not to mention they have the ability to patch them to death afterwards. I can handle 1-2 patches but if you need 5 just to get it to run properly after launch on most hardware (unless adding features/play balancing etc like a skyrim type game etc) maybe you should have kept it in house for another month or two of QA :) Just a thought...
  • Sabresiberian - Wednesday, June 5, 2013 - link

    So, you want an article specifically written for gaming at 2560x1440 to do the testing at 1920x1080?

    Your rant starts from that low point and goes downhill from there.
  • TheJian - Thursday, June 6, 2013 - link

    You completely missed the point. The article is testing for .87% of the market. That is less than one percent. This article will be nice to reprint in 2-3yrs...Then it may actually be relevant. THAT is the point. I think it's a pretty HIGH point, not low, and that fact that you choose to ignore the data in my post doesn't make it any less valid or real. Nice try though :) Come back when you have some data actually making a relevant point please.
  • Calinou__ - Friday, June 7, 2013 - link

    So, all the websites that are about Linux should shut down because Linux has ~1% market share? Nope.

Log in

Don't have an account? Sign up now