I have to mention power also. Considering that a HTPC is properly going to be on quiet a bit, I would like to se some info on the power draw, from these mobo's. The same goes in the roundup!! Also considering that the energy prices is headed only UP, even a little lower performance might be worth that in the long run!!
Frankly I'm quite frustrated from waiting for a site like AT for the past half a year or more to come out with more mATX reviews (until this review, which is a start).
I realize there are a lot of gamers OCers out there - very many AT readers. However, there are many (just as many?) non-gamer enthusiasts hoping to run stock-speed,SILENT, SFF systems out there - myself for one.
While lesser known sites have reviewed many of these products, I (and others like me I know of) have been waiting for AT to publish SOMETHING in the mATX / C2D (current and long-standing performance champ depending on the system config.) category for months on end! I realize you have dedicated folks working for each review category. However, AT - a site as a whole - still seems to have enough bandwidth to publish back-to-back LCD and heatsink reviews in a matter of day or two each. Yet you seem to have held off on prioritizing mATX system reviews for some inexplicable milestone until yesterday. I recall reading a vague comment in one of your reviews around the end of the year regarding an 'upcoming' mATX review, if I remember correctly. In my opinion it was already too delayed a review. Little did I know I'd be waiting another two months for such a review.
Geez! Publish the review in parts if you must, but don't make your readers hold off for this long and think all is well! What's the point of releasing this type of review, months after products became widely available and just a few months before the next round of technology updates?!
-----------------------
Second is a set of requests for the (personally) much anticipated upcoming mATX review next week 'as well as' for future reviews:
Requests for the upcoming mATX review:
* Please try to include Asus P5B-VM. One of the currently best featured G965 MBs.
- how do mATX boards compare to ATX boards "for non-gaming tasks such as video / audio editing, general productivity, multi-tasking etc.?"
- "How much of a performance hit does a G965 type mATX motherboard with integrated graphics incur as a result of sharing memory bus bandwidth with the CPU, for NON-GAMING benchmarks, compared to regular C2D ATX boards?" (Assuming of course, the user chooses to use integrated graphics vs. discrete solution and has that enabled in BIOS.)
Please BE SURE to adderss these and other such real-world topics and help make the review more meaningful for folks like me.
Requests for future reviews:
* Consider investing more time and effort in SFF / mATX / silent PC config based reviews! Yes, there is an audience out there...
* For a site this major and popular - both with readers and vendors - you need to seriously evaluate your time-to-publishing lags for some of these reviews - C2D mATX roundup review for one. I realize there are a million things you can review and only 24 hours in a day. Delayed reviews (compared to when the products came out) don't help your readers as much - think luxury car depreciation over time... :)
I appreciate you reviewing the feedback and requests in detail. Hopefully we'll see some follow-up action based on this as and where appropriate.
Also, thanks for replying to some of the key questions I've had around mATX vs. ATX boards. Lack of major performance delta is very good to hear about, at least for pre-Vista Windows OSes. Interesting.
Based on your comments in the forum posting re: Vista + IGP + memory latency, I am intrigued. If you are going to cover this in the upcoming review, feel free to say so and defer this question. Else I am curious what performance difference we are talking about between XP vs. Vista using IGP solutions? Any pointers to help with this comparison would be helpful in helping decide whether or not a Vista purchase is worthwhile from a performance standpoint in such categories / applications.
Also, have you transitioned to exclusively testing using Vista?
I'm buying a small-form-factor PC with an Intel 965G motherboard (it was the only option), and I'm plugging in an NVIDIA 8800GTX video card. So, I was wondering how these IGP motherboards (specifically this Intel one) perform in general with a vidcard plugged in. Is performance on par with (or at least somewhere close to) that of full-size motherboards? Or am I getting screwed?
It depends on the board that you buy. The Gigabyte GA-965G-DS3 allows for a decent level of overclocking (330FSB) and memory options (CAS 3 operation), the overall performance difference will not be noticeable in day to day activities when compared to a more enthusiast level board. A base G965 board will not offer the same overclocking options and a couple of the boards only allow CAS 5 operation at DDR2-800, but once again, the performance delta overall will be less than 5% in most cases, nothing to be concerned about especially given your choice of video cards.
Great, thanks for the quick reply! This is a high-quality site =). The manufacturer is Maingear, and the board they're using for my system is simply identified generically as "Intel 965G Express," but based on your response, I have faith that I am, indeed, not getting screwed =). I've built all my previous machines, but I'm getting old and fat and lazy, so I figured I'd spend a few hundred extra and have someone do it for me. Not too worried about SLI or overclocking at this point...I'll accept whatever resolution I can run Oblivion in, as long as I can run it. Thanx again.
For the mATX review, you should include results for the Abit Fatal1ty F-I90HD.
It's basically the 690G version for Intel CPU's...and that'd allow a direct comparison between Conroe and AM2 CPU's as the chipset would be the same.
Just a thought...
Chuck
P.S. Plus, I'm sure there's a good amount of people that'd like to run Conroe on a cheap but good mATX, and the Abit Fatal1ty F-I90HD looks to be about the best option out right now for that (albiet in limited quantities so far...), just too bad it doesn't have onboard Firewire (at least I don't think it does, didn't list it on the spec page), because then it'd have like everything one could want...
Looks like there's definitely no Firewire... :( :( :(
What are these manufacturers thinking (or rather not thinking) not including Firewire on this boards? These would be totally complete solutions, especially this Abit with the optical out it has, if they'd only have Firewire on them...
...and the expansion is so limited, putting in an add-in Firewire basically kills for TV tuner, capture, etc. additions.
Man...talk about something that's almost perfect that gets ruined by either a poor design decision or a poor bean counter decision... :(
"The 6150 performs okay considering the age of its core and we will see the new 6150SE and older 6100 chipset performing a few percent better overall but not enough to catch the 690G."
How would the 6100 be a few percent better when it is clocked lower?
The review over at http://www.bit-tech.net/hardware/2007/03/02/amd_69...">Bit-tech.netsays the 690G supports dual-link DVI and confirmed as much by sending 2560x1600 over DVI to the dell 30incher. This review however says "Larger 30" flat panel monitors won't be able to run at native resolution" and the technology overview article says "The digital outputs use TMDS transmitters that run at 165MHz". What's the deal?
The 690G supports Dual-Link DVI. We had stated this on page two but not in a separate section. I will reword the 2D paragraph to make this clear. As for the resolution, I am using a Samsung 30" panel and the current Vista drivers limit me to 2048x1536. I have sent a board to Jarred who has the Dell 30" to test on it. AMD still confirms that 2048x1536 is the "current" max resolution although we know the hardware has 2560x1600 capability according to one of our sources.
Hmmm something's not quite right it seems. Can't see why they were able to send 2560x1600 if you couldn't. Would definitely appreciate Jarred checking it on the dell although I'd be surprised if it was a monitor issue. Who knows without trying. Have asked bit-tech what os they were using to get it to work. An XP vs Vista issue perhaps? The related paragraph in the technology overview article mentions the TMDS's run at 165mhz which I understand is single-link? Have seen the 165mhz listed elsewhere for the 690G so am curious where this info comes from if the chipset is dual-link? Unless I've misunderstood something about "165mhz"?
The DVI spec transmits data using the transition minimized differential signaling (TMDS) protocol. The DVI spec calls for each DVI output to have at least one TMDS “link” consisting of three data channels (RGB) and one control channel. The maximum speed at which a single 10-bit TMDS link may operate at is 165MHz, offering 1.65Gbps of bandwidth. In real world terms, this means a single 10-bit TMDS link can drive a display at up to 1920 x 1200 (the actual maximum resolution can vary depending on the panel, spec is 1920x1080). For most displays that’s not a problem, but the 30” Displays have a native resolution of 2560 x 1600, which exceeds the bandwidth a single TMDS link can deliver. So what do you do? Remember that the DVI spec calls for at least one TMDS link, but each DVI port can support up to two TMDS links (the 690G has dual TDMS links), thus doubling the maximum bandwidth and enabling support for a 30" (if driver support is present) display or even some of the new 27" units that can run at 2048x1560.
Thanks for the reply Gary. That was precisely my understanding of the situation which is why I found the following quote from the technology overview article confusing "The digital outputs each use TMDS transmitters that run at 165MHz." This sentence didn't come across as saying the digital outputs had 2 TMDS "links" but rather just 1 running at 165mhz (hence single-link). Perhaps you could reword it to explain that each link runs at 165mhz but that there are actually 2 links in order to support the higher resolutions afforded by dual-link DVI. Don't mean to be picky just think this part could be a little clearer :-)
As for the resolution cap at 2048x1536 you guys are experiencing the Bit-Tech guys have confirmed they got 2560x1600 working on XP and suggest your problem is an issue with the current vista drivers.
That's cleared that up then (was merely a driver issue). Anyhow 2 questions
1) Both digital outputs support HDCP but are on separate display controllers. Does that mean they have 2 built in cryptoroms (1 for each controller) given that separate cryptoroms are required for each controller/output? If they do have 2 then why only allow HDCP on one output at a time?
2) In a related point (upcoming mobile version of chipset) what connection do laptops use internally for their screens? The reason I ask is I'm interested in getting a laptop in future which supports both hdcp for the laptop screen but also via an external digital connection to a larger display.
I know its onboard video and stuff, but a 3D Mark06 score of 313? They should be able to better than that, see who can get it into the 1000's first. Although unlikely, it'd be a nice alternative to buying a video card for a basic computing system.
What's worse is that the G965 scores almost twice as high in 3DMark06... and then falls flat on its face in actual gaming tests. (Well, most of them anyway.)
quote: What's worse is that the G965 scores almost twice as high in 3DMark06... and then falls flat on its face in actual gaming tests. (Well, most of them anyway.)
I think it might be combination of Vista too. Half Life 2 can score ~20 fps with G965 at the same settings AT tested at, when using Windows XP. I would also like to see how it performs it in XP. It seems G965 suffers more from Vista then other IGP.
I think it would be good insurance, given how amazingly late 690G is, to please confirm with AMD that 690G motherboards will definitly support the AM2+ CPU's this late summer/fall.
And before people remind me that this is already fact, we have not to my knowledge see AMD themselves confirm this...which for something so seemingly simple to confirm, is getting distrubingly telling.
When AnandTech updates their article and says that they've gone back and confirmed with AMD that all 690G boards being release with support AM2+, or AMD themselves says it, then we'll know for sure. Until then, it's rumor...
AMD has not officially stated whether the Agena/Kuma will be drop-in compatible with current AM2 chipsets (or even the AM2 socket). We'd certainly love to know, but we're still waiting along with everyone else.
Do they have any idea, and have you specifically put the question to them? I'm sure you have contacts at AMD...
...because from what I can tell, if a discrete graphics card is used, this chipset is looking like when Dr. Evil says, 1 million dollars! ...and then everyone is like, Uh, big deal...
...this thing should have released in Sept. of last year, and then become the defacto AMD chipset, not be released now - as you point out - with MCP68 right around the corner and the G35 coming also.
This chipset really looks to me like a could have been. Good work ATI (and then AMD)!!!
We asked the question Chuck, have not received an official answer yet. While backwards compatibility has always been discussed as possible by AMD, we are still not convinced with any of the current motherboards. One only has to look at the Conroe launch last year and realize that while the chipsets were compatible, the motherboards were not without an update. We just recently saw this again with Kentsfield. We wish this chipset would have been released last fall also. ;)
I guess now I've got to really sit down and decide what's the best course of action for my godson and also cousins builds. Their both going to be budget builds, but I don't want to build them an AM2 system and basically have it be End Of Lifed in 3-4 months.
You'd think if AMD wanted to stop the hemmoraging their seeing on the enthusiast side, they'd make a statement about AM2+ compatibility now, rather than wait and just keep loosing more and more. Not that a lot won't go over to the Intel side, but still, tell me 690G and say MCP68 will be the only AM2 chipsets that can take AM2+ CPU's, and now at least I've got a comfortable long term upgrade path.
Leave me in doubt, I mine as well get a 1333FSB Intel board and go to the dark side...
Looking forward to that mATX review...you think it'll be out this time next week, or towards Friday?
Honestly, the fact that AMD hasn't come out and said the next gen CPUs will work in AM2 speaks volumes in my book. Perhaps they are just trying to keep things quiet so that a bunch of people won't complain that their particular board won't run the new processors (some 939 boards wouldn't work with X2, after all, and there were some complaints saying AMD "guaranteed backwards compatibility). Hopefully that's all it is, but I am seriously concerned that Kuma and Agena will not work in the vast majority of AM2 boards - that's assuming they'll work in any at all.
If AMD doesn't support older boards with the new processors, they are going to need some really impressive performance to keep people from raising Cain. As it stands, if a reasonably fast X2 5200+ or so isn't good enough for your long-term needs, I certainly wouldn't purchase a new AM2 system with the hope of an upgrade until the truth comes out.
Final thought: The Quad FX platform has clearly been stated as being forwards compatible with native quad core Barcelona chips. If AMD is willing to make that commitment, why not make a similar commitment with AM2 and Agena/Kuma?
Are we going to see some details regarding HD (1080P) playback and whether it can do it comfortable or not. I appreciate you made a small comment about it, but this was lifted from the original look at the 690G chipset a few days ago so no real update in this review. Im just puzzled no sites are taking a closer look at this considering its surely one of, if not the whole point of HDMI being there?
Im not interested at all in using this for games, i want a 1080p capable machine.
I'm considering a board based on the 690G for my new HTPC, but now I see it won't be able to output 1080P? Yipes... even the lowly 6150 can output at 1080P, correct?
Games are not important with regard to this board, but if it can't output at 1080p, what use would it be in a HTPC??
Also, any ETA on when the mATX roundup will be released?
The board will not do 1080P over the HDMI port at this time. 720P is working fine. The mATX will be up on the 19th, provided my heart is still working by that time, have to say that testing under Vista is not a pleasant experience. ;)
Gary, thanks for the info. Is the lack of ability to do 1080p related to Vista or will it just not do it at this stage. Is it likely bios/driver updates will improve this?
The platform will no do 1080P playback at this time in a consistent matter. As stated, we normally would end up with a slide show or a blank screen. AMD has told us 1080P will be possible with a driver update, proper playback support (PowerDVD or WinDVD), and a processor along the lines of a 5200+. We received a new driver update to address video quality issues we found late in testing but 1080P was not addressed yet. I am just as anxious as everyone else to see if it will do 1080P. ;)
This has to be one of the worst review ever done at Anandtech , almost makes you think somebody was paid to do it this bad.
we fully believe the majority of the performance difference lies in the chipset selection.
Is this a joke or what ? The 2.6GHz 5200+ against a 1.86GHz Core 2 in media encoding and you think it is the chipset?! Every other test you made put the E6300 in between the 3800+ and 4200+.
Nero Recode 2 performance:
AnyDVD Rip = 3-way tie more or less
Shrink = 6150 leads, G965 second, 690G last (despite 6150 and 690G using the same CPU)
Shrink/Burn = G965 first, 6150 and 690G virtually tied.
Full quote, instead of your selected text: "Of course, we are using a mid-range AM2 processor against the budget C2D part (the AMD price cuts have helped matters there, as the price difference is currently only about $35) but we fully believe the majority of the performance difference lies in the chipset selection. It is only in the shrink and burn tests that we see the Intel platform flexing its muscles...."
In other words, the difference we saw in the Shrink test indicates that the 6150 chipset is better for this task than 690G. We definitely know that the Core 2 Duo is faster at equivalent CPU prices than X2 chips, but we're looking at platforms and chipsets and not just CPUs.
Gawd, people, if you're going to tell us that we'll "have to play at 800x600" if we use these integrated graphics, why not test the games at 800x600 and report framerates? Find the highest (lowest) settings necessary to get a playable experience and tell us that. No one's going to run a game at 1024x768, get 15fps, and then give up; they're going to crank the settings and res down until they can play the game.
It's not enough to just say "integrated graphics are unsuitable for even casual gamers, buy a discrete card" and then not quantify the difference.
I just wanted to say I'm looking forward to the mATX roundup
Nice article overall, thanks. And you might want to invest in some cheap Intel and AMD processors (the low-end, for $100 both), just to be able to compare them (I'm not suggesting complete testings on every processor possible)
I just read the first page of this review, and wonder if a 15 year old kid wrote this.
Let's grow up a little and stop thinking the world revolves around people that play video games all day and whine about big companies that actually think people use computers for real work!
There's a perfectly good reason that Intel, et al, release IGPs that don't have great 3D processing power. I mean think before you write something so inane. The world doesn't all play games, and not everyone is an idiot and wants to play a mindless 3D shoot 'em up game and thinks extra resolution makes it so much more fun. Pac-Man, Space Invaders and Defender were incredibly crude by today's standards, but people loved them. It's the idea, not the resolution.
Also keep in mind it's not like there is no downside to creating IGPs with all this functionality. You burn up electricity with those extra transistors, even when the vast majority of people, the vast majority of the time, aren't using them. You also generate more heat, and you make them more expensive. Why should someone that doesn't play 3D games pay for this when they don't use it? Why should be use more electricity so some simpletons that want to zap aliens can have that ability in an IGP? Isn't it enough they make cards specifically for that? Shouldn't only the people that want this capability have to pay for it? Why should other machines be crippled with this heat generating, electricity using capability when they never use it?
So, if you're still incredulous about why companies still make relatively slow IGPs, it's because they fill the need for the vast majority of the people that buy them. And if they do not, there are options available that do. That's a perfect solution, you get it if you need it, and don't if you don't. Why whine about it so much?
quote: I just read the first page of this review, and wonder if a 15 year old kid wrote this.
I wish I was 15 again. :)
quote: Let's grow up a little and stop thinking the world revolves around people that play video games all day and whine about big companies that actually think people use computers for real work!
I fully understand the world does not revolve gaming and that most people use computers for work. However, these same people might have kids who want to play games or they decide to try it also and without knowledge of what it takes to run a game properly they pick up the phone and call Dell or HP or go on-line. Unfortunately, the majority of the machines they will see and probably can afford are going to be IGP solutions. The reps or the machine specs on-line will lead you to believe the $650 unit you just picked out will cover all of your needs and that is wrong. One of Dell's top selling units is the 521 series and is advertised as "Built for Smooth, Advanced Performance, Edit your videos and photos, play games and more." Guess what, it just barely does most that and we hear people complain all the time that their PC is not fast enough for this application or this game, or a myriad of other uses.
quote: So, if you're still incredulous about why companies still make relatively slow IGPs, it's because they fill the need for the vast majority of the people that buy them. And if they do not, there are options available that do. That's a perfect solution, you get it if you need it, and don't if you don't. Why whine about it so much?
Probably the same reason we whined about Yugos and Pintos and the reason you no longer see them in production. Sure, they filled a basic need but once you got in a Civic or a Corolla you fully realized that something faster, better, and of higher quality was available for nearly the same price. The technology is there to greatly improve IGP performance at very little cost. We do not expect these solutions to ever provide the same level of performance as dedicated GPU cards but we do expect more out of them. Intel promised the world they would have a class leading DX9 IGP solution in the G965, what we ended up with was a solution that barely worked better than the 945G. If you are going to advertise functionality that leads one to believe you can play the latest games on it and mass market it, then expect us to report our findings and to call them out on it if it does not meet the marketing specifications. The same holds true for NVIDIA, AMD, SIS, or VIA in this market sector. We have the same issue with the low end video cards also, in some cases their performance is no better than the IGP solutions which means you just spent a $100 for the same performance yet those words all over the product box would lead you to believe otherwise.
My big problem with what you said in the article is that you are blaming Intel, and to a lesser extent, for creating a great chipset for the vast majority of people that use it. The negative remarks about how they are making it difficult for the those poor developers is completely misplaced, they make an excellent product that does what it should. In your response, you are now changing your position, I believe. It seems that Dell and HP are misrepresenting products they are selling, and that is the root problem. I am not privvy to that, but if that is your contention, I have no problem with it. It could be correct, but pointing fingers at Intel and other IGP makers is misplaced. They make a product for a huge market and address it well, and if some maker misrepresents it, that's their issue and not the maker. That would be like complaining that a Ford Mustang isn't as nimble as a Lotus Elise, because sometimes dealers say it's the best handling car. Is it Ford's fault that it is being sold incorrectly?
Your remarks about the Pinto and Yugo are totally misplaced. The Pinto died because it had a nasty tendency to explode when hit in the rear, and the Yugo was unreliable to the extreme. Neither was particularly well suited for its target audience, but the class of cars itself was. You kind of make my point for me. A lot of people still like subcompact cars for exactly the reason I stated - they don't need a big monster and they don't want to pay for the gas or the car in the first place. They get what they need that does the job adequately.
It's kind of childish, and megalomanical to think the whole world should have to pay more for their computers, and burn more power doing it, so a small subset of people can benefit because game developers can develop to a higher standard. So the business world, and everyone that doesn't play 3D games should subsidize gameplayers by having to buy something useless that uses extra power, probably makes more noise and costs more. All so gameplayers can get developers to design to something better. Sorry, I don't agree, and apparently neither does any of the big companies that actually make these decisions.
With regards to operating systems, a lot of companies still use Windows 2000. The business world is a lot more pragmatic and don't see a need to move to something because Microsoft came out with it. Outside of new machines coming with XP instead of 2000, was there really a compelling reason to move to it? Not for most people.
This last remark is directed at whoever it was that said that the Intel chipset took more power and was slower. Well, think a bit, it's a fully functional chipset and should, whereas the memory controller is not in those for the AMD processors. Also, Intel typically makes their chipsets on old lithography, not the newest stuff, although I don't know anything specific with regards to these chipsets.
quote: My big problem with what you said in the article is that you are blaming Intel, and to a lesser extent, for creating a great chipset for the vast majority of people that use it. The negative remarks about how they are making it difficult for the those poor developers is completely misplaced, they make an excellent product that does what it should. In your response, you are now changing your position, I believe. It seems that Dell and HP are misrepresenting products they are selling, and that is the root problem. I am not privvy to that, but if that is your contention, I have no problem with it. It could be correct, but pointing fingers at Intel and other IGP makers is misplaced. They make a product for a huge market and address it well, and if some maker misrepresents it, that's their issue and not the maker.
Our issue continues to be the technical and marketing spins of the various companies that sell these solutions as if they will do everything a consumer expects out of a PC. I am not addressing the office crowd and was clear about that in the article. I am addressing the home user who buys a machine and has certain expectations out of it based upon the sales literature or advertisements. I deal with the OEMs on an almost daily basis and while they have certain cost targets to achieve, one of their primary issues is that they have to constantly up sell a unit if the consumer says the word "gaming". Generally, this up sell does not work and they lose the sell as the consumer thinks they are getting the bait and switch routine. One of the top listed support call questions or concerns is, "My computer will not run this game or the game does not run properly." Like it or not, people do use the Home PC to play games or enjoy multimedia applications. The last time I looked at the numbers the PC game market is still very viable and alive. I also realize that not everyone plays games but that does not mean the requirements for decent 3D performance is not required, especially with Vista being launched now. This changes everything for the consumer market as it does not matter if Windows 2000 or XP is still best option for most people as the consumer buying these machines will receive Vista.
I am not advocating that the IGP solutions run the latest games at 100FPS. I am advocating that the base level of performance should be better, at least to the point where 1280x1024 at 30FPS with decent settings is viable, not because I want it but a significant amount of consumers would like an inexpensive PC solution that the kids can also use in playing games or maybe they are buying a second system for the family and do not want to spend as much for a discreet video card as they do for the system. Settling for 640x480 graphics is not the answer, improving the performance of the solution is the answer. We constantly expect our machines (yes, even the office crowds) to perform faster, whether we admit it or not. I have yet to come across a coworker that was satisfied with their current hardware once they used a next generation system. So, I do not think it is wrong to expect an improvement in the advance of IGP solutions targeted to the home crowd as an all-in-one solution.
As for the marketing spins, this is one of my main issues with the G965, and why I have a real issue with it when I address the gaming issues. This is straight from their webiste -
"Gamers and Media enthusiasts have been long demanding better technology for unparallel ultra-realistic visual experience and the demand will be even greater in the future. Intel has answered the current demand to support future technologies by unveiling its hybrid graphics architecture with the introduction of Intel® Graphics Media Accelerator 3000 (Intel® GMA 3000) family engine. This state-of-the-art hybrid architecture evolved as a balance between fixed function and programmability. The graphics engine of the Intel® Graphics Media Accelerator 3000 family consists of a scalable array of symmetric processing components known as execution units (EUs) which can be programmed to dynamically process graphics or media data alike. Programmability of the EUs adds flexibility and enables support for future graphics and media usages by upgrading the driver. Execution units support dynamic load balancing, multi-threading, and multi-functional data processing, resulting in increased performance to enable a more compelling gaming and visual experience for main stream users.
Intel’s next generation architecture shows Intel’s continued innovation by delivering greater flexibility and performance to meet the needs for the current and future consumer and corporate applications.
The new Intel Graphics Media Accelerator architecture with its greater performance and flexibility will be the basis for many new consumer and corporate applications. For consumers, the Intel® Graphics Media Accelerator X3000 engine supported on Intel® G965 Express chipset has been optimized for enhanced 3D to allow for greater game compatibility and realism with new video and display features to deliver a theater-like entertainment experience through Intel® Clear Video Technology. For corporate users, Intel® Graphics Media Accelerator 3000 engine incorporated in Intel® Q965/Q963 Express chipsets continues to offer stability, ease of use, and lower power consumption, in a cost effective solution. The Intel® Graphics Media Accelerator 3000 family engine is eligible for the Microsoft Windows Vista*Premium logo.
Intel’s next generation Graphics Media Accelerator that provides new levels of graphics and video responsiveness in a cost effective solution. The Intel graphics engine is integrated into the Graphics Memory Controller Hub (GMCH) and enables a low power, reduced form factor solution compared to power-hungry discrete graphics cards.
With Intel’s next-generation Graphics Media Accelerator, Intel continues to drive innovative platform enhancements that increase the overall performance for the end user. This next-generation Intel graphics architecture is designed to be extensible and offers extraordinary features. This section discusses the benefits of the Intel® Graphics Media Accelerator 3000 architecture features and innovations: programmability, dynamic load balancing, multi-threading, multi-function, 32 bit full precision compute, and dynamic and static flow control which enable more stunning graphics in games and video applications."
Notice they lead off their marketing spin with the word "Gamer" and continue with a demanding better technology statement. I think that pretty much indicates the market they were addressing with this chipset when the X3000 core was being developed. Guess what, the words that Intel used to launch this chipset are the same ones I am advocating in my opinion.
Thanks for your comments, they are certainly welcome and we respect your opinion. :)
You mention gamer, and for the games I play this chipset works fine for almost all of them. Did you notice the word "mainstream"? It's not for enthusiast gamers, that want the most modern games, and the highest settings. But, still, I agree they exaggerate. I think the crux of our disagreement is, you seem to think that an IGP should run very demanding, very modern games at fairly high settings well. I think that's entirely unrealistic. I think that's what add-in cards are for. These are not for the enthusiast, and I don't think it's feasible to expect so much of them. Anything that could would be very power consuming, expensive, and big. They are already putting fans on some chipsets! This to me is entirely unacceptable, I like my PC very quiet. If you like performance and fans and a hot room, that's what cards are for.
I guess we mainly disagree on the target market for these items. You expect them to be serious gaming platforms, I don't. There are so many tiers of add-on cards, I don't see that being proper segmentation, and people don't want to pay for it. Put another way, if you make these more powerful than entry level add-in cards, what do people buy that want basic 3D performance without needing anything too powerful? They have to spend more money, and get a hotter power hungry machine because there is nothing that works well for them? Whereas, the way it is now, if they want something better, they go with a 965 and get a add-in card. Not only that, I think these things fit a lot of people pretty well. There is almost nothing I run that requires more than a Radeon 9000, and I'm not nearly as unique as I might like to think I am. There a big market for this level of performance.
Now, do computer companies exaggerate their claims. Well, as I said we agree on that part. They always have, always will. It's capitalism, so I guess we have to accept it. Capitalism isn't perfect, but overall it works.
I agree Vista will necessarily raise the bar a bit for 3D performance, and in fact Intel did improve their IGP to address that. We only disagree on the amount, not that it was done at all. But, even with Vista Microsoft hedged their bets and did not make the requirements have any teeth. You don't need "Aero" after all, and in some version you don't even get it. Don't they deserve some blame too? How difficult would it have been for them to make it a requirement? Think about it, if they don't sell Vista, they are still going to sell XP, and the upgrade market has never been that big compared to the amount they sell with new PCs. So, why no egg on their face?
I disagree completely with 1280x1024 being necessary, or that parents want their kids playing brainless games all the time. Playing at 640 x 480 is fine, or 800 x 600, and just saying it isn't without any backing isn't very useful. Do you remember Pac-Man, or the Atari 2600? Or Galaga? Or Space Invaders? They were much more popular than any single game now, and they were a lot of fun (Defender was my favorite). They were incredibly crude by today's standard, but people still loved them. Some poor little kid that has to play at 800 x 600 is just not going to get my sympathy, if the game is a fun game, it just won't matter. If you think the level of detail of an explosion is what makes the game fun, then I think you've lost touch with mainstream human behavior and a bit obsessive about this level of detail. It's not what matters, even if you have a bunch of obsessive people think it is. If it were, games played at 256x192 or whatever, with 4 colors, that were two dimensional, that had big time lag when a lot was going on, would never have been popular. For those that are extreme though, there are cards.
I would agree with you if IGPs were making NO progress. But they are, and they will continue to. You know what would be interesting for you guys to do (at least to me)? A comparison between IGPs and older video cards, and do a trend where you compare them for the last several years to see how far IGPs have been behind cards, and see what the trend is with that. It would be imprecise, and in some cases difficult, but I think the results would be illuminating for just about all of us. I'm kind of curious if they are further behind now than they were say in the 810 days, or if they're trailing by the same amount of time. My guess, and it's just a guess, is it's about the same lag. Wouldn't you be interested in that too?
If you want the best example of misrepresenting the capabilities of a product, look at the 3DMark06 results and then look at the gaming results. The Intel GPU that you're so fond of happens to score the best in 3DMark (nearly twice as fast!), and yet it almost entirely fails to run two popular games/game engines. That's Intel's fault, and it's Intel's drivers that appear to be mostly to blame. How many places will post the 3DMark score as representing a product, because it's "easy to compare", and completely neglect to show that actual performance is very poor?
For all we know, the G965 has the power to actually outperform the other IGPs and it's only the drivers holding it back. Unlikely, but possible. And while you appear to be more than happy with the IGP offerings, we would like more, and we would like more quality - in drivers, performance, features, etc.
You complain about us putting out our opinion on what an IGP should be, but basically opinion is all you give in response. Plenty of people will wholeheartedly agree with our stance; others will agree with yours. That graphics are becoming more necessary in modern computers is basically a given - though certainly a lot of people and businesses still run Windows 98 and 2000, and they'll continue to run XP likely until after Vista's successor ships. Me, I'll take XP over 2000 any day - if for nothing else than the task manager than now has a networking tab. Yup, call me crazy, but I appreciate most of the new features that went into XP, and I'm sure I'll eventually appreciate Vista as well. (SuperFetch alone is going a LONG way towards winning me over!)
G965 is hardly better than 945G, which was barely better than 915G. X1250 likewise is only an incremental step up from X1100. Is there a compelling reason to upgrade operating systems? Sometimes. Is there a compelling reason to not upgrade? Sometimes. What about IGPs - what's the compelling reason to upgrade from 865G if all you need is 2D acceleration?
I guess we should all just forget DX8, 9, 10... probably ditch DX7 as well, forget 3D and move back into the glory days of pure 2D graphics acceleration. For all the people that would like that move, there are lots of others (many more, I'd wager) that would feel it is castrating innovation. Long term, we need new software and hardware and advances in both. How many people switched to OSX because it was an overall better OS for their needs? Only a few percent, but if Apple continued to evolve their OS and nothing happened with Windows, eventually Windows would disappear.
Q965 and 690V are more stripped down versions of the IGPs in question, and those are targeted at the business market that you seem to prefer. We're not looking at those; we're looking at the "consumer" line of IGPs, and we're finding them severely lacking. If there were no real difference in performance between the business and consumer offerings, then there would be no point in having different products. There is a difference, however, indicating that Intel and AMD are both aware of the need for better performance in the home market. Right now, that better performance is largely missing. What it comes down to is that we feel Intel, AMD, and others could do a lot better. For you, that might not be important, but right now a lot of IGPs are only good for a bare level of functionality. Fine for business, but not for a lot of home users; they might as well forget the home market and just focus on business IGPs if the offerings are going to continue to unimpress.
Finally, your comment about power is once again incorrect. Intel Core 2 Duo with GPU x uses less power than AMD Athlon X2 with GPU x, in the majority of situations. Intel doesn't have an IMC in that scenario either. Both X1250 and G965 are "fully functional chipsets". The AMD platform ends up using quite a bit less power. Is it because Intel uses old lithography? Probably, but again, that just reillustrates the point that Intel is not doing anything to move the IGP market forward. If we could run X2 processors with a G965 chipset, we would find as usual that Intel uses less power for the CPU. The net result is that the better platform for low power computing currently looks to favor AMD, at least in the IGP market.
You're changing the argument. I didn't say Intel didn't exaggerate claims, I'm saying the world doesn't need 3D graphics with superior performance for most people, like the supposition in the article said. I don't want it, and if you do, go buy it and stop insisting that everyone gets it too. You have options.
You may want more performance, etc..., and there are cards for you. Go buy them instead of complaining about an item that wasn't made for the enthusiast. It's like complaining a Chevrolet Cavalier isn't fast enough. So go buy a Corvette, and leave the gas mileage people the Cavalier. Stop insisting that we need it.
I don't complain about you giving an opinion, I complain about your opinion. You think that everyone should subsidize game players. I don't. I never said XP wasn't for anyone, just most people don't care one way or another. I prefer 2000, it's faster and I can run it on machines that are less powerful. I still think OS/2 is better than either, but that's another topic :P. Vista might be good eventually, but I don't feel compelled to buy it because it doesn't do anything I need. When it runs something I want to run, and Win 2K doesn't, I'll probably buy it. Until then, I saved a lot of money skipping XP.
"Hardly" is too ambiguous a term, and what you consider hardly could be considered considerable. What is obvious is they run fine for most of the people that need them, and they are not for enthusiasts that need more than they are targetted for. Intel is in the business of making money, and their graphics sell better than anyone else's. So, they're obviously doing something right, even though you can shoot aliens with them as well as you'd like. But then, that's what Nvidia and AMD/ATI are for. That's what the video slots are for.
Innovation for the sake of innovation is always a mistake, and if you don't believe that look at the P7 and you'll see a perfect example. And exaggerating something to make a point shows just how weak your point really is. Intel will continue to boost performance when it makes sense to, meaning it doesn't add much in terms of power requirements and cost, and adds something useful for the target of the product.
Your supposition on Apple is without support. Unless you think you know more than analysts that cover Apple and think they gained a lot of mindshare because of the Ipod. Also, moving to Intel helps, since it is better than the POWER platform was for them.
With regards to home users, you sort of have a point, because they do sell cheaper versions. But, you seem to think the world is filled with kids that play games, and simple ones that don't want to play games that require a lot of thinking. A lot of people, myself included, like playing strategy games, and some are fine playing games without all the bells and whistles running. I don't mind playing at 640 x 480 even for arcade games, because it doesn't change the game ideas and that's what is engaging. The funny part is you guys seem to think the world is filled with people that play extreme games because that is a lot of what reads your web site, but it isn't. Most people are happy to surf the internet, email, play card games online, and do other things that don't stress 3D capabilities much, and would prefer to save the money and electricity, and noise, and heat, and not get something they don't need or want. If they need or want more, there is more out there. Put in a different way, you say they should be more powerful because they have weaker siblings, has it occurred to you there are several lines of cards above them, and by contrast they don't have to be so powerful because of this. They are home machines, not enthusiast machines. That's what add-in cards are for.
You are misrepresenting my comments when you call them incorrect. You said that the highest performing IGP used the least power. I mentioned the chipset, not both pieces, and you would expect, everything else being equal, the Intel would use more, because it has a memory controller as part of the chipset.
You're dead wrong about Intel not moving the IGP market forward, and you even contradict yourself earlier in your remarks. It's moving it along slowly by your own admission. I'm not sure I agree on the speed, but we both agree that it is moving along. The fact is, Intel chipsets with graphics sell really well, so they are meeting the market demands. Maybe not yours, but, again, that's what add-in cards are for.
quote: I don't complain about you giving an opinion, I complain about your opinion.
I have to say that it appears to be the same thing. Gary has further addressed the situation below. You are certainly entitled to your opinion on things, but likewise we are entitled to ours. For non-gaming, non-3D work (and to a lesser extent, non-video tasks) just about any IGP released since 2000 will perform acceptably. If that's all you need for a PC, you're set. Clearly, Intel *doesn't* feel that's all that's necessary, as they have integrated a lot more into their IGP and their marketing touts those new features. However, while we have more, it is in many instances inadequate.
If someone is happy playing older games at 640x480, that will almost certainly lead to them wanting to play newer games. If they only want to play strategy games, that will inevitably lead to looking at newer strategy games (Civ IV for example) that require 3D graphics. I think we can make a pretty good case that for anyone that does any gaming beyond solitaire and web-based games, the move to 3D quickly becomes necessary unless you simply want to relieve the glory days of 2000 and earlier.
There is a market for add-in cards, but that market exists in part because the IGP offerings are so anemic in terms of performance, and there's plenty of obfuscation there as well. Many OEM budget PCs don't even give the buyer the option of installing a truly capable GPU (at least GeForce 7600 or Radeon X1650) in the online configuration tools. The best IGP on the market currently rates about the performance of an X300/9600. Given that we have discrete GPUs ranging from entry level $50 cards up through uber-powerful $600 monsters, we'd like to see a few more options for IGP that get above the "not even entry-level performance". Both AMD and Intel have two IGP chipset levels, but they are simply "too slow" and "even slower - but it's for business so that's okay" (at least when it comes to 3D). If they're going to the trouble of making a faster model, why not actually make it useful? And why all the bogus marketing if it doesn't matter?
I too have seen far too many support questions wondering why a "recent" PC can't run a game some child got for Christmas or whatever. The companies selling and producing these products aren't out to educate the people about what's really required for certain computing tasks, so that's what we do. Most people apparently get that, and we're not the only ones complaining about the state of IGP. I've read about it in quite a few magazines, but I'm sure you disagree with their opinion as well.
For better or worse, IGP still goes into 40% of PCs sold. No wonder people get frustrated and purchase gaming consoles... not because a PC can't handle games with the correct configuration, but because the base models that so many buy are crippled at the starting gate. Don't even get me started about OEM systems that don't include GPU expansion slots (which are thankfully starting to become less common).
I understand your argument, and I respectfully disagree. Intel will sell millions of their IGP whether they suck rocks or they actually perform okay. They will sell millions of CPUs whether they are worse than the competition (P4 vs. Athlon 64) or better than the competition (Core 2 vs. Athlon X2). They are taking the easy way out on the IGP and millions of people unknowingly get "Powerful multimedia and gaming capabilities" that simply aren't. Intel is capable of doing better on IGPs - lower power and better performance, all in the same package. They don't do that for a variety of reasons, but what's good for a big corporation is rarely ideal for consumers. I for one care more about what will benefit consumers than what will make Intel, IBM, AMD, Dell, etc. happy. You don't work for one of them, by chance? You're awfully upset about a few minor paragraphs.
I think we disagree on the purpose of this chipset. I have never liked shared memory video, ever since the PC jr. I thought it was a bad idea, and I still do. They are a low end segment and people should not expect performance from them. My expectations for them are not that high, and I think they're perfect for their market segment. I don't think IGPs should be particularly powerful or play the most modern games well, that's what add-in cards are for. It probably comes down to market segmentation.
You're doing the same thing Gary is, you are blaming Intel, et al, for what OEMs are doing with their computers. Intel has improved the video capability, and no doubt will continue to, and it will inevitably lag behind video cards. But then, why shouldn't it? They make the product, they aren't like IBM used to be and sold the end product to consumers too, so I'd say the makers have to fix that part and segment it well. Then again, who is really to blame? You can't blame makers, you have to put the blame squarely where it belongs, the consumer. If the consumer demands better stuff, they will get it. I totally disagree with you on items sucking "rocks". For the first time I can remember, the technology actually has a huge impact on what sells. Look at the market share losses Intel suffered because of the miserable Prescott. Look at the the low prices AMD is forced to sell their pathetic K8s for now that the P8 is out. I don't ever remember this happening before, if it had would x86 have won out (68K was much better)? It didn't happen completely, but it DID happen to a degree that had a material impact on both companies, and certainly helped Intel create a more prosaic and effective product in the Core 2. This is a company that is known for bizarre microprocessors too, from the 432 to the 860 to the Itanium (really more HP, but still), so a normal microprocessor that just works well probably killed them to make :P.
We have a fundamental disagreement on the segment, I guess that's it. I think they should leave off where the add-in cards start. Cards having their own memory and processor should be better than a integrated chipset with shared memory and shared die space with other functions. Call me a purist, but this makes more sense than this being faster than a card made specifically for that. Also consider that the reason why these machines sell so well is because of the cost. Intel, et al, has to balance the cost with the features, including other aspects like buffers, sound, etc... and make it all in the best package for their intended audience. Since you can't upgrade the chipset on the motherboard, and you can upgrade the sound and video, I think they do a pretty good job.
You probably have never played Civ 4, or other strategy games like it, if you make that remark. They do not need even anything remotely powerful to play well. I run it fine on a Radeon 9000, although the processor speed is extremely important for it. Civ III was a beast too in terms of CPU performance.
I'm sure Intel has their reasons for making the IGP the way they have, and you certainly do not know better than they what they are capable of. Could they make it lower power and more powerful? Well, it's possible, but then it would cost more. Would it target the market better? Probably not, otherwise they'd probably do it. But, of course, they are working on the next chipset that will, so it's not all bad.
The root problem with stuff like this is the consumer, and the consumer has chosen these chipsets, either actively or passively. If they rejected it like they rejected, for example, RD-RAM, or to a lesser extent the Prescott, then Intel wouldn't keep making them and would come out with a product that fit the market better. But these products do fit the market. You talk about how many people that complain about this or that, but do you ever think about how many people that buy these units are not complaining and are perfectly happy and saved some money because it was all they needed? Anandtech readers are a small, very inaccurate subset of the computing world at large, and you can't base a whole lot off of what readers here say. Computers 20 years ago were mainly for hobbyists, but now the general public uses them and the reality is, there is not an industry in the world where there isn't some mismatching. You think no one complains about cars? Dishwashers? Speakers? Call your local Sears to find out if they ever get complaints :P. It's the nature of the beast, and it won't go away if Intel makes ultra-powerful IGPs (which again, I find oxymoronic). There's always a tradeoff.
I don't agree to _how_ this criticism was stated, but I _do_ agree to its content.
I am PC worker, too.
I am not getting paid for playing directX 10 games.
I have a PC for using Word, Excel, Access, maybe a browser and a PDF reader. That is basically it. Now you tell me: Do I need a Core 2 Duo 6800 and a Geforce 8800GTX for that?
Nope. I need no graphics power at all (1280x1024x32 2D desktop - that's all) and my no. 1 priority for a processor is one that is _silent_ and nothing else.
I think most of the people missed the comments or observations in the article. The article was geared to proving or disproving the capabilities of the 690g and in a way the competing platforms. It was obvious to me the office crowd was not being addressed in this article and it was the home audience that the tests were geared towards. I think the separation between the two was correct.
The first computer I bought from Gateway was an IGP unit that claimed it would run everything and anything. It did not and pissed me off. After doing some homework I realized where I went wrong and would never again buy an IGP box unless the video and memory is upgraded, even if it is not for gaming. I have several friends who bought computers for their kids when World of WarCraft came out and bitched non-stop at work because their new Dell or HP would not run the game. At least the author had the balls to state what many of us think. The article was fair and thorough in my opinion although I was hoping to see some 1080P screen shots. Hint Hint
My point (besides correcting a mistake) is, that I think that this test is gravely imbalanced... you are testing - as you have said yourself - an office chipset - then why do you do it with an overpowered CPU?
Office PC's in small businesses go after price and where is the difference in using a mail program between a Core 2 Duo for 1000$ and the smallest and cheapest AMD offering for less than 100$?
quote: My point (besides correcting a mistake) is, that I think that this test is gravely imbalanced... you are testing - as you have said yourself - an office chipset - then why do you do it with an overpowered CPU?
We were not testing an office chipset. We are testing chipsets marketed as an all in solution to the home, home/office, multimedia, HTPC, and casual gaming crowd. The office chipsets are the Q965/963 and 690V solutions. The G965 and 690G are not targeted to the office workers and were not tested as such. Our goal was to test these boards in the environment and with applications they are marketed to run.
We mentioned this above, but basically we were looking to keep platform costs equal. Sure, X2 3800+ is half as expensive and about 30% slower than the 5200+. But since the Intel side was going to get an E6300 (that's what we had available), the use of a low-end AMD X2 would have skewed results the other direction. We could have used an X2 4800+ to keep costs closer, but that's an odd CPU choice as well as we would recommend spending the extra $15 to get the 5200+.
The intent was not to do a strict CPU-to-CPU comparison as we've done that plenty (as recently as the http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">X2 6000+ launch). We wanted to look at platform and keep them relatively equal in the cost department. All you have to do is look at the power numbers to see that the 5200+ with 690G compares quite well (and quiet well) to the E6300 with G965.
The major selling point of this chipset is basically that it supports HDMI output. That's nice, and for HTPC users it could be a good choice. Outside of that specific market, though, there's not a whole lot to put this IGP chipset above other offerings. That was what we were hoping to convey with the article. It's not bad, but neither is it the greatest thing since sliced bread.
If you care at all about GPU performance, all of the modern IGP solutions are too slow. If you don't care, then they're all fast enough to do whatever most people need. For typical business applications, the vast majority of companies are still running Pentium 4, simply because it is more than sufficient. New PCs are now coming with Core 2 Duo, but I know at least a few major corporations that have hundreds of thousands of P4 and P3 systems in use, and I'm sure there are plenty more. Needless to say, those corporations probably won't be touching Vista for at least three or four years - one of them only switched to XP as recently as two years back.
Perhaps it's because the companies releasing these products make so much noise about how much better their new IGP is compared to the older offerings from their competitors? If AMD had released this and said, "This is just a minor update to our previous IGP to improve features and video quality; it is not dramatically faster and is not intended for games" then we would cut them some slack. When all of the companies involved are going on about how much faster percentage-wise they are than the competition (never mind that it's 5 FPS vs. 4 FPS), we're inclined to point out how ludicrous this is. When Intel hypes the DX9 capability of their G965 and yet still can't run most DX9 applications, maybe someone ought to call them on the carpet?
Obviously, these low performance IGPs have a place in the business world, but Vista is now placing more of a demand on the GPU than ever before, and bare minimum functionality might now be adequate for a lot of people. As for power, isn't it interesting that the HIGHEST PERFORMANCE IGP ends up using the least amount of power? Never mind the fact that Core 2 Duo already has a power advantage over the X2 5200+!
So, while you might like to pull out the names and call us inane 15 year olds, there was certainly thought put into what we said. Just because something works okay doesn't mean it's great, and we are going to point out the flaws in a product regardless of marketing hype. Given how much effort Intel puts into their CPUs, a little bit more out of their IGP and drivers is not too much to ask for.
Maybe they didn't intend their products to be tested in the way you did. As someone pointed out, playing at 800 x 600 isn't that bad, and doesn't ruin the experience unless you have an obsession. Incredibly crude games were incredibly fun, so the resolution isn't going to make or break a game, it's the ideas behind it that will.
You can't be serious about what you want AMD to say. You know they can't, they are in competition and stuff like that would be extremely detrimental to them. Percentages are important, because they may not running the same games as you are, at the same settings. You would prefer they use absolutes as if they would give more information? Did AMD actually tell anyone these were excellent for all types of game? I never saw that.
With regards to CPUs and GPUs, you are trying to obfuscate the point. Everyone uses a CPU, some more than others. But, they do sell lower power ones, and even single core ones. Not everyone uses 3D functionality. If you don't get it, I DON'T want it on certain machines of mine. I don't run stuff like that on them, and I don't want the higher power use or heat dissipation problems from it. What you call effort isn't at all, it's a tradeoff. Don't confuse it with you get something for nothing if Intel puts more into it. You pay for it, and that's the problem. People who use it should, people that don't, shouldn't, so the kiddies can play their shoot 'em ups.
Just so you know, I'm both. I have mostly work machines, but two play machines. I like playing some games that require a good 3D card, but just don't like the mentality the the whole world should subsidize a bunch of gameplayers when they don't need it. That's what add-in cards are for. I would be equally against it if no one made 3D cards because most people didn't need them. I like choices, and I don't want to pay for excessive 3D functionality on something that will never use it, to help gameplayers out. Both existing is great, and IGPs will creep up as they always have, when it becomes inexpensive (both in power and initial cost) to add capabality, so the tradeoff is minor.
Does this chipset support 5.1 LPCM over HDMI or not??? Or more plainly can someone send 5.1 (games, HD movies, etc) digitally to receiver with the 690G? According to your previous article on the 690G 5.1 48khz was supported over the HDMI port. Now its back to 2 channel and AC3 bitstream. Which is it?
It is two channel plus AC3 over HDMI. That is the final spec on production level boards and drivers. We will have a full audio review up in a week or so that also utilizes the on-board codec.
Why is this happening? Why on earth can't they produce a PC HDMI Audio solution that outputs up to 7.1 LPCM (96khz/24bit) for ALL sources!?! They already do that for 2 channel sources!!!! Do you have any info from the hardware vendors regarding the reason/s they will not produce such a straightforward and simple solution?!?
PS There are lots of people demanding a TRUE PC HDMI Audio solution not this SPDIF hacks...
I'm also interested to know more specifics about the audio side of this chipset. The support of HDMI v1.3 suggests that with an appropriate driver and supporting playback software Dolby TrueHD and DTS-HD bitstreams should be able to be sent via HDMI to a v1.3 receiver with the necessary decoders. Is this a possibility?
Great review, thanks... I know I asked that a couple times already, but is there a mATX roundup planned here at AT? I'd like to see the Asus M2NPV-VM and Abit NF-M2 NView compared with its 690G counterparts, as this segment makes for most of the computer sales on most places? BTW, weren't you plaged by memory compatibility issues with the M2NPV-VM oe any of the boards tested? This Asus board showed extremely picky on my experience...
The roundup is scheduled on the 19th, trying to pull it in. What BIOS and memory are you using on the M2NPV-VM, so far I have not run into any real issues except with 2GB modules. The abit board is one of my favorites so far. ;)
There shouldn't be a question mark at the end of the "most sales" phrase... There are also a couple typos, sorry about that. Where's the edit button anyway? ;)
i don't really understand the point of comparing chipsets/motherboards between processor families. subsystem performance figures can show glaring deficiencies but otherwise it really boils down to a cpu comparison. the "media/audio encoding" and "media performance" sections are certainly cpu-centric. and pitting a $230 x2 5200+ against a $185 e6300 winds up handicapping the intel contestant. shouldn't the $222 e6400 have been used instead?
As stated in the article, AMD is marketing the AM2 and 690G/V as a platform design to compete against the G/Q965 and Core 2 Duo solution. The 690G is targeted to the multimedia, HTPC, home/office, casual gaming crowd and was tested as such. We looked at the total price of a base Core 2 Duo and decent G965 board and then matched the processor choice that would come closest to the price and performance of the Intel offering while meeting the platform cost. Our tests were chosen based upon the target audience for each platform in the home environment. This was not a review of office level machines as the Q965/963 and 690V are targeted to the business user.
The conclusion mentions that the G965 + E6300 costs around $300 compared to $315 for the 690G + 5200+ (or 6150 + 5200+), so it's more or less a fair "equivalent price" platform comparison. The E6400 ends up being faster than the E6300, but still slower in a few tests (as the text mentions) and even faster in those tests where E6300 already holds the lead. Nothing new there - we've pretty much beat the "Core 2 Duo is faster" drum to death. We feel anyone looking at 690G is going to be interested in the platform as a whole much more than whether or not it is faster than equivalently price Core 2 offerings.
There may be too many variables, but perhaps you could come up with a way to normalize the benchmarks. For instance, run the gaming tests first with ultra high-end graphics to try and isolate the performance delta for each plattofrm/cpu combo you will test with. Then run the game benchmarks with the IGP solutions and adjust the scores based on the previous tests. Just a thought off the top of my head.
Ah, but you're evaluating a chipset here, not a platform or a system solution. Having said that, I agree that it IS difficult to compare chipsets that are targeted for different CPUs. In such a case, a better way to evaluate might be to take an AMD and an Intel CPU that is similar in performance (not in price), and use them to compare their corresponding chipsets. That would highlight the differences between the chipsets. You could always mention the price alongside, or do a separate price/performance comparison alongside.
My point is that a price/performance comparison should complement a pure performance comparison, not the other way around.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
70 Comments
Back to Article
3 CUBED - Friday, March 9, 2007 - link
I have to mention power also. Considering that a HTPC is properly going to be on quiet a bit, I would like to se some info on the power draw, from these mobo's. The same goes in the roundup!! Also considering that the energy prices is headed only UP, even a little lower performance might be worth that in the long run!!Thanks Kasper.
MrNeutrino - Thursday, March 8, 2007 - link
Guys,First, the feedback:
Frankly I'm quite frustrated from waiting for a site like AT for the past half a year or more to come out with more mATX reviews (until this review, which is a start).
I realize there are a lot of gamers OCers out there - very many AT readers. However, there are many (just as many?) non-gamer enthusiasts hoping to run stock-speed,SILENT, SFF systems out there - myself for one.
While lesser known sites have reviewed many of these products, I (and others like me I know of) have been waiting for AT to publish SOMETHING in the mATX / C2D (current and long-standing performance champ depending on the system config.) category for months on end! I realize you have dedicated folks working for each review category. However, AT - a site as a whole - still seems to have enough bandwidth to publish back-to-back LCD and heatsink reviews in a matter of day or two each. Yet you seem to have held off on prioritizing mATX system reviews for some inexplicable milestone until yesterday. I recall reading a vague comment in one of your reviews around the end of the year regarding an 'upcoming' mATX review, if I remember correctly. In my opinion it was already too delayed a review. Little did I know I'd be waiting another two months for such a review.
Geez! Publish the review in parts if you must, but don't make your readers hold off for this long and think all is well! What's the point of releasing this type of review, months after products became widely available and just a few months before the next round of technology updates?!
-----------------------
Second is a set of requests for the (personally) much anticipated upcoming mATX review next week 'as well as' for future reviews:
Requests for the upcoming mATX review:
* Please try to include Asus P5B-VM. One of the currently best featured G965 MBs.
* Please include at least one C2D ATX MB for comparison! My vote is for Asus P5B-E. I can't stress this enough! I have yet to receive any 'quantitative' (read: benchmark backed) response in forums http://forums.anandtech.com/messageview.aspx?catid...">here) and http://forums.anandtech.com/messageview.aspx?catid...">here, on the following topics (quoting from my earlier posts):
- how do mATX boards compare to ATX boards "for non-gaming tasks such as video / audio editing, general productivity, multi-tasking etc.?"
- "How much of a performance hit does a G965 type mATX motherboard with integrated graphics incur as a result of sharing memory bus bandwidth with the CPU, for NON-GAMING benchmarks, compared to regular C2D ATX boards?" (Assuming of course, the user chooses to use integrated graphics vs. discrete solution and has that enabled in BIOS.)
Please BE SURE to adderss these and other such real-world topics and help make the review more meaningful for folks like me.
Requests for future reviews:
* Consider investing more time and effort in SFF / mATX / silent PC config based reviews! Yes, there is an audience out there...
* For a site this major and popular - both with readers and vendors - you need to seriously evaluate your time-to-publishing lags for some of these reviews - C2D mATX roundup review for one. I realize there are a million things you can review and only 24 hours in a day. Delayed reviews (compared to when the products came out) don't help your readers as much - think luxury car depreciation over time... :)
Thanks.
Gary Key - Thursday, March 8, 2007 - link
Hi,Your suggestions and comments are appreciated. I did reply in the forums this morning.
:)
MrNeutrino - Friday, March 9, 2007 - link
Thanks Gary.I appreciate you reviewing the feedback and requests in detail. Hopefully we'll see some follow-up action based on this as and where appropriate.
Also, thanks for replying to some of the key questions I've had around mATX vs. ATX boards. Lack of major performance delta is very good to hear about, at least for pre-Vista Windows OSes. Interesting.
Based on your comments in the forum posting re: Vista + IGP + memory latency, I am intrigued. If you are going to cover this in the upcoming review, feel free to say so and defer this question. Else I am curious what performance difference we are talking about between XP vs. Vista using IGP solutions? Any pointers to help with this comparison would be helpful in helping decide whether or not a Vista purchase is worthwhile from a performance standpoint in such categories / applications.
Also, have you transitioned to exclusively testing using Vista?
blawck - Wednesday, March 7, 2007 - link
I'm buying a small-form-factor PC with an Intel 965G motherboard (it was the only option), and I'm plugging in an NVIDIA 8800GTX video card. So, I was wondering how these IGP motherboards (specifically this Intel one) perform in general with a vidcard plugged in. Is performance on par with (or at least somewhere close to) that of full-size motherboards? Or am I getting screwed?Gary Key - Wednesday, March 7, 2007 - link
It depends on the board that you buy. The Gigabyte GA-965G-DS3 allows for a decent level of overclocking (330FSB) and memory options (CAS 3 operation), the overall performance difference will not be noticeable in day to day activities when compared to a more enthusiast level board. A base G965 board will not offer the same overclocking options and a couple of the boards only allow CAS 5 operation at DDR2-800, but once again, the performance delta overall will be less than 5% in most cases, nothing to be concerned about especially given your choice of video cards.blawck - Wednesday, March 7, 2007 - link
Great, thanks for the quick reply! This is a high-quality site =). The manufacturer is Maingear, and the board they're using for my system is simply identified generically as "Intel 965G Express," but based on your response, I have faith that I am, indeed, not getting screwed =). I've built all my previous machines, but I'm getting old and fat and lazy, so I figured I'd spend a few hundred extra and have someone do it for me. Not too worried about SLI or overclocking at this point...I'll accept whatever resolution I can run Oblivion in, as long as I can run it. Thanx again.chucky2 - Wednesday, March 7, 2007 - link
For the mATX review, you should include results for the Abit Fatal1ty F-I90HD.It's basically the 690G version for Intel CPU's...and that'd allow a direct comparison between Conroe and AM2 CPU's as the chipset would be the same.
Just a thought...
Chuck
P.S. Plus, I'm sure there's a good amount of people that'd like to run Conroe on a cheap but good mATX, and the Abit Fatal1ty F-I90HD looks to be about the best option out right now for that (albiet in limited quantities so far...), just too bad it doesn't have onboard Firewire (at least I don't think it does, didn't list it on the spec page), because then it'd have like everything one could want...
Gary Key - Wednesday, March 7, 2007 - link
We should receive that board next week. I will do my best to include in it the roundup.chucky2 - Wednesday, March 7, 2007 - link
Awesome if you can Gary, Cool if you can't......March looks like the month of motherboard reviews... :)
Chuck
chucky2 - Wednesday, March 7, 2007 - link
BlingBlingArsch of the AnandTech forums linked to some pictures of the board, and there's one of the back panel I/O: http://img256.imageshack.us/img256/5498/board234cx...">http://img256.imageshack.us/img256/5498/board234cx...Looks like there's definitely no Firewire... :( :( :(
What are these manufacturers thinking (or rather not thinking) not including Firewire on this boards? These would be totally complete solutions, especially this Abit with the optical out it has, if they'd only have Firewire on them...
...and the expansion is so limited, putting in an add-in Firewire basically kills for TV tuner, capture, etc. additions.
Man...talk about something that's almost perfect that gets ruined by either a poor design decision or a poor bean counter decision... :(
Chuck
Myrandex - Wednesday, March 7, 2007 - link
"The 6150 performs okay considering the age of its core and we will see the new 6150SE and older 6100 chipset performing a few percent better overall but not enough to catch the 690G."How would the 6100 be a few percent better when it is clocked lower?
Renoir - Tuesday, March 6, 2007 - link
The review over at http://www.bit-tech.net/hardware/2007/03/02/amd_69...">Bit-tech.netsays the 690G supports dual-link DVI and confirmed as much by sending 2560x1600 over DVI to the dell 30incher. This review however says "Larger 30" flat panel monitors won't be able to run at native resolution" and the technology overview article says "The digital outputs use TMDS transmitters that run at 165MHz". What's the deal?Gary Key - Wednesday, March 7, 2007 - link
The 690G supports Dual-Link DVI. We had stated this on page two but not in a separate section. I will reword the 2D paragraph to make this clear. As for the resolution, I am using a Samsung 30" panel and the current Vista drivers limit me to 2048x1536. I have sent a board to Jarred who has the Dell 30" to test on it. AMD still confirms that 2048x1536 is the "current" max resolution although we know the hardware has 2560x1600 capability according to one of our sources.Renoir - Wednesday, March 7, 2007 - link
Hmmm something's not quite right it seems. Can't see why they were able to send 2560x1600 if you couldn't. Would definitely appreciate Jarred checking it on the dell although I'd be surprised if it was a monitor issue. Who knows without trying. Have asked bit-tech what os they were using to get it to work. An XP vs Vista issue perhaps? The related paragraph in the technology overview article mentions the TMDS's run at 165mhz which I understand is single-link? Have seen the 165mhz listed elsewhere for the 690G so am curious where this info comes from if the chipset is dual-link? Unless I've misunderstood something about "165mhz"?Gary Key - Wednesday, March 7, 2007 - link
The DVI spec transmits data using the transition minimized differential signaling (TMDS) protocol. The DVI spec calls for each DVI output to have at least one TMDS “link” consisting of three data channels (RGB) and one control channel. The maximum speed at which a single 10-bit TMDS link may operate at is 165MHz, offering 1.65Gbps of bandwidth. In real world terms, this means a single 10-bit TMDS link can drive a display at up to 1920 x 1200 (the actual maximum resolution can vary depending on the panel, spec is 1920x1080). For most displays that’s not a problem, but the 30” Displays have a native resolution of 2560 x 1600, which exceeds the bandwidth a single TMDS link can deliver. So what do you do? Remember that the DVI spec calls for at least one TMDS link, but each DVI port can support up to two TMDS links (the 690G has dual TDMS links), thus doubling the maximum bandwidth and enabling support for a 30" (if driver support is present) display or even some of the new 27" units that can run at 2048x1560.Renoir - Thursday, March 8, 2007 - link
Thanks for the reply Gary. That was precisely my understanding of the situation which is why I found the following quote from the technology overview article confusing "The digital outputs each use TMDS transmitters that run at 165MHz." This sentence didn't come across as saying the digital outputs had 2 TMDS "links" but rather just 1 running at 165mhz (hence single-link). Perhaps you could reword it to explain that each link runs at 165mhz but that there are actually 2 links in order to support the higher resolutions afforded by dual-link DVI. Don't mean to be picky just think this part could be a little clearer :-)As for the resolution cap at 2048x1536 you guys are experiencing the Bit-Tech guys have confirmed they got 2560x1600 working on XP and suggest your problem is an issue with the current vista drivers.
Gary Key - Thursday, March 8, 2007 - link
I have a new Vista driver as of today.Here are the specs -
DVI - Supports dual link up to 2560x1600.
HDMI - maximum resolution supported is 1920x1080 (using a HDMI-DVI cable
you can go up to 1920x1200)
VGA- Maximum resolution support depends on monitor refresh rates and aspect
ratios:
2048x1536 @ 85 Hz in 4:3 format
2560x1440 @ 75 Hz in 16:9 format
2728x1536 @ 60 Hz in 16:9 format
2456x1536 @ 60 Hz in 16:10 format
Hope that helps.
Renoir - Thursday, March 8, 2007 - link
That's cleared that up then (was merely a driver issue). Anyhow 2 questions1) Both digital outputs support HDCP but are on separate display controllers. Does that mean they have 2 built in cryptoroms (1 for each controller) given that separate cryptoroms are required for each controller/output? If they do have 2 then why only allow HDCP on one output at a time?
2) In a related point (upcoming mobile version of chipset) what connection do laptops use internally for their screens? The reason I ask is I'm interested in getting a laptop in future which supports both hdcp for the laptop screen but also via an external digital connection to a larger display.
jonman03 - Tuesday, March 6, 2007 - link
I know its onboard video and stuff, but a 3D Mark06 score of 313? They should be able to better than that, see who can get it into the 1000's first. Although unlikely, it'd be a nice alternative to buying a video card for a basic computing system.http://www.plugcomputers.com">Custom Gaming Computers
JarredWalton - Tuesday, March 6, 2007 - link
What's worse is that the G965 scores almost twice as high in 3DMark06... and then falls flat on its face in actual gaming tests. (Well, most of them anyway.)IntelUser2000 - Wednesday, March 14, 2007 - link
I think it might be combination of Vista too. Half Life 2 can score ~20 fps with G965 at the same settings AT tested at, when using Windows XP. I would also like to see how it performs it in XP. It seems G965 suffers more from Vista then other IGP.
chucky2 - Tuesday, March 6, 2007 - link
I think it would be good insurance, given how amazingly late 690G is, to please confirm with AMD that 690G motherboards will definitly support the AM2+ CPU's this late summer/fall.And before people remind me that this is already fact, we have not to my knowledge see AMD themselves confirm this...which for something so seemingly simple to confirm, is getting distrubingly telling.
When AnandTech updates their article and says that they've gone back and confirmed with AMD that all 690G boards being release with support AM2+, or AMD themselves says it, then we'll know for sure. Until then, it's rumor...
Chuck
JarredWalton - Tuesday, March 6, 2007 - link
AMD has not officially stated whether the Agena/Kuma will be drop-in compatible with current AM2 chipsets (or even the AM2 socket). We'd certainly love to know, but we're still waiting along with everyone else.chucky2 - Tuesday, March 6, 2007 - link
Do they have any idea, and have you specifically put the question to them? I'm sure you have contacts at AMD......because from what I can tell, if a discrete graphics card is used, this chipset is looking like when Dr. Evil says, 1 million dollars! ...and then everyone is like, Uh, big deal...
...this thing should have released in Sept. of last year, and then become the defacto AMD chipset, not be released now - as you point out - with MCP68 right around the corner and the G35 coming also.
This chipset really looks to me like a could have been. Good work ATI (and then AMD)!!!
Chuck
Gary Key - Tuesday, March 6, 2007 - link
We asked the question Chuck, have not received an official answer yet. While backwards compatibility has always been discussed as possible by AMD, we are still not convinced with any of the current motherboards. One only has to look at the Conroe launch last year and realize that while the chipsets were compatible, the motherboards were not without an update. We just recently saw this again with Kentsfield. We wish this chipset would have been released last fall also. ;)chucky2 - Tuesday, March 6, 2007 - link
Thanks Guys, that's about all I can ask for.I guess now I've got to really sit down and decide what's the best course of action for my godson and also cousins builds. Their both going to be budget builds, but I don't want to build them an AM2 system and basically have it be End Of Lifed in 3-4 months.
You'd think if AMD wanted to stop the hemmoraging their seeing on the enthusiast side, they'd make a statement about AM2+ compatibility now, rather than wait and just keep loosing more and more. Not that a lot won't go over to the Intel side, but still, tell me 690G and say MCP68 will be the only AM2 chipsets that can take AM2+ CPU's, and now at least I've got a comfortable long term upgrade path.
Leave me in doubt, I mine as well get a 1333FSB Intel board and go to the dark side...
Looking forward to that mATX review...you think it'll be out this time next week, or towards Friday?
Chuck
JarredWalton - Tuesday, March 6, 2007 - link
Honestly, the fact that AMD hasn't come out and said the next gen CPUs will work in AM2 speaks volumes in my book. Perhaps they are just trying to keep things quiet so that a bunch of people won't complain that their particular board won't run the new processors (some 939 boards wouldn't work with X2, after all, and there were some complaints saying AMD "guaranteed backwards compatibility). Hopefully that's all it is, but I am seriously concerned that Kuma and Agena will not work in the vast majority of AM2 boards - that's assuming they'll work in any at all.If AMD doesn't support older boards with the new processors, they are going to need some really impressive performance to keep people from raising Cain. As it stands, if a reasonably fast X2 5200+ or so isn't good enough for your long-term needs, I certainly wouldn't purchase a new AM2 system with the hope of an upgrade until the truth comes out.
Final thought: The Quad FX platform has clearly been stated as being forwards compatible with native quad core Barcelona chips. If AMD is willing to make that commitment, why not make a similar commitment with AM2 and Agena/Kuma?
dmce - Tuesday, March 6, 2007 - link
Are we going to see some details regarding HD (1080P) playback and whether it can do it comfortable or not. I appreciate you made a small comment about it, but this was lifted from the original look at the 690G chipset a few days ago so no real update in this review. Im just puzzled no sites are taking a closer look at this considering its surely one of, if not the whole point of HDMI being there?Im not interested at all in using this for games, i want a 1080p capable machine.
PokerGuy - Wednesday, March 7, 2007 - link
I'm considering a board based on the 690G for my new HTPC, but now I see it won't be able to output 1080P? Yipes... even the lowly 6150 can output at 1080P, correct?Games are not important with regard to this board, but if it can't output at 1080p, what use would it be in a HTPC??
Also, any ETA on when the mATX roundup will be released?
Gary Key - Wednesday, March 7, 2007 - link
The board will not do 1080P over the HDMI port at this time. 720P is working fine. The mATX will be up on the 19th, provided my heart is still working by that time, have to say that testing under Vista is not a pleasant experience. ;)dmce - Friday, March 9, 2007 - link
Gary, thanks for the info. Is the lack of ability to do 1080p related to Vista or will it just not do it at this stage. Is it likely bios/driver updates will improve this?chucky2 - Thursday, March 8, 2007 - link
Man, to me it sounds like - other than having video and audio in one cable - HDMI is not the way to go.Better to have DVI w/ HDCP it sounds like...plus, the connector is more beefcake, no falling out accidentally with DVI.
Chuck
Gary Key - Tuesday, March 6, 2007 - link
The platform will no do 1080P playback at this time in a consistent matter. As stated, we normally would end up with a slide show or a blank screen. AMD has told us 1080P will be possible with a driver update, proper playback support (PowerDVD or WinDVD), and a processor along the lines of a 5200+. We received a new driver update to address video quality issues we found late in testing but 1080P was not addressed yet. I am just as anxious as everyone else to see if it will do 1080P. ;)savantu - Tuesday, March 6, 2007 - link
This has to be one of the worst review ever done at Anandtech , almost makes you think somebody was paid to do it this bad.we fully believe the majority of the performance difference lies in the chipset selection.
Is this a joke or what ? The 2.6GHz 5200+ against a 1.86GHz Core 2 in media encoding and you think it is the chipset?! Every other test you made put the E6300 in between the 3800+ and 4200+.
goinginstyle - Tuesday, March 6, 2007 - link
Have you ever run a Conroe on a VIA or 945P chipset, if you have then you know what was meant by his statement.JarredWalton - Tuesday, March 6, 2007 - link
Try reading it in context:Nero Recode 2 performance:
AnyDVD Rip = 3-way tie more or less
Shrink = 6150 leads, G965 second, 690G last (despite 6150 and 690G using the same CPU)
Shrink/Burn = G965 first, 6150 and 690G virtually tied.
Full quote, instead of your selected text: "Of course, we are using a mid-range AM2 processor against the budget C2D part (the AMD price cuts have helped matters there, as the price difference is currently only about $35) but we fully believe the majority of the performance difference lies in the chipset selection. It is only in the shrink and burn tests that we see the Intel platform flexing its muscles...."
In other words, the difference we saw in the Shrink test indicates that the 6150 chipset is better for this task than 690G. We definitely know that the Core 2 Duo is faster at equivalent CPU prices than X2 chips, but we're looking at platforms and chipsets and not just CPUs.
UserNO - Tuesday, March 6, 2007 - link
Gawd, people, if you're going to tell us that we'll "have to play at 800x600" if we use these integrated graphics, why not test the games at 800x600 and report framerates? Find the highest (lowest) settings necessary to get a playable experience and tell us that. No one's going to run a game at 1024x768, get 15fps, and then give up; they're going to crank the settings and res down until they can play the game.It's not enough to just say "integrated graphics are unsuitable for even casual gamers, buy a discrete card" and then not quantify the difference.
Gary Key - Tuesday, March 6, 2007 - link
We tested the games at 800x600 and the results will be in our mATX roundup along with dedicated video scores.In the meantime-
800x600- HQ settings
690G 6150 G965
BF2 20.68 17.4 DNF
HL2 35.7 28.8 5.3
CoH 26.4 21.7 24.9
Calin - Wednesday, March 7, 2007 - link
I just wanted to say I'm looking forward to the mATX roundupNice article overall, thanks. And you might want to invest in some cheap Intel and AMD processors (the low-end, for $100 both), just to be able to compare them (I'm not suggesting complete testings on every processor possible)
TA152H - Tuesday, March 6, 2007 - link
I just read the first page of this review, and wonder if a 15 year old kid wrote this.Let's grow up a little and stop thinking the world revolves around people that play video games all day and whine about big companies that actually think people use computers for real work!
There's a perfectly good reason that Intel, et al, release IGPs that don't have great 3D processing power. I mean think before you write something so inane. The world doesn't all play games, and not everyone is an idiot and wants to play a mindless 3D shoot 'em up game and thinks extra resolution makes it so much more fun. Pac-Man, Space Invaders and Defender were incredibly crude by today's standards, but people loved them. It's the idea, not the resolution.
Also keep in mind it's not like there is no downside to creating IGPs with all this functionality. You burn up electricity with those extra transistors, even when the vast majority of people, the vast majority of the time, aren't using them. You also generate more heat, and you make them more expensive. Why should someone that doesn't play 3D games pay for this when they don't use it? Why should be use more electricity so some simpletons that want to zap aliens can have that ability in an IGP? Isn't it enough they make cards specifically for that? Shouldn't only the people that want this capability have to pay for it? Why should other machines be crippled with this heat generating, electricity using capability when they never use it?
So, if you're still incredulous about why companies still make relatively slow IGPs, it's because they fill the need for the vast majority of the people that buy them. And if they do not, there are options available that do. That's a perfect solution, you get it if you need it, and don't if you don't. Why whine about it so much?
Gary Key - Tuesday, March 6, 2007 - link
I wish I was 15 again. :)
I fully understand the world does not revolve gaming and that most people use computers for work. However, these same people might have kids who want to play games or they decide to try it also and without knowledge of what it takes to run a game properly they pick up the phone and call Dell or HP or go on-line. Unfortunately, the majority of the machines they will see and probably can afford are going to be IGP solutions. The reps or the machine specs on-line will lead you to believe the $650 unit you just picked out will cover all of your needs and that is wrong. One of Dell's top selling units is the 521 series and is advertised as "Built for Smooth, Advanced Performance, Edit your videos and photos, play games and more." Guess what, it just barely does most that and we hear people complain all the time that their PC is not fast enough for this application or this game, or a myriad of other uses.
Probably the same reason we whined about Yugos and Pintos and the reason you no longer see them in production. Sure, they filled a basic need but once you got in a Civic or a Corolla you fully realized that something faster, better, and of higher quality was available for nearly the same price. The technology is there to greatly improve IGP performance at very little cost. We do not expect these solutions to ever provide the same level of performance as dedicated GPU cards but we do expect more out of them. Intel promised the world they would have a class leading DX9 IGP solution in the G965, what we ended up with was a solution that barely worked better than the 945G. If you are going to advertise functionality that leads one to believe you can play the latest games on it and mass market it, then expect us to report our findings and to call them out on it if it does not meet the marketing specifications. The same holds true for NVIDIA, AMD, SIS, or VIA in this market sector. We have the same issue with the low end video cards also, in some cases their performance is no better than the IGP solutions which means you just spent a $100 for the same performance yet those words all over the product box would lead you to believe otherwise.
Thank you for your comments! :)
TA152H - Wednesday, March 7, 2007 - link
Gary,My big problem with what you said in the article is that you are blaming Intel, and to a lesser extent, for creating a great chipset for the vast majority of people that use it. The negative remarks about how they are making it difficult for the those poor developers is completely misplaced, they make an excellent product that does what it should. In your response, you are now changing your position, I believe. It seems that Dell and HP are misrepresenting products they are selling, and that is the root problem. I am not privvy to that, but if that is your contention, I have no problem with it. It could be correct, but pointing fingers at Intel and other IGP makers is misplaced. They make a product for a huge market and address it well, and if some maker misrepresents it, that's their issue and not the maker. That would be like complaining that a Ford Mustang isn't as nimble as a Lotus Elise, because sometimes dealers say it's the best handling car. Is it Ford's fault that it is being sold incorrectly?
Your remarks about the Pinto and Yugo are totally misplaced. The Pinto died because it had a nasty tendency to explode when hit in the rear, and the Yugo was unreliable to the extreme. Neither was particularly well suited for its target audience, but the class of cars itself was. You kind of make my point for me. A lot of people still like subcompact cars for exactly the reason I stated - they don't need a big monster and they don't want to pay for the gas or the car in the first place. They get what they need that does the job adequately.
It's kind of childish, and megalomanical to think the whole world should have to pay more for their computers, and burn more power doing it, so a small subset of people can benefit because game developers can develop to a higher standard. So the business world, and everyone that doesn't play 3D games should subsidize gameplayers by having to buy something useless that uses extra power, probably makes more noise and costs more. All so gameplayers can get developers to design to something better. Sorry, I don't agree, and apparently neither does any of the big companies that actually make these decisions.
With regards to operating systems, a lot of companies still use Windows 2000. The business world is a lot more pragmatic and don't see a need to move to something because Microsoft came out with it. Outside of new machines coming with XP instead of 2000, was there really a compelling reason to move to it? Not for most people.
This last remark is directed at whoever it was that said that the Intel chipset took more power and was slower. Well, think a bit, it's a fully functional chipset and should, whereas the memory controller is not in those for the AMD processors. Also, Intel typically makes their chipsets on old lithography, not the newest stuff, although I don't know anything specific with regards to these chipsets.
Gary Key - Wednesday, March 7, 2007 - link
Our issue continues to be the technical and marketing spins of the various companies that sell these solutions as if they will do everything a consumer expects out of a PC. I am not addressing the office crowd and was clear about that in the article. I am addressing the home user who buys a machine and has certain expectations out of it based upon the sales literature or advertisements. I deal with the OEMs on an almost daily basis and while they have certain cost targets to achieve, one of their primary issues is that they have to constantly up sell a unit if the consumer says the word "gaming". Generally, this up sell does not work and they lose the sell as the consumer thinks they are getting the bait and switch routine. One of the top listed support call questions or concerns is, "My computer will not run this game or the game does not run properly." Like it or not, people do use the Home PC to play games or enjoy multimedia applications. The last time I looked at the numbers the PC game market is still very viable and alive. I also realize that not everyone plays games but that does not mean the requirements for decent 3D performance is not required, especially with Vista being launched now. This changes everything for the consumer market as it does not matter if Windows 2000 or XP is still best option for most people as the consumer buying these machines will receive Vista.
I am not advocating that the IGP solutions run the latest games at 100FPS. I am advocating that the base level of performance should be better, at least to the point where 1280x1024 at 30FPS with decent settings is viable, not because I want it but a significant amount of consumers would like an inexpensive PC solution that the kids can also use in playing games or maybe they are buying a second system for the family and do not want to spend as much for a discreet video card as they do for the system. Settling for 640x480 graphics is not the answer, improving the performance of the solution is the answer. We constantly expect our machines (yes, even the office crowds) to perform faster, whether we admit it or not. I have yet to come across a coworker that was satisfied with their current hardware once they used a next generation system. So, I do not think it is wrong to expect an improvement in the advance of IGP solutions targeted to the home crowd as an all-in-one solution.
As for the marketing spins, this is one of my main issues with the G965, and why I have a real issue with it when I address the gaming issues. This is straight from their webiste -
"Gamers and Media enthusiasts have been long demanding better technology for unparallel ultra-realistic visual experience and the demand will be even greater in the future. Intel has answered the current demand to support future technologies by unveiling its hybrid graphics architecture with the introduction of Intel® Graphics Media Accelerator 3000 (Intel® GMA 3000) family engine. This state-of-the-art hybrid architecture evolved as a balance between fixed function and programmability. The graphics engine of the Intel® Graphics Media Accelerator 3000 family consists of a scalable array of symmetric processing components known as execution units (EUs) which can be programmed to dynamically process graphics or media data alike. Programmability of the EUs adds flexibility and enables support for future graphics and media usages by upgrading the driver. Execution units support dynamic load balancing, multi-threading, and multi-functional data processing, resulting in increased performance to enable a more compelling gaming and visual experience for main stream users.
Intel’s next generation architecture shows Intel’s continued innovation by delivering greater flexibility and performance to meet the needs for the current and future consumer and corporate applications.
The new Intel Graphics Media Accelerator architecture with its greater performance and flexibility will be the basis for many new consumer and corporate applications. For consumers, the Intel® Graphics Media Accelerator X3000 engine supported on Intel® G965 Express chipset has been optimized for enhanced 3D to allow for greater game compatibility and realism with new video and display features to deliver a theater-like entertainment experience through Intel® Clear Video Technology. For corporate users, Intel® Graphics Media Accelerator 3000 engine incorporated in Intel® Q965/Q963 Express chipsets continues to offer stability, ease of use, and lower power consumption, in a cost effective solution. The Intel® Graphics Media Accelerator 3000 family engine is eligible for the Microsoft Windows Vista*Premium logo.
Intel’s next generation Graphics Media Accelerator that provides new levels of graphics and video responsiveness in a cost effective solution. The Intel graphics engine is integrated into the Graphics Memory Controller Hub (GMCH) and enables a low power, reduced form factor solution compared to power-hungry discrete graphics cards.
With Intel’s next-generation Graphics Media Accelerator, Intel continues to drive innovative platform enhancements that increase the overall performance for the end user. This next-generation Intel graphics architecture is designed to be extensible and offers extraordinary features. This section discusses the benefits of the Intel® Graphics Media Accelerator 3000 architecture features and innovations: programmability, dynamic load balancing, multi-threading, multi-function, 32 bit full precision compute, and dynamic and static flow control which enable more stunning graphics in games and video applications."
Notice they lead off their marketing spin with the word "Gamer" and continue with a demanding better technology statement. I think that pretty much indicates the market they were addressing with this chipset when the X3000 core was being developed. Guess what, the words that Intel used to launch this chipset are the same ones I am advocating in my opinion.
Thanks for your comments, they are certainly welcome and we respect your opinion. :)
TA152H - Thursday, March 8, 2007 - link
Gary,You mention gamer, and for the games I play this chipset works fine for almost all of them. Did you notice the word "mainstream"? It's not for enthusiast gamers, that want the most modern games, and the highest settings. But, still, I agree they exaggerate. I think the crux of our disagreement is, you seem to think that an IGP should run very demanding, very modern games at fairly high settings well. I think that's entirely unrealistic. I think that's what add-in cards are for. These are not for the enthusiast, and I don't think it's feasible to expect so much of them. Anything that could would be very power consuming, expensive, and big. They are already putting fans on some chipsets! This to me is entirely unacceptable, I like my PC very quiet. If you like performance and fans and a hot room, that's what cards are for.
I guess we mainly disagree on the target market for these items. You expect them to be serious gaming platforms, I don't. There are so many tiers of add-on cards, I don't see that being proper segmentation, and people don't want to pay for it. Put another way, if you make these more powerful than entry level add-in cards, what do people buy that want basic 3D performance without needing anything too powerful? They have to spend more money, and get a hotter power hungry machine because there is nothing that works well for them? Whereas, the way it is now, if they want something better, they go with a 965 and get a add-in card. Not only that, I think these things fit a lot of people pretty well. There is almost nothing I run that requires more than a Radeon 9000, and I'm not nearly as unique as I might like to think I am. There a big market for this level of performance.
Now, do computer companies exaggerate their claims. Well, as I said we agree on that part. They always have, always will. It's capitalism, so I guess we have to accept it. Capitalism isn't perfect, but overall it works.
I agree Vista will necessarily raise the bar a bit for 3D performance, and in fact Intel did improve their IGP to address that. We only disagree on the amount, not that it was done at all. But, even with Vista Microsoft hedged their bets and did not make the requirements have any teeth. You don't need "Aero" after all, and in some version you don't even get it. Don't they deserve some blame too? How difficult would it have been for them to make it a requirement? Think about it, if they don't sell Vista, they are still going to sell XP, and the upgrade market has never been that big compared to the amount they sell with new PCs. So, why no egg on their face?
I disagree completely with 1280x1024 being necessary, or that parents want their kids playing brainless games all the time. Playing at 640 x 480 is fine, or 800 x 600, and just saying it isn't without any backing isn't very useful. Do you remember Pac-Man, or the Atari 2600? Or Galaga? Or Space Invaders? They were much more popular than any single game now, and they were a lot of fun (Defender was my favorite). They were incredibly crude by today's standard, but people still loved them. Some poor little kid that has to play at 800 x 600 is just not going to get my sympathy, if the game is a fun game, it just won't matter. If you think the level of detail of an explosion is what makes the game fun, then I think you've lost touch with mainstream human behavior and a bit obsessive about this level of detail. It's not what matters, even if you have a bunch of obsessive people think it is. If it were, games played at 256x192 or whatever, with 4 colors, that were two dimensional, that had big time lag when a lot was going on, would never have been popular. For those that are extreme though, there are cards.
I would agree with you if IGPs were making NO progress. But they are, and they will continue to. You know what would be interesting for you guys to do (at least to me)? A comparison between IGPs and older video cards, and do a trend where you compare them for the last several years to see how far IGPs have been behind cards, and see what the trend is with that. It would be imprecise, and in some cases difficult, but I think the results would be illuminating for just about all of us. I'm kind of curious if they are further behind now than they were say in the 810 days, or if they're trailing by the same amount of time. My guess, and it's just a guess, is it's about the same lag. Wouldn't you be interested in that too?
JarredWalton - Wednesday, March 7, 2007 - link
If you want the best example of misrepresenting the capabilities of a product, look at the 3DMark06 results and then look at the gaming results. The Intel GPU that you're so fond of happens to score the best in 3DMark (nearly twice as fast!), and yet it almost entirely fails to run two popular games/game engines. That's Intel's fault, and it's Intel's drivers that appear to be mostly to blame. How many places will post the 3DMark score as representing a product, because it's "easy to compare", and completely neglect to show that actual performance is very poor?For all we know, the G965 has the power to actually outperform the other IGPs and it's only the drivers holding it back. Unlikely, but possible. And while you appear to be more than happy with the IGP offerings, we would like more, and we would like more quality - in drivers, performance, features, etc.
You complain about us putting out our opinion on what an IGP should be, but basically opinion is all you give in response. Plenty of people will wholeheartedly agree with our stance; others will agree with yours. That graphics are becoming more necessary in modern computers is basically a given - though certainly a lot of people and businesses still run Windows 98 and 2000, and they'll continue to run XP likely until after Vista's successor ships. Me, I'll take XP over 2000 any day - if for nothing else than the task manager than now has a networking tab. Yup, call me crazy, but I appreciate most of the new features that went into XP, and I'm sure I'll eventually appreciate Vista as well. (SuperFetch alone is going a LONG way towards winning me over!)
G965 is hardly better than 945G, which was barely better than 915G. X1250 likewise is only an incremental step up from X1100. Is there a compelling reason to upgrade operating systems? Sometimes. Is there a compelling reason to not upgrade? Sometimes. What about IGPs - what's the compelling reason to upgrade from 865G if all you need is 2D acceleration?
I guess we should all just forget DX8, 9, 10... probably ditch DX7 as well, forget 3D and move back into the glory days of pure 2D graphics acceleration. For all the people that would like that move, there are lots of others (many more, I'd wager) that would feel it is castrating innovation. Long term, we need new software and hardware and advances in both. How many people switched to OSX because it was an overall better OS for their needs? Only a few percent, but if Apple continued to evolve their OS and nothing happened with Windows, eventually Windows would disappear.
Q965 and 690V are more stripped down versions of the IGPs in question, and those are targeted at the business market that you seem to prefer. We're not looking at those; we're looking at the "consumer" line of IGPs, and we're finding them severely lacking. If there were no real difference in performance between the business and consumer offerings, then there would be no point in having different products. There is a difference, however, indicating that Intel and AMD are both aware of the need for better performance in the home market. Right now, that better performance is largely missing. What it comes down to is that we feel Intel, AMD, and others could do a lot better. For you, that might not be important, but right now a lot of IGPs are only good for a bare level of functionality. Fine for business, but not for a lot of home users; they might as well forget the home market and just focus on business IGPs if the offerings are going to continue to unimpress.
Finally, your comment about power is once again incorrect. Intel Core 2 Duo with GPU x uses less power than AMD Athlon X2 with GPU x, in the majority of situations. Intel doesn't have an IMC in that scenario either. Both X1250 and G965 are "fully functional chipsets". The AMD platform ends up using quite a bit less power. Is it because Intel uses old lithography? Probably, but again, that just reillustrates the point that Intel is not doing anything to move the IGP market forward. If we could run X2 processors with a G965 chipset, we would find as usual that Intel uses less power for the CPU. The net result is that the better platform for low power computing currently looks to favor AMD, at least in the IGP market.
TA152H - Wednesday, March 7, 2007 - link
Jared,You're changing the argument. I didn't say Intel didn't exaggerate claims, I'm saying the world doesn't need 3D graphics with superior performance for most people, like the supposition in the article said. I don't want it, and if you do, go buy it and stop insisting that everyone gets it too. You have options.
You may want more performance, etc..., and there are cards for you. Go buy them instead of complaining about an item that wasn't made for the enthusiast. It's like complaining a Chevrolet Cavalier isn't fast enough. So go buy a Corvette, and leave the gas mileage people the Cavalier. Stop insisting that we need it.
I don't complain about you giving an opinion, I complain about your opinion. You think that everyone should subsidize game players. I don't. I never said XP wasn't for anyone, just most people don't care one way or another. I prefer 2000, it's faster and I can run it on machines that are less powerful. I still think OS/2 is better than either, but that's another topic :P. Vista might be good eventually, but I don't feel compelled to buy it because it doesn't do anything I need. When it runs something I want to run, and Win 2K doesn't, I'll probably buy it. Until then, I saved a lot of money skipping XP.
"Hardly" is too ambiguous a term, and what you consider hardly could be considered considerable. What is obvious is they run fine for most of the people that need them, and they are not for enthusiasts that need more than they are targetted for. Intel is in the business of making money, and their graphics sell better than anyone else's. So, they're obviously doing something right, even though you can shoot aliens with them as well as you'd like. But then, that's what Nvidia and AMD/ATI are for. That's what the video slots are for.
Innovation for the sake of innovation is always a mistake, and if you don't believe that look at the P7 and you'll see a perfect example. And exaggerating something to make a point shows just how weak your point really is. Intel will continue to boost performance when it makes sense to, meaning it doesn't add much in terms of power requirements and cost, and adds something useful for the target of the product.
Your supposition on Apple is without support. Unless you think you know more than analysts that cover Apple and think they gained a lot of mindshare because of the Ipod. Also, moving to Intel helps, since it is better than the POWER platform was for them.
With regards to home users, you sort of have a point, because they do sell cheaper versions. But, you seem to think the world is filled with kids that play games, and simple ones that don't want to play games that require a lot of thinking. A lot of people, myself included, like playing strategy games, and some are fine playing games without all the bells and whistles running. I don't mind playing at 640 x 480 even for arcade games, because it doesn't change the game ideas and that's what is engaging. The funny part is you guys seem to think the world is filled with people that play extreme games because that is a lot of what reads your web site, but it isn't. Most people are happy to surf the internet, email, play card games online, and do other things that don't stress 3D capabilities much, and would prefer to save the money and electricity, and noise, and heat, and not get something they don't need or want. If they need or want more, there is more out there. Put in a different way, you say they should be more powerful because they have weaker siblings, has it occurred to you there are several lines of cards above them, and by contrast they don't have to be so powerful because of this. They are home machines, not enthusiast machines. That's what add-in cards are for.
You are misrepresenting my comments when you call them incorrect. You said that the highest performing IGP used the least power. I mentioned the chipset, not both pieces, and you would expect, everything else being equal, the Intel would use more, because it has a memory controller as part of the chipset.
You're dead wrong about Intel not moving the IGP market forward, and you even contradict yourself earlier in your remarks. It's moving it along slowly by your own admission. I'm not sure I agree on the speed, but we both agree that it is moving along. The fact is, Intel chipsets with graphics sell really well, so they are meeting the market demands. Maybe not yours, but, again, that's what add-in cards are for.
JarredWalton - Wednesday, March 7, 2007 - link
I have to say that it appears to be the same thing. Gary has further addressed the situation below. You are certainly entitled to your opinion on things, but likewise we are entitled to ours. For non-gaming, non-3D work (and to a lesser extent, non-video tasks) just about any IGP released since 2000 will perform acceptably. If that's all you need for a PC, you're set. Clearly, Intel *doesn't* feel that's all that's necessary, as they have integrated a lot more into their IGP and their marketing touts those new features. However, while we have more, it is in many instances inadequate.
If someone is happy playing older games at 640x480, that will almost certainly lead to them wanting to play newer games. If they only want to play strategy games, that will inevitably lead to looking at newer strategy games (Civ IV for example) that require 3D graphics. I think we can make a pretty good case that for anyone that does any gaming beyond solitaire and web-based games, the move to 3D quickly becomes necessary unless you simply want to relieve the glory days of 2000 and earlier.
There is a market for add-in cards, but that market exists in part because the IGP offerings are so anemic in terms of performance, and there's plenty of obfuscation there as well. Many OEM budget PCs don't even give the buyer the option of installing a truly capable GPU (at least GeForce 7600 or Radeon X1650) in the online configuration tools. The best IGP on the market currently rates about the performance of an X300/9600. Given that we have discrete GPUs ranging from entry level $50 cards up through uber-powerful $600 monsters, we'd like to see a few more options for IGP that get above the "not even entry-level performance". Both AMD and Intel have two IGP chipset levels, but they are simply "too slow" and "even slower - but it's for business so that's okay" (at least when it comes to 3D). If they're going to the trouble of making a faster model, why not actually make it useful? And why all the bogus marketing if it doesn't matter?
I too have seen far too many support questions wondering why a "recent" PC can't run a game some child got for Christmas or whatever. The companies selling and producing these products aren't out to educate the people about what's really required for certain computing tasks, so that's what we do. Most people apparently get that, and we're not the only ones complaining about the state of IGP. I've read about it in quite a few magazines, but I'm sure you disagree with their opinion as well.
For better or worse, IGP still goes into 40% of PCs sold. No wonder people get frustrated and purchase gaming consoles... not because a PC can't handle games with the correct configuration, but because the base models that so many buy are crippled at the starting gate. Don't even get me started about OEM systems that don't include GPU expansion slots (which are thankfully starting to become less common).
I understand your argument, and I respectfully disagree. Intel will sell millions of their IGP whether they suck rocks or they actually perform okay. They will sell millions of CPUs whether they are worse than the competition (P4 vs. Athlon 64) or better than the competition (Core 2 vs. Athlon X2). They are taking the easy way out on the IGP and millions of people unknowingly get "Powerful multimedia and gaming capabilities" that simply aren't. Intel is capable of doing better on IGPs - lower power and better performance, all in the same package. They don't do that for a variety of reasons, but what's good for a big corporation is rarely ideal for consumers. I for one care more about what will benefit consumers than what will make Intel, IBM, AMD, Dell, etc. happy. You don't work for one of them, by chance? You're awfully upset about a few minor paragraphs.
TA152H - Thursday, March 8, 2007 - link
Jared,I think we disagree on the purpose of this chipset. I have never liked shared memory video, ever since the PC jr. I thought it was a bad idea, and I still do. They are a low end segment and people should not expect performance from them. My expectations for them are not that high, and I think they're perfect for their market segment. I don't think IGPs should be particularly powerful or play the most modern games well, that's what add-in cards are for. It probably comes down to market segmentation.
You're doing the same thing Gary is, you are blaming Intel, et al, for what OEMs are doing with their computers. Intel has improved the video capability, and no doubt will continue to, and it will inevitably lag behind video cards. But then, why shouldn't it? They make the product, they aren't like IBM used to be and sold the end product to consumers too, so I'd say the makers have to fix that part and segment it well. Then again, who is really to blame? You can't blame makers, you have to put the blame squarely where it belongs, the consumer. If the consumer demands better stuff, they will get it. I totally disagree with you on items sucking "rocks". For the first time I can remember, the technology actually has a huge impact on what sells. Look at the market share losses Intel suffered because of the miserable Prescott. Look at the the low prices AMD is forced to sell their pathetic K8s for now that the P8 is out. I don't ever remember this happening before, if it had would x86 have won out (68K was much better)? It didn't happen completely, but it DID happen to a degree that had a material impact on both companies, and certainly helped Intel create a more prosaic and effective product in the Core 2. This is a company that is known for bizarre microprocessors too, from the 432 to the 860 to the Itanium (really more HP, but still), so a normal microprocessor that just works well probably killed them to make :P.
We have a fundamental disagreement on the segment, I guess that's it. I think they should leave off where the add-in cards start. Cards having their own memory and processor should be better than a integrated chipset with shared memory and shared die space with other functions. Call me a purist, but this makes more sense than this being faster than a card made specifically for that. Also consider that the reason why these machines sell so well is because of the cost. Intel, et al, has to balance the cost with the features, including other aspects like buffers, sound, etc... and make it all in the best package for their intended audience. Since you can't upgrade the chipset on the motherboard, and you can upgrade the sound and video, I think they do a pretty good job.
You probably have never played Civ 4, or other strategy games like it, if you make that remark. They do not need even anything remotely powerful to play well. I run it fine on a Radeon 9000, although the processor speed is extremely important for it. Civ III was a beast too in terms of CPU performance.
I'm sure Intel has their reasons for making the IGP the way they have, and you certainly do not know better than they what they are capable of. Could they make it lower power and more powerful? Well, it's possible, but then it would cost more. Would it target the market better? Probably not, otherwise they'd probably do it. But, of course, they are working on the next chipset that will, so it's not all bad.
The root problem with stuff like this is the consumer, and the consumer has chosen these chipsets, either actively or passively. If they rejected it like they rejected, for example, RD-RAM, or to a lesser extent the Prescott, then Intel wouldn't keep making them and would come out with a product that fit the market better. But these products do fit the market. You talk about how many people that complain about this or that, but do you ever think about how many people that buy these units are not complaining and are perfectly happy and saved some money because it was all they needed? Anandtech readers are a small, very inaccurate subset of the computing world at large, and you can't base a whole lot off of what readers here say. Computers 20 years ago were mainly for hobbyists, but now the general public uses them and the reality is, there is not an industry in the world where there isn't some mismatching. You think no one complains about cars? Dishwashers? Speakers? Call your local Sears to find out if they ever get complaints :P. It's the nature of the beast, and it won't go away if Intel makes ultra-powerful IGPs (which again, I find oxymoronic). There's always a tradeoff.
Final Hamlet - Tuesday, March 6, 2007 - link
I don't agree to _how_ this criticism was stated, but I _do_ agree to its content.I am PC worker, too.
I am not getting paid for playing directX 10 games.
I have a PC for using Word, Excel, Access, maybe a browser and a PDF reader. That is basically it. Now you tell me: Do I need a Core 2 Duo 6800 and a Geforce 8800GTX for that?
Nope. I need no graphics power at all (1280x1024x32 2D desktop - that's all) and my no. 1 priority for a processor is one that is _silent_ and nothing else.
I am quite sure I am not alone out there...
goinginstyle - Tuesday, March 6, 2007 - link
I think most of the people missed the comments or observations in the article. The article was geared to proving or disproving the capabilities of the 690g and in a way the competing platforms. It was obvious to me the office crowd was not being addressed in this article and it was the home audience that the tests were geared towards. I think the separation between the two was correct.The first computer I bought from Gateway was an IGP unit that claimed it would run everything and anything. It did not and pissed me off. After doing some homework I realized where I went wrong and would never again buy an IGP box unless the video and memory is upgraded, even if it is not for gaming. I have several friends who bought computers for their kids when World of WarCraft came out and bitched non-stop at work because their new Dell or HP would not run the game. At least the author had the balls to state what many of us think. The article was fair and thorough in my opinion although I was hoping to see some 1080P screen shots. Hint Hint
Final Hamlet - Tuesday, March 6, 2007 - link
Too bad one can't edit one's comments...My point (besides correcting a mistake) is, that I think that this test is gravely imbalanced... you are testing - as you have said yourself - an office chipset - then why do you do it with an overpowered CPU?
Office PC's in small businesses go after price and where is the difference in using a mail program between a Core 2 Duo for 1000$ and the smallest and cheapest AMD offering for less than 100$?
Gary Key - Tuesday, March 6, 2007 - link
We were not testing an office chipset. We are testing chipsets marketed as an all in solution to the home, home/office, multimedia, HTPC, and casual gaming crowd. The office chipsets are the Q965/963 and 690V solutions. The G965 and 690G are not targeted to the office workers and were not tested as such. Our goal was to test these boards in the environment and with applications they are marketed to run.
JarredWalton - Tuesday, March 6, 2007 - link
We mentioned this above, but basically we were looking to keep platform costs equal. Sure, X2 3800+ is half as expensive and about 30% slower than the 5200+. But since the Intel side was going to get an E6300 (that's what we had available), the use of a low-end AMD X2 would have skewed results the other direction. We could have used an X2 4800+ to keep costs closer, but that's an odd CPU choice as well as we would recommend spending the extra $15 to get the 5200+.The intent was not to do a strict CPU-to-CPU comparison as we've done that plenty (as recently as the http://www.anandtech.com/cpuchipsets/showdoc.aspx?...">X2 6000+ launch). We wanted to look at platform and keep them relatively equal in the cost department. All you have to do is look at the power numbers to see that the 5200+ with 690G compares quite well (and quiet well) to the E6300 with G965.
The major selling point of this chipset is basically that it supports HDMI output. That's nice, and for HTPC users it could be a good choice. Outside of that specific market, though, there's not a whole lot to put this IGP chipset above other offerings. That was what we were hoping to convey with the article. It's not bad, but neither is it the greatest thing since sliced bread.
If you care at all about GPU performance, all of the modern IGP solutions are too slow. If you don't care, then they're all fast enough to do whatever most people need. For typical business applications, the vast majority of companies are still running Pentium 4, simply because it is more than sufficient. New PCs are now coming with Core 2 Duo, but I know at least a few major corporations that have hundreds of thousands of P4 and P3 systems in use, and I'm sure there are plenty more. Needless to say, those corporations probably won't be touching Vista for at least three or four years - one of them only switched to XP as recently as two years back.
JarredWalton - Tuesday, March 6, 2007 - link
Perhaps it's because the companies releasing these products make so much noise about how much better their new IGP is compared to the older offerings from their competitors? If AMD had released this and said, "This is just a minor update to our previous IGP to improve features and video quality; it is not dramatically faster and is not intended for games" then we would cut them some slack. When all of the companies involved are going on about how much faster percentage-wise they are than the competition (never mind that it's 5 FPS vs. 4 FPS), we're inclined to point out how ludicrous this is. When Intel hypes the DX9 capability of their G965 and yet still can't run most DX9 applications, maybe someone ought to call them on the carpet?Obviously, these low performance IGPs have a place in the business world, but Vista is now placing more of a demand on the GPU than ever before, and bare minimum functionality might now be adequate for a lot of people. As for power, isn't it interesting that the HIGHEST PERFORMANCE IGP ends up using the least amount of power? Never mind the fact that Core 2 Duo already has a power advantage over the X2 5200+!
So, while you might like to pull out the names and call us inane 15 year olds, there was certainly thought put into what we said. Just because something works okay doesn't mean it's great, and we are going to point out the flaws in a product regardless of marketing hype. Given how much effort Intel puts into their CPUs, a little bit more out of their IGP and drivers is not too much to ask for.
TA152H - Wednesday, March 7, 2007 - link
Jared,Maybe they didn't intend their products to be tested in the way you did. As someone pointed out, playing at 800 x 600 isn't that bad, and doesn't ruin the experience unless you have an obsession. Incredibly crude games were incredibly fun, so the resolution isn't going to make or break a game, it's the ideas behind it that will.
You can't be serious about what you want AMD to say. You know they can't, they are in competition and stuff like that would be extremely detrimental to them. Percentages are important, because they may not running the same games as you are, at the same settings. You would prefer they use absolutes as if they would give more information? Did AMD actually tell anyone these were excellent for all types of game? I never saw that.
With regards to CPUs and GPUs, you are trying to obfuscate the point. Everyone uses a CPU, some more than others. But, they do sell lower power ones, and even single core ones. Not everyone uses 3D functionality. If you don't get it, I DON'T want it on certain machines of mine. I don't run stuff like that on them, and I don't want the higher power use or heat dissipation problems from it. What you call effort isn't at all, it's a tradeoff. Don't confuse it with you get something for nothing if Intel puts more into it. You pay for it, and that's the problem. People who use it should, people that don't, shouldn't, so the kiddies can play their shoot 'em ups.
Just so you know, I'm both. I have mostly work machines, but two play machines. I like playing some games that require a good 3D card, but just don't like the mentality the the whole world should subsidize a bunch of gameplayers when they don't need it. That's what add-in cards are for. I would be equally against it if no one made 3D cards because most people didn't need them. I like choices, and I don't want to pay for excessive 3D functionality on something that will never use it, to help gameplayers out. Both existing is great, and IGPs will creep up as they always have, when it becomes inexpensive (both in power and initial cost) to add capabality, so the tradeoff is minor.
StriderGT - Tuesday, March 6, 2007 - link
Does this chipset support 5.1 LPCM over HDMI or not??? Or more plainly can someone send 5.1 (games, HD movies, etc) digitally to receiver with the 690G? According to your previous article on the 690G 5.1 48khz was supported over the HDMI port. Now its back to 2 channel and AC3 bitstream. Which is it?Gary Key - Wednesday, March 7, 2007 - link
It is two channel plus AC3 over HDMI. That is the final spec on production level boards and drivers. We will have a full audio review up in a week or so that also utilizes the on-board codec.StriderGT - Thursday, March 8, 2007 - link
Why is this happening? Why on earth can't they produce a PC HDMI Audio solution that outputs up to 7.1 LPCM (96khz/24bit) for ALL sources!?! They already do that for 2 channel sources!!!! Do you have any info from the hardware vendors regarding the reason/s they will not produce such a straightforward and simple solution?!?PS There are lots of people demanding a TRUE PC HDMI Audio solution not this SPDIF hacks...
Renoir - Tuesday, March 6, 2007 - link
I'm also interested to know more specifics about the audio side of this chipset. The support of HDMI v1.3 suggests that with an appropriate driver and supporting playback software Dolby TrueHD and DTS-HD bitstreams should be able to be sent via HDMI to a v1.3 receiver with the necessary decoders. Is this a possibility?SignalPST - Tuesday, March 6, 2007 - link
I'm interested in this topic as well.Then again, I still waiting for them to come out with a HDMI sound card.
StriderGT - Wednesday, March 7, 2007 - link
Unfortunately there are lots of us who are still waiting for a true HDMI PC audio solution. You can check the thread I started with many technical details for that matter here: http://www.avsforum.com/avs-vb/showthread.php?t=79...">http://www.avsforum.com/avs-vb/showthread.php?t=79...Patrese - Tuesday, March 6, 2007 - link
Great review, thanks... I know I asked that a couple times already, but is there a mATX roundup planned here at AT? I'd like to see the Asus M2NPV-VM and Abit NF-M2 NView compared with its 690G counterparts, as this segment makes for most of the computer sales on most places? BTW, weren't you plaged by memory compatibility issues with the M2NPV-VM oe any of the boards tested? This Asus board showed extremely picky on my experience...Gary Key - Wednesday, March 7, 2007 - link
The roundup is scheduled on the 19th, trying to pull it in. What BIOS and memory are you using on the M2NPV-VM, so far I have not run into any real issues except with 2GB modules. The abit board is one of my favorites so far. ;)Patrese - Tuesday, March 6, 2007 - link
There shouldn't be a question mark at the end of the "most sales" phrase... There are also a couple typos, sorry about that. Where's the edit button anyway? ;)RamarC - Tuesday, March 6, 2007 - link
i don't really understand the point of comparing chipsets/motherboards between processor families. subsystem performance figures can show glaring deficiencies but otherwise it really boils down to a cpu comparison. the "media/audio encoding" and "media performance" sections are certainly cpu-centric. and pitting a $230 x2 5200+ against a $185 e6300 winds up handicapping the intel contestant. shouldn't the $222 e6400 have been used instead?Gary Key - Tuesday, March 6, 2007 - link
As stated in the article, AMD is marketing the AM2 and 690G/V as a platform design to compete against the G/Q965 and Core 2 Duo solution. The 690G is targeted to the multimedia, HTPC, home/office, casual gaming crowd and was tested as such. We looked at the total price of a base Core 2 Duo and decent G965 board and then matched the processor choice that would come closest to the price and performance of the Intel offering while meeting the platform cost. Our tests were chosen based upon the target audience for each platform in the home environment. This was not a review of office level machines as the Q965/963 and 690V are targeted to the business user.JarredWalton - Tuesday, March 6, 2007 - link
The conclusion mentions that the G965 + E6300 costs around $300 compared to $315 for the 690G + 5200+ (or 6150 + 5200+), so it's more or less a fair "equivalent price" platform comparison. The E6400 ends up being faster than the E6300, but still slower in a few tests (as the text mentions) and even faster in those tests where E6300 already holds the lead. Nothing new there - we've pretty much beat the "Core 2 Duo is faster" drum to death. We feel anyone looking at 690G is going to be interested in the platform as a whole much more than whether or not it is faster than equivalently price Core 2 offerings.mostlyprudent - Tuesday, March 6, 2007 - link
There may be too many variables, but perhaps you could come up with a way to normalize the benchmarks. For instance, run the gaming tests first with ultra high-end graphics to try and isolate the performance delta for each plattofrm/cpu combo you will test with. Then run the game benchmarks with the IGP solutions and adjust the scores based on the previous tests. Just a thought off the top of my head.asliarun - Tuesday, March 6, 2007 - link
Ah, but you're evaluating a chipset here, not a platform or a system solution. Having said that, I agree that it IS difficult to compare chipsets that are targeted for different CPUs. In such a case, a better way to evaluate might be to take an AMD and an Intel CPU that is similar in performance (not in price), and use them to compare their corresponding chipsets. That would highlight the differences between the chipsets. You could always mention the price alongside, or do a separate price/performance comparison alongside.My point is that a price/performance comparison should complement a pure performance comparison, not the other way around.