#26
I think we are arguing in slightly different directions.
IIRC, the original MIPS R2000 had an ok C compiler made for it, and the compiler was able to take advantage of the branch delay slot. So while I appreciate the difficulties in making a good compiler, I don't agree with you on the point that it will be that difficult to achieve a respectable performance in branch heavy code. You might end up with an empty branch delay slot(s) at times, but finding instructions that will be executed regardless of whether the branch is taken or not, is fairly easy for the compiler compared to grouping instructions for concurrent execution on the very wide Itanium architecture.
However, if the part you wrote about abstracting away from the architecture means abstracting *completely* away from all the peculiarities in the Cell processor (mainly the fact that SPEs can’t handle general purpose code), then I agree with you. However, the PS2 also have a “few” peculiarities of its own and the dev houses still manage to write well performing code.
I think the big hurdle will be to develop a multithreaded game engine, and then “just” adapt the engine for “the other” architecture. AFAIK, the truly impressive PS2 games have not been achieved by just writing the code once and compile to multiple architectures.
All that said, I still believe the XBOX cpu will be easier to program for, because all the threads will be executing on the same architecture. But…. It all depends on the tools available, and Sony might have some fairly good ones ready by now.
I would like to point you to a blog entry made by Major Nelson. http://www.majornelson.com/2005/05/20/xbox-360-vs-... Its a four part issue comparing the two console in detail. Altough posted by someone whose in the Microsoft camp, i would like to hear your comments on the figures he posted. They definately seem to be reliable and in detail, but i would like to hear an unbiased opinion on these figures from someone who isnt from either MS or Sony.
The G5's are not able to be used fully, some of the components in them are designed for operations the 360 CPU will not have. So they are not able to harness the full power of 2 G5's, the only reason for dual is for multi-threading practice as well.
The 360 core, btw, is 3 CPU's @ 3.2Ghz. So a dual CPU 2.5Ghz dev kit would be roughly 1/2 the performance assuming everything about the CPU's was equal, which it is not. So the estimation is not off base.
"It takes two Apple G5s to power a 30% capacity Xbox 360 demo?"
So the MS claim is this:
I'm assuming Dual 2.5GHz G5s - so that's four of them - four 2.5GHz G5s from IBM running three times slower than a single, yet to be seen, 3.2GHz CPU from IBM.
Either IBM has made an unheard of leap in CPU performance, or the Macs were running slow code, or we won't see anything like this king of jump in performance.
#25: The only way I see that being the case is if:
a) coders optimize every application like hell for it, and are intimatly familiar with how it works in order to avoid poor coding decisions. The drawback to this is their code would be very difficult to write and would not bbe very portable.
b) A very very advanced compiler is developed that takes all the work out of coding for the architecture. Unfortunatly compilers take years to write and refine, they are still finding refinements to C compilers 25 years after they were created. I wouldn't hold your breath for this one.
I'm not saying it isn't possible, but it would only happen if Cell is a permanant new processor line that will be used for generations and refined/advanced over time. I don't see that happening any more than the Emotion Engine was, Sony will use Cell for 5 years and then do something different for the PS4. Without platform stability there is no real purpose to learning the ins and outs of an architecture well enough to take something that was created to be specialized and make it general purpose.
#17 & #19
It might just be that it won't even suck that much at branch heavy code. All the articles I have read about the subject speculate that the pipeline will be very short, and that is the logical thing to assume when you remove all of the out of order logic from the core. One could speculate that it might be entirely possible to do the branch target or jump target calculation already in the second pipeline stage as it is the case in the wonderful MIPS R2000 processor, which (for all I know) is THE preferred architecture when teaching university students about computer architecture. If you couple that with a very advanced compiler that probably do a very good job of filling the branch delay slot(s) with usable instructions, then you could potentially have a very respectable performance in branch heavy code as well. Of course problems might arise when executing very tight loops with very few instructions in each loop, but I really don’t think it will be such a big problem as you make it to be.
I think the pc's biggest problem, is the fact that it will take a lot of time until most normal users has a multi-processor(core) system. Which will make it a bit risky for developers to rely on this kind of architecture. And what will be the standard? 2, 4 8 processors? A physics psu? Standards is the main advantage consoles have, and this time it might just take a bit longer for pc's to catch up, because of the new multi-processor architecture found in next gen consoles.
Everyone knows that the G5s don't play games well. Apple have just hired Open GL experts to try to raise OS X performance. Still saying it's only at 30% and actually releasing hardware that gives 100%... well we'll see,,,
[19/05/05 03:52]
People think they're sampling Xbox 360, but they're not – can you guess what they're really playing?
Entering the Microsoft E3 stand is a wonderful experience – green and full of Xbox 360 activity. Activity we will be covering with some kind of ninja passion over the next few days – and you won't want to miss any of it, regardless of what PlayStation 3 fanboys are telling you on web forums.
However, all is not as it seems.
Despite the reassuring presence of an actual Xbox 360 unit locked away in a visible compartment in the demo stations, and despite the games being playable using an Xbox 360 pad, it's not an Xbox 360 people are playing.
Poking our noses around the side of the demo station and peering through the vaguely camouflaging air grills revealed the shocking truth: behind the Xbox 360 unit sat two Apple G5 computers.
So, are alpha-stage Xbox 360 games really so powerful they need a combined effort from two G5 machines to properly run Xbox 360 software? Quite possibly.
And, we were told by a man demoing Kameo that the game was only running at 30% of Xbox 360's capacity! Kameo featured pleasantly detailed next-gen graphics and, more impressively, hundreds of on-screen characters, all displaying individual AI routines – all at 30%.
It takes two Apple G5s to power a 30% capacity Xbox 360 demo? This may well be the case and if it's true then this console's only just started – this is just the beginning. Its games, once developers have sat down and spent a decent amount of time with full development kits, will just get better and better.
So, it seems there are a few more rounds to fight in the PS3 vs Xbox 360 match-up before the winner can lift up its gloves with any true confidence after all.
#18: Thanks for pointing the branch prediction issue out. These people with the "Cell is my next desktop PC" statements really do not understand that it simply would *not* be a high performance general purpose PC. Now a Cell co-processor perhaps on a PCI-E add in card, that would be a good idea. But as a main CPU it would be a terrible experience.
The Cell can't handle highly branched code. Believe it or not the cell has no branch prediction. There are systems in place to minimize the effect (software predictors... perhaps highly efficient compliers etc.) but theres the answer to your query.
My question is... shouldn't people be making a bigger deal about the PS3 GPU not having EDRAM? Ati's 360 part does have the embedded ram (10MB) so full screen anti-aliasing is effectively free performance-wise. We all know enabling FSAA comes at the cost of a huge performance hit bandwidth (& frame-rate) wise... will this not level the (Graphics) playing field between the 360 and the PS3?
It does make me wonder when I hear about all the "jaggies" in Sony's PS3 demo games... if they had the RSX enable FSAA would these "real time" demos slow to a crawl? Time will tell.
Running tech demos or just the game engine is a completely different story than running actual games.
I mean, we all know the cell is amazing for floating point operations and should be great for well behaved code like game engine or physics. But how does it run highly branched code? How does A.I. or game play code run on it.
I hope you can find some developers to talk to you, on or off the record ,about what is real and what is hype.
I wouldn't say that multithreaded development will be a huge issue for the Cell. After all, it only has one main processor core in contrast to the X-Box 360's 3. So, at worse, the developer will have to deal with 2 threads.
However, the Cell does have a bunch of auxillary vector processors. Developers for the PS2 are already used to these and it will be a case of "more, faster, better" rather than working with something revolutionary.
Wait for a Korean to sell a Mobo for the Cell, run Linux etc....
Add to that Sony releasing a waterdown version of their SDK kit like they did for the PS2 and then we'll have some fun.
While it is true that CPU's won't be catching up to these new consoles in a while, I have a feeling PC's as gaming platforms will be able to catch up pretty quickly. It would be interesting to hear if Ageia is showing anything new at E3...
#10 as long as the cell has the basic instructions, it'll be capable of anything. i heard the same thing about the alpha risc yet autocad ran very nicely on it. the cell has the bandwidth to make up for any short comings.
#8: Your going to be waiting for a very long time. That is not in the cards, the Cell has some serious compromises that preclude it from being a general computing processor for the mass market...
I am stoked about the next gen consoles. I will prob. get the ps3 becuase i have an extensive library of ps1 and ps2 games.
Anyway, i was reading the tech specs and what people were saying about both systems, and it got me thinkiing on what next gen system will reign supreme. After thinking about it i decided that it comes down to one thing. The games. If the games arent good on one system, but they are on the other, then well we will have our winner.
So watch out for which games come out for which system.
No one is referencing what i think is the most impressive demo, the real time rendering of the digital terrain model developed by the Cell team.
This demo was solely done on the cell without the aid of the GPU. I am more than familiar in this field and i know the horsepower required to make this achievement.
I don't know about you guys but i am not putting another nickel into my PC. I will wait for a Korean to sell me a motherboard for the Cell, run Linux and leave my gaming experiences up to the PS3.
"When asked to comment, ATI’s spokesperson, Louis Frappe-Mocha, had this to say..."
"In a similar vein, NVIDIA remain officially unconcerned, as their spokeswoman, Sandy Beaches, stated..."
"Intel’s Hugh Jass had this to say about the AMD board..."
Mark Reign(Epic) made a post over at VE3D clearing up the demo issue(http://ve3dboards.ign.com/message.asp?topic=197460... He stated flat out that the demos shown were real in the case of Unreal3, EA games, and the Sony tech demos. However all the rest you saw was pre-rendered approximations(including the Killzone2 demo everyone is going on about). Mark feels that they should be attainable, so I will qualify the statement with that, however they were *not* rendered in realtime on a Cell based dev system.
My skepticism on this is that we heard all this before. The PS2 showed off a ton of CGI approximations of what gaming on the PS2 would be like. It never materialized. Furthermore, while its nice that Sweeney says the Cell isn't a problem to program for, it was also stated in Mark's post that what he showed off was a demo of the RSX chip, he hasn't begun to tap Cell yet. So at this stage he is reassuring cross-platform licensees of Unreal3 that it will run on the PS3 more than anything.
Remember that Sweeney's job is to sell game engines. No one is going to license Unreal3 for a PS3 game if he makes it sound as though he's having difficulty with the platform. I am not calling him dishonest, I am only pointing out that facts are being omitted.
From a programming perspective you make some solid points about general purpose vs. special purpose. However with the level of power of the general purpose CPU's we are talking about here, and the flexability of the 360 system architecture, and the fact that it does have some dedicated hardware as well where needed, I really don't think its at much, if any disadvantage.
What I want to know about the next generation of machines is whether they will easily connect to my PC monitor. Ever since the dreamcast I've been playing my machines on my super crisp high definition monitor vs. the sad little TV. Yet I've always had to find someone (usually in Korea or Taiwan) to develop a bit of hardware that would connect the console to my monitor. Will these next generation machines that output High Definition as a standard output connect to monitors that already support high definition video without having to plunk down a lot of greenbacks?
But how does the PC development environment and software play into the whole equation? The hardware means nothing if we don't have PC games that can make good use of it, and this is always the problem when it comes to console vs. PC debates.
Will we have to wait for DirectX 10/WGF 1.0 and Unified Shaders before we see developers releasing PC games comparable to XBox 360/PS3? Will we have to upgrade to Longhorn to take advantage of these architectures?
I'm also curious to see what impact Intel's new bus architecture and continued development of HyperTransport, memory speeds and PCI-E will play in delivering bandwidth comparable to what the next-gen consoles have. My guess is that we'll need EDRAM on video cards to boost bandwidth before system bus, memory and PCI-E catch up.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
32 Comments
Back to Article
Anonymous - Thursday, June 2, 2005 - link
AndyKH knows what he's talking about. Listen to him.Michael2k - Saturday, May 21, 2005 - link
Personally I can't wait to see what this implies for future PowerMacs; multicore and 3GHz+AndyKH - Saturday, May 21, 2005 - link
#26I think we are arguing in slightly different directions.
IIRC, the original MIPS R2000 had an ok C compiler made for it, and the compiler was able to take advantage of the branch delay slot. So while I appreciate the difficulties in making a good compiler, I don't agree with you on the point that it will be that difficult to achieve a respectable performance in branch heavy code. You might end up with an empty branch delay slot(s) at times, but finding instructions that will be executed regardless of whether the branch is taken or not, is fairly easy for the compiler compared to grouping instructions for concurrent execution on the very wide Itanium architecture.
However, if the part you wrote about abstracting away from the architecture means abstracting *completely* away from all the peculiarities in the Cell processor (mainly the fact that SPEs can’t handle general purpose code), then I agree with you. However, the PS2 also have a “few” peculiarities of its own and the dev houses still manage to write well performing code.
I think the big hurdle will be to develop a multithreaded game engine, and then “just” adapt the engine for “the other” architecture. AFAIK, the truly impressive PS2 games have not been achieved by just writing the code once and compile to multiple architectures.
All that said, I still believe the XBOX cpu will be easier to program for, because all the threads will be executing on the same architecture. But…. It all depends on the tools available, and Sony might have some fairly good ones ready by now.
Ritesh - Friday, May 20, 2005 - link
Anand,I would like to point you to a blog entry made by Major Nelson. http://www.majornelson.com/2005/05/20/xbox-360-vs-... Its a four part issue comparing the two console in detail. Altough posted by someone whose in the Microsoft camp, i would like to hear your comments on the figures he posted. They definately seem to be reliable and in detail, but i would like to hear an unbiased opinion on these figures from someone who isnt from either MS or Sony.
Reflex - Friday, May 20, 2005 - link
The G5's are not able to be used fully, some of the components in them are designed for operations the 360 CPU will not have. So they are not able to harness the full power of 2 G5's, the only reason for dual is for multi-threading practice as well.The 360 core, btw, is 3 CPU's @ 3.2Ghz. So a dual CPU 2.5Ghz dev kit would be roughly 1/2 the performance assuming everything about the CPU's was equal, which it is not. So the estimation is not off base.
Anonymous - Friday, May 20, 2005 - link
"It takes two Apple G5s to power a 30% capacity Xbox 360 demo?"So the MS claim is this:
I'm assuming Dual 2.5GHz G5s - so that's four of them - four 2.5GHz G5s from IBM running three times slower than a single, yet to be seen, 3.2GHz CPU from IBM.
Either IBM has made an unheard of leap in CPU performance, or the Macs were running slow code, or we won't see anything like this king of jump in performance.
What's the betting?
Reflex - Friday, May 20, 2005 - link
#25: The only way I see that being the case is if:a) coders optimize every application like hell for it, and are intimatly familiar with how it works in order to avoid poor coding decisions. The drawback to this is their code would be very difficult to write and would not bbe very portable.
b) A very very advanced compiler is developed that takes all the work out of coding for the architecture. Unfortunatly compilers take years to write and refine, they are still finding refinements to C compilers 25 years after they were created. I wouldn't hold your breath for this one.
I'm not saying it isn't possible, but it would only happen if Cell is a permanant new processor line that will be used for generations and refined/advanced over time. I don't see that happening any more than the Emotion Engine was, Sony will use Cell for 5 years and then do something different for the PS4. Without platform stability there is no real purpose to learning the ins and outs of an architecture well enough to take something that was created to be specialized and make it general purpose.
AndyKH - Friday, May 20, 2005 - link
#17 & #19It might just be that it won't even suck that much at branch heavy code. All the articles I have read about the subject speculate that the pipeline will be very short, and that is the logical thing to assume when you remove all of the out of order logic from the core. One could speculate that it might be entirely possible to do the branch target or jump target calculation already in the second pipeline stage as it is the case in the wonderful MIPS R2000 processor, which (for all I know) is THE preferred architecture when teaching university students about computer architecture. If you couple that with a very advanced compiler that probably do a very good job of filling the branch delay slot(s) with usable instructions, then you could potentially have a very respectable performance in branch heavy code as well. Of course problems might arise when executing very tight loops with very few instructions in each loop, but I really don’t think it will be such a big problem as you make it to be.
Anand, what is your take on this?
oysteini - Friday, May 20, 2005 - link
I think the pc's biggest problem, is the fact that it will take a lot of time until most normal users has a multi-processor(core) system. Which will make it a bit risky for developers to rely on this kind of architecture. And what will be the standard? 2, 4 8 processors? A physics psu? Standards is the main advantage consoles have, and this time it might just take a bit longer for pc's to catch up, because of the new multi-processor architecture found in next gen consoles.Chris - Thursday, May 19, 2005 - link
Can't wait to see what the PC world counters with.3.2 GHz Athlon FX X4 with 4 cores?
That would eat both of these CPUs alive.
Anonymous - Thursday, May 19, 2005 - link
OSX performance is irrelevant. It's running on an NT kernel.Anonymous - Thursday, May 19, 2005 - link
Everyone knows that the G5s don't play games well. Apple have just hired Open GL experts to try to raise OS X performance. Still saying it's only at 30% and actually releasing hardware that gives 100%... well we'll see,,,The great Xbox 360 E3 hoax - Thursday, May 19, 2005 - link
The great Xbox 360 E3 hoaxhttp://gamesradar.msn.co.uk/news/default.asp?paget...
bsectionid=1586
[19/05/05 03:52]
People think they're sampling Xbox 360, but they're not – can you guess what they're really playing?
Entering the Microsoft E3 stand is a wonderful experience – green and full of Xbox 360 activity. Activity we will be covering with some kind of ninja passion over the next few days – and you won't want to miss any of it, regardless of what PlayStation 3 fanboys are telling you on web forums.
However, all is not as it seems.
Despite the reassuring presence of an actual Xbox 360 unit locked away in a visible compartment in the demo stations, and despite the games being playable using an Xbox 360 pad, it's not an Xbox 360 people are playing.
Poking our noses around the side of the demo station and peering through the vaguely camouflaging air grills revealed the shocking truth: behind the Xbox 360 unit sat two Apple G5 computers.
So, are alpha-stage Xbox 360 games really so powerful they need a combined effort from two G5 machines to properly run Xbox 360 software? Quite possibly.
And, we were told by a man demoing Kameo that the game was only running at 30% of Xbox 360's capacity! Kameo featured pleasantly detailed next-gen graphics and, more impressively, hundreds of on-screen characters, all displaying individual AI routines – all at 30%.
It takes two Apple G5s to power a 30% capacity Xbox 360 demo? This may well be the case and if it's true then this console's only just started – this is just the beginning. Its games, once developers have sat down and spent a decent amount of time with full development kits, will just get better and better.
So, it seems there are a few more rounds to fight in the PS3 vs Xbox 360 match-up before the winner can lift up its gloves with any true confidence after all.
Reflex - Thursday, May 19, 2005 - link
#18: Thanks for pointing the branch prediction issue out. These people with the "Cell is my next desktop PC" statements really do not understand that it simply would *not* be a high performance general purpose PC. Now a Cell co-processor perhaps on a PCI-E add in card, that would be a good idea. But as a main CPU it would be a terrible experience.GrandTrain - Thursday, May 19, 2005 - link
The Cell can't handle highly branched code. Believe it or not the cell has no branch prediction. There are systems in place to minimize the effect (software predictors... perhaps highly efficient compliers etc.) but theres the answer to your query.My question is... shouldn't people be making a bigger deal about the PS3 GPU not having EDRAM? Ati's 360 part does have the embedded ram (10MB) so full screen anti-aliasing is effectively free performance-wise. We all know enabling FSAA comes at the cost of a huge performance hit bandwidth (& frame-rate) wise... will this not level the (Graphics) playing field between the 360 and the PS3?
It does make me wonder when I hear about all the "jaggies" in Sony's PS3 demo games... if they had the RSX enable FSAA would these "real time" demos slow to a crawl? Time will tell.
knitecrow - Thursday, May 19, 2005 - link
Running tech demos or just the game engine is a completely different story than running actual games.I mean, we all know the cell is amazing for floating point operations and should be great for well behaved code like game engine or physics. But how does it run highly branched code? How does A.I. or game play code run on it.
I hope you can find some developers to talk to you, on or off the record ,about what is real and what is hype.
fluffy - Wednesday, May 18, 2005 - link
Anand,I wouldn't say that multithreaded development will be a huge issue for the Cell. After all, it only has one main processor core in contrast to the X-Box 360's 3. So, at worse, the developer will have to deal with 2 threads.
However, the Cell does have a bunch of auxillary vector processors. Developers for the PS2 are already used to these and it will be a case of "more, faster, better" rather than working with something revolutionary.
FredMT - Wednesday, May 18, 2005 - link
Wait for a Korean to sell a Mobo for the Cell, run Linux etc....Add to that Sony releasing a waterdown version of their SDK kit like they did for the PS2 and then we'll have some fun.
Anonymous - Wednesday, May 18, 2005 - link
While it is true that CPU's won't be catching up to these new consoles in a while, I have a feeling PC's as gaming platforms will be able to catch up pretty quickly. It would be interesting to hear if Ageia is showing anything new at E3...Chris - Wednesday, May 18, 2005 - link
Can't wait for PS3 mod chips and a Pukklink/PS2link port :)I got to play with the PPC750e in the Gamecube and found the instruction set quite interesting.
Anonymous - Wednesday, May 18, 2005 - link
...oh man, an april fool's joke...=P
...and i totally FELL FOR IT...
=D
...thanks for letting me know about AMD in the graphic cards biz...
;)
Anonymous - Wednesday, May 18, 2005 - link
#10 as long as the cell has the basic instructions, it'll be capable of anything. i heard the same thing about the alpha risc yet autocad ran very nicely on it. the cell has the bandwidth to make up for any short comings.Reflex - Wednesday, May 18, 2005 - link
#8: Your going to be waiting for a very long time. That is not in the cards, the Cell has some serious compromises that preclude it from being a general computing processor for the mass market...static1117 - Wednesday, May 18, 2005 - link
I am stoked about the next gen consoles. I will prob. get the ps3 becuase i have an extensive library of ps1 and ps2 games.Anyway, i was reading the tech specs and what people were saying about both systems, and it got me thinkiing on what next gen system will reign supreme. After thinking about it i decided that it comes down to one thing. The games. If the games arent good on one system, but they are on the other, then well we will have our winner.
So watch out for which games come out for which system.
Cdeck - Wednesday, May 18, 2005 - link
No one is referencing what i think is the most impressive demo, the real time rendering of the digital terrain model developed by the Cell team.This demo was solely done on the cell without the aid of the GPU. I am more than familiar in this field and i know the horsepower required to make this achievement.
I don't know about you guys but i am not putting another nickel into my PC. I will wait for a Korean to sell me a motherboard for the Cell, run Linux and leave my gaming experiences up to the PS3.
icarus4586 - Tuesday, May 17, 2005 - link
lol"When asked to comment, ATI’s spokesperson, Louis Frappe-Mocha, had this to say..."
"In a similar vein, NVIDIA remain officially unconcerned, as their spokeswoman, Sandy Beaches, stated..."
"Intel’s Hugh Jass had this to say about the AMD board..."
Reflex - Tuesday, May 17, 2005 - link
#5 - Check the date on that article...its an April Fool's joke.john - Tuesday, May 17, 2005 - link
what about the AMD commitment to the graphic cards market ?http://www.hexus.net/content/reviews/review.php?dX...
i'm longing for you to comment on this possibility; my next PC would be an A64 X2 with AMD GPU !!!
=)
Reflex - Tuesday, May 17, 2005 - link
Anand -Mark Reign(Epic) made a post over at VE3D clearing up the demo issue(http://ve3dboards.ign.com/message.asp?topic=197460... He stated flat out that the demos shown were real in the case of Unreal3, EA games, and the Sony tech demos. However all the rest you saw was pre-rendered approximations(including the Killzone2 demo everyone is going on about). Mark feels that they should be attainable, so I will qualify the statement with that, however they were *not* rendered in realtime on a Cell based dev system.
My skepticism on this is that we heard all this before. The PS2 showed off a ton of CGI approximations of what gaming on the PS2 would be like. It never materialized. Furthermore, while its nice that Sweeney says the Cell isn't a problem to program for, it was also stated in Mark's post that what he showed off was a demo of the RSX chip, he hasn't begun to tap Cell yet. So at this stage he is reassuring cross-platform licensees of Unreal3 that it will run on the PS3 more than anything.
Remember that Sweeney's job is to sell game engines. No one is going to license Unreal3 for a PS3 game if he makes it sound as though he's having difficulty with the platform. I am not calling him dishonest, I am only pointing out that facts are being omitted.
From a programming perspective you make some solid points about general purpose vs. special purpose. However with the level of power of the general purpose CPU's we are talking about here, and the flexability of the 360 system architecture, and the fact that it does have some dedicated hardware as well where needed, I really don't think its at much, if any disadvantage.
Brian - Tuesday, May 17, 2005 - link
What I want to know about the next generation of machines is whether they will easily connect to my PC monitor. Ever since the dreamcast I've been playing my machines on my super crisp high definition monitor vs. the sad little TV. Yet I've always had to find someone (usually in Korea or Taiwan) to develop a bit of hardware that would connect the console to my monitor. Will these next generation machines that output High Definition as a standard output connect to monitors that already support high definition video without having to plunk down a lot of greenbacks?Please!!
uncjigga - Tuesday, May 17, 2005 - link
But how does the PC development environment and software play into the whole equation? The hardware means nothing if we don't have PC games that can make good use of it, and this is always the problem when it comes to console vs. PC debates.Will we have to wait for DirectX 10/WGF 1.0 and Unified Shaders before we see developers releasing PC games comparable to XBox 360/PS3? Will we have to upgrade to Longhorn to take advantage of these architectures?
I'm also curious to see what impact Intel's new bus architecture and continued development of HyperTransport, memory speeds and PCI-E will play in delivering bandwidth comparable to what the next-gen consoles have. My guess is that we'll need EDRAM on video cards to boost bandwidth before system bus, memory and PCI-E catch up.
ksherman - Tuesday, May 17, 2005 - link
I always enjoy your interesting muses... well typed Anand!