Sony just announced the PlayStation 4, along with some high level system specifications. The high level specs are what we've heard for quite some time:

  • 8-core x86-64 CPU using AMD Jaguar cores (built by AMD)
  • High-end PC GPU (also built by AMD), delivering 1.84TFLOPS of performance
  • Unified 8GB of GDDR5 memory for use by both the CPU and GPU with 176GB/s of memory bandwidth
  • Large local hard drive

Details of the CPU aren't known at this point (8-cores could imply a Piledriver derived architecture, or 8 smaller Jaguar cores—the latter being more likely), but either way this will be a big step forward over the PowerPC based general purpose cores on Cell from the previous generation. I wouldn't be too put off by the lack of Intel silicon here, it's still a lot faster than what we had before and at this level price matters more than peak performance. The Intel performance advantage would have to be much larger to dramatically impact console performance. If we're talking about Jaguar cores, then there's a bigger concern long term from a single threaded performance standpoint.

Update: I've confirmed that there are 8 Jaguar based AMD CPU cores inside the PS4's APU. The CPU + GPU are on a single die. Jaguar will still likely have better performance than the PS3/Xbox 360's PowerPC cores, and it should be faster than anything ARM based out today, but there's not huge headroom going forward. While I'm happier with Sony's (and MS') CPU selection this time around, I always hoped someone would take CPU performance in a console a bit more seriously. Given the choice between spending transistors on the CPU vs. GPU, I understand that the GPU wins every time in a console—I'm just always an advocate for wanting more of both. I realize I never wrote up a piece on AMD's Jaguar architecture, so I'll likely be doing that in the not too distant future. Update: I did.

The choice of 8 cores is somewhat unique. Jaguar's default compute unit is a quad-core machine with a large shared L2 cache, it's likely that AMD placed two of these together for the PlayStation 4. The last generation of consoles saw a march towards heavily threaded machines, so it's no surprise that AMD/Sony want to continue the trend here. Clock speed is unknown, but Jaguar was good for a mild increase over its predecessor Bobcat. Given the large monolithic die, AMD and Sony may not have wanted to push frequency as high as possible in order to keep yields up and power down. While I still expect CPU performance to move forward in this generation of consoles, I was reminded of the fact that the PowerPC cores in the previous generation ran at very high frequencies. The IPC gains afforded by Jaguar have to be significant in order to make up for what will likely be a lower clock speed.

We don't know specifics of the GPU, but with it approaching 2 TFLOPS we're looking at a level of performance somewhere between a Radeon HD 7850 and 7870. Update: Sony has confirmed the actual performance of the PlayStation 4's GPU as 1.84 TFLOPS. Sony claims the GPU features 18 compute units, which if this is GCN based we'd be looking at 1152 SPs and 72 texture units. It's unclear how custom the GPU is however, so we'll have to wait for additional information to really know for sure. The highest end PC GPUs are already faster than this, but the PS4's GPU is a lot faster than the PS3's RSX which was derived from NVIDIA's G70 architecture (used in the GeForce 7800 GTX, for example). I'm quite pleased with the promised level of GPU performance with the PS4. There are obvious power and cost constraints that would keep AMD/Sony from going even higher here, but this should be a good leap forward from current gen consoles.

Outfitting the PS4 with 8GB of RAM will be great for developers, and using high-speed GDDR5 will help ensure the GPU isn't bandwidth starved. Sony promised around 176GB/s of memory bandwidth for the PS4. The lack of solid state storage isn't surprising. Hard drives still offer a dramatic advantage in cost per GB vs. an SSD. Now if it's user replaceable with an SSD that would be a nice compromise.

Leveraging Gaikai's cloud gaming technology, the PS4 will be able to act as a game server and stream the video output to a PS Vita, wirelessly. This sounds a lot like what NVIDIA is doing with Project Shield and your NVIDIA powered gaming PC. Sony referenced dedicated video encode/decode hardware that allows you to instantaneously record and share screenshots/video of gameplay. I suspect this same hardware is used in streaming your game to a PS Vita.

Backwards compatibility with PS3 games isn't guaranteed and instead will leverage cloud gaming to stream older content to the box. There's some sort of a dedicated background processor that handles uploads and downloads, and even handles updates in the background while the system is off. The PS4 also supports instant suspend/resume.

The new box heavily leverages PC hardware, which is something we're expecting from the next Xbox as well. It's interesting that this is effectively how Microsoft entered the console space back in 2001 with the original Xbox, and now both Sony and MS have returned to that philosophy with their next gen consoles in 2013. The PlayStation 4 will be available this holiday season.

I'm trying to get more details on the CPU and GPU architectures and will update as soon as I have more info.

Source: Ustream

Comments Locked

160 Comments

View All Comments

  • tzhu07 - Thursday, February 21, 2013 - link

    I don't play console games anymore, but they have clear advantages over something like a PC. Consoles are simple and easier to operate, great for multi-player with people in the same room, and have exclusives that you can't get on the PC (especially if you're a fan of Japanese developers). They are more family-friendly and you don't have to tweak any settings within games. Also, crashes and freezes are far less frequent on consoles.

    If I was still into gaming like I was when I was a teenager, I'd buy a PS4 or Xbox 720.
  • This Guy - Thursday, February 21, 2013 - link

    I'ld have to disagree. A PC game without intrusive DRM can be a hell of a lot easier to operate. Most of my PC games just load. I don't have to sign in, I don't have to select my save location. I just click on the game icon, wait ten seconds, click continue and I'm on my way.

    As for crashes, the only one's I've had often were in Fallout 3 and NV, and even then that was using mods.

    My main experience console gaming is Forza 4. A minute just to load the game. Last time I checked I had more than 50 hours in menus (not including upgrading/tuning. That was for ~250 hours of play all up. Four days of my life just waiting.
  • SlyNine - Saturday, February 23, 2013 - link

    PC's also have great games you wont see one consoles.
  • poohbear - Thursday, February 21, 2013 - link

    I was looking everywhere for detailed info on the specs, thanks for putting up this info and your thoughts on it!!!! Really more than i expected as some people were saying its gonna be 4GB RAM (which would've been terrible if its supposed to last 8 years). The thing is these specs aren't anywhere near futureproof..... in 2 years they'll be completely low end, let alone 5 years! Anyways from the games i saw being previewed, the baseline for graphics will be like BF3 Ultra on PC, which is still fantastic!
  • ilikemoneygreen - Thursday, February 21, 2013 - link

    For PC building at least their is no option to install GDDR5 ram. That kind of ram is only in GPUs to my knowledge. Is this equivalent to say ...the systems DDR3 RAM? I also realize DDR3 is built into the worse off GPUs but I guess my question goes to differentiating that between the GPU RAM and the system RAM. They say the PS4 has 8gb of GDDR5 RAM...But its like everyone is acting like its the system RAM. Can the whole system run off the GPU RAM? Kind of a 2fer? I hope I phrased this well enough because im not sure if I actually got it across correct.
    Just so I'm clear here (Dont need any wasted replies explaining that DDR3 is worst) I know DDR3 RAM in video cards sucks and GDDR5 RAM rocks. GDDR5 is better.
  • Xorrax - Thursday, February 21, 2013 - link

    The 8GB of GDDR5 *is* also the system ram. It's unified and shared between CPU and GPU.
  • silverblue - Thursday, February 21, 2013 - link

    Using GDDR5 throughout the system got me thinking as well. I've never really known the latency involved with GDDR5 but it sure means that Jaguar has a huge amount of bandwidth.

    As others have said, a strong CPU isn't really too necessary here, and as more and more games are multithreaded, I expect even two Jaguars can spit out a decent amount of work (they are, in essence, as powerful as K8 cores albeit with a huge list of ISAs available to them). I do wonder if HSA comes into any of this. Regardless, perhaps it'll convince developers to spend more time on properly threading their games with so many cores available to them, and with them using x86, porting should be easier.

    As for "built by AMD", I thought they were just licensing the tech out for somebody else to make it?
  • tipoo - Thursday, February 21, 2013 - link

    The GDDR5 is one unified pool. The CPU and GPU both access the same pool.
  • Mugur - Thursday, February 21, 2013 - link

    I know I'm asking much, but some more technical details regarding the cpu/gpu/ram would be nice... :-)

    8 GB DDR5 and 7850 performance levels is nice (hopefully not comparing to a mobile gpu part). Of course, it also depends of the DDR5 speed. Unified fast cache would be nice too.

    But the cpu is very, very slow by today's (pc gaming) standards. Even with 8 cores, unless it's 3 Ghz, the performance won't be enough to push the graphics part to its limit. Don't forget that a game is not quite the optimal task to spread to more than 4 cores, so the single threaded performance is still important. Hell, I've touched notebooks with this kind of cpus and they are painfully slow comparing to the cheapest core i3. They are much better than Atoms, I agree, but there's not the comparison I would make. We need at least desktop Trinity cpu performance here.

    Anyway, I know it's impossible, but I would like to see XBMC on this someday... :-)
  • Xorrax - Thursday, February 21, 2013 - link

    Do you feel that the CPU performance of existing consoles (Cell and PPC in the Xbox 360) are lacking? Jaguar is much, much better for integer computation than either of those two machines, even when clocked at a far lower rate. For example, the Power architecture in the 360 has *horrible* IPC as it is in-order, with many pipeline hazards. Clock for clock, Jaguar should be on the order of 3-4 times as fast as the PPE in Cell and the Xbox 360, as it is out-of-order, has extensive register renaming, and a very good branch predictor. Couple that with an increase in real core count, and it seems like a sensible decision for Sony to have made.

Log in

Don't have an account? Sign up now