• What
    is this?
    You've landed on the AMD Portal on AnandTech. This section is sponsored by AMD. It features a collection of all of our independent AMD content, as well as Tweets & News from AMD directly. AMD will also be running a couple of huge giveaways here so check back for those.
    PRESENTED BY

AMD sent us word that tomorrow they will be hosting a Livecast celebrating 30 years of graphics and gaming innovation. Thirty years is a long time, and certainly we have a lot of readers that weren't even around when AMD had its beginnings. Except we're not really talking about the foundation of AMD; they had their start in 1969. It appears this is more a celebration of their graphics division, formerly ATI, which was founded in… August, 1985.

AMD is apparently looking at a year-long celebration of the company formerly known as ATI, Radeon graphics, and gaming. While they're being a bit coy about the exact contents of the Livecast, we do know that there will be three game developers participating along with a live overclocking event. If we're lucky, maybe AMD even has a secret product announcement, but if so they haven't provided any details. And while we can now look forward to a year of celebrating AMD graphics and most likely a final end-of-the-year party come next August, why not start out with a brief look at where AMD/ATI started and where they are now?

Commodore-64-Computer.png
Source: Wikimedia Evan-Amos

I'm old enough that I may have been an owner of one of ATI's first products, as I began my addiction career as a technology enthusiast way back in the hoary days of the Commodore 64. While the C64 initially started shipping a few years earlier, Commodore was one of ATI's first customers and they were largely responsible for an infusion of money that kept ATI going in the early days.

By 1987, ATI began moving into the world of PC graphics with their "Wonder" brand of chips and cards, starting with 8-bit PC/XT-based board supporting monochrome or 4-color CGA. Over the next several years ATI would move to EGA (640x350 and provided an astounding 16 colors) and VGA (16-bit ISA and 256 colors). If you wanted a state-of-the-art video card like the ATI VGA Wonder in 1988, you were looking at $500 for the 256K model or $700 for the 512K edition. But all of this is really old stuff; where things start to become interesting is in the early 90s with the launch and growing popularity of Windows 3.0.

Mach8isa.jpg
Source: Wikimedia Misterzeropage

ATI's Mach 8 was their first true graphics processor from the company. It was able to offload 2D graphics functions from the CPU and render them independently, and at the time it was one of the few video cards that could do this. Sporting 512K-1MB of memory, it was still an ISA card (or it was available in MCA if you happened to own an IBM PS/2).

Two years later the Mach 32 came out, the first 32-bit capable chip with support for ISA, EISA, MCA, VLB, and PCI slots. Mach 32 shipped with either 1MB or 2MB DRAM/VRAM and added high-color (15-bit/16-bit) and later True Color (the 24-bit color that we're still mostly using today) to the mix, along with a 64-bit memory interface. And two years after came the Mach 64, which brought support for up to 8MB of DRAM, VRAM, or the new SGRAM. Later variants of the Mach 64 also started including 3D capabilities (and were rebranded as Rage, see below), and we're still not even in the "modern" era of graphics chips yet!


Rage Fury MAXX

Next in line was the Rage series of graphics chips, and this was the first line of graphics chips built with 3D acceleration as one of the key features. We could talk about competing products from 3dfx, NVIDIA, Virge, S3, etc. here, but let's just stick with ATI. The Rage line appropriately began with the 3D Rage I in 1996, and it was mostly an enhancement of the Mach64 design with added on 3D support. The 3D Rage II was another Mach64 derived design, with up to twice the performance of the 3D Rage. The Rage II also found its way into some Macintosh systems, and while it was initially a PCI part, the Rage IIc later added AGP support.

That part was followed by the Rage Pro, which is when graphics chips first started handling geometry processing (circa 1998 with DirectX 6.0 if you're keeping track), and you could get the Pro cards with up to 16MB of memory. There were also low-cost variations of the Rage Pro in the Rage LT, LT Pro, and XL models, and the Rage XL may hold the distinction of being one of the longest-used graphics chips in history, as I know even in 2005 or thereabouts there were many servers still shipping with that chip on the motherboard providing graphics output. In 1998 ATI released the Rage 128 with AGP 2X support (the enhanced Rage 128 Pro added AGP 4X support among other things a year later), and up to 32MB RAM. The Rage 128 Ultra even supported 64MB in its top configuration, but that wasn't the crowning achievement of the Rage series. No, the biggest achievement for Rage was with the Rage Fury MAXX, ATI's first GPU to support alternate frame rendering to provide up to twice the performance.


Radeon 9700 Pro

And last but not least we finally enter the modern era of ATI/AMD video cards with the Radeon line. Things start to get pretty dense in terms of releases at this point, so we'll mostly gloss over things and just hit the highlights. The first iteration Radeon brought support for DirectX 7 features, the biggest being hardware support for transform and lighting calculations – basically a way of offloading additional geometry calculations. The second generation Radeon chips (sold under the Radeon 8000 and lower number 9000 models) added DirectX 8 support, the first appearance of programmable pixel and vertex shaders in GPUs.

Perhaps the best of the Radeon breed goes to the R300 line, with the Radeon 9600/9700/9800 series cards delivering DirectX 9.0 support and, more importantly, holding onto a clear performance lead over their chief competitor NVIDIA for nearly two solid years! It's a bit crazy to realize that we're now into our tenth (or eleventh, depending on how you want to count) generation of Radeon GPUs, and while the overall performance crown is often hotly debated, one thing is clear: games and graphics hardware wouldn't be where it is today without the input of AMD's graphics division!

That's a great way to finish things off, and tomorrow I suspect AMD will have much more to say on the subject of the changing landscape of computer graphics over the past 30 years. It's been a wild ride, and when I think back to the early days of computer games and then look at modern titles, it's pretty amazing. It's also interesting to note that people often complain about spending $200 or $300 on a reasonably high performance GPU, when the reality is that the top performing video cards have often cost several hundred dollars – I remember buying an early 1MB True Color card for $200 back in the day, and that was nowhere near the top of the line offering. The amount of compute performance we can now buy for under $500 is awesome, and I can only imagine what the advances of another 30 years will bring us. So, congratulations to AMD on 30 years of graphics innovation, and here's to 30 more years!

Source: AMD Livecast Announcement

POST A COMMENT

32 Comments

View All Comments

  • cblakely - Friday, August 22, 2014 - link

    "If you wanted a state-of-the-art video card like the ATI VGA Wonder in 1998, you were looking at $500 for the 256K model or $700 for the 512K edition."

    1988 maybe?
    Reply
  • JarredWalton - Friday, August 22, 2014 - link

    Yup... fat fingered or just too many dates in my head. :) Reply
  • StevoLincolnite - Saturday, August 23, 2014 - link

    "3dfx, NVIDIA, Virge, S3"

    I think you mean: "3dfx, nVidia, S3".
    Virge was actually an S3 branded GPU, aka, the S3 Virge DX/XG and not a separate company.
    Plus we also had competitors such as Matrox, Rendition, Trident, NEC etc'.
    Reply
  • Mathos - Saturday, August 23, 2014 - link

    He's leaving out Rendition(Verite chip) and Matrox from that time period as well. Reply
  • StevoLincolnite - Sunday, August 24, 2014 - link

    Already noted that. :P

    Then we have: "The second generation Radeon chips (sold under the Radeon 8000 and lower number 9000 models) added DirectX 8 support, the first appearance of programmable pixel and vertex shaders in GPUs."

    Which isn't entirely accurate either.
    Yes it was one of the first GPU's with Direct X 8.0/8.1 compatible programmable pixel shaders, but it's far from the first implementation of programmable pixel shaders as a whole.

    If you go back to the GPU before the 8000 series you had programmable pixel shaders on the Radeon 7500, albeit it wasn't flexible enough to be compliant with Direct X.
    Reply
  • Mr Perfect - Friday, August 22, 2014 - link

    It appears this is more a celebration of their graphics division, formerly ATI, which was founded in… August, 1985.


    Maybe I'm just suffering from friday brain here, but wouldn't a start date of '85 make them 29 years old?
    Reply
  • ruggia - Friday, August 22, 2014 - link

    Seems like its meant to be a year-long celebration, so this is more like a pre-gaming kick off party Reply
  • Karl Hungus - Friday, August 22, 2014 - link

    Next to my 3dfx Voodoo 2 my Radeon 9700 pro is the the card I have the best memories of. I bought it on release and it was relevant for many years. How they squandered that lead I will never understand. Reply
  • just4U - Friday, August 22, 2014 - link

    wait ... what? Squandered? They were behind till the 9x series came out and have been trading blows with Nvidia's Geforce line ever since. Reply
  • Samus - Friday, August 22, 2014 - link

    Yeah Karl you act like ATI/AMD hasn't even been relevant. NVidia had some real doozies back in the Fermi days (just a few years ago) and they've literally been trading blows with every generation of GPU. However, I think ATI/AMD is in real trouble when the high-performance Maxwell-based 880/870 are released this Fall. The fact Maxwell is about twice as efficient that Kepler alone is going to require some real overhaul of AMD's "Islands" architecture. Reply

Log in

Don't have an account? Sign up now