Seven billion dollars.

That’s the amount that Intel is going to spend in the US alone on bringing up its 32nm manufacturing process in 2009 and 2010.

These are the fabs Intel is converting to 32nm:

In Oregon Intel has the D1D fab which is already producing 32nm parts, and D1C which is scheduled to start 32nm production at the end of this year. Then two fabs in Arizona: Fab 32 and Fab 11X. Both of them come on line in 2010.

By the end of next year the total investment just to enable 32nm production in the US will be approximately eight billion dollars. In a time where all we hear about are bailouts, cutbacks and recession, this is welcome news.

If anything, Intel should have a renewed focus on competition given that its chief competitor finally woke up. That focus is there. The show must go on. 32nm will happen this year. Let’s talk about how.

The Manufacturing Roadmap
Comments Locked

64 Comments

View All Comments

  • Oyvind - Wednesday, April 15, 2009 - link

    7 - 10 post are about the fact that todays sw don't use more than 2 cores in a efficent way. Well 2 - 3 years ago, there was close to none. Did Valve, Epic and others build frameworks for using multi cpu's before the hardware base was in place. The answer is no. Do most big software house today put a big effort in scaling over more cores? The answer is yes. Should Intel/Amd wait until the sw houses catch up? I don't understand it, but the spoken majority seems to answer this with a yes?
    My question: When the big sw house is done with the mulri cpu frameworks, do you son't belive they then will scale over n numbers of cpu's. Userinput, rendering/gpu stuff, AI x n the deept of today etc. All real lifte arhitechure is paralell, sw is not yet, but hopefully that will change.

    ----
    If lifte is good, and you have insane to much money, you stop developing, you dont need to priortize and you slowly fall back in your pillow. Yes AMD fight uphill, but if they manage to survive, nature has proven that fhigting uneven odds, will give you and middel to long term edge (ok if you survive). Tons of money dont save anything. Not sure they suvive, but if they don't a new company with clever enginers will rise somwhere in the future. Yes we need competition and there always will be.
  • mattigreenbay - Friday, March 6, 2009 - link

    This is the end of AMD. Unless this turns out like P4 (not likely), AMD will have to release their process first or soon after [or better yet, a 16nm ultra-fast processor, and while I'm still dreaming, make it free] and have it perform better (also not likely). Poor AMD. I was going to buy a Phenom II, but Intel seems the way to go, future-wise. AMD will be liquidated, as well as VIA and Intel will go back to selling way overpriced processors that perform less than a i386 [Windows 7 certified].
  • mattigreenbay - Friday, March 6, 2009 - link

    But it'll come with a free super fast Intel GPU. (bye bye Nvidia too) :(

  • arbiter378 - Sunday, November 22, 2009 - link

    Intel doesn't make fast gpu's. Even when they tried with that agp gpu ati and nvidia killed it. They won't let a new playing into a graphics market with out a fight. Lastly intel has been trying to beat amd for 40 something years, and there still not even close to beating them. Now that amd has acquired amd they have superior graphics patents.
  • LeadSled - Friday, February 20, 2009 - link

    What is really amazing, is the shrink proccess timetable. It looks like they will meet the timetable for our first Quantum DOT procersors. It is theorized to occure at the 1.5nm proccess and by the year 2020.
  • KeepSix - Saturday, February 14, 2009 - link

    I guess I can't blame them for changing sockets all the time, but I'm not sure if I'll be switching any time soon. My Q6600 hasn't gone past 50% usage yet, even when extreme multi-tasking (editing HD video, etc.)

    I'd love to build an i7 right now, but I just can't justify it.
  • Hrel - Thursday, February 12, 2009 - link

    On the mainstream quad-core side, it may not make sense to try to upgrade to 32nm quad-core until Sandy Bridge at the end of 2010. If you buy Lynnfield this year, chances are that you won’t feel a need to upgrade until late 2010/2011.

    So if you buy a quad core 8 thread 3.0 Ghz processor you will "NEED" to upgrade in one year?! What?! It doesn't make sense to upgrade just for the sake of having the latest. Upgrade when your computer can't run the programs you need it to anymore; or when you have the extra money and you'll see at least a 30 percent minimum increase in performance. You should be good for at least 2 years with Lynnfield and probably 4 or 5 years.
  • QChronoD - Thursday, February 12, 2009 - link

    He's saying that the people who have no qualms about throwing down a grand on just the processor are going to want to upgrade to the 32nm next year.

    However for the rest of us that don't shit gold, picking up a Lynnfield later this year will tide us over until 2011 fairly happily.
  • AnnonymousCoward - Thursday, February 12, 2009 - link

    > However for the rest of us that don't shit gold, picking up a Lynnfield later this year will tide us over until 2011 fairly happily.

    My C2D@3GHz will hold me over to 2011...
  • MadBoris - Thursday, February 12, 2009 - link

    I watch roadmaps from time to time and I know where AMD has potential.

    Simplify the damn roadmap, platforms, chipsets, sockets!

    Seriously, I need a spread sheet and calculator to keep it all straight.

    Glad Anand gave kind of a summary for were and when it makes sense to upgrade but I just don't have the patience to filter through it all to the end I get a working knowledge of it.

    One thing AMD has been good at in the past if they continue, is to keep upgrades simple. I don't want a new motherboard and new socket on near every CPU upgrade. I'm not sure if mobo makers love it or hate it, obviously they get new sales but it's kind of nuts.
    This alone, knowing I have some future proofing on the mobo, makes CPU upgrades appealing and easy and something I would take advantage of.

    As far as the GPU/CPU it's nothing I will need for years to come. We will have to wait until it permeates the market before it gets used by devs, just like multicore. It will at least take consoles implementing it before game devs start utilizing it, and even then it's liable to take a lot of steps back in performance (it's only hype now)...

Log in

Don't have an account? Sign up now