Results 1 to 2 of 2

Thread: Intel Larabee all but dead for consumers

Hybrid View

Previous Post Previous Post   Next Post Next Post
  1. #1
    No Hobbits Allowed Isengard's Avatar
    Join Date
    Feb 2009
    Location
    Cape Town
    Posts
    8,409

    Default Intel Larabee all but dead for consumers

    It looks like Intel’s bid to become a major player in consumer graphics chips has ended in disaster — for now.

    The world’s biggest chip maker has been working for years on Larrabee, a chip with dozens of cores for processing graphics. It was the company’s major competitive thrust at Nvidia and the graphics division of Advanced Micro Devices. But the company has canceled the consumer version of Larrabee, as first reported in the SemiAccurate blog.

    “Larrabee silicon and software development are behind where we had hoped to be at this point in the project,” said Nick Knuppfler, a spokesman for Intel in Santa Clara, Calif. “Larrabee will not be a consumer product.”

    In other words, it’s not entirely dead. It’s mostly dead. Instead of launching the chip in the consumer market, Intel will make it available as a software development platform for both internal and external developers. Those developers can use it to develop software that can run in high-performance computers.

    But Knuppfler said that Intel will continue to work on stand-alone graphics chip designs. He said the company would have more to say about that in 2010.

    The setback will allow Nvidia and AMD to breathe sighs of relief. Intel was offering a very different architecture that would have competed with the stand-alone graphics chips that those companies make. In one manner of measurement, Intel said the performance of the initial Larrabee design for “throughput computing” applications used in supercomputers is “extremely promising.” The design drew praise and interest at the Game Developers Conference this year. Evidently, however, the consumer graphics performance was weak.

    Intel will still continue to make graphics components that are integrated into PC support chips known as chip sets. But those are typically not good enough to run high-end 3-D games and high-resolution videos.

    Next Story: Droid anti-iPhone ad mocks “clueless” girls, praises Iraqi missile
    Previous Story: Report: Apple in talks to buy Lala to face a new streaming world
    Direct Link

    Pity, it's always good to have more competition.

  2. #2
    Persistent illusion GReeN_ORB's Avatar
    Join Date
    Jun 2009
    Posts
    1,871

    Default

    Can't say I'm really surprised, to be honest. For Intel to attempt to join a stable market, in which the competing firms have decades of research behind their name, is pretty optimistic. It's even more optimistic to assume a chip like that could compete effectively with the current generation of GPUs from nvidia/ati/via/etc.

    As you say though, it is a pity. Cheaper, faster performing GPUs are what we want...
    The difference between pizza and your opinion is that I asked for pizza.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •