another take on Xbox 360's genesis (originally 60 shader GPU - 16 core CPU)

  • Thread starter Thread starter AirRaid
  • Start date Start date
A

AirRaid

Learning from failure

The inside story on how IBM out-foxed Intel with the Xbox 360

By Dean Takahashi -- Electronic Business, 5/1/2006



Learning from failure is a hallmark of the technology business. Nick
Baker, a 37-year-old system architect at Microsoft, knows that well. A
British transplant at the software giant's Silicon Valley campus, he
went from failed project to failed project in his career. He worked on
such dogs as Apple Computer's defunct video card business, 3DO's failed
game consoles, a chip startup that screwed up a deal with Nintendo, the
never successful WebTV and Microsoft's canceled Ultimate TV satellite
TV recorder.

But Baker finally has a hot seller with the Xbox 360, Microsoft's video
game console launched worldwide last holiday season. The adventure on
which he embarked four years ago would ultimately prove that failure is
often the best teacher. His new gig would once again provide copious
evidence that flexibility and understanding of detailed customer needs
will beat a rigid business model every time. And so far the score is
Xbox 360 one and the delayed PlayStation 3 nothing.

The Xbox 360 console is Microsoft's living room Trojan horse, purchased
as a game box but capable of so much more in the realm of digital
entertainment in the living room. Since the day after Microsoft
terminated the Ultimate TV box, in February 2002, Baker has been
working on the Xbox 360 silicon architecture team at Microsoft's campus
in Mountain View, Calif. He is one of the 3DO survivors who now gets a
shot at revenge against the Japanese companies that vanquished his old
firm.

"It feels good," says Baker. "I can play it at home with the kids. It's
family-friendly, and I don't have to play on the Nintendo anymore."

Baker is one of the people behind the scenes who pulled together the
Xbox 360 console by engineering some of the most complicated chips ever
designed for a consumer entertainment device. The team labored for
years and made critical decisions that enabled Microsoft to beat Sony
and Nintendo to market with a new box, despite a late start with the
Xbox in the previous product cycle. Their story, captured here and in a
forthcoming book by the author of this article, illustrates the ups and
downs in any big project.

When Baker and his pal Jeff Andrews joined games programmer Mike
Abrash, in early 2002, they had clear marching orders. Their
bosses-Microsoft CEO Steve Ballmer, at the top of Microsoft; Robbie
Bach, running the Xbox division; Xbox hardware chief Todd Holmdahl;
Greg Gibson, for Xbox 360 system architecture; and silicon chief Larry
Yang-all dictated what Microsoft needed this time around.

They couldn't be late. They had to make hardware that could become much
cheaper over time and had to pack as much performance into a game
console as they could without overheating the box.
Trinity taken

The group of silicon engineers started first among the 2,000 people in
the Xbox division on a project that Baker had code-named Trinity. But
they couldn't use that name, because someone else at Microsoft had
taken it. So they named it Xenon, for the colorless and odorless gas,
because it sounded cool enough. Their first order of business was to
study computing architectures, from those of the best supercomputers to
those of the most power-efficient portable gadgets. Although Microsoft
had chosen Intel and Nvidia to make the chips for the original Xbox the
first time around, the engineers now talked to a broad spectrum of
semiconductor makers.

"For us, 2002 was about understanding what the technology could do,"
says Greg Gibson, system designer.

Sony had teamed up with IBM and Toshiba to create a full-custom
microprocessor from the ground up. They planned to spend $400 million
developing the Cell architecture and even more fabricating the chips.
Microsoft didn't have the time or the chip engineers to match the
effort on that scale, but Todd Holmdahl and Larry Yang saw a chance to
beat Sony. They could marshal a host of virtual resources and create a
semicustom design that combined both off-the-shelf technology and their
own ideas for game hardware. Microsoft would lead the integration of
the hardware, own the intellectual property, set the cost-reduction
schedules, and manage its vendors closely.

They believed that this approach would get them to market by 2005,
which was when they estimated Sony would be ready with the PlayStation
3. (As it turned out, Microsoft's dreams were answered when Sony, in
March, postponed the PlayStation 3 launch until November.)

More important, using an IP ownership strategy with the chips could
dramatically cut Microsoft's costs on the original Xbox. Microsoft had
lost an estimated $3.7 billion over four years, or roughly a whopping
$168 per box. By cutting costs, Microsoft could erase a lot of red ink.

Balanced design

Baker and Andrews quickly decided they wanted to create a balanced
design, trading off power efficiency and performance. So they
envisioned a multicore microprocessor, one with as many as 16 cores, or
miniprocessors, on one chip. They wanted a graphics chip with 60
shaders, or parallel processors for rendering distinct features in a
graphic animations.

Laura Fryer, manager of the Xbox Advanced Technology Group in Redmond,
Wash., solicited feedback on the new microprocessor. She said that game
developers were wary of managing multiple software threads associated
with multiple cores, because the switch created a juggling task they
didn't have to do on the original Xbox or the PC. But they appreciated
the power efficiency and added performance they could get.

Microsoft's current vendors, Intel and Nvidia, didn't like the idea
that Microsoft would own the IP they created. For Intel, allowing
Microsoft to take the x86 design to another manufacturer was as
troubling as signing away the rights to Windows would be to Microsoft.
Nvidia was willing to do the work, but if it had to deviate from its
road map for PC graphics chips in order to tailor a chip for a game
box, then it wanted to get paid for it. Microsoft didn't want to pay
that high a price. "It wasn't a good deal," says Jen Hsun-Huang, CEO of
Nvidia. Microsoft had also been through a painful arbitration on
pricing for the original Xbox graphics chips.

IBM, on the other hand, had started a chip engineering services
business and was perfectly willing to customize a PowerPC design for
Microsoft, says Jim Comfort, an IBM vice president. At first IBM didn't
believe that Microsoft wanted to work together, given a history of
rancor dating back to the DOS and OS/2 operating systems in the 1980s.
Moreover, IBM was working for Microsoft rivals Sony and Nintendo. But
Microsoft pressed IBM for its views on multicore chips and discovered
that Big Blue was ahead of Intel in thinking about these kinds of
designs.

When Bill Adamec, a Microsoft program manager, traveled to IBM's chip
design campus, in Rochester, N.Y., he did a double take when he arrived
at the meeting room where 26 engineers were waiting for him. Although
IBM had reservations about Microsoft's schedule, the company was
clearly serious.

"For us, 2002 was about understanding what the technology could do."
-Greg Gibson, Microsoft system designer

Meanwhile, ATI Technologies assigned a small team to conceive a
proposal for a game console graphics chip. Instead of pulling out a
derivative of a PC graphics chip, ATI's engineers decided to design a
brand-new console graphics chip that relied on embedded memory to feed
a lot data to the graphics chip while keeping the main data pathway
clear of traffic- critical for avoiding bottlenecks that would slow
down the system.
Stomaching IBM

By the fall of 2002, Microsoft's chip architects had decided that they
favored the IBM and ATI solutions. They met with Ballmer and Gates, who
wanted to be involved in the critical design decisions at an early
juncture. Larry Yang recalls, "We asked them if they could stomach a
relationship with IBM." Their affirmative answer pleased the team.

By early 2003, the list of potential chip suppliers had been narrowed
down. At that point, Robbie Bach, the chief Xbox officer, took his team
to a retreat at the Salish Lodge, on the edge of Washington's beautiful
Snoqualmie Falls, made famous by the Twin Peaks TV show. The team
hashed out a battle plan. They would own the IP for silicon that could
take the costs of the box down quickly. They would launch the box in
2005 at the same time as Sony would launch its box, or even earlier.
The last time, Sony had had a 20-month head start with the PlayStation
2. By the time Microsoft sold its first 1.4 million Xboxes, Sony had
sold more than 25 million PlayStation 2s.

Those goals fit well with the choice of IBM and ATI for the two pieces
of silicon that would account for more than half the cost of the box.
Each chip supplier moved forward, based on a "statement of work, " but
Gibson kept his options open, and it would be months before the team
finalized a contract. Both IBM and ATI could pull blocks of IP from
their existing products and reuse them in the Microsoft chips.
Engineering teams from both companies began working on joint projects
such as the data pathway that connected the chips. ATI had to make
contingency plans, in case Microsoft chose Intel over IBM, and IBM also
had to consider the possibility that Microsoft might choose Nvidia.
Hacking embarassment

Through the summer, Microsoft executives and marketers created detailed
plans for the console launch. They decided to build security into the
microprocessor to prevent hacking, which had proved to be a major
embarrassment on the original Xbox. Marketers such as David Reid all
but demanded that Microsoft try to develop the new machine in a way
that would allow the games for the original Xbox to run on it.
So-called backward compatibility wasn't necessarily exploited by
customers, but it was a big factor in deciding which box to buy. And
Bach insisted that Microsoft had to make gains in Japan and Europe by
launching in those regions at the same time as in North America.

For a period in July 2003, Bob Feldstein, the ATI vice president in
charge of the Xenon graphics chip, thought Nvidia had won the deal, but
in August Microsoft signed a deal with ATI and announced it to the
world. The ATI chip would have 48 shaders, or processors that would
handle the nuances of color shading and surface features on graphics
objects, and would come with 10 megabytes of embedded memory.

IBM followed with a contract signing a month later. The deal was more
complicated than ATI's, because Microsoft had negotiated the right to
take the IBM design and have it manufactured in an IBM-licensed foundry
being built by contract chip maker Chartered Semiconductor. The chip
would have three cores and run at 3.2 gigahertz. It was a little short
of the 3.5 GHz that IBM had originally pitched, but it wasn't off by
much.

By October 2003, the entire Xenon team had made its pitch to Gates and
Ballmer. They faced some tough questions. Gates wanted to know if there
was any chance the box would run the complete Windows operating system.
The top executives ended up giving the green light to Xenon without a
Windows version.

"They were on the highest wire with the shortest net." -J Allard,
Corporate Vice President, Microsoft

The ranks of Microsoft's hardware team swelled to more than 200, with
half of the team members working on silicon integration. Many of these
people were like Baker and Andrews, stragglers who had come from failed
projects such as 3DO and WebTV. About 10 engineers worked on "Ana," a
Microsoft video encoder chip, while others managed the schedule and
cost reduction with IBM and ATI. Others supported suppliers, such as
Silicon Integrated Systems, the supplier of the "south bridge," the
communications and input/output chip. The rest of the team helped
handle relationships with vendors for the other 1,700 parts in the game
console.

Ilan Spillinger headed the IBM chip program, which carried the code
name Waternoose, after the spiderlike creature from the film He
supervised IBM's chief engineer, Dave Shippy, and worked closely with
Microsoft's Andrews on every aspect of the design program.
Games at center

Everything happened in parallel. For much of 2003, a team of industrial
designers created the look and feel of the box. They tested the design
on gamers, and the feedback suggested that the design seemed like
something that either Apple or Sony had created. The marketing team
decided to call the machine the Xbox 360, because it put the gamer at
the center. A small software team led by Tracy Sharp developed the
operating system in Redmond. Microsoft started investing heavily in
games. By February 2004, Microsoft sent out the first kits to game
developers for making games on Apple Macintosh G5 computers. And in
early 2004, Greg Gibson's evaluation team began testing subsystems to
make sure they would all work together when the final design came
together.

IBM assigned 421 engineers from six or seven sites to the project,
which was a proving ground for its design services business. The effort
paid off, with an early test chip that came out in August 2004. With
that chip, Microsoft was able to begin debugging the operating system.
ATI taped out its first design in September 2004, and IBM taped out its
full chip in October 2004. Both chips ran game code early on, which was
good, considering that it's very hard to get chips working at all when
they first come out of the factory.

IBM executed without many setbacks. As it revised the chip, it fixed
bugs with two revisions of the chip's layers. The company was able to
debug the design in the factory quickly, because IBM's fab engineers
could work on one part while the Chartered engineers could debug a
different part of the chip. They fed the information to each other,
speeding the cycle of revisions. By January 30, 2005, IBM taped out the
final version of the microprocessor.

ATI, meanwhile, had a more difficult time. The company had assigned 180
engineers to the project. Although games ran on the chip early,
problems came up in the lab. Feldstein said that in one game, one frame
of animation would freeze as every other frame went by. It took six
weeks to uncover the bug and find a fix. Delays in debugging threatened
to throw the beta-development-kit program off schedule. That meant that
thousands of game developers might not get the systems they needed on
time. If that happened, the Xbox 360 might launch without enough games,
a disaster in the making.

The pressure was intense. But Neil McCarthy, a Microsoft engineer in
Mountain View, designed a modification of the metal layers of the
graphics chip. By doing so, he enabled Microsoft to get working chips
from the interim design. ATI's foundry, Taiwan Semiconductor
Manufacturing Co., churned out enough chips to seed the developer
systems. The beta kits went out in the spring of 2005.

Meanwhile, Microsoft's brass was worried that Sony would trump the Xbox
360 by coming out with more memory in the PlayStation 3. So in the
spring of 2005, Microsoft made what would become a fateful decision. It
decided to double the amount of memory in the box, from 256 megabytes
to 512 megabytes of graphics double-data-rate 3 (GDDR3) chips. The
decision would cost Microsoft $900 million over five years, so the
company had to pare back spending in other areas to stay on its profit
targets.

Microsoft started tying up all the loose ends. It rehired Seagate
Technology, which it had hired for the original Xbox, to make hard disk
drives for the box, but this time Microsoft decided to have two SKUs,
one with a hard drive, for the enthusiasts, and one without, for the
budget-conscious. It brought aboard both Flextronics and Wistron, the
current makers of the Xbox, as contract manufacturers. But it also laid
plans to have Celestica build a third factory for building the Xbox
360.

Just as everyone started to worry about the schedule going off course,
ATI spun out the final graphics chip design in mid-July 2005. Everyone
breathed a sigh of relief, and they moved on to the tough work of
ramping up manufacturing. There was enough time for both ATI and IBM to
build a stockpile of chips for the launch, which was set for November
22 in North America, December 2 in Europe and December 10 in Japan.

Flextronics debugged the assembly process first. Nick Baker traveled to
China to debug the initial boxes as they came off the line. Although
assembly was scheduled to start in August, it didn't get started until
September. Because the machines were being built in southern China,
they had to be shipped over a period of six weeks by boat to the
regions. Each factory could build only as many as 120,000 machines a
week, running at full tilt. The slow start, combined with the
multiregion launch, created big risks for Microsoft.
Pins and needles

The hardware team was on pins and needles. The most-complicated chips
came in on time and were remarkable achievements. Typically, it took
more than two years to do the initial designs of complicated chip
projects, but both companies were actually manufacturing inside that
time window.

Then something unexpected hit. Both Samsung and Infineon Technologies
had committed to making the GDDR3 memory for Microsoft. But some of
Infineon's chips fell short of the 700 megahertz specified by
Microsoft. Using such chips could have slowed games down noticeably.
Microsoft's engineers consulted and decided to start sorting the chips,
not using the subpar ones. Because GDDR3 700-MHz chips were just
ramping up, there was no way to get more chips. Each system used eight
chips. The shortage constrained the supply of Xbox 360s.

Microsoft blamed the resulting shortfall of Xbox 360s on a variety of
component shortages. Some users complained of overheating systems. But
overall, the company said, the launch was still a great achievement. In
its first holiday season, Microsoft sold 1.5 million Xbox 360s,
compared to 1.4 million original Xboxes in the holiday season of 2001.
But the shortage continued past the holidays.

Leslie Leland, the hardware evaluation director, says she felt
"terrible" about the shortage and that Microsoft would strive to get a
box into the hands of every consumer who wanted one. But Greg Gibson,
the system designer, says that Microsoft could have worse problems on
its hands than a shortage. The IBM and ATI teams had outdone
themselves.

The project was by far the most successful that Nick Baker had ever
worked on. One night, hoisting a beer and looking at a finished
console, he said that it felt good.

J Allard, the head of the Xbox platform business, praised the chip
engineers such as Baker: "They were on the highest wire with the
shortest net."


http://www.reed-electronics.com/eb-mag/article/CA6328378?pubdate=5/1/2006
 
AirRaid said:
Learning from failure

The inside story on how IBM out-foxed Intel with the Xbox 360

By Dean Takahashi -- Electronic Business, 5/1/2006
Nice copying. And make that Rochester, Minnesota.
 
: Learning from failure
:
: The inside story on how IBM out-foxed Intel with the Xbox 360
:

Changed your screen-name once again, eh punk? Really, who gives a
shit....

j.
 
Jack said:
: Learning from failure
:
: The inside story on how IBM out-foxed Intel with the Xbox 360
:

Changed your screen-name once again, eh punk? Really, who gives a
shit....

j.

We do, copy and paste is a waste of bandwidth, when you can just post a
link, and maybe a short paragraph describing the content of the
article.

I would also watch who you address your comments too, all this does is
put people on notice to avoid your posts and put you in their kill
files. It might also be a good idea to know who you address so you
don't make yourself out to be more feeble minded, and stupid.

Rthoreau
 
Back
Top