photonic x86 CPU design

  • Thread starter Thread starter Nathan Bates
  • Start date Start date
N

Nathan Bates

A rumor is that yet another x86 startup company (as in Cyrix)
has opened shop on Freedom Circle (which by no coincidence,
is next-door to Transmeta). Dozens of tech companies tried to compete
in the x86 arena and all failed (including AMD in terms of
profitability).
The twist is that this x86 CPU will be based on photonic technology.
Photonics opens a world of design and architecture possibilities,
as compared to an electronic device where power dissipation
has historically limited architectural enhancements/innovations.
 
YKhan said:
As soon as I hear photonics in relation to CPUs, I immediately think
scam.

Yousuf Khan
Intel recently demonstrated in-silicon infrared laser, so I think
it will not remain this way much longer. There is not many other
ways to go if you want to keep increasing bandwith inside the chip.

Regards,
Evgenij
 
Nathan said:
A rumor is that yet another x86 startup company (as in Cyrix)
has opened shop on Freedom Circle (which by no coincidence,
is next-door to Transmeta). Dozens of tech companies tried to compete
in the x86 arena and all failed (including AMD in terms of
profitability).

I presume you mean "not as much profit" rather than "unprofitable."
The twist is that this x86 CPU will be based on photonic technology.
Photonics opens a world of design and architecture possibilities,
as compared to an electronic device where power dissipation
has historically limited architectural enhancements/innovations.
Do post when they start shipping anything other than glowing
projections. Like fusion and quantum computing, it's clear that it would
be much better than any existing alternative, if you could get it
working in any cost effective way.

In future tech, I'm much more impressed with the possibilities of that
company which can produce quantity boron-doped semiconducting diamond
for substrate. That seems to be deliverable with technology which exists
currently, although not all in the same place.

I spent a few decades at a major R&D lab, research is proving it can be
done, development is finding out how. There's a lot of engineering
needed to get photonic computing going. Just as a first thought, I would
think that a RISC design would be easier to emulate if that were a goal.
 
Bill said:
I spent a few decades at a major R&D lab, research is proving it can be
done, development is finding out how. There's a lot of engineering
needed to get photonic computing going. Just as a first thought, I would
think that a RISC design would be easier to emulate if that were a goal.

Or some kind of an embedded RISC core, with no FPU or stuff like that.

Yousuf Khan
 
The twist is that this x86 CPU will be based on photonic technology.
Photonics

can you say bling bling?
I just hope they got themselves lots of room for those flashlights and
mirrors.
 
Evgenij Barsukov said:
Intel recently demonstrated in-silicon infrared laser, so I think
it will not remain this way much longer. There is not many other
ways to go if you want to keep increasing bandwith inside the chip.

Not speaking for my employer, Intel, but I agree with YKhan that
a photonic x86 is not a real possibility any time soon.

But M.Barsukov, do you really work for TI ?
That's where your post is from.
 
As soon as I hear photonics in relation to CPUs, I immediately think
scam.

Manufacturing a CPU on silicon was an enormous scam.
Just melt worthless sand into tiny wafers and sell each one for $1,000.

Seriously, here's an intriguing article mentioning
Intel, AMD, FreeScale, and Transmeta regarding photonics:
http://www.extremetech.com/article2/0,1558,1779951,00.asp

Thru design to tape-out, a new x86 CPU based on CMOS technology
will take 3 years min to develop. A photonic x86 will take much
longer,
maybe 5..7 years, and expect a rate of advancement of photonic
technology.
Sounds like the classic gamble for Silicon Valley VCs.
 
|> >As soon as I hear photonics in relation to CPUs, I immediately think
|> scam.
|>
|> Manufacturing a CPU on silicon was an enormous scam.
|> Just melt worthless sand into tiny wafers and sell each one for $1,000.

A long time back, someone wrote an article about the forthcoming
silicon shortage, if computer use kept expanding.

|> Seriously, here's an intriguing article mentioning
|> Intel, AMD, FreeScale, and Transmeta regarding photonics:
|> http://www.extremetech.com/article2/0,1558,1779951,00.asp

Sigh. Most of that is about the electro-optical converters,
which are produced in large numbers today but are not scalable.
If Luxtera or anyone else can manage to integrate those with
CPUs, it would make massive difference to interconnects and
might even be used inside a chip to reduce latency.

|> Thru design to tape-out, a new x86 CPU based on CMOS technology
|> will take 3 years min to develop. A photonic x86 will take much
|> longer,
|> maybe 5..7 years, and expect a rate of advancement of photonic
|> technology.
|> Sounds like the classic gamble for Silicon Valley VCs.

Complex optical logic is another game. Yes, maybe 5-7 years.
But also maybe 50-70. Sane people don't believe tight schedules
for developing new, known to be difficult, technology.


Regards,
Nick Maclaren.
 
|> Thru design to tape-out, a new x86 CPU based on CMOS technology
|> will take 3 years min to develop. A photonic x86 will take much
|> longer,
|> maybe 5..7 years, and expect a rate of advancement of photonic
|> technology.
|> Sounds like the classic gamble for Silicon Valley VCs.

Complex optical logic is another game. Yes, maybe 5-7 years.
But also maybe 50-70. Sane people don't believe tight schedules
for developing new, known to be difficult, technology.
Jeez! After we get nuclear fusion tackled we'll not need photonics.
Processors can then scale to 1.21GW and there won't be any need for
photonics. Indeed, my money would be on Mr. Fusion first. After all,
it has a 50 year head start.
 
can you say bling bling?
I just hope they got themselves lots of room for those flashlights and
mirrors.

I understand flashlights (when the power company wants the bill paid), but
mirrors?
 
Evgenij Barsukov said:
Intel recently demonstrated in-silicon infrared laser, so I think
it will not remain this way much longer. There is not many other
ways to go if you want to keep increasing bandwith inside the chip.

Regards,
Evgenij

The problem is not (or mostly not) the generation of the light, it is
the switching based on the light. Pretty much all the technologies
available to do this suck in one way or another.
And, of course, it's pretty much irrelevant whether they get a photonics
CPU to work or not. Hook up your fancy 100GHz CPU to existing RAM, and
the results aren't going to impress anyone much.

Maynard
 
keith said:
Jeez! After we get nuclear fusion tackled we'll not need photonics.
Processors can then scale to 1.21GW and there won't be any need for
photonics. Indeed, my money would be on Mr. Fusion first. After all,
it has a 50 year head start.

Well, if you have a 50 year head start, but are nowhere near the
finishing line, perhaps you are running in the wrong direction?

:-)

-k
 
The problem is not (or mostly not) the generation of the light, it is
the switching based on the light. Pretty much all the technologies
available to do this suck in one way or another.
And, of course, it's pretty much irrelevant whether they get a photonics
CPU to work or not. Hook up your fancy 100GHz CPU to existing RAM, and
the results aren't going to impress anyone much.

Hey, you could hook the bugger up directly to some of that holographic
memory I've been hearing great things about... for the past x0 years
(choose x according to your age).
 
fammacd=! said:
Hey, you could hook the bugger up directly to some of that holographic
memory I've been hearing great things about... for the past x0 years
(choose x according to your age).
I first heard about holographic storage "proposed"[*] for the
Illiac IV in the mid '60s.

[*] not sure it was serious
 
Hey, you could hook the bugger up directly to some of that holographic
memory I've been hearing great things about... for the past x0 years
(choose x according to your age).

you mean Diamond/ruby/gas based? The one workind in labs today ?
 
In comp.arch Maynard Handley said:
And, of course, it's pretty much irrelevant whether they get a photonics
CPU to work or not. Hook up your fancy 100GHz CPU to existing RAM, and
the results aren't going to impress anyone much.

That was what came to mind here.

If you ignore that however and also assume that a first generation
photonic product wouldn't be able to support the complexity of a
modern x86 chip, would there be a market for a 100GHz 8086? Or
80386, etc.? Would you be willing to run Windows 3.1 again if it
ran 20x faster than XP on a traditional modern x86?

G.
 
Back
Top