photonic x86 CPU design

  • Thread starter Thread starter Nathan Bates
  • Start date Start date
If you ignore that however and also assume that a first generation
photonic product wouldn't be able to support the complexity of a
modern x86 chip, would there be a market for a 100GHz 8086? Or
80386, etc.? Would you be willing to run Windows 3.1 again if it
ran 20x faster than XP on a traditional modern x86?

G.

It doesn't work that way. The question is based on lots of false
premises.

DS
 
Gavin said:
That was what came to mind here.

If you ignore that however and also assume that a first generation
photonic product wouldn't be able to support the complexity of a
modern x86 chip, would there be a market for a 100GHz 8086? Or
80386, etc.? Would you be willing to run Windows 3.1 again if it
ran 20x faster than XP on a traditional modern x86?

The question would be irrelevant if it reflected the reality of the
hardware. Given a CPU 20x faster than state of the art at that time you
could run an X86 emulator on it and the real instruction set wouldn't
matter, it would still be faster than current. If it were some subset of
x86 that's a bonus, not a requirement. You don't say "cost effective"
but the assumption is there, I think, 20x faster in any usable way would
probably do it.

If a photonic computer of reasonable speed and cost existed, I suspect
that there would be a quick port of at least one O/S to it and x86 would
be come history in the high end market. Clearly if you port something
like Linux you get a lot of tools which either just work or work with
little effort. That's because there are 32 and 64 bit versions in use so
many of the problems have been fixed already.

You could assume that if someone were building any new CPU today it
would be LSB order 2's complement, so even though there are assumptions
in some programs they are unlikely to fail.

I believe 32 bit Windows was ported to Alpha, so given a reason it could
be ported to a new CPU as well. I'm not sure that even a chance to own
the workstation market would be a reason, the Alpha market was too small
to support itself, and most buyers don't spend the money to buy the
fastest CPU available now.

The real question is if this new system would be so fast that people
would go to the effort to port anything BUT Linux, since it would have
to make sen$e to do so. Linux would get ported because (a) there are
lots of recent ports to serve as examples, (b) people will do it to
prove they can, and (c) you can get a grad student to do almost anything
for a little money and a thesis topic.
 
Nick said:
|> >As soon as I hear photonics in relation to CPUs, I immediately think
|> scam.
|>
|> Manufacturing a CPU on silicon was an enormous scam.
|> Just melt worthless sand into tiny wafers and sell each one for $1,000.

A long time back, someone wrote an article about the forthcoming
silicon shortage, if computer use kept expanding.

|> Seriously, here's an intriguing article mentioning
|> Intel, AMD, FreeScale, and Transmeta regarding photonics:
|> http://www.extremetech.com/article2/0,1558,1779951,00.asp

Sigh. Most of that is about the electro-optical converters,
which are produced in large numbers today but are not scalable.
If Luxtera or anyone else can manage to integrate those with
CPUs, it would make massive difference to interconnects and
might even be used inside a chip to reduce latency.

My first thought was HyperTransport. But 10GHz isn't that fast... people
are using that for ethernet in labs, based on silicon.
 
So how exactly does holographic memory work?

Do you mean "how is it *supposed* to work"?

Remember, you can cut a hologram in half and still have a complete
picture, though the S/N ratio is reduced. With this in mind, it's
really a method of spreading information over a wide area. Many
redundancy methods do similar things. It does sound kewll though. ;-)
 
Well, if you have a 50 year head start, but are nowhere near the
finishing line, perhaps you are running in the wrong direction?

:-)

Well, CNF hasn't been around 50 years. Perhaps you'd rather invest in
that? ;-)
 
My first thought was HyperTransport. But 10GHz isn't that fast... people
are using that for ethernet in labs, based on silicon.

With how many NAND4s inbetween clocks? GHz<>GHz.
 
Yeah but how does laser-burning pictures onto photographic plates
relate to computer memory, especially RAM?

Yousuf Khan
 
Yousuf said:
Yeah but how does laser-burning pictures onto photographic plates
relate to computer memory, especially RAM?
I think the practical application would be backup. Some media which
would be fast enough and cheap enough to take regular backups.

I can buy a TB of IDE disk for ~$1k, but I can't buy a practical backup
system any cheaper than another two (or more) sets of drives, in some
way removable. To balance the cost of backup to the cost of main storage
I would guess reusable media would need to be ~$100/TB, use once no more
than $25-35/TB so you can sell it into the home/SB environments.
 
Keith said:
Pictures are made from pixels, which are bits.


The application then was mass storage.

OIC, so it's not so much as a replacement for RAM or Flash, but more as
a replacement for CD/DVDs, etc.?

Yousuf Khan
 
YKhan said:
OIC, so it's not so much as a replacement for RAM or Flash, but more as
a replacement for CD/DVDs, etc.?

The designs I saw could be considered that, as long as you are talking about
the RW varieties of those devices. They used crystals of something like
Lithium Niobate to record the data. They also had some "odd" access
characteristics. When you flashed the laser to read a block of data, you
got a huge chunk (~100 MB) at a time, with extradinary transfer rate, but it
put so much energy into the system that it had to cool down for several
seconds before being accessed again. Of course, this was just one proposed
device and AFAIK, it never made it past the prototype stage.
 
Back
Top