The end of the road for the DIY PC?

  • Thread starter Thread starter Yousuf Khan
  • Start date Start date
Y

Yousuf Khan

Intel has announced that they will stop making replaceable CPU's after
Haswell. From now on, all CPU's are supposed to be in BGA packaging,
which means you can only attach CPU's to the motherboard by soldiering
them on. You won't be seeing these in any home DIY's toolkit, so it's
the end of the road for that upgrade mechanism.

I've been upgrading my original system since 1987, and right now there's
no original pieces remaining on it, but I can trace each of the pieces
back in a chain to the original 8088 PC-XT clone that I had bought back
then. I suppose it was meant to happen, not many people build their own
PC's anymore, and it's been cheaper to buy a full brand-new system for
many years now rather than upgrading it.

Although this is just an Intel announcement, and AMD hasn't said it
would do the same thing, but I don't see AMD not following suit with
this, it'll help their financial situation too, and probably help them
even more.

I suppose you could keep upgrading if you buy a full new motherboard
alongside your CPU, you'd probably have to buy it with new memory also.

Yousuf Khan

Intel’s Haswell Could Be Last Interchangeable Desktop Microprocessors -
Report - X-bit labs
http://www.xbitlabs.com/news/cpu/di...hangeable_Desktop_Microprocessors_Report.html
 
Yousuf said:
Intel has announced that they will stop making replaceable CPU's after
Haswell. From now on, all CPU's are supposed to be in BGA packaging,
which means you can only attach CPU's to the motherboard by soldiering
them on. You won't be seeing these in any home DIY's toolkit, so it's
the end of the road for that upgrade mechanism.

I've been upgrading my original system since 1987, and right now there's
no original pieces remaining on it, but I can trace each of the pieces
back in a chain to the original 8088 PC-XT clone that I had bought back
then. I suppose it was meant to happen, not many people build their own
PC's anymore, and it's been cheaper to buy a full brand-new system for
many years now rather than upgrading it.

Although this is just an Intel announcement, and AMD hasn't said it
would do the same thing, but I don't see AMD not following suit with
this, it'll help their financial situation too, and probably help them
even more.

I suppose you could keep upgrading if you buy a full new motherboard
alongside your CPU, you'd probably have to buy it with new memory also.

Yousuf Khan

Intel’s Haswell Could Be Last Interchangeable Desktop Microprocessors -
Report - X-bit labs
http://www.xbitlabs.com/news/cpu/di...hangeable_Desktop_Microprocessors_Report.html

There's always a solution.

Remember that Foxconn makes their own sockets for motherboards, and
they also make motherboards. The motherboard industry could cook up
a flexible solution all on their own.

There are a ton of cheesy adapters out there. Lots of opportunities
for someone to cook up a solution. All that's needed is sufficient
lead time to do the engineering and make a reliable solution.

http://www.primedistributing.com/ProductImages/enplas/p-PA-BGA-SMT.jpg

And if Intel makes tested silicon die available as a purchase option,
someone can package them at an MCM factory. And put any kind of lead
or contact on it, that you want.

This is just an opportunity for someone - a middle man - to make some cash.

Paul
 
Yousuf said:
Intel has announced that they will stop making replaceable CPU's
after Haswell. From now on, all CPU's are supposed to be in BGA
packaging,

My first PC (besides a couple pocket computers) was a real IBM PC circa
1983. I upgraded that as much as I could over the 3 years (but never
had a hard drive) and my next PC in 1987 was a 24 mhz Harris-80286-based
motherboard made (IIRC) by Western Digital - with integrated MFM (or was
it RLL?) winchester controller. I never did own any 386-based boards -
I moved directly to various 486 boards, and skipped the Pentium-1 and
instead owned various P2 and P3 boards (slocket, 370, etc). Was never
into AMD.

Since I use Windows 98se (fortified with KernelEx) for my primary
systems - based on 2.x ghz socket-478 Pentium and Celeron CPU's. I have
several Asrock/Via-based socket 775 boards waiting to be used as my
next-gen systems (still plan on running win-98 on them). I build the
PC's at $dayjob and our developers have Gigabyte socket-775 boards with
4-core CPU's, nvidia 7k or 8k video cards with dual display, and don't
really need anything more. They run Windoze 7 (some still run XP) and
we probably will never migrate those to Windoze 8 or beyond (those guys
will retire before windoze 7 becomes obsolete). Heck, our NNTP and HTTP
servers still run on NT4 servers running on P2-800 Gigabyte Intel 440BX
motherboards - 24/7/365 for more than 10 years now.

For the general-purpose SOHO desktop, the socket 478 was a real work
horse for years, and the socket-775 didn't really seem to hang around
for long - but I seriously doubt that the average person (soho, or
institutional / corporate) needs more CPU (not counting portable use, or
gamers).

So basically I've been out of touch with motherboards since maybe
Q3-2007, but I seriously doubt I'm missing anything for the past 5 years
except that the boards are getting more colorful and have sick-looking
heat sinks.
 
Intel has announced that they will stop making replaceable CPU's after
Haswell. From now on, all CPU's are supposed to be in BGA packaging,
which means you can only attach CPU's to the motherboard by soldiering
them on. You won't be seeing these in any home DIY's toolkit, so it's
the end of the road for that upgrade mechanism.

I've been upgrading my original system since 1987, and right now there's
no original pieces remaining on it, but I can trace each of the pieces
back in a chain to the original 8088 PC-XT clone that I had bought back
then. I suppose it was meant to happen, not many people build their own
PC's anymore, and it's been cheaper to buy a full brand-new system for
many years now rather than upgrading it.

Although this is just an Intel announcement, and AMD hasn't said it
would do the same thing, but I don't see AMD not following suit with
this, it'll help their financial situation too, and probably help them
even more.

I suppose you could keep upgrading if you buy a full new motherboard
alongside your CPU, you'd probably have to buy it with new memory also.

Yousuf Khan

Intel’s Haswell Could Be Last Interchangeable Desktop Microprocessors -
Report - X-bit labs
http://www.xbitlabs.com/news/cpu/di...hangeable_Desktop_Microprocessors_Report.html

What good does it do them to make it so you can't install a CPU? I
would think that would cut CPU sales!
 
Loren said:
What good does it do them to make it so you can't install a CPU? I
would think that would cut CPU sales!

There are other packaging concepts. Such as putting the CPU
on a removable card. This might not be that easy to do for
a P4, but processors with QPI or DMI, it might work out.

http://www.asrock.com/mb/spec/upgrade.asp?Model=939CPU Board

That makes the failure group a bit smaller. And if the
edge connector was standardized, it might allow swapping the
processor card between systems.

That does screw up cooling. In that existing heatpipe coolers
might not fit in the confined space available.

It's just a matter of where you move the interface point.
BGA to PGA adapter plus PGA socket. Processor card with
socket for FSB bus, whatever it happens to be.

Paul
 
Yousuf Khan said:
Intel has announced that they will stop making replaceable CPU's after
Haswell. From now on, all CPU's are supposed to be in BGA packaging,
which means you can only attach CPU's to the motherboard by soldiering
them on. You won't be seeing these in any home DIY's toolkit, so it's
the end of the road for that upgrade mechanism.

and a whole bunch more at
http://www.youtube.com/results?search_query=bga+soldering

What, you mean you don't have a heat gun in your electronic toolbox
along with the soldering iron, or hot air station sitting on the shelf?
The spouse will get pissed if you don't cleanup the pancake griddle
after using it to remove and resolder the BGA parts. You must have
soldering wick, though. Just means you'll have to put those old-school
soldering techniques in your backstore memory and learn how to desolder
and solder BGA parts.

You're just spoiled by sockets that made it possible for home users with
no or destructive soldering skills to add components to a mobo. Maybe
the parts vendors are getting tired of the returns by boobs that don't
employ anti-static measures, overclock, overheat, or otherwise destroy
good parts. Soldering on the CPU, chipset, memory, and other components
would certainly up the reliability of the assembly while reducing
returns from ignorant, lazy, or sloppy users.
Intel¢s Haswell Could Be Last Interchangeable Desktop Microprocessors -
Report - X-bit labs
http://www.xbitlabs.com/news/cpu/di...hangeable_Desktop_Microprocessors_Report.html

That wouldn't prevent first-time soldering of the CPU onto the BGA grid.
The mobo maker could just make a plastic frame to hold the chip in place
(both for position along with affixing to the mobo via spring clip) and
the user would use a soldering iron with a tip designed for the BGA grid
pattern. The user would buy the mobo they want, the CPU they want, and
then do a one-time solder of the CPU onto the mobo. After all, after
you buy the mobo and CPU and put them together, how often have you
actually replaced the CPU? Yeah, if the CPU goes bad then you have to
replace it but have you had to do so? When the CPU gets too old,
underpowered, or lacking in firmware features, do you really replace
just the CPU or do you replace the CPU, mobo, memory, and the whole
smash to upgrade to newer hardware?

Also, you can already buy mobo+CPU combos from online vendors. Most
times they pre-install the CPU so all you have to do is attached the
heatsink+fan (and sometimes you don't have to do that if you stay with
the stock HSF for the CPU). So instead of them sliding the CPU into the
ZIF socket for you, they'll have an inventory of pre-soldered
combinations and you pick one to buy.
 
Apple was/is like that, limited options in changing out hardware. If Intel
completely removes the DIY aspect of a PC then they are handing business
over to Apple. Also, a lot of third party vendors will probably close shop.

Fixed hardware + a Bing OS (aka Windows 8) = a fast declining pc industry.
Sounds like the 1990's Atari ST, AMIGA all over again.
 
Sounds like the 1990's Atari ST, AMIGA all over again.
Really, every computer. Sure you could buy an S100 bus systemin the early
days, but there was limited ability to upgrade despite all the boards
plugging into a motherboard that only had sockets.

It was easy to move to the Z80 from the 8080. But the bus was very much
related to the 8080, so "foreign" CPUs took a lot of adapting. Even the
front panel on the Altair was too specific to the 8080 to be useful with
another CPU. The standardization was often because of CP/M the operating
system, since it was written to keep the I/O in a small section, one could
fairly eaily adapt it to other hardware (as long as it used the 8080).

So the real upgrade path was the 16bit CPU, preferably the 8088 or 8086.
But then there were other issues besides differring buss signals, such as
lack of address lines for more RAM. There were various schemes to deal
with that, but it took time before standardization set in, and then it was
mostly too late.

When MITS came out with a 6800 based computer in the fall of 1975, they
put a different bus on it, and when SWTP put out their computer (which was
far more successful 6800 system than the MITS 6800 system) it used a
different bus (though that bus tended to be used by other 6800 based
computers).

The DIgital Group that was more like a hobby trying to turn into a
commercial product, it used its own bus which made it easy to have
different CPU boards, but they never went further than the Z80 and maybe
the 6502.

The Apple II wsa fairly flexible, so one could get Z80 cards for it, then
later 6809 cards, and at some point 68000 cards. But they were
workarounds and usually the 6502 did the I/O.

Let's not forget that the original IBM PC was no different from that Amiga
or Atari. ALl three had CPUs in sockets, but there was no plug in
replacement that made things faster. You could workaround that, but it
would need a whole board. And you'd be stuck with the existing clock
frequency unless you had complicated timing methods (to run the CPU faster
but the bus at its regular rate).

It was only with time that the "IBM PC" became more flexible. And that
was more a crossover between the CPUs and the motherboard manufacturers.
So you could put in a faster CPU, but that's because the motherboard
company anticipated faster speeds and put in jumpers. That meant the CPU
companies had to keep the other companies informed of where they were
going.

In the 386 era there was some level of variability, so you could get a
cheaper one that had no math coprocessor built in (and oddly then find a
math coprocessor to add later).

It was really in much more recent times that a motherboard had some hope
to be useable over time, and that was because the CPUs generally stopped
changing that much, the speed being the key factor. If the motherboard
anticipated upgrades, and the CPU kept the same package and other
features, then you could use the motehrboard for a few years. Usually a
new motherboard was needed if the databus bumped up in size, the exception
being eventually with the 32-64 upgrade.

Otherwise, it would be no different from the Amiga or Atari, except by
that point nobody was making CPUs to plug into the expansion bus (I once
found an 80286 card that did that), so you had to replace the motehrboard.
But then, the motherboard probably cost as much as one of those plug in
upgrade boards in the past, but the new motherboard didn't have to
compromise. The only good thing was the case was generally generic so the
new motherboard fit (well so long as the area for connectors at the back
matched up or could be replaced).

Michael
 
What good does it do them to make it so you can't install a CPU? I
would think that would cut CPU sales!

After speaking to my son-in-law, who works at Intel doing CPU design,
this will only be for mobile (laptop) machines. Desktop will remain as
it is now. For some reason Intel feels this will benefit foreign
markets more than domestic. I wasn't interested in why that is so I
didn't ask him.
 
Intel has announced that they will stop making
replaceable CPU's after Haswell. From now on, all CPU's
are supposed to be in BGA packaging, which means you
can only attach CPU's to the motherboard by soldiering them on.

I've seen BGA sockets for a long time, but I don't know how close their pins are. Also what's really the difference between BGA and the LGA packages Intel currently uses.

OTOH some motherboard-CPU combination deals are sold at the price of the CPU alone.
 
After speaking to my son-in-law, who works at Intel doing CPU design,
this will only be for mobile (laptop) machines. Desktop will remain as
it is now. For some reason Intel feels this will benefit foreign
markets more than domestic. I wasn't interested in why that is so I
didn't ask him.

Ok, that makes a lot more sense.

One rarely will mess with a laptop CPU anyway. Removing the socket
means it a bit smaller and lighter, things that matter on laptops.
 
I've seen BGA sockets for a long time, but I don't know how close their pins are. Also what's really the difference between BGA and the LGA packages Intel currently uses.

BGA is prepared with solder already placed and ready to lay into a
supportive component for soldering multiples of contacts (usually in
industrial assembly practices or on a wave station);- although I don't
see in principle where a working familiarity with wave stations
preclude one from, say, cutting off every pin from an AMD CPU, sitting
them in the socket and laying in isolated blobs of hot solder for
every pin stub on the CPU, to remelt into solid contacts to the pins
placed in the socket. LGA is sort of a reverse of the PGA. LGA has
modified the pins into miniature springs (elbows) that will accept
tension from the mounting bracket and lever arm as the utterly-flat
CPU is lowered upon and then pressed into the Land(-ing) Grid Array.

I've personally straightened out bent LGA pins by the way. Why would
I do such a thing? A computer that won't boot and at least an
horological hour under a jeweler's loupe with a fine sewing pin and
razor blade is obviously great fun.

A pretty draconian statement for Big MT to whittle down on size
thankfully into a nibble, excuse me while I burp.
 
The mobo maker could just make a plastic frame to hold the chip in place
(both for position along with affixing to the mobo via spring clip) and
the user would use a soldering iron with a tip designed for the BGA grid
pattern. The user would buy the mobo they want, the CPU they want, and
then do a one-time solder of the CPU onto the mobo.
[...]

That's some funny stuff right there.

Unless you're serious, of course...

Cheers!
 
daytripper said:
The mobo maker could just make a plastic frame to hold the chip in place
(both for position along with affixing to the mobo via spring clip) and
the user would use a soldering iron with a tip designed for the BGA grid
pattern. The user would buy the mobo they want, the CPU they want, and
then do a one-time solder of the CPU onto the mobo.
[...]

That's some funny stuff right there.

Unless you're serious, of course...

Cheers!

I was serious. You do know what "ball" means in BGA, right? It's a
ball of solder. So why can't the chip, even a CPU, come prepped with
the balls of solder on its pads, the mobo come with balls of solder on
its grid and using feedthroughs so the solder is reached from the
backside of the board, and all you have to do is keep the chip pressed
against the grid, keep it aligned, heat up the solder gun with a
matching grid tip, and just melt all the solder to weld the chip to the
grid?

You've never applied new solder to the underside of a PCB so it heats
the solder on the other side through a feedthrough to use solder wick on
the other side when you cannot otherwise reach the other side with a
soldering iron? Heat travels.

Of course, we're talking about DIY'ers that know how to solder and that
it flows towards the heat source and what level of heat to apply and not
the boobs that barely know how to push down the level for a ZIF socket.
Not having sockets doesn't mean you can't DIY. It means the DIY'er will
need better skills than pushing stuff into a socket or slot.
 
BGA is prepared with solder already placed and ready to lay into a
supportive component for soldering multiples of contacts (usually in
industrial assembly practices or on a wave station);-

BGAs are reflowed, not wave soldered. A hot atmosphere slowly brings the
temperature of the components and boards towards the melting point of the
solder, and then the temperature goes above the melting point for a few
seconds. The surface tension of the liquified solder holds the package
in place for the short time it takes for the solder to freeze as the
temperature is brought back down.

Wave soldering is used for through hole components, and 'back-side'
surface mount parts (after they are affixed with adhesive).
 
Wave soldering is used for through hole components, and 'back-side'
surface mount parts (after they are affixed with adhesive).

I've a couple of welders in the garage and have run into that, except
I'll use magnets, clamps, stick and a monkey, or whatever else to hold
it in place while getting a MIG tip or flux rod to heliarc into a
through-hole if it's first been established for that purpose.
Interesting though to know. Never can tell when a garbage can for all
the junk mail will turn into welding up a handy-dandy USPS mailbox
project.
 
VanguardLH said:
daytripper said:
The mobo maker could just make a plastic frame to hold the chip in place
(both for position along with affixing to the mobo via spring clip) and
the user would use a soldering iron with a tip designed for the BGA grid
pattern. The user would buy the mobo they want, the CPU they want, and
then do a one-time solder of the CPU onto the mobo.
[...]

That's some funny stuff right there.

Unless you're serious, of course...

Cheers!

I was serious. You do know what "ball" means in BGA, right? It's a
ball of solder. So why can't the chip, even a CPU, come prepped with
the balls of solder on its pads, the mobo come with balls of solder on
its grid and using feedthroughs so the solder is reached from the
backside of the board, and all you have to do is keep the chip pressed
against the grid, keep it aligned, heat up the solder gun with a
matching grid tip, and just melt all the solder to weld the chip to the
grid?

You've never applied new solder to the underside of a PCB so it heats
the solder on the other side through a feedthrough to use solder wick on
the other side when you cannot otherwise reach the other side with a
soldering iron? Heat travels.

Of course, we're talking about DIY'ers that know how to solder and that
it flows towards the heat source and what level of heat to apply and not
the boobs that barely know how to push down the level for a ZIF socket.
Not having sockets doesn't mean you can't DIY. It means the DIY'er will
need better skills than pushing stuff into a socket or slot.

You at least want to solder all the balls at the same time.

There is a magic alignment effect, where the wetted contacts
tend to "pull" the chip into alignment, such that the chip
rotates to the grid of contacts underneath. You want the
solder to fill the pads properly, which is going to
happen if all the balls melt at the same time
and the chip settles into place.

If you were a home user, and desperate for adventure,
you could try a toaster oven. That's the closest thing
to IR reflow you can arrange for real cheap. Some people
used the toaster oven method, to fix Nvidia GPU solder joints.
But I would still put this idea in the "repugnant" category.
You have absolutely no control of the temperature profile
that way, and the toaster oven is going to be heating all sorts
of stuff you don't want heated (think "burned plastic").

At the factory, they use an XRay machine to verify BGA soldering.
On a processor, 2/3rds of the balls could be VCC and GND, and
they wouldn't be candidates for boundary scan verification. An
XRay, can uncover balls damaged by the "popcorn" problem
for example. And more than one XRay is taken. By holding
the XRay machine on an angle, you can photograph the balls
from either side. No home user would be able to verify
the solder job was completed properly. You wouldn't want
to burn some power connections, because too many VCC or
GND pins were open circuit.

http://upload.wikimedia.org/wikipedia/commons/thumb/6/6c/BGA_joint_xray.jpg/600px-BGA_joint_xray.jpg

(Voids caused by excessive moisture level on the solder balls.)

http://glenbrook.webworksnow3.com/blog/wp-content/uploads/2011/03/bga_fig4.gif

With care, I'm told you can get defectivity down to around
1 ball in 100,000. That means, if you solder down a hundred
chips each having 1000 balls underneath, one of the chips
will have a single bad solder joint. It would take a little
effort and expense though, to get that good at it. The
results of home users doing such soldering, isn't going to be
that good.

Paul
 
daytripper said:
The mobo maker could just make a plastic frame to hold the chip in place
(both for position along with affixing to the mobo via spring clip) and
the user would use a soldering iron with a tip designed for the BGA grid
pattern. The user would buy the mobo they want, the CPU they want, and
then do a one-time solder of the CPU onto the mobo.
[...]

That's some funny stuff right there.

Unless you're serious, of course...

Cheers!

I was serious. You do know what "ball" means in BGA, right? It's a
ball of solder. So why can't the chip, even a CPU, come prepped with
the balls of solder on its pads, the mobo come with balls of solder on
its grid and using feedthroughs so the solder is reached from the
backside of the board, and all you have to do is keep the chip pressed
against the grid, keep it aligned, heat up the solder gun with a
matching grid tip, and just melt all the solder to weld the chip to the
grid?

You've never applied new solder to the underside of a PCB so it heats
the solder on the other side through a feedthrough to use solder wick on
the other side when you cannot otherwise reach the other side with a
soldering iron? Heat travels.

Of course, we're talking about DIY'ers that know how to solder and that
it flows towards the heat source and what level of heat to apply and not
the boobs that barely know how to push down the level for a ZIF socket.
Not having sockets doesn't mean you can't DIY. It means the DIY'er will
need better skills than pushing stuff into a socket or slot.
If you knew how to solder, you would know a solder wick is used to desolder
 
GMAN said:
[...]
The mobo maker could just make a plastic frame to hold the chip in place
(both for position along with affixing to the mobo via spring clip) and
the user would use a soldering iron with a tip designed for the BGA grid
pattern. The user would buy the mobo they want, the CPU they want, and
then do a one-time solder of the CPU onto the mobo.
[...]

That's some funny stuff right there.

Unless you're serious, of course...

Cheers!

I was serious. You do know what "ball" means in BGA, right? It's a
ball of solder. So why can't the chip, even a CPU, come prepped with
the balls of solder on its pads, the mobo come with balls of solder on
its grid and using feedthroughs so the solder is reached from the
backside of the board, and all you have to do is keep the chip pressed
against the grid, keep it aligned, heat up the solder gun with a
matching grid tip, and just melt all the solder to weld the chip to the
grid?

You've never applied new solder to the underside of a PCB so it heats
the solder on the other side through a feedthrough to use solder wick on
the other side when you cannot otherwise reach the other side with a
soldering iron? Heat travels.

Of course, we're talking about DIY'ers that know how to solder and that
it flows towards the heat source and what level of heat to apply and not
the boobs that barely know how to push down the level for a ZIF socket.
Not having sockets doesn't mean you can't DIY. It means the DIY'er will
need better skills than pushing stuff into a socket or slot.
If you knew how to solder, you would know a solder wick is used to
desolder[/QUOTE]

Or merely to remove excess solder....

Chris
 
GMAN said:
VanguardLH <[email protected]> said:
:

[...]
The mobo maker could just make a plastic frame to hold the chip in place
(both for position along with affixing to the mobo via spring clip) and
the user would use a soldering iron with a tip designed for the BGA grid
pattern. The user would buy the mobo they want, the CPU they want, and
then do a one-time solder of the CPU onto the mobo.
[...]

That's some funny stuff right there.

Unless you're serious, of course...

Cheers!

I was serious. You do know what "ball" means in BGA, right? It's a
ball of solder. So why can't the chip, even a CPU, come prepped with
the balls of solder on its pads, the mobo come with balls of solder on
its grid and using feedthroughs so the solder is reached from the
backside of the board, and all you have to do is keep the chip pressed
against the grid, keep it aligned, heat up the solder gun with a
matching grid tip, and just melt all the solder to weld the chip to the
grid?

You've never applied new solder to the underside of a PCB so it heats
the solder on the other side through a feedthrough to use solder wick on
the other side when you cannot otherwise reach the other side with a
soldering iron? Heat travels.

Of course, we're talking about DIY'ers that know how to solder and that
it flows towards the heat source and what level of heat to apply and not
the boobs that barely know how to push down the level for a ZIF socket.
Not having sockets doesn't mean you can't DIY. It means the DIY'er will
need better skills than pushing stuff into a socket or slot.
If you knew how to solder, you would know a solder wick is used to
desolder

Or merely to remove excess solder....

Chris

True, but if your skill level is that poor, that you need to remove alot of
excess solder, you shouldnt really be doing it in the first place.
 
Back
Top