It's All IBM

  • Thread starter Thread starter Arapahoe
  • Start date Start date
A

Arapahoe

When you really boil it down...in the last 60 years of computing...it's
really all been IBM.

I mean, I've been reading this book "Hackers" which goes through all the
*underground* computer movements and DEC, and Altair and probably if I
bother to keep reading, Apple and so on.

But you know, as I see it -- basically computing is divided into 3 epochs.

I) The IBM Mainframe - 1950 to 1980

The Mainframe was all that really mattered in computing for those 30
years. You could take all the /hacking/ and DEC and minicomputers and
flush them down the toilet and what would be left is the real computing.
The computing of commerce and industry that really /comupterized/
America. The computing of Defense and Simulation.

And 99 percent of that was on IBM mainframes.


II) The IBM PC - 1980 to 2000

The basic architecture of the PC that is used by 96 percent of the world
was developed by IBM -- and that architecture is still in use today.

The IBM PC is 2nd to the mainframe in design and importance in
computing. Apple is about as important as the Altair. A trivial
implementation, before IBM came up with the machine that Corporate
America really wanted.


III) IBM and Linux, 2000-???

Oh, did you notice that I left Microsoft off this short list? That is
because IBM was planning a PC with a multitasking OS as early as the
1960s. They called it basically a Smart Terminal because it could
offload some of the server processing. And that's about what most PCs
do, because real processing is done on servers. As far as Microsoft
goes, I see them as a transitory period -- basically allowing PC
hardware to mature up to the point that UNIX -- a reall OS -- could be
allowed to run on PCs. And now it does.
 
When you really boil it down...in the last 60 years of computing...it's
really all been IBM.

I mean, I've been reading this book "Hackers" which goes through all the
*underground* computer movements and DEC, and Altair and probably if I
bother to keep reading, Apple and so on.

But you know, as I see it -- basically computing is divided into 3 epochs.

I) The IBM Mainframe - 1950 to 1980

The Mainframe was all that really mattered in computing for those 30
years. You could take all the /hacking/ and DEC and minicomputers and
flush them down the toilet and what would be left is the real computing.
The computing of commerce and industry that really /comupterized/
America. The computing of Defense and Simulation.

And 99 percent of that was on IBM mainframes.


II) The IBM PC - 1980 to 2000

The basic architecture of the PC that is used by 96 percent of the world
was developed by IBM -- and that architecture is still in use today.

The IBM PC is 2nd to the mainframe in design and importance in
computing. Apple is about as important as the Altair. A trivial
implementation, before IBM came up with the machine that Corporate
America really wanted.


III) IBM and Linux, 2000-???

Oh, did you notice that I left Microsoft off this short list? That is
because IBM was planning a PC with a multitasking OS as early as the
1960s. They called it basically a Smart Terminal because it could
offload some of the server processing. And that's about what most PCs
do, because real processing is done on servers. As far as Microsoft
goes, I see them as a transitory period -- basically allowing PC
hardware to mature up to the point that UNIX -- a reall OS -- could be
allowed to run on PCs. And now it does.

"Hackers" is a great book....

Read a book called "Big Blues" to learn the rest of the story.

IBM was a big, batch processing monolith that worked FUD to the limit.
Read about the IBM descent decree.
Anti-trust+IBM is another good topic.

IBM never made the "best" computers, DEC was always better, but IBM had
more money.
Lot's more money.
DEC's were used in transaction processing, like banks while IBM's were
still all batch.
 
Arapahoe said:
When you really boil it down...in the last 60 years of
computing...it's really all been IBM.

I mean, I've been reading this book "Hackers" which goes through all
the *underground* computer movements and DEC, and Altair and probably
if I bother to keep reading, Apple and so on.

But you know, as I see it -- basically computing is divided into 3
epochs.

I) The IBM Mainframe - 1950 to 1980

The Mainframe was all that really mattered in computing for those 30
years. You could take all the /hacking/ and DEC and minicomputers and
flush them down the toilet and what would be left is the real
computing. The computing of commerce and industry that really
/comupterized/ America. The computing of Defense and Simulation.

And 99 percent of that was on IBM mainframes.
True, because they were a Monoply, just like Microsoft, is today....
They drove the others out of the market or nearly. Hacking back in than
had a vastly different meaning than it does today. Do you know the
difference?
II) The IBM PC - 1980 to 2000

The basic architecture of the PC that is used by 96 percent of the
world was developed by IBM -- and that architecture is still in use
today.
I don't have MFM drives, or any ISA slots in my computer other than the CPU
made my Intel the architecture of the 'PC' is vastly different.
The IBM PC is 2nd to the mainframe in design and importance in
computing. Apple is about as important as the Altair. A trivial
implementation, before IBM came up with the machine that Corporate
America really wanted.
No, not to IBM, they didn't even really want to fund the project. Apple
and Altair played a much bigger roll at the beginning than IBM. During the
age of KIT computers (Altair and alike) IBM was a joke, since they tried to
market a OVER priced kit computer, which flopped big time. Apple was the
FIRST to introduce a computer that YOU didn't have to assembly yourself.
I wouldn't call either trivial, if anything IBM was trivial during that
time, at least computer Hobbests.
III) IBM and Linux, 2000-???

Oh, did you notice that I left Microsoft off this short list? That is
because IBM was planning a PC with a multitasking OS as early as the
1960s. They called it basically a Smart Terminal because it could
offload some of the server processing. And that's about what most PCs
do, because real processing is done on servers. As far as Microsoft
goes, I see them as a transitory period -- basically allowing PC
hardware to mature up to the point that UNIX -- a reall OS -- could be
allowed to run on PCs. And now it does.

You may not like but Microsoft has played a big roll in making the PC
affordable, but I would say Compaq did even more. Linux still have along
way to go in the desktop realm, but they are doing good in the backoffice
(servers).
Even Windows NT and DOS are real OSs and you could call Win3.11 a real OS
depending on your definition, do you even know what an OS really does?
I guess UNIX on the PC doesn't exist? Even Microsoft, sold Xenix (Unix) a
long long time ago.... (1983 or so) PC have been running UNIX for over 20
years now....
 
Patricia said:
IBM was a big, batch processing monolith that worked FUD to the limit.
Read about the IBM descent decree.
Anti-trust+IBM is another good topic.

Well, I wouldn't say that.

For example, CICS is the most used computer interface in the world. It is
the 'green' screen that you typically see in banks, hospitals, etc.

CICS is the interface to MVS, AIX, TSO, VM and so on -- the operating
systems that really run the world.

Here's something else -- CICS is /stateless/ -- as in ... yep, you guessed
it, http. So a CICS session, is just like a Web browsing session. Only,
IBM beat out the web by, oh, about two decades :D
 
Arapahoe said:
When you really boil it down...in the last 60 years of computing...it's
really all been IBM.
IBM was there, but it hasn't been all IBM. The others contributed quite a
bit, and pushed IBM to do better. Without the others, computers would still
be behemoths in corporate computer rooms with wizards required to figure out
how to make them do whatever needed to be done. There would be no pc
revolution.
I mean, I've been reading this book "Hackers" which goes through all the
*underground* computer movements and DEC, and Altair and probably if I
bother to keep reading, Apple and so on.

But you know, as I see it -- basically computing is divided into 3 epochs.

I) The IBM Mainframe - 1950 to 1980

The Mainframe was all that really mattered in computing for those 30
years. You could take all the /hacking/ and DEC and minicomputers and
flush them down the toilet and what would be left is the real computing.
The computing of commerce and industry that really /comupterized/
America. The computing of Defense and Simulation.

And 99 percent of that was on IBM mainframes.

Uh, 99 percent? I don't think so. More like half to two thirds.
And they were having problems in the 80's. For a time it didn't look like
IBM was going to be viable because their mainframes weren't doing so well
with all the new, smaller machines coming along.

I've personally watched a VAX 11/750 beat out an IBM 3081GX on certain
tasks. This was the second smallest VAX, with 2 MB memory, versus the
largest IBM with about 32 MB. (big at the time). The 3081GX architecture
just wasn't up to doing certain tasks. I could have put about a dozen of
those VAXen into the 3081, and still had room for a few users.

If it weren't for having other machines besides IBM, you wouldn't have UNIX.
IBM felt that batch processing was the way to go, and would not budge from
their opionion. Even their "real time" operating systems were hopped up
batch systems. Security was a nightmare, JCL was taken to ridiculous
extremes.
II) The IBM PC - 1980 to 2000

The basic architecture of the PC that is used by 96 percent of the world
was developed by IBM -- and that architecture is still in use today.

Sorry, wrong planet. IBM looked at what everyone else was doing, (and
eating IBM's lunch for that matter), and started to do the same thing. They
pretty much copied the architecture of the PC and made it their own.
Notice, their motherboards are Intel. They tried a few things with a
different bus, but that wasn't so great. The only thing IBM did was to
firm up a standard which was evolving quickly. I'm not sure, but I've
heard that the exec that approved trying to make the PC was in a bit of hot
water.

The IBM PC is 2nd to the mainframe in design and importance in
computing. Apple is about as important as the Altair. A trivial
implementation, before IBM came up with the machine that Corporate
America really wanted.

IBM didn't come up with anything new. Apple did. While it is generally
missed, the Tandy computer and Visicalc probably made the PC revolution get
into high gear. Apple wasn't as available as a tandy, and Visicalc finally
made getting a computer worthwhile for business. Apple predated Tandy, but
didn't have the marketing ability of Radio Shack. Apple was also a closed
architecture, while the Tandy system wasn't. Later, other started to
manufacture their own machine. There were hundreds of companies, each
taking INTEL chips and motherboards, assembling them, and trying to stay
ahead of the competition. The 80's computer revolution was probably one of
the most exciting periods of technical competition ever. IBM came in on the
tail end of it.

III) IBM and Linux, 2000-???

Oh, did you notice that I left Microsoft off this short list? That is
because IBM was planning a PC with a multitasking OS as early as the
1960s. They called it basically a Smart Terminal because it could
offload some of the server processing. And that's about what most PCs
do, because real processing is done on servers. As far as Microsoft
goes, I see them as a transitory period -- basically allowing PC
hardware to mature up to the point that UNIX -- a reall OS -- could be
allowed to run on PCs. And now it does.
Hmm, you also forgot a lot of other details. You forgot Xerox too. Have
you any idea of what the Xerox contributions were?

The 1960's implementation at IBM was canned because they figured that there
was no reason anyone would need their own computer. They continued this
thought right up until the handwriting was breaking through the wall.

While servers do some processing, most processing is done on individual
machines. The servers tend to handle data flow and coordination between the
various machines. That is why they call it distributed processing.

Michael
 
IBM was there, but it hasn't been all IBM. The others contributed quite a
bit, and pushed IBM to do better. Without the others, computers would still
be behemoths in corporate computer rooms with wizards required to figure out
how to make them do whatever needed to be done. There would be no pc
revolution.


Uh, 99 percent? I don't think so. More like half to two thirds.

And they were having problems in the 80's. For a time it didn't look like
IBM was going to be viable because their mainframes weren't doing so well
with all the new, smaller machines coming along.

I've personally watched a VAX 11/750 beat out an IBM 3081GX on certain
tasks. This was the second smallest VAX, with 2 MB memory, versus the
largest IBM with about 32 MB. (big at the time). The 3081GX architecture
just wasn't up to doing certain tasks. I could have put about a dozen of
those VAXen into the 3081, and still had room for a few users.

If it weren't for having other machines besides IBM, you wouldn't have UNIX.
IBM felt that batch processing was the way to go, and would not budge from
their opionion. Even their "real time" operating systems were hopped up
batch systems. Security was a nightmare, JCL was taken to ridiculous
extremes.


Sorry, wrong planet. IBM looked at what everyone else was doing, (and
eating IBM's lunch for that matter), and started to do the same thing. They
pretty much copied the architecture of the PC and made it their own.
Notice, their motherboards are Intel. They tried a few things with a
different bus, but that wasn't so great. The only thing IBM did was to
firm up a standard which was evolving quickly. I'm not sure, but I've
heard that the exec that approved trying to make the PC was in a bit of hot
water.



IBM didn't come up with anything new. Apple did. While it is generally
missed, the Tandy computer and Visicalc probably made the PC revolution get
into high gear. Apple wasn't as available as a tandy, and Visicalc finally
made getting a computer worthwhile for business. Apple predated Tandy, but
didn't have the marketing ability of Radio Shack. Apple was also a closed
architecture, while the Tandy system wasn't. Later, other started to
manufacture their own machine. There were hundreds of companies, each
taking INTEL chips and motherboards, assembling them, and trying to stay
ahead of the competition. The 80's computer revolution was probably one of
the most exciting periods of technical competition ever. IBM came in on the
tail end of it.


Hmm, you also forgot a lot of other details. You forgot Xerox too. Have
you any idea of what the Xerox contributions were?

The 1960's implementation at IBM was canned because they figured that there
was no reason anyone would need their own computer. They continued this
thought right up until the handwriting was breaking through the wall.

While servers do some processing, most processing is done on individual
machines. The servers tend to handle data flow and coordination between the
various machines. That is why they call it distributed processing.

Michaelf

Good post, left intact....
 
Arapahoe said:
II) The IBM PC - 1980 to 2000

The basic architecture of the PC that is used by 96 percent of the
world was developed by IBM -- and that architecture is still in use
today.

BWAHAHAHAHAHAHAHHAHA!!!!

8-bit ISA slots. CGA graphics. Tone generated sound. 64K of 200ns RAM, if
you were rich. 5MB MFM disk drives.

Yeah, I suppose in the linux world things don't quite move as fast as
treacle on a cold day.
 
BWAHAHAHAHAHAHAHHAHA!!!!

8-bit ISA slots. CGA graphics. Tone generated sound. 64K of 200ns RAM, if
you were rich. 5MB MFM disk drives.

Yeah, I suppose in the linux world things don't quite move as fast as
treacle on a cold day.

It was a start......

And where do you think you would be today if IBM hadn't started the ball
rolling?

Paper tape maybe?
 
Patricia said:
It was a start......

And where do you think you would be today if IBM hadn't started the
ball rolling?

They didn't 'start the ball rolling,' you stupid ****. They did what they
always do. They umed and ahed about making a business decision. And they did
that as early as the '60s. So go and **** yourself. Nobody else will, you
clag-cunted trollop.
 
<snip everything off-topic for alt.sci.physics>

Where's the Physics?

Why did you post this to alt.sci.physics? Just a rude habit?

Think before you thread.


Tom Davidson
Richmond, VA
 
Herman said:
If it weren't for having other machines besides IBM, you wouldn't have
UNIX.

ANd if weren't for IBM creating AIX, nobody would care.

Sorry, wrong planet. IBM looked at what everyone else was doing, (and
eating IBM's lunch for that matter), and started to do the same thing.

Same thing? Like Apple? Which was a closed system, as were all other PCs
of the day.

IBM's brilliant vision was creating and 'open' system, where anybody could
create add on boards and hardware.

That and bringing it to the public at a reasonable cost was the Real PC
Revolution. You could flush the whole 'Homebrew' B.S. down the toilet
because it wasn't as radical as IBM's vision.

IBM didn't come up with anything new. Apple did.

Oh, yes, right. Steve Jobs dumpster dived at Xerox Parc and you call it
new...HAHAHHA>
Hmm, you also forgot a lot of other details. You forgot Xerox too. Have
you any idea of what the Xerox contributions were?

Stealing a lot of ideas that had been kicking around at IBM all during the
60s.

While servers do some processing, most processing is done on individual
machines. The servers tend to handle data flow and coordination between

Please...separate /important/ processing from a word processor loaded into
memory....
 
Wayther Palabi said:
Well, I wouldn't say that.

For example, CICS is the most used computer interface in the world. It is
the 'green' screen that you typically see in banks, hospitals, etc.


Are they still using those old green screens? Much more colorful ones
are now available.

CICS is the interface to MVS, AIX, TSO, VM and so on -- the operating
systems that really run the world.


CICS (Customer Information Control System) is only one interface to
MVS, mostly supporting end user interactive applications. TSO is
another interface to MVS, supporting application program development,
and interactive control of batch processes. VM (Virtual Machine) is
another operating system designed so other operating systems like MVS,
OS390, and DOS/VSE can run under it, as well as its own subordinate
operating system and interface, CMS, which is used for both program
development and applications. CICS is not supported directly under
VM, but can run under an operating system such as MVS, OS390, or
DOS/VSE which in turn runs under VM.

In DOS/VSE (if anyone still uses it), the interactive interface ICCF
supporting program development ran under CICS. But in turn, CICS ran
logically in an ICCF virtual partition. That sort of leads to the IBM
joke about VM. IBM said they were working on getting VM to run under
VM. Then in turn, they wanted to get the host VM to run under another
VM. The goal, they said, was to eliminate the hardware altogether!

Of course nobody believed them, because they knew that IBM sells
hardware!

Here's something else -- CICS is /stateless/ -- as in ... yep, you guessed
it, http. So a CICS session, is just like a Web browsing session. Only,
IBM beat out the web by, oh, about two decades :D


What I don't understand is why IBM, which has done everthing right in
recent years as far as I can see, is probably on their way to
marginalizing Microsoft, and consistantly makes money, is still having
its stock languishing so badly!

Double-A
 
Arapahoe said:
When you really boil it down...in the last 60 years of computing...it's
really all been IBM.

I mean, I've been reading this book "Hackers" which goes through all the
*underground* computer movements and DEC, and Altair and probably if I
bother to keep reading, Apple and so on.

But you know, as I see it -- basically computing is divided into 3 epochs.

I) The IBM Mainframe - 1950 to 1980

The Mainframe was all that really mattered in computing for those 30
years. You could take all the /hacking/ and DEC and minicomputers and
flush them down the toilet and what would be left is the real computing.
The computing of commerce and industry that really /comupterized/
America. The computing of Defense and Simulation.

And 99 percent of that was on IBM mainframes.


II) The IBM PC - 1980 to 2000

The basic architecture of the PC that is used by 96 percent of the world
was developed by IBM -- and that architecture is still in use today.

The IBM PC is 2nd to the mainframe in design and importance in
computing. Apple is about as important as the Altair. A trivial
implementation, before IBM came up with the machine that Corporate
America really wanted.


III) IBM and Linux, 2000-???

Oh, did you notice that I left Microsoft off this short list? That is
because IBM was planning a PC with a multitasking OS as early as the
1960s. They called it basically a Smart Terminal because it could
offload some of the server processing. And that's about what most PCs
do, because real processing is done on servers. As far as Microsoft
goes, I see them as a transitory period -- basically allowing PC
hardware to mature up to the point that UNIX -- a reall OS -- could be
allowed to run on PCs. And now it does.

I agree, and their to blame for Bill Gates amassed fortune and the torment
that has been inflicted 98% of pc's globally.
 
begin In <[email protected]>, on
07/06/2004
at 12:59 PM, (e-mail address removed) (Double-A) said:
Are they still using those old green screens? Much more colorful
ones are now available.

AFAIK, IBM hasn't made the green screens for a couple of decades. They
switched to amber for monochrome, and the color screens became a lot
more affordable than initially.
In DOS/VSE (if anyone still uses it),

They still use VSE/ESA. But I don't know of anybody using the old free
base.
That sort of leads to the IBM joke about VM. IBM said they were
working on getting VM to run under VM.

Don't laugh; that's been available for decades, and is useful,
especially when dealing with incompatible brand-x hardware. Take the
GTE IS/7800 - please!
What I don't understand is why IBM, which has done everthing right
in recent years as far as I can see,

It isn't. It's still sticking with its boject code only (OCO) policies
on mainframe software, and it isn't doing what is necessary to get its
products into schools.

--
Shmuel (Seymour J.) Metz, SysProg and JOAT <http://patriot.net/~shmuel>

Unsolicited bulk E-mail subject to legal action. I reserve the
right to publicly post or ridicule any abusive E-mail. Reply to
domain Patriot dot net user shmuel+news to contact me. Do not
reply to (e-mail address removed)
 
Back
Top