Ultimate in over-the-top cell speculation. Intel manufactures Cell. Microsoft withers.

  • Thread starter Thread starter Robert Myers
  • Start date Start date
Robert Myers said:
An example of what Teller was talking about (and I can't find the
exact quote, but you can easily find quotes of him advocating drastic
changes to the country's secrecy policies) was the inadvertent
shipment of machines to make precision ball bearings to the Soviet
Union at the height of the cold war. That slip allowed them to MIRV
their warheads, a major escalation of the arms race. The Soviets
didn't know how to make ball bearings? Apparently not.

Bob, you're apparently under the impression that the gyroscopes at the
heart of the inertial guidance packages used to direct ICBM warheads
used ball bearings. That ain't so. The gyros used gas bearings;
specifically, nitrogen gas since the presence of oxygen would
gradually change the delicate balance over time. Small vanes on the
rotating part assured that there was _no_ metal-to-metal contact.

These gas-bearing based guidance packages were developed by MIT
initially under the guidance of prof. Charles Stark Draper. Later,
the Charles Stark Draper Lab, operating under MIT's roof, carried on
this work - even when they had to move the Lab to Florida because of
all the peaceniks in Cambridge MA during the latter stages of the
Vietnam war.

I was intimately involved with this stuff at a first-tier
subcontractor during the 60's and most of the 70's.

Are you possibly confusing the inadvertant (supposedly) shipment of
Japanese multi-axis milling equipment to the Soviet Union, making it
possible for the Soviets to produce very quiet propellors for their
submarines? I think the company involved was Toshiba.
 
Bob, you're apparently under the impression that the gyroscopes at the
heart of the inertial guidance packages used to direct ICBM warheads
used ball bearings. That ain't so. The gyros used gas bearings;
specifically, nitrogen gas since the presence of oxygen would
gradually change the delicate balance over time. Small vanes on the
rotating part assured that there was _no_ metal-to-metal contact.

These gas-bearing based guidance packages were developed by MIT
initially under the guidance of prof. Charles Stark Draper. Later,
the Charles Stark Draper Lab, operating under MIT's roof, carried on
this work - even when they had to move the Lab to Florida because of
all the peaceniks in Cambridge MA during the latter stages of the
Vietnam war.

I was intimately involved with this stuff at a first-tier
subcontractor during the 60's and most of the 70's.

Are you possibly confusing the inadvertant (supposedly) shipment of
Japanese multi-axis milling equipment to the Soviet Union, making it
possible for the Soviets to produce very quiet propellors for their
submarines? I think the company involved was Toshiba.

Well, no, at least not as far as the functioning of my memory and
understanding goes. I remember the submarine propeller incident,
which involved export by a Japanese company, not an American company,
as you stated. I can't find a respectable reference on the web to the
Bryant Chucking Grinder Company episode, but here is a reference to a
respectable reference:

http://www.nwowatcher.com/ebooks/The Best Enemy Money Can Buy - By Antony Sutton.pdf

<quote>

Perhaps the best-informed American scholar in the field of Soviet
history and overall strategy is Prof. Richard Pipes of Harvard
University. In 1984, his chilling book appeared, Survival Is Not
Enough: Soviet Realities and America's Future (Simon &
Schuster). His book tells at least part of the story of the Soviet
Union's reliance on Western technology, including the infamous Kama
River truck plant, which was built by the Pullman-Swindell company of
Pittsburgh, Pennsylvania, a subsidiary of M. W. Kellogg Co.
Prof. Pipes remarks that the bulk of the Soviet merchant marine, the
largest in the world, was built in foreign shipyards. He even tells
the story (related in greater detail in this book) of the Bryant
Chucking Grinder Company of Springfield, Vermont, which sold the
Soviet Union the ball-bearing machines that alone made possible the
targeting mechanism of Soviet MIRV'ed ballistic missiles.

</quote>

The ball bearings part of the story never seemed particularly
plausible to me, but that was the story as it was reported. It may
well have been a cover.

RM
 
George said:
So let's say
I come up with a novel, revolutionary algorithm, e.g. practical solver for
the traveling salesman problem with true optimal solutions; I then design
the method for implementation and code it all up. Now I'm supposed to give
it away because it uses libraries which are OS?

If you don't like it, don't use the open-source libraries. Why should
*you* get to profit from the work of the people who wrote those
libraries? You used their ideas and their work, for free; why shouldn't
they get to use your ideas and your work, for free?
 
One of the very few things Edward Teller said that I agreed with was
that the things that really make a difference in national security
don't need to be classified because you can't write down, transmit, or
easily steal the secrets, anyway. The prizes of World War II were the
actual rocket scientists, not their blueprints or even prototypes.

Players more or less _have_ to contribute to these communal efforts,
and their assets are the people who really understand what's going on.
Take your eye off the ball for a short period, and you're quickly out
of the game.

Harrumph - "join the clique or wither" - lost bodies and squandered
opportunities. There are any number of important works which have been
developed in near-seclusion. Mediocrity loves "peers" and their
self-regarding committees.
You don't want RedHat's actual packaged software? No problem. But if
it breaks, you're on your own or on the mercy of community resources.
That's neither free software nor commercial software, but RedHat _is_
making money off software.

It is not *creating* *anything* - sorry but I don't see charging for
packages as making $$ from software.
From an end user's point of view, I don't know that the biggest
concern works much differently either way. Unless your favorite
software is kept up to date so that it can live happily with the
latest kernel, you could be out of luck. Have it happen to you just
once, spend some time digging through mail lists trying to figure out
how the kernel headers changed, and you realize what a problem it is.
Wouldn't happen with commercial software? Look at your prized watcom
compiler.

Now, now... I have used Watcom's compilers and have not said in any terms
that I prized them, other than that they exist (existed commercially) and
are/were another alternative... in fact a very valuable one when M$ didn't
have the goods, less so now. Oh and Watcom perished because of business
mistakes by a greedy Sybase - it sunk along with the rest of Sybase.
There is so much room for creativity that I don't really see that the
GPL is all that much of a hindrance to making money. This is
_America_, George.

I think having an identified target market with money is more
important than having a novel algorithmic twist.

More important for what - either you're being obtuse or missing the point.
What I'm getting at is the survival, or not, of the sort of company which
employs analysts/programmers who design and write software and try to make
a living from that endeavour.
I don't think so. The PowerPC part of Cell is really crippled
relative to a G5. You really have to be able to exploit the SPE's to
make Cell competitive, and I don't think any compiler anywhere is
going to compile c or c++ to effective Cell software because the
programming model is so different.

Instead of letting the PowerPC do actual work, you let it create a
thread and pass it off to an SPE. Then, if a SPE pipeline stalls on
the task, you don't care so much because it's only 1 of 16, whereas
the PPC has only two paths, both of them in-order.

The natural programming model is something like Kahn networks or
Synchronous Dataflow. Lots of work done, but applications would have
to be rewritten at the source code level.

What I'm saying is that for the bulk of installed, hum-drum software on a
PC/workstation, the performance just doesn't matter that much.
 
You forget that IBM turned over 500ish patents to the open-software
community. You're not looking beyond the razors. You've just flunked
Gillette marketing 101. ;-)

No I didn't forget - I didn't know in the first place.:-) If they were
software patents then I'm glad they did that because they should never have
been awarded in the first place IMO. That *is* the world we are supposed
to live in now I guess, with the EC[ptui] looking like forcing through
approval of this eniquity as well (their parliament is being brushed aside
by the EC[ptui] crypto-fascists), but that doesn't make it right. Just
wait till the Chinese get themselves organized under such a framework.
You are not "fettered" by having used GPL tools. You may indeed sell your
tools as OCO. IIRC, you may not package that code as part of yours. I'm
not a frappin' programmer <spit>, but that's my understanding.

As you well know, with any high level language it's impossible to
distribute software without its library content. Anything which might
currently allow that, on a limited basis, is just another rule, which is up
for change on the whim of whoever has the reigns today.
Your understangin of emplouer relationships is a little out of date too.
Many are encourraged to participate in OSS, within obvious conflict of
interest barriers, obviously.

Things may be different where you are. FWIS, if anything, employer
restrictions on outside and post-employment activities are getting more
onerous and broader in their coverage.
I'm not sure I agree. I'm not sure I understand the difference between an
algorithm and a process. Ok, I do work in the patent arena, but I do shy
away from anything with software in it. Processes aren't software though,
but it could easily be argued that they are algorithms. I'm not smart
enough to know the difference. You?

Agree on what?... the patenting of algorithms? It's only in the past 20
years or so that algorithms have been patentable - prior to that they were
classed as an idea which is/was(?) not patentable; protection is/was
available under copyright of the expression of the idea. Not sure how that
sits vs. hardware processes but some differences are obvious... at least
under the old rules.

It's difficult to go into such things in a public forum but I was somewhat
peripherally involved in an early algorithm patent err, quarrel; this thing
was hailed on national news as a "mathematical breakthrough", though it was
really only a twist on well known published methods. The abuse was glaring
and inequitable - the only ones (large corps) who had the clout to do
anything about it had a broad cross-license agreement with the (large corp)
originator of the patent, so didn't care. The little guys got
"penetrated"... even though their implementation of a modified version of
the algorithm blew the big guy's one away.

We now have the (resulting) situation where hardly anything of note gets
published anymore, as universities rush to the patent office to exact their
pound of flesh. Apart from any legal ramifications, the previous situation
was healthier and much more apt to produce real innovation, from my POV.
The only ones who benefit from the status quo are the usual shysters.
 
Robert Myers said:
So, as we have discovered, if one country does the proof of principle,
and only the vaguest outlines of how it's done can be discovered, a
determined adversary can often duplicate the results, even under very
challenging circumstances. Keeping things secret doesn't do much
good.

Not in the long run. And if the other side has Klaus Fuchs and the
Rosenbergs....
But it works for Coke. :-)
An example of what Teller was talking about (and I can't find the
exact quote, but you can easily find quotes of him advocating drastic
changes to the country's secrecy policies) was the inadvertent
shipment of machines to make precision ball bearings to the Soviet
Union at the height of the cold war. That slip allowed them to MIRV
their warheads, a major escalation of the arms race. The Soviets
didn't know how to make ball bearings? Apparently not.

Soviet military equipment was both very sophisticated and amazingly
crude, in the same piece of equipment. Top notch airframes and
primitive avionics. That sort of thing.
I'm glad. Need help on your next application, E&TS is ready.
Cell has both the interconnect bandwidth and the execution paths to
make a worthy successor to vector supercomputers.

As to the actual prospects? Who wouldn't be cautious at this point?
The age imbalance (with some exceptions :-) ) in who is showing
interest and excitement and who is huffily standoffish is striking.

RM
Now I'm curious. Which is which?del
 
No I didn't forget - I didn't know in the first place.:-) If they were
software patents then I'm glad they did that because they should never have
been awarded in the first place IMO.

I'm not sure I agree with you here (though I must stress the "sure" part
since that area makes me queezy too). Processes _are_ patentable. What is
software other than a rigid process?
That *is* the world we are supposed
to live in now I guess, with the EC[ptui] looking like forcing through
approval of this eniquity as well (their parliament is being brushed
aside by the EC[ptui] crypto-fascists), but that doesn't make it right.
Just wait till the Chinese get themselves organized under such a
framework.

The Chineese are joining the EU? We have *nothing* to worry about! ;-)

As you well know, with any high level language it's impossible to
distribute software without its library content. Anything which might
currently allow that, on a limited basis, is just another rule, which is
up for change on the whim of whoever has the reigns today.

Again (and please folks, correct me), that if the tools are part of GPL
tools you are *not* required to GPL or ship the sources of your
derivitave works. You are required to ship, or make otherwise available,
the GPL'd software you used. AIUI, there is no requirement to turn your
source code over to anyone, unless you decide it's to your benefit.
Things may be different where you are. FWIS, if anything, employer
restrictions on outside and post-employment activities are getting more
onerous and broader in their coverage.

Again, I'm not a programmer, but they are under fewer restrictions than
we hardware dweebs are. AIUI, programmers can donate stuff to OSS, but I
can't donate the same sorts of things to OpenCores. Obviously one has to
be aware of any conflicts of interest.

Agree on what?... the patenting of algorithms? It's only in the past 20
years or so that algorithms have been patentable - prior to that they
were classed as an idea which is/was(?) not patentable; protection
is/was available under copyright of the expression of the idea. Not
sure how that sits vs. hardware processes but some differences are
obvious... at least under the old rules.

Is there a difference. I'm sure you'll be horrified to know that even
"business processes" are patentable. If you have a process to do
*anything* it is patentable (within the obvious patent criteria).
It's difficult to go into such things in a public forum but I was
somewhat peripherally involved in an early algorithm patent err,
quarrel; this thing was hailed on national news as a "mathematical
breakthrough", though it was really only a twist on well known published
methods. The abuse was glaring and inequitable - the only ones (large
corps) who had the clout to do anything about it had a broad
cross-license agreement with the (large corp) originator of the patent,
so didn't care. The little guys got "penetrated"... even though their
implementation of a modified version of the algorithm blew the big guy's
one away.

There is nothing new here. Even if the "little guy" did get a patent he
hasn't the resources to defend the patent against a predetor. Gould was a
famous counterexample that proved this. ;-)
We now have the (resulting) situation where hardly anything of note gets
published anymore, as universities rush to the patent office to exact
their pound of flesh. Apart from any legal ramifications, the previous
situation was healthier and much more apt to produce real innovation,
from my POV. The only ones who benefit from the status quo are the usual
shysters.

Perhaps. I see this problem differently. As far as im'm concerned this
is an issue of ownership. If the widget was "discovered" under contract
from "you and me", what's this patent thing?
 
Again (and please folks, correct me), that if the tools are part of GPL
tools you are *not* required to GPL or ship the sources of your
derivitave works. You are required to ship, or make otherwise available,
the GPL'd software you used. AIUI, there is no requirement to turn your
source code over to anyone, unless you decide it's to your benefit.

Not if the tools you used are covered by the LGPL, or Lesser GPU,
originally the Library GPL:

http://www.gnu.org/copyleft/lesser.html

That page contains a link to "Why You Shouldn't Use LGPL for Your Next
Library":

http://www.gnu.org/philosophy/why-not-lgpl.html

which page will probably confirm many of George's worst fears about
open source (or, at least, the RMS version of open source).

The position stated is that, if you link to a GPL library, you have a
derivative work subject to the GPL. It's hard to see how it could be
otherwise for an executable binary, but I don't really see how you
could make such a claim if a vendor shipped object modules. Then
again, I'm not a lawyer and have no desire to become one. As a
practical matter, I don't know of an example of someone actually
shipping object code to be linked against a library intended for
Linux.

As to the whim of the day, it would be very difficult for the whim of
any single person or likely collection of persons to change the
licensing terms for a particular library, since derivative works must
be licensed under the LGPL. The only way I could see to un-LGPL
something would be to get every contributor to agree to such a change.
Not likely in the case of a library covered by LGPL.

RM
 
[George Macdonald wrote]:
Harrumph - "join the clique or wither" - lost bodies and squandered
opportunities. There are any number of important works which have been
developed in near-seclusion. Mediocrity loves "peers" and their
self-regarding committees.
Well, maybe.

The Open Research Compiler is not licensed under GPL, but it is open
source. It doesn't keep up with Intel's compiler for Itanium, but
(unlike gcc) it stays in the hunt.

_And_ whatever is learned about compilers, about intermediate
representations, and about computation will become part of the general
fund of knowledge.

Self-regarding committees? I'll take them any day over Bill Gates'
arrogant mediocrity factory. Recent characterization of Gates from a
venture capitalist at a public forum: spent his career turning other
peoples' ideas into mediocre products.

Unfair to tar closed source with Bill Gates? Equally unfair (and
unworthy of you, really) to make sweeping characterizations of open
source.
It is not *creating* *anything* - sorry but I don't see charging for
packages as making $$ from software.
You _do_ have to make the package work.

More important for what - either you're being obtuse or missing the point.
What I'm getting at is the survival, or not, of the sort of company which
employs analysts/programmers who design and write software and try to make
a living from that endeavour.
I have my moments. Mathworks sells a version of MathCAD for Linux.
They seem to be doing okay.

The GNU parts of GNU/Linux that are just unavoidable are licensed
under the Lesser GPL, as discussed in another thread. You can count
on the safety of that arrangement with about the same certainty as you
can count on driving on the right hand side of the road as a
convention in the US.

That said, I still think that having an identified target market with
money is more important than the quality of your ideas. Or rather,
the idea that counts is how whatever idea of whatever quality (even if
it's just repackaging Linux) will serve a target market with money.
You seem to chafe at that reality.

While I admire Mathworks as a company, I think their product is a
terrible idea, for the same reason I think corporations insane to keep
_their_ intellectual property in a proprietary format owed by
Microsoft. Whatever I may think of it, people with money to spend
think it's just fine, and Mathworks has EE/CS departments that really
ought to know better teaching their products.

Maybe I am a little obtuse. You seem to think that open source has
made the software business unprofitable and/or unattractive. I can
point you to links that show that 60% of new US venture capital money
is going into software. It's _hardware_ that's become unattractive to
venture capital, and there isn't a thing about hardware that's open
source.
What I'm saying is that for the bulk of installed, hum-drum software on a
PC/workstation, the performance just doesn't matter that much.

Well, let's see. In terms of products I understand, a 2.4GHz Celeron
seems to be the entry level office product these days. The in-order
PPC front end to the Cell running at (say) 3GHz can keep up with that?
Doesn't sound completely implausible.

Graphics applications are a big market for Mac, everyone agrees that
image processing applications like Photoshop will hum on Cell, and
most of the work that's already been done on Cell-type architectures
has been applications like image processing. Who knows. Never say
never.

One vast unknown here is whether the software model in Sony's patent
is going to go anywhere. Once you have taken the trouble to
reformulate software so that it creates little packets that go out
seeking resources on which to execute and created the infrastructure
to support that execution model, you can use SPE's and other Cell
processors pretty transparently, I would think.

IBM claims that Cell will run AIX. I'd think that hardware that could
cope with a jillion threadlets, didn't care if a few of it's zillions
of execution paths got stalled, and could virtualize nearly arbitrary
numbers of machines would be ideally suited to servers, but Keith here
is going to jump in and tell me that no one will be interested because
it's not x86.

RM
 
Not in the long run. And if the other side has Klaus Fuchs and the
Rosenbergs....
But it works for Coke. :-)

I didn't say it the first time, but I can't resist saying that the
reason the proof of principle is so important, even if you don't know
the details, is that it allows you to focus all your resources on the
path with a known possible favorable outcome.

I'm glad. Need help on your next application, E&TS is ready.

See my comments to George Macdonald about the funding climate for
hardware. If there's money, it's going to be at a place like Mercury
Computer Sytems or BB&N. Now, as to the possibilities for software,
.... but that's understandably not what you want to hear.

Aside from the expectations of venture capitalists, the profound shift
in hardware is that (with the notable exception of IBM's Power
architecture) small systems lead the way. Even the free-spending
national labs can't buck that trend.
Now I'm curious. Which is which?

Well, now that you mention it, there may be a bimodal distribution.
The age group that went to school, studied computer architecture, and
graduated believing that ________ was going to change computing
profoundly is the group that doesn't seem even to want to talk about
it (unless ________ was streaming architectures, maybe). The
youngsters are excited. The people who mostly believe it was all
thought up in the era of Project MAC and System 360 are showing
glimmerings of interest. Maybe there's hope.

RM
 
If you don't like it, don't use the open-source libraries. Why should
*you* get to profit from the work of the people who wrote those
libraries? You used their ideas and their work, for free; why shouldn't
they get to use your ideas and your work, for free?

You mean don't use a compiler which is available for Linux?... shut any
potential Linux market off? Fer Chrissakes I'm talking about mundane OS
interfacing to do file IO/management and basic process housekeeping. I'm
not interested in their GUI bloat. By your account(?) some genius who
devises a ground-breaking method for a previously unsolved problem, and
programs it up himself, has to turn his jewel over to the OS "community"...
so *they* can profit/steal from his unique creativity?

I'm perfectly willing to pay the going rate for the compiler and its
non-GUI libraries - the OS crowd needs to get their act together to cater
for this.
 
George said:
You mean don't use a compiler which is available for Linux?...

Or find one that's not GPL. You could write one, ferinstance. Then you
wouldn't have to pay anybody anything, or give away anything to anybody.
Of course, there's be a certain up-front cost in time and effort, but
that's for you to decide.
shut any
potential Linux market off?

That's for you to decide. The non-free software market works one way;
the free software market works differently. Only you can decide whether
you want to play in that market, but you don't have the right to demand
that the rules change to suit you.
By your account(?) some genius who
devises a ground-breaking method for a previously unsolved problem, and
programs it up himself, has to turn his jewel over to the OS "community"...

He doesn't *have to* do anything. He can keep his idea to himself. He
can publish it in a blog. He can run it on *some other* OS
so *they* can profit/steal from his unique creativity?

You mean, like the way he intends to profit/steal from the creativity
and time and work of the people who made it possible for him to have an
OS to work with in the first place?
I'm perfectly willing to pay the going rate for the compiler and its
non-GUI libraries - the OS crowd needs to get their act together to cater
for this.

They don't *need* to. That's the decision they've made. If you don't
like it, stick to Windows - MS will be happy to take your money.

(I should point out here that I'm a lifelong Windows user/developer
myself, and yes, I - or my employer - pay for all my software. But that
doesn't mean I don't *understand* where the F/OSS folks are coming from,
or that they don't have a right to play their game by their rules.
Neither you nor anyone else are "entitled" to make money on Linux software.)
 
Or find one that's not GPL. You could write one, ferinstance. Then you
wouldn't have to pay anybody anything, or give away anything to anybody.
Of course, there's be a certain up-front cost in time and effort, but
that's for you to decide.

You think that's not been done?:-) We're supposed to be more productive
now by not duplicating past efforts!
That's for you to decide. The non-free software market works one way;
the free software market works differently. Only you can decide whether
you want to play in that market, but you don't have the right to demand
that the rules change to suit you.


He doesn't *have to* do anything. He can keep his idea to himself. He
can publish it in a blog. He can run it on *some other* OS

.... and not reap any benfits from his innovations! Some err, "solution".
This is slightly more than an "idea" I'm talking about - it would warrant
some reward in the normal scheme of things.
You mean, like the way he intends to profit/steal from the creativity
and time and work of the people who made it possible for him to have an
OS to work with in the first place?

The umm, scope of the "creativity" is somewhat different... ergo instrinsic
value! Is this what OS is about then?... trading baubles for crown jewels?
They don't *need* to. That's the decision they've made. If you don't
like it, stick to Windows - MS will be happy to take your money.

(I should point out here that I'm a lifelong Windows user/developer
myself, and yes, I - or my employer - pay for all my software. But that
doesn't mean I don't *understand* where the F/OSS folks are coming from,
or that they don't have a right to play their game by their rules.
Neither you nor anyone else are "entitled" to make money on Linux software.)

I've done my share of developing myself and my "lifelong" extends well
beyond Windows so you can err, unpuff yourself. The problem here is simply
a case of superimposing an ideology on a business model beyond practical
terms.
 
George said:
You think that's not been done?:-) We're supposed to be more productive
now by not duplicating past efforts!

And you can save yourself those efforts, if you wish, by using GPL tools
- as long as you're willing to comply with the license under which they
are published.
... and not reap any benfits from his innovations!

Says who? There's nothing that says that you can't sell GPLed software
for money.
Some err, "solution".

I.e. not a solution *you* like. Other people feel differently.
This is slightly more than an "idea" I'm talking about - it would warrant
some reward in the normal scheme of things.

Again - he doesn't *have to* use/develop for Linux. If he's looking to
make money, he can develop his software on Windows.

The umm, scope of the "creativity" is somewhat different... ergo instrinsic
value! Is this what OS is about then?... trading baubles for crown jewels?

An operating system is a "bauble" compared to this guy's tool? This guy
thinks rather highly of himself, doesn't he?
I've done my share of developing myself and my "lifelong" extends well
beyond Windows so you can err, unpuff yourself.

I'm not "puffed" to begin with - my development experience extends
beyond Windows as well. I was simply pointing out that I'm not a F/OSS
shill.
The problem here is simply
a case of superimposing an ideology on a business model beyond practical
terms.

It sounds more to me like you're trying to impose a business model
(selling proprietary software for money) onto an ideology (Free
Software) that isn't as compatible with your goals as you'd like it to be.
 
On Mon, 04 Apr 2005 17:47:51 -0400, Mike Smith

Christ said:
Says who? There's nothing that says that you can't sell GPLed software
for money.

*YOU* just said right above the alternative was to keep it to himself or
publish in a *blog*?? Make up your mind.
I.e. not a solution *you* like. Other people feel differently.

What?... there are people falling over themselves to give away perfectly
marketable, dole-avoiding, high value technology? I wonder what they eat
and where they live?:-[]
An operating system is a "bauble" compared to this guy's tool?

Of course it is - Unix is a >35year old OS with a few recent trinkets &
twists added, plus new device support... and a GUI which doesn't matter a
whit for serious computing.
It sounds more to me like you're trying to impose a business model
(selling proprietary software for money) onto an ideology (Free
Software) that isn't as compatible with your goals as you'd like it to be.

Well the case being discussed was a hypothetical, but not unlikely,
scenario so it doesn't concern *me* personally right now. This
hypothetical also presupposes a high value product which has a guaranteed
base of customers knocking at the door with big $$ - if OS can't cater, it
loses.<shrug>
 
Of course it is - Unix is a >35year old OS with a few recent trinkets &
twists added, plus new device support... and a GUI which doesn't matter a
whit for serious computing.

Ok, I wuz trying to sucker someone into this discussion that has more
knowledge here than I. But... It's my understanding that the libraries
that require GPL are few and aren't needed at all for "serious computing".
The compilers can be used for the "serious computing" as long as these
libraries aren't statically linked. Some can be dynamically linked and
some stuff can be piped (ported) into the "serious computing" part that
can be ket as a jewel.

My example was an ASIC or FPGA router. The "jewel" here is the
Place&route algorithms (serious computing) which have no need to access
the troubled libraries. Open a port to communicate to the GUI, publish
the GUI, and be done with it.

Ok, pick it apart. I'm no expert here and can't even spell 'C'. What I
do know is that there are closely held packages in this arena that do
Linux. It's obvioulsy not impossible.

<snip>
 
Ok, I wuz trying to sucker someone into this discussion that has more
knowledge here than I. But... It's my understanding that the libraries
that require GPL are few and aren't needed at all for "serious computing".
The compilers can be used for the "serious computing" as long as these
libraries aren't statically linked. Some can be dynamically linked and
some stuff can be piped (ported) into the "serious computing" part that
can be ket as a jewel.

I've heard so many different angles on this that I'd never pretend to be an
expert. Unless the compiler comes with a well delineated "fence" to
library functions forbidden for commercial exploitation, it's all a PITA to
me. I don't want to have to become a legal expert to know what's
allowed... or get caught with my pants down later.
My example was an ASIC or FPGA router. The "jewel" here is the
Place&route algorithms (serious computing) which have no need to access
the troubled libraries. Open a port to communicate to the GUI, publish
the GUI, and be done with it.

Yes that is a seriously tough problem - any time you have cascading
dependencies on discrete decisions, it's a "hard" problem. Scheduling of
circuit board stuffing is another one.
Ok, pick it apart. I'm no expert here and can't even spell 'C'. What I
do know is that there are closely held packages in this arena that do
Linux. It's obvioulsy not impossible.

OK - there must be a way. I hope I can find it when needed.:-)

BTW did you see my post on 3d partitioning of IC/PCBs? Just wondered what
you thought?
 
In comp.sys.ibm.pc.hardware.chips keith said:
My example was an ASIC or FPGA router. The "jewel" here is
the Place&route algorithms (serious computing) which have
no need to access the troubled libraries. Open a port to
communicate to the GUI, publish the GUI, and be done with it.

This is a question of "what makes a work derivative" and
subject to some parent licence (GPL).

Stallman addressed this concern with the LGPL (Lesser GPL, formerly
Library GPL) that most (but not all) free libraries use. It is
convoluted text, but sec.5 would appear to make static linked
derivative while exempting dynamic linked executables.

-- Robert
 
This is a question of "what makes a work derivative" and
subject to some parent licence (GPL).

Stallman addressed this concern with the LGPL (Lesser GPL, formerly
Library GPL) that most (but not all) free libraries use. It is
convoluted text, but sec.5 would appear to make static linked
derivative while exempting dynamic linked executables.

This is the *problem* here - convoluted text... always open to err,
"higher" levels of interpretation and difficult to comprehend in the first
place by the people who need to know. Read: I don't wanna have to read
this crap.

As for static vs. dynamically linked that seems like a can of worms to me -
there's always *some* code which is statically linked, e.g. the usual
file/print/device IO functions usually have substantial code statically
linked. And if I'm charging for my software am I allowed to supply the
dynamic library on the same media or does the customer have to get that
from an OS source?
 
On Thu, 07 Apr 2005 12:59:28 GMT, Robert Redelmeier



This is the *problem* here - convoluted text... always open to err,
"higher" levels of interpretation and difficult to comprehend in the first
place by the people who need to know. Read: I don't wanna have to read
this crap.

As for static vs. dynamically linked that seems like a can of worms to me -
there's always *some* code which is statically linked, e.g. the usual
file/print/device IO functions usually have substantial code statically
linked. And if I'm charging for my software am I allowed to supply the
dynamic library on the same media or does the customer have to get that
from an OS source?

If this is a real concern for you and not merely a hypothetical, why
not talk to somebody at a place that does release proprietary software
to run on Linux, like the Mathworks? Maybe Larry E. could offer you
some pointers.

RM
 
Back
Top