Prescott with 64-bit extensions coming in June

  • Thread starter Thread starter Judd
  • Start date Start date
Carlo said:
You mean you actually had documentation at the places you worked? Must be
nice, where I am it's do this project and figure out where you need to pull
the data from on your own. Very annoying, getting easier since I've been
with this company for a few months now, I'm actually getting familiar with
some of their DBs, but it's still just a huge mess. Very fustrating
experience.

Carlo

Format the whole thing and tell the management a Win32 worm ate the
thing up.
 
Judd said:
I don't know when dual core will get here. They surprised the heck
out of me with the release announcement of the EM64T. I figured that
was 2005 at the earliest. Funny how things change. If AMD is
pushing dual core, then Intel will release it earlier than you expect.

Basically what Intel has announced is that they will bring out 64-bit P4
desktop chips at the same time that they introduce their Nocona
dual-processor Xeon server chips. So Nocona and Prescott 64-bit will come
out at the same time, instead of at different times.

Yousuf Khan
 
Judd said:
Other than server DB and some large image processing programs, what
other programs are doing this? I have found this to be rarity thus
the exception and not the norm.

I would assume this would become more of the norm rather than the exception
once 64-bit is entrenched. There are big advantages to accessing files with
a mmap function rather than the traditional file i/o functions. Not the
least of which is that accessing mmapped files is just like accessing
memory, and secondly you bypass having to do a context switch to the kernel
at several levels.

Yousuf Khan
 
Bill Davidsen said:
Think about the overhead of resetting the segment register before
using each memory access. Does that sound like a practical thing?
Segments protect processes from one another, not a process from
itself.

No, segments could save processes from themselves too. There are separate
code and data segments.

Why would you need to reset the segment register for every memory access?
The data segment would be cached in the DS register, the code in CS
register, and the stack in the SS, plus two extra DS-like segment registers
in the form of ES and FS.

Yousuf Khan
 
There is no way I'm moving to WinXP, much less SP2. At home I've
stopped with Win2K SP2 and will go no further. At work I'm
required to have SP4. It's their machine, they can do what they
wish with it. I'm not agreeing to be a M$uck slave though.

The important thing though is that most Joe-users WILL run WinXP (if
they aren't already), and they probably will get the latest service
pack. Maybe not right away, but slowly but surely the older systems
will be phased out. With SP2 users will be protected by firewalls by
default, they will have hardware and/or software protection to reduce
the effects of buffer overruns, and their machines will automatically
get the latest updates unless they are specifically disabling this
feature.

Not a sure-fire fix by a long shot, but it should be a BIG improvement
from the current situation.
 
Not so, there are existing applications which are already running out of
address space when they mmap() large files. This is a technique which is
a big win if you have a bunch of processes sharing a file, although it
makes little difference over file i/o if you have a single process and
don't have all the flushing and locking issues.

Also note that as real memory approaches the 2GB line that
virtual memory is still "interesting". the normal machine with
2GB will be here long before the 2010 Intel fobs off as fact.
 
firewalls. My machine was attacked, and shut down, even though I
had the AV program up-to-snuff, and all applicable M$uks patches
installed.

Wat??? U actually trusted M$ software to protect u???? Keith, u r
getting senile! :pPpPp

A good software firewall + hardware firewall are essential nowadays
against these worms & trojans. AV doesn't work since they are like
lawyers, costs you more money than they are worth and only useful
AFTER the accident. I haven't had an AV programs for years and I'm
Win2K too.

--
L.Angel: I'm looking for web design work.
If you need basic to med complexity webpages at affordable rates, email me :)
Standard HTML, SHTML, MySQL + PHP or ASP, Javascript.
If you really want, FrontPage & DreamWeaver too.
But keep in mind you pay extra bandwidth for their bloated code
 
Yousuf Khan said:
I would assume this would become more of the norm rather than the exception
once 64-bit is entrenched. There are big advantages to accessing files with
a mmap function rather than the traditional file i/o functions. Not the
least of which is that accessing mmapped files is just like accessing
memory, and secondly you bypass having to do a context switch to the kernel
at several levels.

I don't disagree with you at all. 2 GB is generally plenty big for most
small to medium level applications which is the majority of what people use
today.
 
Judd said:
I don't disagree with you at all. 2 GB is generally plenty big for
most small to medium level applications which is the majority of what
people use today.

What I meant was that even if processor's memory doesn't go past 4GB on
early 64-bit desktops, applications which create greater than 4GB datafiles
will still benefit from the 64-bits. Not the least of which is the ability
open super-4GB files using a mmap rather than standard file i/o.

Yousuf Khan
 
a?n?g?e? said:
Wat??? U actually trusted M$ software to protect u????

No, the CIO thinks it's good enough (though rumors have it that
we're all Linux by YE 2005; yeah, right). I do what I'm told
(for at least another week or two).
Keith, u r getting senile! :pPpPp

I have been told that before. Do you know my wife? ;-)

I am going to try to kick the WinHabit at home for SuSE next week
(give or take bringing up a new system, and some learning), so
perhaps I am nuts. ;-)
A good software firewall + hardware firewall are essential nowadays
against these worms & trojans. AV doesn't work since they are like
lawyers, costs you more money than they are worth and only useful
AFTER the accident. I haven't had an AV programs for years and I'm
Win2K too.

Well, the CIO dictates what we use. Our network is *supposed* to
be trusted. (yeah, right!) ...with *MORONS* using OE and opening
trash?! A hardware firewall at work is not going to happen
either. That's all under the shadow of the networking folks.

At home things are a little different. Note that I've never been
infected in either place though.
 
@twister01.bloor.is.net.cable.rogers.com>, news.20.bbbl67
@spamgourmet.com says...
What I meant was that even if processor's memory doesn't go past 4GB on
early 64-bit desktops, applications which create greater than 4GB datafiles
will still benefit from the 64-bits. Not the least of which is the ability
open super-4GB files using a mmap rather than standard file i/o.

DOn't forget the memory that's been committed (to even suspended
tasks and the ever-present M$ memory leaks). A 2GB virtual space
isn't a lot with even a 1GB real system! Of course Intel and
apologists won't admit that a larger address space is needed
until Intel offers it to the masses (and that will be *LONG*
before the claimed 2010).
 
No, it's vapor. Until it's on the shelves of CompUSA and prepackaged from
vendors, it's vapor. Your statements would be considered wholly dishonest
from the consumer POV.

It is *not* vapor to developers right now and its non-appearnce at CompUSA
is not *known* to be for technical reasons. To say it's not "built" is
rubbish by any acknowledged software standards.
You bet it's cheaper! Intel in comparison to AMD is not cheaper, but when
Intel releases 64-bit products, the 32-bit products will be cheaper (prices
slashed, etc.). You are comparing apples to apples.

We'll see when it happens and if so, how that stands up. Previous attempts
at selling crippled, intentionally and otherwise, hardware, e.g. 386SX,
486SX, Celeron, etc. have had mixed results.
Uh, the average consumer doesn't know jack. This is why marketing is so
important. They don't care so long as it sounds good ("That thang got a
hemi!") and runs their software better. You are using your considerable
intelligence of computers as a benchmark for what others know. I'm just
letting you know that it's unrealistic.

As I've tried to get across, a large part of the consumer market is now
"upgrade"... == wiser. The consumer market is becoming increasingly
sophisticated in its appreciation... and demands. Digital cameras,
digi-videocams, home media centers/networks, etc will do that.
I don't know when dual core will get here. They surprised the heck out of
me with the release announcement of the EM64T. I figured that was 2005 at
the earliest. Funny how things change. If AMD is pushing dual core, then
Intel will release it earlier than you expect.

You have definitely not been paying attention then. The 64-bit nature of
Prescott has been exposed for over a year now:
http://www.chip-architect.com/. It is also interesting that AFAIK we have
not even been given a clue on its performance in 64-bit mode.

Rgds, George Macdonald

"Just because they're paranoid doesn't mean you're not psychotic" - Who, me??
 
As I've tried to get across, a large part of the consumer market is now
"upgrade"... == wiser. The consumer market is becoming increasingly
sophisticated in its appreciation... and demands. Digital cameras,
digi-videocams, home media centers/networks, etc will do that.

I work with a lot of families through high/middle school connections,
here in Silicon Valley, and it's my experience that the bulk of
computer users really don't have a clue what's in their box. They
know the brand name (HP, Dell, whatever) and (generally) the OS, but
don't have any idea the CPU speed, AMD or Intel, even how much RAM or
how big a HD they have. They get advice from a salesman at Best Buy
or a friend who reads PC Mag, buy it, and that's it until it quits
working right and they call someone to help.

Sure, the techies know that stuff, but there are so many more people
who still have all the vendor-installed icons on their desktop (Free
AOL Trial!) a year after they bought it because they don't know what
they can delete or not, or even that they *can* delete it. IME, the
bulk of PC consumers today have no more idea what's inside their PC
than they do what's inside their TV.


Neil Maxwell - I don't speak for my employer
 
KR Williams said:
DOn't forget the memory that's been committed (to even suspended
tasks and the ever-present M$ memory leaks). A 2GB virtual space
isn't a lot with even a 1GB real system! Of course Intel and
apologists won't admit that a larger address space is needed
until Intel offers it to the masses (and that will be *LONG*
before the claimed 2010).

Well, obviously Intel has already admitted to it, witness EM64T.

Yousuf Khan
 
In comp.sys.ibm.pc.hardware.chips Yousuf Khan said:
What I meant was that even if processor's memory doesn't go past 4GB on
early 64-bit desktops, applications which create greater than 4GB datafiles
will still benefit from the 64-bits. Not the least of which is the ability
open super-4GB files using a mmap rather than standard file i/o.

You don't even need to be able to go as far as individual super-4GB files...
just over 3GB of files and physical memory in total (2GB on Windows). Not
to mention that given how most OSes today handle physical memory, things
start getting klugy after a gig or two, rather than at 4GB.
 
In comp.sys.ibm.pc.hardware.chips KR Williams said:
Also note that as real memory approaches the 2GB line that
virtual memory is still "interesting". the normal machine with
2GB will be here long before the 2010 Intel fobs off as fact.

My own suspicion is that a lot of power users will be going to 2GB as soon
as prices fall back to where they were, oh, 6 months ago. I know enough
people building 1GB machines as a matter of course even with the much higher
memory prices that as soon as they drop back, I suspect they'll start
doubling the RAM rather than saving the money.

Ditto for regular users going to 1GB. Next big price drop after that, we'll
be at 2GB.
 
I work with a lot of families through high/middle school connections,
here in Silicon Valley, and it's my experience that the bulk of
computer users really don't have a clue what's in their box. They
know the brand name (HP, Dell, whatever) and (generally) the OS, but
don't have any idea the CPU speed, AMD or Intel, even how much RAM or
how big a HD they have. They get advice from a salesman at Best Buy
or a friend who reads PC Mag, buy it, and that's it until it quits
working right and they call someone to help.

Sure, the techies know that stuff, but there are so many more people
who still have all the vendor-installed icons on their desktop (Free
AOL Trial!) a year after they bought it because they don't know what
they can delete or not, or even that they *can* delete it. IME, the
bulk of PC consumers today have no more idea what's inside their PC
than they do what's inside their TV.

Like I said, there's difference between first-timers, which you seem to be
talking about (high/middle school) and the 2nd-time-arounders. Go look at
some of the forums on digital cameras, video processing etc. Then there's
the kids who are into network parties and the likes - I see them getting up
to speed pretty quickly. The difference here is that where the motivation
for owning a computer is to do something useful with it, rather than just
because the want to have a computer, people learn quite quickly and know
what they want.

Rgds, George Macdonald

"Just because they're paranoid doesn't mean you're not psychotic" - Who, me??
 
Wat??? U actually trusted M$ software to protect u???? Keith, u r
getting senile! :pPpPp

It's become fairly clear that when M$ does its monthly "update service"
that within a week or two, there will be new worms, trojans etc. released
which exploit the latest announced bug.... err, vulnerability.
A good software firewall + hardware firewall are essential nowadays
against these worms & trojans. AV doesn't work since they are like
lawyers, costs you more money than they are worth and only useful
AFTER the accident. I haven't had an AV programs for years and I'm
Win2K too.

It depends a lot on the individual user and the network environment their
computer lives in. Many people don't know enough to protect themselves:
I've seen people get infected because they didn't know enough to realize
that when they hovered the mouse cursor over a URL in their e-mail (spoofed
from a friend) the URL actually pointed to a local attached file. Another
common way to get infected is where people take notebooks home every
evening - they bring the worm back with them the next day, thus bypassing
the company firewall.

You don't have to be stupid to get infected but it helps. There are, of
course, people who think that clicking LiveUpdate first thing every morning
is total protection; the same guys often indulge in all kinds of risky
behavior, with "enhanced search/toolbar" doohickeys and peer-to-peer crap
software.... because they think they're "protected".

Rgds, George Macdonald

"Just because they're paranoid doesn't mean you're not psychotic" - Who, me??
 
Back
Top