I really have no idea what % loss this is for "most office
tasks". You might very well be correct if said tasks are
not memory bound and sufficient cache makes CPU speed the
most important factor.
However, I'd hate to have a machine that bogged horribly
on memory-bound tasks (photoediting?)
True enough, I'm not saying that integrated video is for everyone,
just that it isn't nearly as bad as it was at one time and for a LOT
of people it's well past the "good enough" point.
As a point of note though, it's really more a question of 2D graphics
vs. 3D stuff where you REALLY hit the difference, no memory-bound
tasks.
The good reason is probably cost.
Of course it's cost, though in more ways that just up-front cost.
Having integrated video also tends to simplify things for IT
departments in that it reduces the number of variables in their
systems. Just one video type of video card across a whole line of
computers makes things a lot easier when looking at drivers and
images.
Do you have benchmarks
on the speed loss from integrated video?
I have seen some, though I'm having trouble tracking down any recent
ones that really compare integrated video to non-integrated video on
anything other than games. Here's one old example:
http://www.realworldtech.com/page.cfm?ArticleID=RWT110500000000
Intel's i810 and i815 chipsets, along with nVidia's first nForce
chipset, were some of the first integrated video chipsets where the
performance hit wasn't all that great. Later chipsets have closed the
gap even further, primarily through new techniques that greatly reduce
the amount of time they need to go to main memory.
Of course, the flip side to this coin is that there are some add-in
video cards that are quite respectable and sell for dirt-cheap these
days, so it's still tough to justify integrated video to anyone other
than real penny-pinchers.