marsha said:
No, this is not how we work. No one user is dependent on another users
computer except when they need a printer that is local to that other user.
Three
of the users have local unique printers that other users will use. But,
when that rarely happens (that they need another users local printer and
that person
is not there - they just put it off for a day). They never use files from
another users
computer. What our specialty software does is access the data folder on
that one
"server" computer and also put the changed data back on that one computer.
That
is it in a nutshell. The problem (the whole problem) is when one of the
users cannot
access that one "server" computer. The fix is to shut down the server
computer and
all the user computers and then to turn on the server computer, let it come
up completely,
and then turn on the user computers. On a rare day, I will have to reset
the passwords
for one or more users on the "server" computer. For some reason, that seems
to fix
things so the users can then reconnect as they need to get data.
That is pretty much it as far as our little office goes. Each of the users
connect to a switch
via ethernet. It is a D-Link DSS superswitch. That connects to another
D-Link DSS
superswitch that connects to the DSL modem, and the "server" computer.
There you have
it. That's it as far as our system goes.
Everyone seems be telling me that our system should handle more than it is.
But
when users can't connect, they get very aggravated. We think that the
special
software that we use is at fault. We have talked to many other companies
doing
what we do and they have had similar problems with our software and have
made
the switch to a better know software and are now happy. I guess we will
have to
do the same but since everyone had told me that a peer-to-peer network
working
as we are working should be able to handle more concurrent connections, I
thought
I would try to make it work.
Ok. Obviously it's impossible for me to evaluate your software, whether it
indeed contributes to the problems. I would obviously have to examine it
more closely and know much more about it to even begin to give relevant
feedback.
BUT, there does seem to be one consistent and explainable problem, and
that's the connection limit. If you're telling me that rebooting the
system(s) gets ppl connected again, then it's VERY LIKELY you're running
into the XP connection limits. To repeat, MS purposely limits concurrent
connections so ppl don't try to turn a peer-based network into a (pseudo)
client-server network. When you reboot your PC, you are effectively
clearing all your open network connections. I suspect this clears up the
problem because the very first thing clients do after reboot is attempt to
connect to the application. Since little if anything else has consumed
other network connections since that reboot, the chances of success are
greatly increased. But then over time, the various peers do other work,
connecting and disconnecting, including the peer w/ the shared files.
Eventually, inevitably, some time down the road, either the client-peer or
server-peer (for lack of better terms) exhausts its connections, and you're
stuck again.
Trying to manage these network connections w/ XP is very difficult because
you can use up network connections in very subtle ways. A typical PC is
opening, using, and dropping network connections all the time, with a
variety of software applications. Just running Internet Explorer can open
FOUR (4) network connections alone! More and more applications these days
are using the network for program updates, product activation, whatever, so
it doesn't take much much to reach that 10 connection limit on even a modest
system. The situation is exacerbated when you try to turn one of those
"peers" into a quasi-server, since now *it* becomes the target of MANY
concurrent connection requests. IT JUST WON'T WORK WITH XP! Yeah, you can
get away w/ it when the application is rarely accessed, or the likelihood of
concurrent access is low. But that's not what typically happens as the
environment grows. That one machine is being asked to act as a "server",
but it just doesn't have the capacity (at the least in terms of network
connections, and perhaps other limitations as well, like processing power,
access control, etc.).
If you wanted to continue stretching the current peer architecture, you
could perhaps spend time meticulously trying to limit network connections,
on both client-peers and the server-peer. It's a hassle, and not always
easy to do, but you might be able to improve the situation somewhat w/
careful analysis. For example, on the client-peers, it's possible to make a
registry change to limit Internet Explorer connections. So if a user visits
their homepage, instead of Internet Explorer opening up four connections to
download the webpage, graphics, etc., you might limit it to only one or two.
Obviously this would impact the performance of Internet Explorer, but it
would save some network connections for your primary application. On the
server-peer side, I wouldn't allow anyone to use that PC for foreground
applications. Anyone using that PC as their desktop is almost assurdly
going to consume additional network connections from time to time, and thus
hinder access by client-peers. You could also shutdown unnecessary services
on the to further conserve network connections. It's a little difficult for
me to recommend specific services to shutdown since I don't know which you
must have, which are only nicities, and which are running needlessly. But
it's not uncommon for ppl to have quite a few unnecessary services running.
By default, XP turns a LOT of services ON so as to make the system highly
functional. But in reality, many are not needed by all users, and running
them only consumes valuable resources (including memory, CPU cycles, network
connections) that could otherwise be made available to other processes.
There are many resources on the web that describe these services and which
are most likely to be no problem in disabling for most ppl. If interested,
I'll find a few of these sites, just let me know.
But at a minimum, I suspect using NT, W2K, or W2K3 to solve your connection
limits would be a good first step. If you want to save costs, or at least
determine if this is indeed the problem, you could try some of the
techniques described above to see if you can improve the situation. If you
do have success, at a minimum, you at least KNOW what one source of the
problem is. But in the long run, I believe you will eventually succumb to
needing a true server. You could save a few bucks by perhaps picking up a
cheap release of NT off eBay, if only to see if it helps (can be had for the
price of lunch). NT is no longer supported by MS, but many organizations
still use it anyway. (btw, makes sure it's NT *server*, not *workstation*).
It might be worth spending a few bucks on NT as an experiment, or perhaps
maybe it will suffice for a couple years. Who knows. But it's an avenue I
would consider for your circumstances where cost may be a big factor.
HTH
Jim