You find the files either by searching with Google, or (in my
experience, certainly) by being in the relevant community which
publishes the torrent files in the first place. For instance, I might
be reading a review of a game, and there's a torrent of a demo
available. No need to search.
I finally found a definition of BitTorrent that was dumbed down enough for
me to understand why BitTorrent doesn't offer a search engine. I found the
following defitnition of BitTorrent....
"BitTorrent has no network in the sense of KaZaA or Napster - it's a
protocol. People or companies wanting to distribute a file essentially
create their own private P2P network which only consists of whoever is
downloading the file at the time. These miniature networks are formed around
a ``tracker'', which is a server program operated by the entity wishing to
share a file. " (
http://www.joestewart.org/p2p.html)
This means that there is no "network" in the traditional P2P sense,
therefore, no searching peers for files.
While a .NET DLL will certainly help for some clients, if you *require*
.NET you've just removed a huge bunch of users.
You are right. But, as the Microsoft worm turns, more and more users will
be able to DL the framework. And, if the .Net framework is available via
the network, a simplified win32 app could DL the .Net framework via the same
file-sharing network.
I'll also make the protocol public so that other people can code clients in
the language of their choice.
Yahoo doesn't require significant amounts. It's free and pretty easy -
less time consuming for most people than downloading and installing
.NET, for instance.
This is something I have posted about also. If Microsoft is really banking
on .Net going forward (and I believe they are) why in the world wouldn't
Microsoft have thought through its distribution better than they did?
The first thing I don't get with the .Net distribution is why the .Net
framework is not a critical update on the Windows Update site.
The second thing I don't get is... The .Net framework has a neat feature
that allows a program run from a network location (intra or inter) to only
download the DLLs it needs as it needs them. Why wasn't this same type of
functionality built into the Windows OS (or even the .Net EXEs or the .Net
setup for .Net applications in Visual Studio) for programs that need the
..Net runtime to run. If it were, simply running a .Net exe would initiate a
download and installation of only the .Net DLLs required to run the .Net
application.
Each PC would then trickle down the .Net framework as it is needed and would
avoid the massive 23+MB download.
Arguably, making people put a *bit* more effort into getting technical
support isn't a bad idea - it might mean that by the time they ask the
question they're interested in, they have a vested interest in asking
it in a useful way.
Technical support aside, the main reason for using the Microsoft Passport
service, as I understood it, was to have a single login for many websites.
Microsoft has charged so mch for the service that this is not an option for
the majority of small businesses on the internet (whihc just happens to be a
majority of all businesses on the internet).
Sure. I'll remain cynical about it being a BT-beater until that happens
though... it's not like the world has been short of P2P apps and had to
make do with any old rubbish.
You should be cynical. I would be if I were in your shoes. And, Lord knows
we have a few too many file-sharing applications with no real open standards
that anyone follows. Also the massive amount of user requests that flow
through the major systems (like Gnutella and Kazaa) seem to slow the systems
down quite a bit.
There are good points to a lot of systems. The distributed server model of
one model has the distributor of the file-sharing software running a main
server that only accepts connections from other mini-servers that are run by
individuals interested in starting a file-sharing server for his/her
particular interests. This allows people to connect to mini-servers that
specialize in the type of file they are looking for and speeds file
searching while increasing the likelihood of finding your file.
The main server simply keeps a log of all mini-servers and their specialties
for the users to choose a mini-server to connect to. Connections to more
than one would be nice, but is not a feature.
Others use hashes to find exact file matches. While this the added benefit
of finding exact matches for a file, this also has a drawback because a file
that is 99% or even 90% the same may suffice or my be able to donate
portions of the file being sought. This also means a boatload of returns
for a search when most of the returns may be basically the same file.
Absolute bit precision is not usually needed in something like an MP3 or a
movie trailer.
Another nice feature would be automatically disconnecting from a file server
if the download time exceeds a certain, user-supplied limit. Each user
could also have a ranking that indicates the number of downloads
initated/completed, the same for uploads and the average time online. This
may help to choose a more reliable file-server and should result in
increased functionality for the more reliable file servers.
And, why should you have to tell a file-sharing application to "find new
sources" if it knows it can't dl the file from the original file-server
(because it is no longer online or whatever)? Shouldn't it just search and
find the file automatically (perhaps fail after X tries or Y minutes without
a match)?
The biggest change could come in the way we share and retrieve files.
Current methods usually result in many partial downloads because of
file-servers going on- and off-line. Since the very nature of a peer file
server is transient, this should be taken into consideration when designing
the system. Thus far, (IMHO) it has not been addressed in any meaningful
manner.
To be sure, it will not be an easy task. But, especially if files that are
placed onto the network by companies trying to distribute demos and what-not
can be shared easily and securely, the network may be able to generate
revenue from those companies. It is imperative to have a stream of income
if the system is to remain viable and advancing. (Only kids can code for
free....and, even then, not for long).
I do have 2 oher small businesses to run, but I hope to find time to get a
BETA out by Dec 24th.