View Single Post
  #1 (permalink)  
Old March 22nd, 2002
taer taer is offline
Novicius
 
Join Date: March 22nd, 2002
Posts: 1
taer is flying high
Default Linux automatic gnutella client brain dump

I've been thinking about this all day. It seems somewhat doable,

I have a need for a linux gnutella client that would function as wget does. This would be on a private gnutella network, and the filenames on this network are guaranteed to be the same amoung all the servers.

Before I start working on something like this, I'd like some feedback on the idea(and if something already exists).

--begin brainstorm--
For example, the linux boot disk would boot up, root on ramdisk, and fire off a "gnutellaget foo101.img | dd of=/dev/hda" equivalent. This would let us image machines with images that do not reside on a central server, but in a gnutellanet server cloud. 2 or more servers would be "authoratative" for an image for redunancy.

If the client could perform a gnucleus type load balancing and still maintain the ability to send the output to std out, that would be even better. This would also allow one of the servers to die during the transfer, and the other to step up to fill in the gap.

The client would not share files, and they would only connect to the server cloud, not each other. The client also would not route messages between its connections. It would have multiple connections into the server cloud, but would not route.

A problem I could see is the possibility for the server cloud to become segmented. That would be bad, esp if our client connected all its connections to an isolated half, and missed the other side.

A feature the server cloud might get would become data replication. A server might maintain a sliding window type of statistic for images requested vs query responses received to determine how popular an image is(ie, 15 people asked for foo204.img, and a machine routing the resonse saw 2 distinct responses, vs another case with 300 requests and 2 responses). Others could clone that image in a temporary space to automatically spread the load if a particular image was popular at the time. During a cloning transfer, the destination could immediately begin the sharing of the data it has already received. For example, if its grabbing an image, and has 100k/2Gig, a client could grab the 100K from him combined with the rest from the other 2 servers.

Comments, questions, gaping holes? Does anyone else want what I'm smoking? :)

Thanks.

Last edited by taer; March 22nd, 2002 at 03:11 PM.
Reply With Quote