View Single Post
  #19 (permalink)  
Old January 2nd, 2002
Stigeide
Guest
 
Posts: n/a
Default

First, I know that a request with TTL=120 will (thank God) not survive - it was just to make a comparison between the old, inefficient method and my Smart method. It is easier to compare the efficiency of two methods if they consumes the same amount of bandwidth.

Anyways, you have to admit that the current method of sending searches blindly is inefficient?

My (Smart ) method would use two lists of hosts for each client:
One list to send and forward queries to.
This list should be cultiv8ted by hosts that responds to your searches. This way you have the great benefit of being close to hosts that hosts files that you want.
One list to receive queries from
This list, you should not care who is on. But you know that these hosts prefer your files, if they are using the Smart method.

My claim is, that this method will make the searches much more efficient because:
Searches is only send to those who actually have files.
The probability that the hosts that see your search will return a hit is bigger, because they are closer to you in "taste".

You can think of it as insiders and outsiders. The outsiders are freeloaders and sends the requests to the insiders. The insiders sends the requests to other insiders.
A cute picture:
http://www.geocities.com/stigeide/s.html

Peace!
Stig Eide
Reply With Quote