View Single Post
  #4 (permalink)  
Old June 13th, 2002
Eagle3Eyes
Guest
 
Posts: n/a
Thumbs up I have a cool suggestion

About the removal of possible clients in case of errors...

For one offer more than one option:

Make one option the following exponential test scheme:
In this scheme first you try again in one (time unit) if that doesn't result in something instead of deleting the possible client (which will generate a lot of extra search traffic !) increase its exponent to 2 that means retry in two in continuation 4, 8 , 16 , 32 ,
very quickly it would be days before it would try again. If it should in any attempt result in a connection again reset the exponent to Zero. Give the user the option to configure at which exponent to eliminate (forget) the possible host and how many hosts have to be known with a minimum exponent to not trigger a new search.

How about that - reason is I see it all the time now I successfully download a fragment and then I loose the wonderful high speed host because it ran into a limit and with the next manual search I get the same host and a good connection again. Think about all that useless search traffic to be saved ! Think about the efficiency having full filehost lists again like in 4.6 or so !

Admit it folks you love this idea too! only one or two digits to tell about the state of a host ... only 32 tests in 4 Giga Seconds (130 Years...) max.
Actually make the base adjustable too because I just realize that the last few tests happen only every 65, 32 , 16 , 8 , 4 , 2, 1 Years ... that's not too efficient

with base 1.5 every next test-pause is 50 percent longer than the last resulting with unit size 1second in 32 tests in 5 days

Anyhow you see how with 3 or 4 parameters you have a world of options and efficiency ???

cheers

P.S.: take a spreadsheet to think about it ...
Reply With Quote