View Single Post
  #10 (permalink)  
Old April 9th, 2002
Nosferatu's Avatar
Nosferatu Nosferatu is offline
Daemon
 
Join Date: March 25th, 2002
Location: Romania
Posts: 64
Nosferatu is flying high
Post Some figures

OK, the statistical math is very hardgoing and I'd probably get it wrong and you probably wouldn't understand it (even if you do understand statistical maths!)

Using the Binomial calculator at
http://www.anu.edu.au/nceph/surfstat...me/tables.html

I can quickly plug in n=17 and p=0.6 as determined previously and find out the standard deviation: 2
So we can say, for the horror 40% of people disallow searches that aren't for some specific resource and you aren't searching for that specific resouce, that
5% of the time you get 10 hosts in under 13 tries
33% of the time you get 10 hosts in under 15 tries
67 % of the time you get 10 hosts in under 19 tries
95% of the time you get 10 hosts in under 21 tries

For a still fairly bad situation where 20% of people disallow .. blah blah blah ..
by plugging in n=12, n=13 (the mean is 12.5) and p=0.8 as determined earlier, and finding that the standard deviation is 1.4, so
5% of the time 10 hosts in under 10 tries
33% of the time 10 hosts in under 11 tries
67% of the time 10 hosts in under 14 tries
95% of the time 10 hosts in under 15-16 tries

I couldn't find any online application which will graph these outcomes in a useful way.

I wonder whether any of the big commercial vendors have their own gnutella network modellers. If so, they could figure out better what would happen. I guess they wouldn't tell us though

I wonder if there is a project yet to write a gnutella network model? It would be useful for exploring proposed protocol modificiations, and I guess not much different from writing a client.

The hard part would be writing analysis routines to make the data meaningful.

Nos
Reply With Quote