Thread: downloading
View Single Post
  #1 (permalink)  
Old July 20th, 2001
Unregistered
Guest
 
Posts: n/a
Question downloading

I find when trying to downloading stuff using Gnucleus, I get like a 1 in 10 chance of being able to download the file. Most of the files I try to download say 'Refused' or 'Pending', or immediately after saying 'Checking Firewall', the download vanishes.

Is this because I need to meet specific requirements in order to download the majority of ppl's files. Such as needing to share (n) files. Or (x) bytes of data shared. Or (z) Kbytes free bandwidth.

I have downloaded relatively large files (couple hundred meg) using various GNUtella clients. But Generally, a similair thing occurs with the other GNUtella clients, in that I can every 10 files or so I try to download are actually succesful.

Also to the Gnucleus Developer: Why don't you put...

1) Segmented downloading into Gnucleus. Just like Getright does. This way people with fast connections can download stuff more quickly

2) When a file has more than 1 host, it should download from one host, and at the same time keep attempting to download the same file from the various other hosts (just so it can see which one is the fastest), then if it finds any host which is faster than the current one, then it'll switch downloading to this new one.

3) I don't know if you can do this already, but it would be a very powerful thing to have. A type of script, where users could simply type a set of commands and leave their computer on over-night or for days even . and they would get what they're looking for.

The script format could be something like:

----script.txt---

cfg.max_hunts = 3; // Max. of three simulatenous hunts taking place.

HUNT("my hunt 1")
{
find = "any search string";
limit.size_min = 100M; // k/K affix means kilobyte, m/M affix meaning Meg etc.
limit.size_max = 500M;
limit.speed_max = 16k; // 16Kbyte max download speed.
timeout = 5m; // after 5mins of no success
retry = 1h; // retry search'n'download in 1 hour's time.
files = 3; // download 3 different files which meet the above criteria.
}

HUNT("my hunt 2");
{
find = "another search string";
}

--EOF--

Then the script could just be loaded into and ran.
In english, what the above script would do, is simply run the first three hunts in the script. There happens to be only two, so it'll just do them both. The first script called "my hunt 1", would first search for the string "any search string", with a size range of 100M to 500M. A speed range of 0KB/sec to 16KB/sec. After five minutes of no success*, abort this hunt, and retry 1 hour later.
However, if successful, then it'll download 3 different files which happen to meet this criteria. so if you leave the script running over-night, then you could wake up in the morning with three movies, of the same thing which happen to be of different size/format. - You could add an additional check called 'tollerance' specified more likely in Megabytes, which would prevent two files being to close in size. Coz a lot of the movies I've seen tend to be like say 100MB, 101MB, 105MB etc.

*success = determined as actually getting a file to begin downloading.

Also in parallel, there would be a 2nd hunt going on "my hunt 2", which is a real basic hunt, it's just a search. Now what this would do, is the other parameters would be treated as default i.e:

limit.size_min = 0;
limit.size_max = 'INFINITY';
limit.speed_min = 0;
limit.speed_max = 'INFINITY';
timeout = 'NEVER TIMEOUT';
retry = 'RETRY IMMEDIATELY'; // this wouldn't even occur, unless there is a time out, but if someone does specify timeout & no retry, then it'll retry straight after timing out.
files = 1; // It'll just download one file.

Whaddya think guys, do you think it would be kewl to have a script in Gnucleus?

ps: I would like to help out on these GNU programs. But I just don't have any spare time in my busy game dev. schedule, I can only really suggest ideas.

/Ichidan - ichidan256@yahoo.com
Reply With Quote