Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   Gnucleus (Windows) (https://www.gnutellaforums.com/gnucleus-windows/)
-   -   downloading (https://www.gnutellaforums.com/gnucleus-windows/2560-downloading.html)

Unregistered July 20th, 2001 03:13 AM

downloading
 
I find when trying to downloading stuff using Gnucleus, I get like a 1 in 10 chance of being able to download the file. Most of the files I try to download say 'Refused' or 'Pending', or immediately after saying 'Checking Firewall', the download vanishes.

Is this because I need to meet specific requirements in order to download the majority of ppl's files. Such as needing to share (n) files. Or (x) bytes of data shared. Or (z) Kbytes free bandwidth.

I have downloaded relatively large files (couple hundred meg) using various GNUtella clients. But Generally, a similair thing occurs with the other GNUtella clients, in that I can every 10 files or so I try to download are actually succesful.

Also to the Gnucleus Developer: Why don't you put...

1) Segmented downloading into Gnucleus. Just like Getright does. This way people with fast connections can download stuff more quickly ;)

2) When a file has more than 1 host, it should download from one host, and at the same time keep attempting to download the same file from the various other hosts (just so it can see which one is the fastest), then if it finds any host which is faster than the current one, then it'll switch downloading to this new one.

3) I don't know if you can do this already, but it would be a very powerful thing to have. A type of script, where users could simply type a set of commands and leave their computer on over-night or for days even ;). and they would get what they're looking for.

The script format could be something like:

----script.txt---

cfg.max_hunts = 3; // Max. of three simulatenous hunts taking place.

HUNT("my hunt 1")
{
find = "any search string";
limit.size_min = 100M; // k/K affix means kilobyte, m/M affix meaning Meg etc.
limit.size_max = 500M;
limit.speed_max = 16k; // 16Kbyte max download speed.
timeout = 5m; // after 5mins of no success
retry = 1h; // retry search'n'download in 1 hour's time.
files = 3; // download 3 different files which meet the above criteria.
}

HUNT("my hunt 2");
{
find = "another search string";
}

--EOF--

Then the script could just be loaded into and ran.
In english, what the above script would do, is simply run the first three hunts in the script. There happens to be only two, so it'll just do them both. The first script called "my hunt 1", would first search for the string "any search string", with a size range of 100M to 500M. A speed range of 0KB/sec to 16KB/sec. After five minutes of no success*, abort this hunt, and retry 1 hour later.
However, if successful, then it'll download 3 different files which happen to meet this criteria. so if you leave the script running over-night, then you could wake up in the morning with three movies, of the same thing which happen to be of different size/format. - You could add an additional check called 'tollerance' specified more likely in Megabytes, which would prevent two files being to close in size. Coz a lot of the movies I've seen tend to be like say 100MB, 101MB, 105MB etc.

*success = determined as actually getting a file to begin downloading.

Also in parallel, there would be a 2nd hunt going on "my hunt 2", which is a real basic hunt, it's just a search. Now what this would do, is the other parameters would be treated as default i.e:

limit.size_min = 0;
limit.size_max = 'INFINITY';
limit.speed_min = 0;
limit.speed_max = 'INFINITY';
timeout = 'NEVER TIMEOUT';
retry = 'RETRY IMMEDIATELY'; // this wouldn't even occur, unless there is a time out, but if someone does specify timeout & no retry, then it'll retry straight after timing out.
files = 1; // It'll just download one file.

Whaddya think guys, do you think it would be kewl to have a script in Gnucleus?

ps: I would like to help out on these GNU programs. But I just don't have any spare time in my busy game dev. schedule, I can only really suggest ideas.

/Ichidan - ichidan256@yahoo.com

chr_rossi July 20th, 2001 04:41 AM

Re: downloading
 
[Is this because I need to meet specific requirements in order to download the majority of ppl's files. Such as needing to share (n) files. Or (x) bytes of data shared. Or (z) Kbytes free bandwidth.]

No.....

[But Generally, a similair thing occurs with the other GNUtella clients, in that I can every 10 files or so I try to download are actually succesful.]

Yes, it is sometimes a little bit frustrating, agreed - but it is a game of luck, and having a little patience odds are good you will get it, though. Alas, I fear the freeloader issue will not be solved soon.

I am not a developer, but it seems to me that such kind of scripts will generate a huge amount of mostly unnecessary net traffic. As far as I know, this was also base of the critique on Phex's (a good servent, too :) automated search behaviour. But I will not condemn what I haven't seen - maybe there would be a way to do this without too many annoyances for others.

Anyway, I think you can post this on sourceforge in the appropriate gnucleus forum, too. Maybe there you will get better answers....


Greetings....

Unregistered July 20th, 2001 09:47 AM

>Also to the Gnucleus Developer: Why don't you put...

Suggestions are good :)

>1) Segmented downloading...

It is a bitch to code, but will be done

>2) When a file has more than 1 host, it should download from one host, and at the same time keep attempting to download the same file from the various other hosts...

Version 1.3.4 my friend

>3) I don't know if you can do this already, but it would be a very powerful thing to have. A type of script, where users could...

I've had this idea too, bitch to code, could be done, but I'm not going to be doing it anytime soon. Only because there are more important things to implement like segmented downloading. People who want to join the project are welcome to code a scripting feature into Gnucleus.

Unregistered July 20th, 2001 09:55 AM

Segmented downloading...
 
I like this idea but worry about it's ability to work properly... does gnutella support any sort of error checking into the protocol?? and to me this feature would only be useful if the files were exactly the same otherwise you would get either corrupted MP3's or possibly two different songs... I'm all about faster downloads but not at the expense of the quality of those downloads (downloading 1 file at twice the speed is nice but when it arrives as messed up file and you got to redownload it, it just adds to the time... maybe using MD5 to CRC check everything but then the question is how do you integrate that into the gnutella protocol, cause as far as I know there are no clients that do this...

my $0.02

Unregistered July 20th, 2001 12:42 PM

Gnucleus is cool
 
Whenever gnucleus resumes a file it rolls back 4096 bytes to make sure the new file is the same as the one you already have.

Someone would have to guess all 4096 bytes correctly to give you a different file. Its like a very long key.

When segmented downloading is implemented, rollbacks will be used to ensure all the pieces match up exactly.


All times are GMT -7. The time now is 04:00 AM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.