Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   Gnucleus (Windows) (https://www.gnutellaforums.com/gnucleus-windows/)
-   -   Gnucleus do what it wants ... (https://www.gnutellaforums.com/gnucleus-windows/2835-gnucleus-do-what-wants.html)

Unregistered August 6th, 2001 02:08 AM

Gnucleus do what it wants ...
 
Hi ,
when I use gnucleus (Best Gnutella client :) , it changes some properties ...for example before ver 1.3.5.0 , it was able to wait only 5 sec. before reconnect for a tansfer .Now it ' s always 30 sec , even i put another value in properties . An other strange thing ...When i want to stay connected to 5 nodes (min&max) , sometimes it changes for other value like 3 (often ...)
So does somebody know how to wait only 5 sec. before try transfer again and make other values constant .Thx.

swabby August 6th, 2001 10:17 AM

The unchangable 30 second retry is a bug and will be fixed next version.

If bandwidth limits are applied and gnucleus uses more bandwidth than you allotted it then nodes will be dropped and the preferences will be changed to reflect it. I should probably think of a better way to implement this.

Unregistered August 6th, 2001 12:09 PM

Hi ,
Ver. 1.3.5.0 does not attempt to reconnect at all on my machine
(Win2K) . The countdown goes as expected ,then just stops with
Pending, 3 Hosts Found (30,29,28,.......3,2,1 Pending,).
The auto re-search function seems to work great . I dont like the
dynamic sorting of search finds , this makes it harder to scan your
finds until the search is completely finished, before the change your
sort prefrences could be applied and scanned from the top of the
list to the bottom, without the risk of a late return being placed in
the top part of the list unnoticed, since all latecomers were placed
at the bottom of the sorted list. The dynamic sorting be an option,
but with the option to disable it.
The partial file Resume function works flawlessly for me. I have
resumed some very large files a dozen times without ever getting
any corrupted files.
Stability isn't a problem on Win2K but the Upload Bandwidth limit
function causes an almost immediate crash when enabled. This
occurs with ver. 1.3.1.0 which I am using now, because of the
failure of the retry after ? sec. function in ver 1.3.5.0 .
Is it possible upgrade Gnucleus without evolving all the way to
ver. 1.3.5.0 ? Are past releases available for download anywhere?
Even using ver. 1.3.1.0 Gnucleus is still hands-down, the best
gnutella client available. Thanks

swabby August 6th, 2001 01:31 PM

Theres a bug in 1.3.5 that if you dont have a download limit set, it will not try to download files. I will release a beta soon so get that and it will be fixed.

I've been some having some problems with resuming, anyone else? sometimes the file is corrupted when it really shouldnt be since I was just downloading from that server a second ago.

Stability should be very good in this version, especially for serving files for extended amounts of time, the beta should introduce some better logging of uploads so memory isnt hogged. Limiting upload bandwidth I have to check out because I think it is broken. You can downgrade your version at the gnucleus page on sourceforge but I dont suggest it since 1.3.5 cleans sockets up after itself which is good for other internet programs running on your computer.

Unregistered August 6th, 2001 01:55 PM

Hi (again ..., I open this thread)

I thank you , Swabby .I am very pleased to have answer directly from developer...So i ll wait (Gnucleus is really worth it) .But please don ' t abandon the bandwith limit for my network health ;)

SRL August 6th, 2001 05:10 PM

I've had the occational corrupt file too. I think trash sometimes gets added to the end of the file - if I delete the last few k from the file and try again, it usually works.

Unregistered August 6th, 2001 09:24 PM

Reply to Swabby from 2nd unregistered poster above:

Hi, Swabby

I specified a download limit and it solved my problem with 1.3.5
Thanks for the prompt reply.

I am very surprised You are having any problems at all resuming!
My experience has been limited mostly to large mpg video files
(30- 600MB) where resumeability is a must. The only time I have
ever found a file that could not be resumed was in 1 instance in
which I inadvertantly let my C: drive run completely out of free
space during a download. Gnucleus notified me of my error right
before it crashed and I consider that to be complimentary, given
the circumstances. The file being downloaded at the time was
not resumeable although Gnucleus did attempt to resume the
download it reported a corrupted download file in the "extended
info" upon connection and wouldn't finish the file thereby saving
the wasted time and bandwidth of resuming what would have
obviously been a corrupted finished file. I have sucessfully resumed
downloads that were interrupted due to network disconnections,
Gnucleus crashes( yes , I tried to limit upload bandwidth ) , and
even due to power failures (no UPS). IMHO the unequalled resume
capabilities of Gnucleus set it apart from all the other gnutella clients.

Unregistered August 6th, 2001 09:42 PM

Question for SRL-

That's a good idea - delete the last few k. How do you do that?

SRL August 7th, 2001 06:01 PM

You had to ask huh? ;-)

Well... I'm sure there's easier ways, but I use this perl script...

+++++++++++++++++++++++++++

use Win32::Clipboard;

## remove 2048 bytes from end of file ##
my($tval)=2048;

## use filename from clipboard ##
my($clp) = Win32::Clipboard();
my($fn) = $clp->Get();

($dev,$ino,$mode,$nlink,$uid,$gid,$rdev,$size,
$atime,$mtime,$ctime,$blksize,$blocks)
= lstat($fn);

my($nsize)=$size-$tval;

print "$fn\n\n";
print "$size->$nsize\n";

open(MFILE,"+<$fn") || die "Can't open file\n";
truncate(MFILE, $nsize) || die "Can't truncate\n";
close(MFILE);
+++++++++++++++++++++++++++

It takes the filename from the clipboard and removes $tval bytes from the end (I think the Win32::Clipboard() module was an add-on to perl BTW).

&lt;smart tag&gt; August 8th, 2001 12:51 PM

Problems downloading? Oh yes!
 
--
Swabby:
I've been some having some problems with resuming, anyone else?
--

You could say that again, yes. I've had nothing but problems with resuming. See my earlier posts.

It looks like I got a totally different version of 1.3.5.0 than everyone else which seems to brag about how easy it is to resume download of partially downloaded files.

I have not been able to make it work except for one time.
I am not only talking about downloading but also trying to find clients which got the files. The result is allways "Searching, 0 hosts found" and "Unable to connect"(because of 0 hosts).


All times are GMT -7. The time now is 01:44 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.