Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   Download/Upload Problems (https://www.gnutellaforums.com/download-upload-problems/)
-   -   Requiries & searches (https://www.gnutellaforums.com/download-upload-problems/11758-requiries-searches.html)

Unregistered May 25th, 2002 01:14 PM

Requiries & searches
 
For a couple weeks now I've noticed that most files i try to download become "Requiry sent, waiting XXXX for download", and permenantly stay that way. But lately I've found out that by running new searches SOME of these files start to download again........So basically my question is, What the hell is the point of the Requiry if it doesn't work?, if a download is dropped why doesn't it say sometime like "Download stopped, please re-search"....rather then giving you the false hope that i'll come back to life again, I remember that it once worked a while ago, but if I'm gunna have to babysit the computer and constantly re-search for those things i want, it isn't worth it. Also, why is it that the "force resume" no longer works for the "requiry sent" files?

Will this be something corrected in new version??

Unregistered May 25th, 2002 09:18 PM

I agree...
 
some frustrating - eom - where is Napster or Bearshare...

Unregistered May 27th, 2002 08:47 PM

Requery sent, what the HELL?? ;((
 
This is something I do not get...why does it do this? Is it something they can fix for me? Or, when can it start working? I am sick of this [EDIT FOUL LANGUAGE] and-stuff!

Treatid May 28th, 2002 11:59 AM

Nothing needs to be fixed.

LimeWire will periodically re-search (requery) for new hosts to download from. There is no need to do anything manually or to otherwise babysit LimeWire.

However, if you manually do a search which finds new hosts then LimeWire will make use of those new hosts.

Mark

AlienPenguin May 30th, 2002 09:01 AM

as a matter of fact u *do* need to babysit limewire... otherwise u will end having lots of different incomplete copies of the same file...differing in just the name, in fact every requery bring the possibility of re-starting a download that doesnt neeed to.

Treatid May 30th, 2002 11:02 AM

LimeWire will try to get the file for you - if it can't find a host it needs it will start a new download - but it won't forget about the first part.

So - yes you can get lots of partial downloads of a file. But if LimeWire restarts a download it is because it cannot find an original host - in which case LimeWire *does* need to restart.

I'm not sure how you think baby-sitting will improve this - except that you can delete some of the partials which will mean that on average it takes longer to get that file.

Of course, if you want to fiddle - that's fine. But some fiddling gets in the way of the automagic features.

Mark

AlienPenguin May 30th, 2002 03:39 PM

yes, but when downloading *large* file like divx movies u end with gigabytes of useless data. Also, if u manually do a requery u often can find exact matches for the name -maybe new clients- (i dunno why automatic requery doesn't). I hope that with hashing all this will go away :-)

efield May 31st, 2002 09:16 AM

Deleting the partially downloaded files in the Incomplete folder may not be a good idea. I think someone else on this board said that if LimeWire fails to find a partial file on resume, etc. the download.dat file is cleared. Any other downloads might then be interrupted.

Unregistered June 1st, 2002 11:03 AM

but wasn't the point of the "re-query" to look up new hosts? or at least to continue the ones from the old host.......then why is it that you can sit there for hours on end and have nothing download, but once you use one of the "tricks" (closing and restarting limewire, or simply running new searches), that some of these downloads come alive, and start progressing? doesn't this suggested that the "re-query" isn't fullfilling it's purpose?

AlienPenguin June 12th, 2002 08:28 AM

that was exactly my point.... the requery works but is not very effective... i know that searching every 2 seconds would clog the net.... but there should be a reasonable timeout.... i could report an 80% cases in which simply re-search makes the download start again

Joakim Agren June 12th, 2002 10:20 AM

Hello!

This is how requery sent works:

If you find a file on a host and initiate a download procedure but that host has disconnected or closed LW's connection to get a better Gnutella network connection by continously disconnect and reconnect to get connected to hosts running LW and not Morpheus or Gnucleus it will result in the Requery sent message and LW will now initiate a process of continuosly checking when that host is once again reachable and not beyond your reach.The success of the download depends on if that host has a Static IP or a Dynamic one if it is Static then ofcourse it will start to download when that host reconnects to the Internet if it is dynamic it will fail ofcourse.But there is a couple of more functions inbedded in this function and that is that about once every 45 minutes LW will check for potentionally new hosts that has the same file by perform a hidden search and then from the result list compare name and file size and if it finds a file that matches almost exactly it will attempt a download from that source.But you can speed that process up and that is if you manually initiate a search and if one of the files in the result list mathces the requery it will attempt a download from that source which sometimes is exactly the same source as before or sometimes a completely new host.So the Requery Sent function is not the same as the busy signal the Requery Sent function is only to be used when you want a file really badly and it is supposed to be used as a most wanted function and it is time consuming but it is atleast more honest then in previous versions of LW where you almost always got the busy signal even if the host was not there.And also do not use the Requery sent function for very large 700MB files like DivX movies because it can result in very large incomplete folders.


All times are GMT -7. The time now is 09:41 AM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.