Gnutella Forums  

Go Back   Gnutella Forums > Current Gnutella Client Forums > LimeWire+WireShare (Cross-platform) > Technical Support > Download/Upload Problems
Register FAQ The Twelve Commandments Members List Calendar Arcade Find the Best VPN Today's Posts

Download/Upload Problems Problems with downloading or uploading files through the Gnutella network.
* Please specify whether the file problem is a Gnutella network shared file OR a Torrent file. *


Welcome To Gnutella Forums

You are currently viewing our boards as a guest which gives you limited access to view most discussions and access our other features. By joining our free community you will have access to post topics, communicate privately with other members (PM), respond to polls, upload content, fun aspects such as the image caption contest and play in the arcade, and access many other special features after your registration and email confirmation. Registration is fast, simple and absolutely free so please, join our community today! (click here) (Note: we use Yandex mail server so make sure yandex is not on your email filter or blocklist.)

If you have any problems with the Gnutella Forum registration process or your Gnutella Forum account login, please contact us (this is not for program use questions.) Your email address must be legitimate and verified before becoming a full member of the forums. Please be sure to disable any spam filters you may have for our website, so that email messages can reach you.
Note: Any other issue with registration, etc., send a Personal Message (PM) to one of the active Administrators: Lord of the Rings or Birdy.

Once registered but before posting, members MUST READ the FORUM RULES (click here) and members should include System details - help us to help you (click on blue link) in their posts if their problem relates to using the program. Whilst forum helpers are happy to help where they can, without these system details your post might be ignored. And wise to read How to create a New Thread

Thank you

If you are a Spammer click here.
This is not a business advertising forum, all member profiles with business advertising will be banned, all their posts removed. Spamming is illegal in many countries of the world. Guests and search engines cannot view member profiles.



           Deutsch?              Español?                  Français?                   Nederlands?
   Hilfe in Deutsch,   Ayuda en español,   Aide en français et LimeWire en françaisHulp in het Nederlands

Forum Rules

Support Forums

Before you post to one of the specific Client Help and Support Conferences in Gnutella Client Forums please look through other threads and Stickies that may answer your questions. Most problems are not new. The Search function is most useful. Also the red Stickies have answers to the most commonly asked questions. (over 90 percent).
If your problem is not resolved by a search of the forums, please take the next step and post in the appropriate forum. There are many members who will be glad to help.
If you are new to the world of file sharing please do not be shy! Everyone was ‘new’ when they first started.

When posting, please include details for:
Your Operating System ....... Your version of your Gnutella Client (* this is important for helping solve problems) ....... Your Internet connection (56K, Cable, DSL) ....... The exact error message, if one pops up
Any other relevant information that you think may help ....... Try to make your post descriptive, specific, and clear so members can quickly and efficiently help you. To aid helpers in solving download/upload problems, LimeWire and Frostwire users must specify whether they are downloading a torrent file or a file from the Gnutella network.
Members need to supply these details >>> System details - help us to help you (click on blue link)


Moderators

There are senior members on the forums who serve as Moderators. These volunteers keep the board organized and moving.
Moderators are authorized to: (in order of increasing severity)
Move posts to the correct forums. Many times, members post in the wrong forum. These off-topic posts may impede the normal operation of the forum.
Edit posts. Moderators will edit posts that are offensive or break any of the House Rules.
Delete posts. Posts that cannot be edited to comply with the House Rules will be deleted.
Restrict members. This is one of the last punishments before a member is banned. Restrictions may include placing all new posts in a moderation queue or temporarily banning the offender.
Ban members. The most severe punishment. Three or more moderators or administrators must agree to the ban for this action to occur. Banning is reserved for very severe offenses and members who, after many warnings, fail to comply with the House Rules. Banning is permanent. Bans cannot be removed by the moderators and probably won't be removed by the administration.


The Rules

1. Warez, copyright violation, or any other illegal activity may NOT be linked or expressed in any form. Topics discussing techniques for violating these laws and messages containing locations of web sites or other servers hosting illegal content will be silently removed. Multiple offenses will result in consequences. File names are not required to discuss your issues. If filenames are copyright then do not belong on these forums & will be edited out or post removed. Picture sample attachments in posts must not include copyright infringement.

2. Spamming and excessive advertising will not be tolerated. Commercial advertising is not allowed in any form, including using in signatures.

3. There will be no excessive use of profanity in any forum.

4. There will be no racial, ethnic, or gender based insults, or any other personal attacks.

5. Pictures may be attached to posts and signatures if they are not sexually explicit or offensive. Picture sample attachments in posts must not include copyright infringement.

6. Remember to post in the correct forum. Take your time to look at other threads and see where your post will go. If your post is placed in the wrong forum it will be moved by a moderator. There are specific Gnutella Client sections for LimeWire, Phex, FrostWire, BearShare, Gnucleus, Morpheus, and many more. Please choose the correct section for your problem.

7. If you see a post in the wrong forum or in violation of the House Rules, please contact a moderator via Private Message or the "Report this post to a moderator" link at the bottom of every post. Please do not respond directly to the member - a moderator will do what is required.

8. Any impersonation of a forum member in any mode of communication is strictly prohibited and will result in banning.

9. Multiple copies of the same post will not be tolerated. Post your question, comment, or complaint only once. There is no need to express yourself more than once. Duplicate posts will be deleted with little or no warning. Keep in mind a forum censor may temporarily automatically hold up your post, if you do not see your post, do not post again, it will be dealt with by a moderator within a reasonable time. Authors of multiple copies of same post may be dealt with by moderators within their discrete judgment at the time which may result in warning or infraction points, depending on severity as adjudged by the moderators online.

10. Posts should have descriptive topics. Vague titles such as "Help!", "Why?", and the like may not get enough attention to the contents.

11. Do not divulge anyone's personal information in the forum, not even your own. This includes e-mail addresses, IP addresses, age, house address, and any other distinguishing information. Don´t use eMail addresses in your nick. Reiterating, do not post your email address in posts. This is for your own protection.

12. Signatures may be used as long as they are not offensive or sexually explicit or used for commercial advertising. Commercial weblinks cannot be used under any circumstances and will result in an immediate ban.

13. Dual accounts are not allowed. Cannot explain this more simply. Attempts to set up dual accounts will most likely result in a banning of all forum accounts.

14. Video links may only be posted after you have a tally of two forum posts. Video link posting with less than a 2 post tally are considered as spam. Video link posting with less than a 2 post tally are considered as spam.

15. Failure to show that you have read the forum rules may result in forum rules breach infraction points or warnings awarded against you which may later total up to an automatic temporary or permanent ban. Supplying system details is a prerequisite in most cases, particularly with connection or installation issues.

Violation of any of these rules will bring consequences, determined on a case-by-case basis.


Thank You! Thanks for taking the time to read these forum guidelines. We hope your visit is helpful and mutually beneficial to the entire community.


Reply
 
LinkBack Thread Tools Display Modes
  #1 (permalink)  
Old April 21st, 2003
Gnutella Muse
 
Join Date: December 19th, 2001
Posts: 173
sdsalsero is flying high
Exclamation no more auto-requeries?!

Do I understand correctly, that the new "Could Not Download; Awaiting Sources" message means that LW is not automatically re-doing the search for that file? If so, that's just ridiculous! The only reason I leave a failed download in my download window is because I'm still hoping to find it. The whole point of computers is to do things for us, i.e., labor-saving. If I have to manually re-do the search, that's no benefit to me. I can't stay home all day and click Repeat Search...

I understand the need to save bandwidth but disabling one of the main features is just totally misguided.

Here's some alternative suggestions:

1. Group and compress communications between Ultrapeers, e.g. wait 1.0 seconds before passing-along requests and then use an open-source zip function to reduce bandwidth.

2. "Expose" the Repeat Search parameters of each incomplete download, i.e., ability to right-click a file and see what known source IPs exist(ed) and what search term(s) generated the original download. If you've been searching for a file for a long time, you'll probably have developed a list of source IPs that have the file but who aren't on-line most of the time. If you can add those IPs to the file's "Source IPs" list, the computer can automatically check them every 5 minutes (or whatever).

Finally, were auto-requeries (before they were cancelled!) doing a search for the exact filename or were they doing a search on the original search term? If the former- DUH! There's your problem! Every failed download should maintain a copy of the original search term and use that instead of the particular filename. The reasons: a) it'll be shorter and save bandwidth; b) it'll find more potential hits!
Reply With Quote
  #2 (permalink)  
Old April 21st, 2003
Joakim Agren's Avatar
Trouble Shooter
 
Join Date: June 4th, 2002
Location: Örebro Sweden
Posts: 366
Joakim Agren is flying high
Cool Re: no more auto-requeries?!

Quote:
Originally posted by sdsalsero
Do I understand correctly, that the new "Could Not Download; Awaiting Sources" message means that LW is not automatically re-doing the search for that file? If so, that's just ridiculous! The only reason I leave a failed download in my download window is because I'm still hoping to find it. The whole point of computers is to do things for us, i.e., labor-saving. If I have to manually re-do the search, that's no benefit to me. I can't stay home all day and click Repeat Search...

I understand the need to save bandwidth but disabling one of the main features is just totally misguided.


Yes it means that you manually have to search for new sources for the file. The requeries where considerd by the LW developers to be an evil function for Gnutella since it increased traffic but really never worked as advertised. LW used to send a requery out to find alternate locations for the file once every 50 minutes but most of the times it did not find any and the requery sent status could last for days. Clearly a waste of bandwidth for nothing. The new 2.9X versions will drop all requery messages in an attempt by the LW team to stop other vendors from using it to in future versions. Requeries are no good to Gnutella!. This was clearly not one of the main functions of LimeWire as you suggests.

Quote:

Here's some alternative suggestions:

1. Group and compress communications between Ultrapeers, e.g. wait 1.0 seconds before passing-along requests and then use an open-source zip function to reduce bandwidth.


I think that some kind of message compression technique is currently beeing worked on by the LimeWire team.

Quote:

2. "Expose" the Repeat Search parameters of each incomplete download, i.e., ability to right-click a file and see what known source IPs exist(ed) and what search term(s) generated the original download. If you've been searching for a file for a long time, you'll probably have developed a list of source IPs that have the file but who aren't on-line most of the time. If you can add those IPs to the file's "Source IPs" list, the computer can automatically check them every 5 minutes (or whatever).


Since a vast majority of users on Gnutella uses a dynamic IP this would not work effectivly. And also bandwidth consuming.
Quote:

Finally, were auto-requeries (before they were cancelled!) doing a search for the exact filename or were they doing a search on the original search term? If the former- DUH! There's your problem! Every failed download should maintain a copy of the original search term and use that instead of the particular filename. The reasons: a) it'll be shorter and save bandwidth; b) it'll find more potential hits!


This could work if all people on Gnutella would use specific search terms such the name of a particular song. But since the original search term is often for instance just an artist name then the result list will be slightly larger then for just that particular song so I think that this function would increase traffic on Gnutella slightly due to more query hits if the Requerys with this function added would have been implemented. I think that the requeries where by a specific Filename not search term used.

I think the the Requerys is a thing of the past now!.
__________________
<img src="http://www.jordysworld.de/emoticons/blob16.gif">Sincerely Joakim Agren!
Reply With Quote
  #3 (permalink)  
Old April 21st, 2003
A reader, not an expert
 
Join Date: January 11th, 2003
Location: Canada
Posts: 4,613
stief has a spectacular aura about
Default

sdsalsero, FWIW, I have been able to leave LW 2.9.8.2 unattendend for 6-8 hours and some files have completed. I do have to spend a lot of time setting up a blocked host list (to filter the spam results that mask good sources), and try to build up a list of good alternates by using narrow searches.

Yeah, I'd like to see automatic requeries back if they couldn't be abused by spammers who flood the network with spurious results. Maybe requeries for narrow searches (less than 10 results) as a decreasing average of previous manual searches?

I think Acq .84 has temporarily brought back limited automatic requeries, but I think newer Ultrapeers have some logic to block clients that use automatic requeries.
Reply With Quote
  #4 (permalink)  
Old April 21st, 2003
Gnutella Muse
 
Join Date: December 19th, 2001
Posts: 173
sdsalsero is flying high
Default

Thanks for the replies, guys.

I understand the need to reduce bandwidth waste but, really, I hope there is a way to improve the requery. For instance, maybe maintain a copy of the original query and then requery along with a size or file-hash? That way, the only hits would be the same file.

As for blocking requeries, are they labeled as such? They must be else the Ultrapeers couldn't differentiate and block them.

I've been a paying support of LW since day 1 (i'm on my 3rd subscription now) but this loss of functionality is really really pissing me off...
___________________

stief, what's this 2.9.8.2 version? I'm running LW 2.9.8-Pro on W2K, and it's a copy I downloaded within 24-48 hrs of its availability.
Reply With Quote
  #5 (permalink)  
Old April 21st, 2003
A reader, not an expert
 
Join Date: January 11th, 2003
Location: Canada
Posts: 4,613
stief has a spectacular aura about
Default

re 2.9.8.2

an anonymous poster "thebigname" posted it late last week. I'm not sure what improvements it has, but I think it's related to some of the tweaks trap_jaw mentioned. Here's the link (no pro avail) http://www9.limewire.com:82/download/ I'll look up the original post and edit the link back here.

http://www.gnutellaforums.com/showth...=&postid=68802

[sorry about the editing hassle--couldn't format the link properly]

If you try this one, be prepared for a lot of could not move to lib errors

Last edited by stief; April 21st, 2003 at 07:27 PM.
Reply With Quote
  #6 (permalink)  
Old April 22nd, 2003
Distinguished Member
 
Join Date: September 21st, 2002
Location: Aachen
Posts: 733
trap_jaw is flying high
Default

Quote:
Originally posted by sdsalsero
I understand the need to reduce bandwidth waste but, really, I hope there is a way to improve the requery. For instance, maybe maintain a copy of the original query and then requery along with a size or file-hash? That way, the only hits would be the same file.
Statistically requeries always were a complete waste of bandwidth. 95% did not even return any results. Query-by-hash is the most inefficient thing you can do in gnutella at the moment. It means sending a query to thousands of peers with a very, very low probability that anybody has this particular file. If more people had that file, you would get those as alternate locations while downloading.

But it's not just that requeries are inefficient, they were abused by other vendors. The network load was unacceptable so they had to be removed.

Quote:
As for blocking requeries, are they labeled as such? They must be else the Ultrapeers couldn't differentiate and block them.
LimeWire's requeries are labeled as such. But there are a couple of other ways to identify requeries, because they often contain the full filename or the hash of a file.

Quote:
I've been a paying support of LW since day 1 (i'm on my 3rd subscription now) but this loss of functionality is really really pissing me off...
There is no other way, if some people can't play by the rules. There are clients out there, that send requeries every five seconds. And if you want to ban their requeries, you have to ban all requeries.
__________________
Morgens ess ich Cornflakes und abends ess ich Brot
Und wenn ich lang genug gelebt hab, dann sterb ich und bin tot

--Fischmob
Reply With Quote
  #7 (permalink)  
Old April 22nd, 2003
Gnutella Muse
 
Join Date: December 19th, 2001
Posts: 173
sdsalsero is flying high
Default

Baby? Bathwater?!

Trap_jaw, I know you're not one of the developers but please post the following suggestions on the developer mailing-list:

1. The ability to continue searching automatically for files is a BASIC feature of any P2P app. If they're unable to design an intelligent requery function, they're going to lose their audience.

2. For every file left in requery mode, keep a record of every non-firewalled IP that has ever been reported with the file and periodically request the file anew from those IPs. This will not waste any Gnutella bandwidth since it's a direct request! Also, allow the enduser to right-click each file and edit the IP list, i.e., if you've been manually keeping a list. No, I don't think that the existence of dynamic IPs invalidates this function!

3. Keep track of leaf-node's requeries and throttle them. This could become part of the G2 protocol, e.g. "Do not requery more often than every 120 seconds."

4. If you're seeing a lot of unsuccessful requeries, why not try to improve them? If you assume that requeries are a required function, then making them more effective will reduce bandwidth. Requerying for the original search-term plus the filesize would be more likely to 'hit' than searching for the exact filename. Again, this could become part of the G2 protocol, e.g. "no 'filename' over 20 char and no filehashes in requeries."

5. Don't throttle or block manual requeries! (right-click on search tab, select Repeat Search)

(Thank you...)

Last edited by sdsalsero; April 22nd, 2003 at 07:53 AM.
Reply With Quote
  #8 (permalink)  
Old April 22nd, 2003
Distinguished Member
 
Join Date: September 21st, 2002
Location: Aachen
Posts: 733
trap_jaw is flying high
Default

Quote:
Baby? Bathwater?!
An ultrapeer can only handle so many queries per second. If a certain value of queries is reached queries will be dropped, meaning they won't return any results. Even worse, it meant that search results have to be dropped.

This discussion already happened on the development mailinglist. Until / unless some sort of distributed hash lookup table is created, there will be no more requeries with LimeWire.

Quote:
1. The ability to continue searching automatically for files is a BASIC feature of any P2P app. If they're unable to design an intelligent requery function, they're going to lose their audience.
Napster didn't have it, Kazaa doesn't have it (you have to choose click 'find more sources' to do a requery - however some Kazaa-hacks do have it). Audiogalaxy didn't have it either.

Quote:
2. For every file left in requery mode, keep a record of every non-firewalled IP that has ever been reported with the file and periodically request the file anew from those IPs. This will not waste any Gnutella bandwidth since it's a direct request! Also, allow the enduser to right-click each file and edit the IP list, i.e., if you've been manually keeping a list. No, I don't think that the existence of dynamic IPs invalidates this function!
LimeWire already doesd this. - LimeWire keeps track of all clients that ever downloaded the file and tells every other client requesting a file, who else to ask for the file. It's called download mesh. It works spectacularly with partial-file-sharing, a feature that will be implemented in the near future (probably not 3.0 but maybe in v3.1). Of course LimeWire could be more aggressive, e.g. retrying hosts that did not respond but that does not increase the performance very much.

Quote:
3. Keep track of leaf-node's requeries and throttle them. This could become part of the G2 protocol, e.g. "Do not requery more often than every 120 seconds."
LimeWire does not plan on implementing G2. Besides any requery-frequency lower than 1-2 requeries per hour (for ten downloads it would take five to ten hours to requery all of them) has proven to be harmful. There are also some clients (QTrax, although it is not available anymore) that would frequently drop the ultrapeer connections to defeat any such protection. It's much safer to drop all requeries.

Quote:
4. If you're seeing a lot of unsuccessful requeries, why not try to improve them? If you assume that requeries are a required function, then making them more effective will reduce bandwidth. Requerying for the original search-term plus the filesize would be more likely to 'hit' than searching for the exact filename. Again, this could become part of the G2 protocol, e.g. "no 'filename' over 20 char and no filehashes in requeries."
LimeWire was never requerying for the exact filename. However, the fuzzier the search the more unwanted results the more wasted bandwidth. Searching for the SHA-1 hash alone was about as exact as you can get, using broadcast search. It would return all results that could be used to resume a download and only those.

Quote:
5. Don't throttle or block manual requeries! (right-click on search tab, select Repeat Search)
It's impossible to distinguish between manual and automatic requeries. 'Repeat Search' is de facto not requery but a completely new query for the same keywords.
__________________
Morgens ess ich Cornflakes und abends ess ich Brot
Und wenn ich lang genug gelebt hab, dann sterb ich und bin tot

--Fischmob
Reply With Quote
  #9 (permalink)  
Old April 22nd, 2003
Gnutella Muse
 
Join Date: December 19th, 2001
Posts: 173
sdsalsero is flying high
Thumbs up

OK, I'm satisfied that this wasn't done lightly! And thanks for clarifying that a manual use of Repeat Search isn't being discarded.
Reply With Quote
  #10 (permalink)  
Old April 24th, 2003
Apprentice
 
Join Date: April 24th, 2003
Location: CO
Posts: 7
rockhaus is flying high
Default

In addition to my problems I've described elsewhere, I'm very dissapointed that in 2.9.8 a half-done file never resumes.

After uninstalling and reinstalling 2.9.8 (in order to troubleshoot), I'm finding no improvements to my problems, and the 10.2.5/2.9.8 combo genernally very inefficient.

If I restore a previous LMP version, I expect the same awesome experience as before, but if that's going to 'hurt others', I won't do it.

Thoughts?

-r
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
Auto populate Mitch34 General Windows Support 2 November 18th, 2003 03:59 AM
Acquisition actually downloads files - LimeWire just "Requeries" jannuss General Mac OSX Support 15 February 6th, 2003 10:30 PM
What's this I see? Auto Re-searching? Elusive Gnucleus (Windows) 1 June 20th, 2002 08:23 PM
Auto-requeries! Morgwen General Gnutella / Gnutella Network Discussion 22 May 31st, 2002 11:18 PM
"Requeries sent" (From Hell????) Chs Download/Upload Problems 1 March 15th, 2002 08:15 AM


All times are GMT -7. The time now is 01:50 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.