Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   LimeWire Beta Archives (https://www.gnutellaforums.com/limewire-beta-archives/)
-   -   fix for "Awaiting Sources"? (https://www.gnutellaforums.com/limewire-beta-archives/21088-fix-awaiting-sources.html)

sdsalsero July 17th, 2003 11:42 AM

trap_jaw,
Just so you know, I really do appreciate your efforts to respond to all the requests people make of LW.

Having said that, what problem is there with my idea of transforming Awaiting Sources into, e.g. "2 Sources off-line, retrying in 60min" ?

I'm using 3.3.0-Beta, and I still find the majority of my download requests at Awaiting after 10-15 minutes. Force Resume then puts most of them back to Busy or Downloading.

trap_jaw4 July 17th, 2003 01:04 PM

Quote:

Originally posted by sdsalsero
Having said that, what problem is there with my idea of transforming Awaiting Sources into, e.g. "2 Sources off-line, retrying in 60min" ?

It's nothing wrong with it. I just don't see that it does much good. It may help you complete one download out of ten or twenty that fail (if you are lucky) but it is not solution for the problem that some connections frequently fail. Those hosts aren't just instable for a few minutes, they are instable all the time, as it seems.

sdsalsero July 17th, 2003 06:26 PM

Fine, so there's a lot of unstable hosts. Why not automate the required workaround?

stief July 17th, 2003 07:03 PM

I think you're right about "awaiting" needing some work sdsalsero.

Still, whatever "automated requeries" are called, more efficient gnutella messages are needed. At 15KB/s, I use up 500 MB of bandwidth as an UP in about 10 hrs (not sure if the stats include compression)--that's not including http traffic up and down. My ISP gives me a rough limit of 1 GB/day without too much hassle.

I could disable UP--but more UP's are needed, no? If gnutella could be more efficient in connecting searches and downloads, I'd be all in favour of automating the connections.

[btw--I'm getting double notifications of posts--and resetting profile hasn't helped]

trap_jaw4 July 17th, 2003 07:20 PM

You do not want to flood hosts with connection requests that really don't accept any connections. I've already read complaints from user who haven't run a Gnutella servent in days or even weeks that they are still hammered with connection requests.
In addition, a malicious attacker could spoof addresses from the download mesh & query hits to use Gnutella as a DDoS tool.
If a host is already overloaded, more connection request won't improve the download stability. On the contrary, the host will have even less bandwidth available to upload and its connections will be even less stable.

These are just a few reasons, why you have to be careful about retrying unresponsive hosts.

There have to be other solutions, to improve connection stability than hammering them.

Blackbird July 17th, 2003 08:07 PM

Stief:

A bit off topic...I do about 2.7 GB/day upstream as an ultrapeer. I did not know that ISPs have upstream and downstream limits for individual users (I know they do for businesses). Any idea how I can find that out?

sdsalsero July 17th, 2003 10:42 PM

Blackbird,
If your ISP hasn't complained, don't ask or bring their attention to it!

Trap_jaw,
I'm a network admin so I appreciate the concerns re network "elegance". However, an extra ping or two every hour from the dozen or so requestors that might be trying to get a popular file aren't going to overwhelm anybody. Now, obviously, there should be some reasonable limit on how long the s/w continues to try. Maybe set it the same as the Days To Keep Incompletes? That way, people could control it. Or, just default it to a week? That doesn't seem excessive to me. Insisting that end-users continue to hit Force Resume and/or Repeat Search is turning us into lab rats, continually hitting the button for food (or was it pleasure?). :-)

Blackbird July 17th, 2003 11:20 PM

Well, I already sent an e-mail to them, so too late.

trap_jaw4 July 18th, 2003 02:48 AM

Quote:

Originally posted by sdsalsero
However, an extra ping or two every hour from the dozen or so requestors that might be trying to get a popular file aren't going to overwhelm anybody.
One or two requests per hour don't hurt, that's right. The assumption that there are only a dozen requestors is a little optimistic. When I shut down LimeWire, I will get 500-2000 incoming connection requests per hour for quite a while (usually until my ISP resets my IP which happens every 24 hours).

Quote:

Now, obviously, there should be some reasonable limit on how long the s/w continues to try. Maybe set it the same as the Days To Keep Incompletes? That way, people could control it. Or, just default it to a week? That doesn't seem excessive to me.
I would be very careful about that. The users with static IPs are probably a minority. Besides I think very few people have the patience to wait a whole week.

Quote:

Insisting that end-users continue to hit Force Resume and/or Repeat Search is turning us into lab rats, continually hitting the button for food (or was it pleasure?). :-)
Well, the repeat search button was necessary because all requeries were banned to save the network. Maybe, automated requeries come back in the next version.


All times are GMT -7. The time now is 07:24 AM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.