Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   New Feature Requests (https://www.gnutellaforums.com/new-feature-requests/)
-   -   More efficient searches and downloads (3 ideas) (https://www.gnutellaforums.com/new-feature-requests/30260-more-efficient-searches-downloads-3-ideas.html)

hello there everybody November 21st, 2004 04:45 AM

More efficient searches and downloads (3 ideas)
 
Noticed that the system doesnt have an automatic system to refresh downloads, (i.e. one that finds more sources automatically). How about a system that automatically refreshes your searches say every hour, so that you dont have to babysit longer downloads and the effect of slow down on these longer downloads is reduced.

Also, could there be an option to set up an exclusive link with another individual on the network ( eg for uploads). This would make it easier for two to share a file that may be particularly popular, and would otherwise generate a lot of interest from other users. I understand that you can reduce the ul slots to one, but you still have others queuing, would an exclusive link necessarily help to maximise the rate of a download as presumed? Surely it would at least make it easier for the person known to connect immediately to your file.

In the way that limewire recognises a particlar file that you are already downloading in a search, would it not be possible to conduct searches looking solely for the filr that you are already downloading, sort of like a "super filter"?

stief November 21st, 2004 05:51 AM

automatic searches are banned (seriously discouraged) on the gnutella network.

They can flood the network and prevent regular searches and transfers, so the developers had to give up this very popular and attractive feature.

hello there everybody November 21st, 2004 09:55 AM

Automatic searches
 
Yeah but if the searches were automatically registered at intervals of an hour, surely this would not "flood" the system as an individual could manually generate many more searches in the same space of time.

As I see it, this would not be such a problem but would aid the user in the sense that they would not have to babysit. You could also ensure that the programme does not auto search once the download is complete, so as to minimise network traffic.

stief November 21st, 2004 10:30 AM

The developers tried the one hour interval, but still no good. They even tried only ONE of the downloads once every hour, and it was still too much.

really--this is such an obvious and desirable feature--the devs have been trying and trying to find a workable solution that can be used by everyone. So far, any attempts to try a solution just cause a flood and the developers quickly start threatening to ban each other's clients.

hello there everybody November 21st, 2004 11:13 AM

Automatic searches
 
What if the was based on a non-time based factor, such as if the search was triggered by the download being held in queue or stagnant on for an extended period??

Something that would not happen in the instance of every download, but only when required, would this still be too much?

stief November 21st, 2004 12:22 PM

I wish I knew. All these ideas sound reasonable if they would work in practise. The major developers, like LimeWire and BearShare, who have the tools to measure the impact on the network, are trying either workarounds or new ideas. You can follow the discussions about how the gnutella protocol is being constantly revised by reading the daily (or archived) posts on http://groups.yahoo.com/group/the_gdf/

btw--I like the idea of watching when the machine has had no mouse activity for a few hours, and then doing a "repeat search" for any pending downloads.

Currently, though, if enough sources have been found for a file, LW will continue trying them until all are tried. This usually means that a file left unattendended overnight will complete without babysitting. The key is to repeat search a few times to build up the alternate locations, then leave it alone. In practise, this works as well as/better than requeries. Every time you download a chunk of a file, the host is also supposed to send a list of the alternate locations it knows too.

cheers

arne_bab November 21st, 2004 09:37 PM

Correction: Bearshare does _exactly one_ requery every hour for the downloads (exactly one, not one per download).


All times are GMT -7. The time now is 07:11 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.