Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   General Gnutella Development Discussion (https://www.gnutellaforums.com/general-gnutella-development-discussion/)
-   -   Improve Download Host/Speed (https://www.gnutellaforums.com/general-gnutella-development-discussion/5329-improve-download-host-speed.html)

Moak November 15th, 2001 06:32 AM

Quote:

Originally posted by Pataya
Limewire does not need to discuss with you. Stop it, because most gnutella developers are not interested in talking with interested users.
After reading the latest the_GDF postings, I do not visit them regular, I have to agree. Most issues on swarming/superpeer/more are discussed there and never brought to bigger public. It seems a bunch of developers want to discuss only there.

As long as no other programmers or developers are interested in using this forum, I will reduce my posts here. Together with the Xolox developers it has been fun bringing in own experience and ideas! Thx. :) Right now the Xolox team is busy, so I hope the gnutella development will not stuck as it did the last 6 month. Finding a place for an open Gnutella developer community is still a topic I would be interested.

So long, Moak

Dude November 27th, 2001 11:39 AM

Re: More about a Gnutella Swarming idea
 
Quote:

Finding most requested files is the first item: Every client could maintain a statistics of highly uploaded files (files often downloaded by other users from itself) and tells other clients which they are (e.g. once on connect). I think we should not use search queries to maintain these statistics, because they are too inaccurate and upcoming query caches or super peers (which I both highly recommend) would falsify those results.
Sniffing Queries/Queryhits can be an addition to maintain a statistics of most requested files.
Such a passive sniffing (or call it passive searching) does cost no extra traffic and should be allready inplemented in a modern client (to improve automatic requeries in a healthy way). Yes, the results will be falsifyed by query caches... but passive search will help, especially when users do not share enough files to create a good upload statistics.

Aigamisou November 30th, 2001 02:33 PM

NOOOOOOO!!!

Moak,
I am learning how to be a developer. I don't always have a lot of things to contribute to discussions since I am such a newbie... I learn a lot by reading your ideas, opinions, and questions. I will be sad to see you go :(

Moak November 30th, 2001 08:28 PM

Thx thx Aigamisou :) Meanwhile we have established a new Developers Forum (this one)... I hope we can attract even more developers, geeks, geekgirls, network gurus, friendly moderators, people with ideas and find a coders community for Gnutella's future.... and I'm posting more again.

Aigamisou December 3rd, 2001 05:04 PM

*yeah*
 
Good, I am glad to see you are going to stick around. We do not agree on some things, but I like to hear people who do not agree... it helps me to learn. You have good ideas.

Nobody January 14th, 2002 05:39 AM

One idea to swarming: why do you need to find out which files are popular and download them "in the background"? You would cause even more traffic...

The only thing you need is to make a cache, size depending on connection speed. If an ADSL user has a 2G cache file, it should be OK for him/her nowdays... If you download something it goes to the cache, then it will be copied in the download folder. If your download exceeds the 2G limit, the files that where most uploaded from you will stay in the cache, the others will be replaced with the newly downloaded ones - so no freeloaders.

And for the modem user problem: those are not the only ones, who "steal" bandwidth. 56/36k modem user is able to upload at 4k/s, download at 8k/s (approx.).

ADSL/Cable 384/64 or 512/128 - the difference is bigger, so they (me too) are getting even more than they give....

Multicasting? Wouldn't it be able to do?

efield January 14th, 2002 11:07 AM

A note for those not aware, Freenet shares through a cache that contains files requested on the network. The cache is encrypted so you do not know what is being shared.

<http://freenetproject.org/>

Unregistered February 9th, 2002 02:29 PM

I like the swarming idea :)

Unregistered March 24th, 2002 04:59 AM

I agree with 'Nobody' - I think it sux to use bandwidth downloading files that I personally won't ever use.

But I think it's a great idea to have a cache of files I have previously downloaded which are deemed 'useful' by the gnutella network, however it decides that.

I have a question though - if I search for 'brinty spears' what is going to stop me getting individual results for all the sub-parts of that one giant popular brinty spears mpeg, when all I want is her latest mp3? (OK, bad example, but do you get what I mean?)

Basically, it does come down to a protocol change, I think, even though a very small one.

What about a user of an old version searching? If you just use the current protocol and put the filtering at the client-level, then old clients will be swamped with search results for partial files. You will be creating, say 100 files in the place of every 1 file now, and then distributing them say 100 times as much, so in fact 10000 query hits suddenly where before it would have been one.

I don't want that!

Also the point about caching of files with offensive content is very valid. This is why freenet works the way freenet does, because no-one can hope to create a filter that will filter all files they find offensive, and many people find it morally objectionable to knowingly host files they find objectionable .. so freenet encrypts everything so that you can't know what you are hosting, allowing you to say, ok, either i play or i don't but at least i will never know what horrors i am distributing .. therefor the culpability is on the people sharing those files.

This moral benefit is not a clear benefit to all users, but the legal benefit certainly is. If you can't know what is being stored, in most countries currently that means you aren't responsible for it, and secondly, noone can point the finger of blame because they can't decode the contents unless they know what it is anyway (in which case they could well be the people who caused the file to be stored on your system - it's a clever protcol!).

Unless you can answer these questions, 'real swarming' should be very optional, and not enabled by default.

In this case, how many people will enable it? Perhaps you might want to set up clients so that people can only download swarmed files if they also host them, to encourage people to turn it on. I dunno.


All times are GMT -7. The time now is 11:45 AM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.