Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   LimeWire Beta Archives (https://www.gnutellaforums.com/limewire-beta-archives/)
-   -   LimeWire 3.3.0 Beta! (https://www.gnutellaforums.com/limewire-beta-archives/21074-limewire-3-3-0-beta.html)

afisk July 15th, 2003 08:05 AM

LimeWire 3.3.0 Beta!
 
Well, we're continuing to roll out the betas at a fast pace. LimeWire 3.3.0 includes the much-awaited partial file sharing feature. In this version, as soon as you've downloaded 1 megabyte of a file, you start sharing that file with other users. This is particularly useful for larger files, when many of the available sources may only have partial content. It's also useful for discovering new sources of a file over the course of your download, as a user who attempts to download your partial file can also tell you about new sources to download from.

LimeWire 3.3.0 also has a number of improvements to downloads that should further improve download success rates. The free version is available from:

http://www.limewire.com/english/content/beta.shtml

and the pro version is available from your pro download page.

Thanks very much for everyone's assistance in testing the beta.

- LimeWire Team

Norm July 15th, 2003 06:42 PM

Adam,

My Pro page is showing 3.2.3

Norm

sdsalsero July 16th, 2003 08:17 AM

Norm,
Scroll to the bottom of your Pro download-page and you'll see an additional comment, "If you are interested in trying out the LATEST BETA of LimeWire PRO, please click here." This comment only appears when there's a beta-version available.
_________________________

Afisk,
I've seen the partial-file uploads working -- sweet! One question, though: How does this avoid duplicate uploads from the original source? For instance, if I download the first 1MB of a file, then the source-IP goes off-line, and I end-up sharing my 1MB of file with a dozen other people. Now, the source-IP comes back on-line. Do I and the others all request the same 2nd MB of the file? Assuming it's bigger than just 2MB, I would hope we all request different parts of the file so that, if the source goes off-line again before we all finish, we can all share our different parts and hopefully end-up with a completed file.

Also, when will there be an option to disable Host Browsing of your own machine? I understand that this won't absolutely prevent people from seeing your files (since you need to post your file-list to your UltraPeers) but I think it will help (privacy) more than it will hurt (convenience for others).

trap_jaw4 July 16th, 2003 09:14 AM

Quote:

Originally posted by sdsalsero
One question, though: How does this avoid duplicate uploads from the original source? For instance, if I download the first 1MB of a file, then the source-IP goes off-line, and I end-up sharing my 1MB of file with a dozen other people. Now, the source-IP comes back on-line. Do I and the others all request the same 2nd MB of the file? Assuming it's bigger than just 2MB, I would hope we all request different parts of the file so that, if the source goes off-line again before we all finish, we can all share our different parts and hopefully end-up with a completed file.
This may be a small improvement, but do you think it's a good idea to fragment the partial files any more than absolutely necessary (you migh want to preview a file, for example)? How often do you suppose a case like this will occur?

Quote:

Also, when will there be an option to disable Host Browsing of your own machine? I understand that this won't absolutely prevent people from seeing your files (since you need to post your file-list to your UltraPeers) but I think it will help (privacy) more than it will hurt (convenience for others).
You don't upload your file-list to your Ultrapeer. You just upload a list of hashed keywords that cannot be used to reconstruct what you are sharing.

afisk July 16th, 2003 01:03 PM

Hi sdsalsero-

We haven't completely optimized the algorithm from the perspective of making sure that the rarest chunks are always downloaded first, a la BitTorrent. This is basically because we're coming at the problem from another angle -- from the angle of a file-sharing app that simply wants to also make partial files available. In BitTorrent, the problem is slightly different simply befause node transience is more of an issue -- nodes are expected to go offline as soon as the file is complete. With Gnutella, it's expected that many nodes will have the complete file.

Not that we shouldn't make optimizations along these lines, but they're just not as big optimizations on Gnutella as they are with BitTorrent because the networks simply behave differently.

I hope I'm addressing your point and not just rambling!

Thanks.

Norm July 16th, 2003 04:21 PM

Thanks, sdsalsero

Norm

sdsalsero July 17th, 2003 11:34 AM

Thanks afisk, trap_jaw.
You're welcome Norm.

re my partials, I've seen lots of uploads marked as "Complete". I assume this means they finished downloading what portion of the file I had, rather than meaning they'd finished downloading a complete copy of the file (which I don't have).

It also brings-up the question: How do you download from 2+ sources simultaneously? Again, I'm assuming that there's some sort of leap-frog logic so that you're not requesting the same "next chunk" from both sources.

Finally, trap_jaw, you say that a functional file-list is not uploaded to Ultrapeers?! Then, Heck, if there was an option to block Browse Host in LW (as there is in other servents) then there'd be a lot more work involved for someone trying to document what files everyone is sharing...

trap_jaw4 July 17th, 2003 01:18 PM

Quote:

Originally posted by sdsalsero
Finally, trap_jaw, you say that a functional file-list is not uploaded to Ultrapeers?! Then, Heck, if there was an option to block Browse Host in LW (as there is in other servents) then there'd be a lot more work involved for someone trying to document what files everyone is sharing...
There are simpler solutions to find out what you are sharing than to browse your files. If you run a few ultrapeers you can extract the all you need to know from the queryHit traffic passively and within a couple of hours you will know what thousands of users share (even those servents that do no support browse host).

If you wanted to make gnutella secure, you'd have to proxy all file-transfers and rely on Push-like mechanisms to start all downloads. It's possible and not too complicated (you don't even have to remove the ability to browse hosts to achieve anonymity) but all uploads and downloads would only be half as fast and the download mesh would not work nearly as well. That is a price I'm not really willing to pay. There are other networks that already offer that kind of security you seem to seek, like Freenet, WASTE or GnuNET.


All times are GMT -7. The time now is 04:22 AM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.