Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   LimeWire Beta Archives (https://www.gnutellaforums.com/limewire-beta-archives/)
-   -   LimeWire 1.9 Beta Available (https://www.gnutellaforums.com/limewire-beta-archives/6262-limewire-1-9-beta-available.html)

afisk December 7th, 2001 05:55 PM

LimeWire 1.9 Bet Available
 
LimeWire has released the beta version of LimeWire 1.9, available at:

http://www.limewire.com/index.jsp/download_beta

This is one of the most important LimeWire releases to date, with features including:

<ul>
<li>"Swarm Downloads," or the ability to download a single file from multiple sources so you get the file more quickly. LimeWire will automatically select the best hosts to simultaneously download from when you select a download group.
<li>"UltraPeers," or more powerful computers on the network that many "client" peers can connect to. Over time, this change will dramatically increase the number of files that you can find while also reducing the bandwidth used for messaging, leaving more bandwidth for file transfers. This is a particularly useful feature for modem users, as you will see a significant increase in download speeds.
<li>LimeWire now offers the option to only close the program after all of you current uploads and downloads are complete, meaning that you should be able to successfully download more files.
<li>Hosts that you can chat with are now displayed with a chat icon in the search and download windows. There is also now a button that you can use to begin a chat session with the selected host.
<li>File and browser launching now work on Mac OS 10.1.
<li>The dreaded "Type 2" error when using the Mac menu on Mac OS 9.2.1 has been fixed.
<li>Many minor bug fixes.
</ul>

We've been working very hard over here on the LimeWire team to get this version out, so we really hope you like it! We understand everyone's frustration with the ads and the bundled software, but please be patient with us as we try to make this a network where we can make money in more sensible and useful ways.

Thanks.

sweeppicker December 7th, 2001 07:29 PM

Thanx Adam

U guys are still #1. I don't mind the adware stuff....Dont let the rude
comments by immature and ungrateful folks get to u. Your all doing
a great job....Time to test drive the most exciting release yet

Thanx!:)

xYSpecOpsYx December 7th, 2001 08:16 PM

Ah, the swarming downloads make up for the ads!!! THIS IS THE BEST CLIENT!!! The new features will certainly make people rise above the ads and start acting like part of the community!!! Great job LimeWire.

Unregistered December 7th, 2001 09:20 PM

this could be illigal
 
you have NO RIGHT to force users to run cydoor spyware.
in your EULA it says that cydoor comes bondled with install and it also says this
Quote:

"If you wish to remove this program from your system, be sure to remove the Cydoor ad-support item as well. You can do this through the 'Add/Remove Programs' applet in the Control Panel"
so you have no right to stop user from using LimeWire after removing cydoor. nowhere in your EULA does it say that user MUST run cydoor in order to use LimeWire.
also your install is broken becuase it does not create cydoor uninstall entry in 'Add/Remove Programs' applet in the Control Panel.

anti-bearshare December 7th, 2001 09:42 PM

If you dont want to see the ads than try running a better OS than Windows. For example *BSD (I run FreeBSD) or Linux....they dont have ads :]. Its only b/c Window users are stupid, most of the popluation, and Windows supports things like cydoor and all that other crap.


But besides that. LimeWire 1.9 kicks ***!! I love the stats in the title bar of the application, incomplete file option, upload options, and shutdown option. The UltraPeer and swarmed downloading technologies are also great. Keep up the good work!!

#655321 December 8th, 2001 06:24 AM

Swarming? Where??
 
I'm sorry but I can't see any hint of LimeWire using swarmed downloads in beta 1.9. As far as i observe it, each file is still downloaded from a single source only.

- I tried to download several files (each having about 10 mirrors, all 4-star quality and cable connections), granting me a great download speed of 0.5 to 2.0 kb/s on average using ISDN...

- the download list only shows "downloading from xxx.xxx.xxx.xxx", indicating a single-source-transfer.

- furthermore, the downloads in a download list all have EXACTLY the same filename, therefore i do believe that they were sorted by filename only and not by sha1-hash or at least file ength.

how do i see that Limewire is actually swarming a download ? i didn't notice any difference at all in its download behaviour.

Also if it really does support swarming, a research button should be added in order to search for new sources when all the old ones don't work because they're outdated or something...

Unregistered December 8th, 2001 10:19 AM

Thanks very much for the support from many of you.

Just a quick response to the comment about Cydoor -- the version of Cydoor that we install is nothing more and nothing less than an ad engine. It does not maintain any information about you. It is really important to us to make this distinction, because it is the only component that we do not give you the option not to install, and it is the only component that does not collect information about you. It's basically just like a banner ad on a web site.

Cydoor does not appear in Add/Remove programs because it is uninstalled when LimeWire is uninstalled.

Unregistered December 8th, 2001 10:20 AM

In response to the question about the swarmed downloads, basically they will no always go into effect. They will only go into effect for files that are exactly the same size, so this almost always means that a given group will contain several swarm "buckets" of exact file matches. Then within this group, you will only swarm from hosts that successfully are connected to. So, in practice this means that often a group of 10 hosts will not generate a swarmed download, although sometimes it will.

We may also implement some changes that should improve this considerably (implementing persistent connections using HTTP 1.1, for example), but this is the way it is for the currrent release.

Let me know if you want any more details!

Morgwen December 8th, 2001 11:45 AM

[QUOTE]Originally posted by anti-bearshare
[B]Its only b/c Window users are stupid, most of the popluation, and Windows supports things like cydoor and all that other crap.
[QUOTE]

The only stupid thing I can see here is your comment! :mad:

Morgwen

anti-bearshare December 8th, 2001 12:17 PM

[QUOTE]Originally posted by Morgwen
[B][QUOTE]Originally posted by anti-bearshare
Its only b/c Window users are stupid, most of the popluation, and Windows supports things like cydoor and all that other crap.
Quote:


The only stupid thing I can see here is your comment! :mad:

Morgwen
heh, Well its the truth.

crohrs December 8th, 2001 12:31 PM

swarming
 
#655321:

If you are swarming a file, you will see "downloading from X hosts" instead of the usual "downloading from xxx.xxx.xxx.xxx".

Also, there is a feature to search for new results. Right click on the tab to bring up the menu.

John Blackbelt Jones December 8th, 2001 02:33 PM

How about giving a 'Search for File' option for the files you're downloading, in case the download fails? Or a 'automatically re-search for failed downloads and try to resume them, but keep an eye on the caused network load' option in the options.

afisk December 8th, 2001 03:58 PM

The two unregistered posts in a row were from me, by the way -- I had not realized that I wasn't logged in.

Unregistered December 8th, 2001 05:39 PM

"Find more hosts" from Fasttrack Clients
 
Yes, the two options, mentioned 2 posts earlier are really necessary, because you always can only download from the same person. That is not good, when you want to stop the client, when downloads are running. After the Restart, the downloads could automatically be resumed by the client (they'll search for the file, and then download it from different hosts)

John Blackbelt Jones December 8th, 2001 10:04 PM

Passive search would be a nice feature, too and it doesn't create much network traffic...

anti-bearshare December 9th, 2001 08:20 AM

Actually it would create no network at all. Since it doesnt send any queries out, its just watching for results people send. Now what would be cool is to have automatically download a specific file it finds, granted you absolutely know what you're looking for. Maybe it can have check the results by how big the file is and etc, just an idea.

Unregistered December 9th, 2001 08:38 AM

I very much like the idea of automatically requerying the network for a file in the case where you try to resume a download on startup -- this is the ideal place for this kind of feature.

The delicate part here is that automatically sending searches out on the network has the potential to create a LOT of traffic -- it could potentially double the messaging traffic on the network if not done carefully. This is why we have not yet added something like this.

There's definitely room for a nice feature addition in this general area, we just have to work out exactly how it would work so that it doesn't do more harm than good. Other clients have done this (Phex, I believe still does this), but they are only really able to get away with it because they just aren't that popular, so the effect, while surprisingly detrimental in our studies, does not grind everything to a halt.

anti-bearshare December 9th, 2001 10:01 AM

uhhh, no one is talking about "requerying" (sending queries out at a specific interval) the network. Everyone is talking about PASSIVE searching. Meaning it would monitor the results sent out to the other hosts you're connected to that pass through you. For example


Host A, B, and C are connected together as I have shown,


A <-> B <-> C


Host A has a filename named 111

Host B has a filename named 222

Host C has a filename named 333


Well host B wants file 333, but he doesnt want to send a query out to the network because he knows it is such a broad/general search it would generate a lot of traffic. So he does a PASSIVE SEARCH. The PASSIVE SEARCH is just sitting there watching results go pass and seeing if it sees anything with 333.

ok....

Well Host A also wants file 333. He doesnt really care about network traffic, he just wants the file. So he sends out a query for 333. Well the query go to Host B then to Host C. Well Host B doesnt have file 333 so it doesnt send anything back. But Host C does. So it sends a PONG (I think) back through the connection, to Host B then to Host A saying Host C has that file. Well when that result passes through Host B, the PASSIVE SEARCH picks it up and displays it. Then if Host B wants that file he just tries to directly connect to Host C to download the file, like it does already.


I hope this cleared some things up for you.

Unregistered December 9th, 2001 04:32 PM

HUGE??
 
Does 1.9 implement HUGE (full file hashes)?

Also, I would like to see researching (not just passively) the network based on the file hash.

afisk December 9th, 2001 06:19 PM

I was just responding to John Blackbelt Jones's message about requerying the network when a download fails -- it's a good idea in theory, but it's just always a little dangerous to put in features that automaticaly increase network traffic without the user explicitly initiating it. So, it just requires some careful thought, which probably would mean only applying it in very specific situations.

As far as passive searching, that's another issue entirely. It is also a potentially very valuable feature, and could reduce network traffic considerably. We conducted some thorough studies of this about a year ago now, however, and at that time the network was too dynamic for cached query replies (the message that responds to a file hit, a little different from a PONG, but sort of similar) to make that much of a difference. We could revisit this issue at some point, however, as the basic idea is inarguably sound.

I don't know if you do any programming in Java, but it might be interesting to take hack up the source a bit and try to quantify again how effective this could be. UltraPeers in particular could change the effectiveness of this.

Oh, and sorry I keep not logging in on this thread -- I keep forgetting that I haven't logged into this site yet on my girlfriend's fancy Titanium. =)

Thanks.

afisk December 9th, 2001 06:22 PM

LimeWire 1.9 does not implement HUGE, although a future version likely will. While HUGE is inarguably a more robust solution that our current implementation, we've found that approximate file name matching combined with exact file length matches works just as well in many cases. The only danger is that sometimes file names do not approximately match even though they are the same file -- the type of oversite that a HUGE impementation would not make.

We should have it in there at some point, however.

Unregistered December 9th, 2001 06:31 PM

file hashing using md4, sfv or better yet md5 would be really sweet.
also url support like edonkey.
give url w/ filename, size and hash and client will automaticaly search for exactly that file and d/l it.

afisk December 9th, 2001 07:01 PM

HUGE would almost definitely be implemented using SHA-1. I'm not familiar with EdonKey, but I'll check it out. A HUGE implementation would include a "download mesh" as well where the URLs of other users with the file would also be returned in query replies.

anti-bearshare December 9th, 2001 08:18 PM

yeah.....Adam, I didnt know if that was you (b/c you've posted some without logging in before) or it was some average Joe Blow telling everyone what he thinks LimeWire needs b/c he wants to download his porn faster. I just took it as an average Joe. But it will help others who dont know what a PASSIVE SEARCH is, out.

afisk December 9th, 2001 08:30 PM

Hey, no problem -- feel free to go ahead and tell me if some of my posts sound stupid (I'm sure many of them do anyway!) -- all the better for a lively exchange of ideas =).

JimmySmith December 9th, 2001 08:57 PM

1.9 A dissapointment, get eDonkey2000
 
I dont care what you fools say, the "swarm downloads" suck and never work.... LW 1.9 is the biggest dissapointment yet -- FastTrak prog's are still a million times better (even though I hate them too!) ---------- >> eDonkey2000 <<---------- is probably the best gnutella prog. right now, eventhough it probably has fewer users, at least you can download faster than 0.5 KB/s (-- I'm downloading Rush Hour 2 at 202 KB/s right now)!!!!!

eDonkey2000's FLIPPIN' SIMULTANEOUS MULTIPLE SOURCE DOWNLOADS ACTUALLY WORK!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

.:Ron:. December 9th, 2001 09:17 PM

afisk
LimeWire Developer::::::.....

I'm sorry to say, but I'd have to agree with JimmySmith, eDonkey2K seems to download a lot faster than limewire due to their multiple source downloading system. Also: I can't seem to stay connected to limewire as easy as with eDonkey....

You guys R cool, and i'd much rather use limewire, but the swarm downloads really don't work which makes downloading VERY slow.

Keep trying guys, I'll download the next version and hopefully you'll have everything fixed.

.:Ron:.

doobybrain December 10th, 2001 12:03 PM

its just as cool as ever. cant wait till it gets outta beta though. :)

Unregistered December 11th, 2001 06:07 AM

what a HUGE disappointment :(
 
I thought the source code for HUGE was complete? Why hasnt it been implemented? Will we see it in 2.0?

I think that this is an extremely important feature that will advance the gnutella network.

crohrs December 11th, 2001 07:21 AM

The HUGE proposal will probably be added later. We already have enough new code to test for 2.0!

JohnReam December 11th, 2001 02:56 PM

eDonkey2000 ??
 
Some of you have mentioned alternatives to LimeWire, such as eDonkey2000 and FastTrak.

Which is best? Where can I find these clients to try out.


I still use LimeWire 1.7c. I am NOT a fan of Ads or Spyware. I like Limewire and would be happy to pay an reasonable amount (PayPal) for an ad-less version.

But I think I need to start checking alternatives to LimeWire.

JohnReam December 11th, 2001 04:11 PM

Alternatives to LimeWire....
 
I just surfed to http://www.taxster.f2s.com. This is a real good site that compares most of the FileSharing programs.

One of his favorites for Applications and Games is Direct Connect, with eDonkey2000 close behind.

Anybody ever try Direct Connect ?

John Blackbelt Jones December 11th, 2001 04:42 PM

Maybe you picked the wrong forum to elaborate the blessings of DirectConnect.

Unregistered December 11th, 2001 05:09 PM

This is the best site for other programs but this is the Limewire forum after all.

http://www.zeropaid.com/

Unregistered December 11th, 2001 09:33 PM

Mac OS X version of limewire beta
 
the link that you have on your website only allows me to download the 1.8b version of limewire and not the limewire 1.9 beta that it is supposed to link to.

Moak December 12th, 2001 12:17 AM

Hi,

is the "Sparky"-like Pong caching implemented into this beta version?

crohrs December 12th, 2001 06:44 AM

pong-caching
 
Hi Moak. LW 1.9 does not use pong-caching, mainly because we have enough new things in it to keep us busy. :-) Ultrapeers do shield leaf nodes from all pongs except new ultrapeer pongs, unless the leaf explicitly pings the ultrapeer.

Unregistered December 12th, 2001 09:48 AM

Just curious, you said that there is already enough new code for 2.0? What other features will be added?

When will HUGE be implemented?

-gnutellafan

Abaris December 12th, 2001 01:54 PM

improving swarm downloads
 
You have pointed out that files are only bundled for a swarm download if they are in the same download group and have the same size, and that they are bundled in download groups only if they have exactly the same filename. however, it turns out that this way of swarming fails way too often and is less more than a disappointment - i'm sorry to tell you, but it's the truth.

I think it is very unlikely to find a lot of files that have exactly the same filename. limewire should rather group downloads into download lists by size only. after all, the files were returned as replies to the same query string, and if the file size is exactly the same, it is very likely that the files are too.

If that is not enough safety you can still implement a rollback interval in order to check if the files are the same (gnucleus does so, i think it uses a 4 kb rollback, but i am not sure).

i think that would by far increase the number of mirrors and the effectiveness of swarming. i find it *very* annoying that i get results of files that have exactly the same size and only differ slightly in the title (for example, some are called band-album-title and others are called band -- title). although it is obvious that these files are equal, limewire does not recognize them as peers.

And of course there should be an option for researching files for an active download, adding the results to the mirror list if they equal the file. this does not mean any kind of aoto-requerying. but when i resume a file that is already 80%, but the hosts in the mirror list are no longer running, i want to search again for sources of that very file instead of restarting at zero.

i think gnucleus has the most stable implementation o f all this (although it does not swarm, it only downloads from one mirror in the list at one time). you should try to get your mirror lists as accurate as those of gnucleus are, this *is* possible and swarming would then be no problem at all. this could all be done easily even without HUGE.

--- awaiting comments

afisk December 12th, 2001 02:13 PM

HUGE is clearly the best way to implement this, and this is where the majority of improvements in this area will come from, along with HTTP 1.1.

The file matching does not require exact names -- it uses an approximate matcher to identify files with similar names. I do agree with your point that files of the exact same length for a given query are almost definitely the same file, in which case you could bypass the approximate matching by name all together. We may implement this in our final release.

I disagree that this will signifantly increase the number of swarmable files, however, as it is quite rare that identical files have names that are so different that the approximate matcher does not group them together.

We will likely implement the researching for more hosts for a given download feature at some point.

The best easiest and most robust way of getting more hosts is to build a download mesh -- this is really the ideal, and it is not that hard. We will be doing this fairly soon.

Thanks for the suggestions -- the file length one in particular we may just go ahead and implement.

Unregistered December 13th, 2001 09:26 AM

bitzi and source code
 
I know bitzi supplie a source code for HUGE. Was there a java version of the source code? If not, how difficult is it to convert the source code (I imagine that it is pretty tough)?

Will BearShare be implementing HUGE? Vinnie has said in the past that he will not have multisource downloads w/o filehashes. I am sure LW and BS have discussed this. As the two clients make up the majority of the network it would be nearly fully implemented if the 2 would support it.

Any timeline?

What benefits does HTTP 1.1 provide?

-gf

afisk December 13th, 2001 09:51 AM

I know that Bitzi guys coded up a Java implementation at one point, but I don't think it's publicly available yet. I just sent a message to them checking on the status, though.

As far as HTTP 1.1 goes, its basically nice for two reasons:

1) persistent connections, so you don't have to disconnect when you've received your part of the file, and then reconnect to request another part.

2) the ability to simply connect without embedding precisely what you're asking for in the connection header -- this allows you to connect to everyone you can connect to, and then decide which parts of the file to request from which hosts. This is nice because you have a little more information (who you can actually connect to) before you have to decide which parts of the file to request from who.

I think Vinnie will implement HUGE, although I'm not positive. As far as a timeline for the hashes, it should be pretty soon. It's not at the top of our priority list, but it's not far from it.

Thanks.

afisk December 13th, 2001 09:55 AM

Oh, and I just implemented the change to group same-size search results together. Do to the technical details of the code, the first group can contain files of the same size as other groups, but after that every same size file is put in the same bucket. I'm about to commit the code now.

Abaris December 13th, 2001 10:50 AM

i am very curious about this download mesh - could you explain how it is supposed to work and what its advantages are? have you any technical article available at limewire or the gdf?

--- thanks

off topic:
what do you think about implementing channel chat on IRC ? Gnucleus already does this, but its users are far too less...the channel is deserted...is it an option for some time in the future ?

Jay Presper Eckart December 13th, 2001 07:28 PM

Well it's about god damn time!
 
All you fools dissing limewire don't know what you're talking about! This is f___ing amazing compared to the previous gnutella programs! Who cares about the stupid spyware! What do you f____ing expect for free??

The swarm downloads kick ***! Instead of limewire's usual 2 KB/s Im downloading at 30-100 constistantly! & since my network admin blocked morpheus/kazza (that a__hole) this is my main source of free sh_t!

Oh yeah, for you fools saying the swarm dl's dont work, you might need to leave limewire running for a while to get 'em to work.

THANKS FELLAZ!

Unregistered December 14th, 2001 05:51 AM

download mesh
 
I believe the download mesh he is refering to is where one client that hosts a file will also keep track of other clients with the same file. When the host recieves a request for the file it will also send information regarding other hosts where the same file can be found. This will allow for more hosts to swarm from.

-gf

Abaris December 14th, 2001 05:51 AM

nice that swarming works for a flaming ignorant with a highspeed connection beating on people who utter criticism. this is a beta release man and we are just trying to help the lime folks in making it even better. you might be glad because you can load your stuff with 30-100 now, but there are many many modem users who would be glad about 5-6. of course swarming is one of the greatest improvements ever in gnutella. but compared to swarming with morpheus and edonkey it is not a match at all. so relax and smile about the success at yours, but don't flame us because we would like to benefit as well.

afisk December 14th, 2001 07:32 AM

Jay, thanks very much for the enthusiastic support, and I'm glad you're enjoying the program. I do really want to discourage anybody on the forums from personally attacking each other, though -- let's try to keep this a nice and clean operation, ok?

The final release should have significant improvements in a number of areas, although they might not be immediately apparent. We will probably be releasing another beta today with the new changes.

As far as the download mesh goes, the previous poster was correct. A download mesh relies on the basic observation that you can easily keep a record of who has successfully uploaded from you, storing their ip address. Then, when you respond to another query for a file that you have, you can also return the ip addresses of everyone else who has successfully downloaded that file from you -- this provides new peers for swarmed downloads extremely cheaply.

The mesh part really comes in when you request the file from one of the ips returned in the first list, and it returns another list of ips of hosts who have successfully uploaded from it, and so one. Quite quickly (and with very few messages) you can get a long list of hosts who have a given file. This addresses the issue of scalability in a whole new way, as you can easily know all you need to know (that they have the damn file) about hosts otherwise not within your network horizon.

afisk December 14th, 2001 07:35 AM

Oh, and we should have mentioned that swarm downloading is completely turned off for modem users. We did this because there is a potential shortage of upload slots on the network if everyone starts swarming, and modem users (we thought) could not really get much out of swarming anyway.

We are, of course, open to changing this if given a good reason. Do the other programs allow swarming for modem users? Does it improve things that greatly?

Apologies for not mentioning the modem user exlusion sooner.

Unregistered December 14th, 2001 08:46 AM

BTW, dl limit does not seem to work when you're swarming.

Right now, I have dl limited to 8 simultaneous, but I have 8 dl slots active from which multiple are downloading from 2 or more hosts.

DL limit SHOULD count swarming also. Right now it seems that I could have limit of 8 and still have 8 files downloading with each swarming from 12 hosts which in total would mean 96 active connections.

Not very bandwidth-effective, eh? ;)


All times are GMT -7. The time now is 03:10 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.