![]() |
Sharing large numbers of files - performance problem? In an attempt to get a little more legitimate content on the Gnutella network (I fear without enough network providers will habitually block it), I put up a host running limewire latest (2.9.11) to serve out a collection of approx 10000 well-named fine art images of 100-200K each. On startup CPU usage builds up to 100% after about 30 mins, at which point network activity begins to drop off. The application then typically crashes after a further hour in this state. The isn't box isn't particularly powerful (2x450MHz PII, 384MB, Win2K). Disk activity is minimal, no paging or thrashing. Would I be right in thinking that the load (and subsequent problem) is the result of incoming searches ? If so, are there recommendations for how many files a single LimeWire instance should server out ? Or is it simply the case that the software has never been tuned for larger collections of smaller files ? Any assistance appreciated. |
The hashing is (I believe) complete - LimeWire reports the files as being shared, both in the summary indicator and in the library. Whereas the initial startup (disconnected) took a few hours to report all as being available, it now takes only a few minutes. |
Ah, I think I may have found them problem. About 4000 of the 10000 images had 'nude' in thier title. D'oh! I've nixed the Boticelli and Modigliani collections along with a few others to see if this improves matters. Failing that, I might try digging out a java profiler to see if there's a particular bottleneck. Are the developers open to feedback of this sort ? |
All times are GMT -7. The time now is 11:43 PM. |
Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.
Copyright © 2020 Gnutella Forums.
All Rights Reserved.