Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   User Experience (https://www.gnutellaforums.com/user-experience/)
-   -   Unrealistic search, HD is permanently on !!!! (https://www.gnutellaforums.com/user-experience/3836-unrealistic-search-hd-permanently.html)

Unregistered September 21st, 2001 05:19 AM

Unrealistic search, HD is permanently on !!!!
 
I have 3000 files in my upload search directory, by the time one search ends new one begins and my HD is permanently on !!!!

To make ver 1.12 usable I had to take the main upload dir off line.

Make it search onece a day, or user defined !!

porttikivi September 21st, 2001 07:00 AM

Hmm... your 3000 files should be listed in RAM memory, and searches should be processed without accessing the disk. How much memory does Windows tell its using? Is it paging for some reason?

Unregistered September 21st, 2001 10:05 AM

I have 512 meg in the system p3 600 (512k)

Win 2k says that mem usage is ~270 meg and Xolox shows that it's holding about 40 meg

thanks

The Seeker September 21st, 2001 11:57 AM

Just for fun.
 
Ok, I went and shared my c:\windows directory and d:\ (Compaq recovery drive) to get my total nuber of shared files up to 3632 (1.45 GB). After the initial scan, I saw almost no hard drive activity. I saw a marked increase in CPU usage (probably due to searching) but not much extra resourse use. Less than 3 MB more than before, which means <1 K of RAM is used per indexed file.

If your hard drive is constantly grinding, tere are rwo reasons I can think of that this would be the case:

1) You are downloading one or more very large files (in the hundreds of megs) and they are nearing completion. At the end of a download, the segments that get put together are increasingly large, and copying a few hundred megs takes a while. I'll take this opportunity to echo the request made not long ago for FAT manipulation over filestream copying, it would be MUCH faster, and no extra HDD space would be needed.. though it may be a little harder to figure out and manage without bugs.

2) You have a very fast internet connection, have set your speed settings to 'Max', and someone else with a very fast connection is downloading lots of stuff from you. I dont' have a LAN so I can't be perfectly sure, but when downloading locally between BearShare and XoloX, XoloX would cause a lot more HDD grinding when sending files than BearShare does. (poor/nonexistant pre-buffering for sends?)

Unregistered September 21st, 2001 01:14 PM

Your Explanation 1) seems to be correct.

Thanks

PS: I was DL lot's of divx's

Moak September 21st, 2001 02:49 PM

> FAT manipulation over filestream copying

Great idea. Does it work on different filesystems (FAT/NTSC... at least not on remote/network filesystems, but those are for sure used rare for a download directory)?

Another idea (only reducing needed filespace) is to split jobs while downloading (let's say 10 MB). So merging all parts together will only eat 10 MB extra space... not the double file size. Few extra logic for downloading and uploading partials is needed then. Also when Xolox joins those smart parts together as soon as it is possible (only adding segments to the first segment as soon as possible), there might be less or no harddisk-heart-attack (grin) when a huge download finishes.

(Btw, when I share 3000 files, I nearly have no harddisk access when not downloadin/uploading, it must be Seekers explanation one)

PS: Seeker are you programming C++/Delphi/something?

The Seeker September 21st, 2001 03:49 PM

Nope
 
The only language I know most of the syntax and conventions for is QBASIC. I've dabbled a little with Rapid-Q, which is basically an OO-Basic for Windows language, but I haven't learned to manage windows or network stuff yet, so I can't make my own Gnutella client. If enough people nag me to do it I moght work on it though. ;)

Unregistered September 28th, 2001 01:29 PM

yeah, you guys seriously need to work on cpu/disk utilisation when downloading large files. i've had to uninstall because it now just hangs at startup churning the disk and eating 80% cpu, the UI is unresponsive.

Unregistered September 28th, 2001 02:30 PM

> Great idea. Does it work on different filesystems (FAT/NTSC... at
You better say NTFS.

> PS: Seeker are you programming C++/Delphi/something?
Xolox is written using Delphi

puffa October 14th, 2001 03:18 AM

FAT manipulation?

Yipe.

I understand the principle and why you suggest it but it scares me to trust someone else to manipulate the FAT.

Microsoft I have to trust, but a third party?

The files only really need to be stitched together when all are complete. That's only a single job.

I am prejudiced because I'm running a striped array. The stitching doesn't cause me any great problems. If FAT patching was implemented, I'd want it to be an option I could turn off.


All times are GMT -7. The time now is 06:16 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.