Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   General Discussion (https://www.gnutellaforums.com/general-discussion/)
-   -   Phex maxing out my CPUs (https://www.gnutellaforums.com/general-discussion/42958-phex-maxing-out-my-cpus.html)

Comrade Fidel August 11th, 2005 06:47 AM

Phex maxing out my CPUs
 
Why does Phex max out my CPU when I run it?

Surely, it doesn't do *that* much behind the scenes? Its actually been causing overheating problems. Yes, I need a better heat management solution, but in the meantime, I'd like to be able to leave Phex running without be around to keep an eye on my CPU temp...

As a workaround, I set the affinity to one CPU (its a hyperthreaded CPU, so that throttles Phex a bit, but not much).

Also, I noticed that it will spazmodically lose connection to all other hosts, reporting "Socked Closed" on the network tab. Why is this?

Here's my configuration:

Phex 2.6.0.87
Java 1.5.0_04-b05 (mixed mode, sharing)
Windows NT 5.1 ("XP") Professional (Version 2002)

Intel Pentium 4 3.01 GHz Hyperthreading
1 GB DDR-400 RAM
Plenty of free hard disk space

Thanks,
Fidel

GregorK August 11th, 2005 07:23 AM

What are your Phex settings?

Things that will cause high CPU usage are:

- Too many shared files! Incoming searches for files need to be checked against a local keyword database. If this database is very large it can take a while and eats CPU.

- Initial hasing of shared files. Files need to be hashed before they can be shared. If you share many files or very big files this can take a while but it should stop when finished.

- A high number of parallel download connections. Each one is using a little bit of your CPU.

- A very high number of network connections. Though this should eat up your bandwidth faster then your CPU it could also add up.

- Activated Debug settings.


Hope that helps you..


The message "Socket Closed" usually indicates that you lost the network connection to that host.
If you loose all connections at once I could maybe mean your internet connection got lost??

Gregor

Comrade Fidel August 12th, 2005 02:36 AM

Settings...
 
Hi Gregor, thanks for your response. I'll see about getting my number of shared files under control (I have thousands of files shared out).

My network connections are restricted to about 20. With regards to my other settings, here is an educated-guess selection from my config file:

mDownloadMaxRetry=999
mDownloadRetryWait=4000
mMaxUpload=5
mMaxUploadPerIP=1
mNetConnectionTimeout=8000
mNetMaxHostToCatch=1000
mNetMaxRate=102400
mNetMaxSendQueue=500
mNetMinConn=4
mPushTransferTimeout=30000
mSearchMaxConcurrent=10
mUploadMaxSearch=64
maxConcurrentConnectAttempts=8
maxTotalDownloadWorker=99
maxUploadQueuePollTime=120
maxUploadQueueSize=100
maxWorkerPerDownload=99
orderingMethod=322
peerConnections=4
segmentHijacking=-1
segmentMultiple=4096
segmentTransferTime=90

Do you have any additional suggestions with regard to my sesttings?

Also, is there a FAQ or other document around regarding optimisation of Phex?

Thanks greatly for your help!

Cheers,
Fidel

kitonne August 13th, 2005 06:19 PM

same issue here... jre is hogging the CPU after phex 2.60.87 is up an running for a while. XP Pro, SP2, P4-2800 with 1GB RAM, i845 chipset, NAT, default settings for phex, 3Mbps/512Kbps. Increasing the default number of allowed connections makes the problem worse, but no setting changes seem to cure it. java 1.5.04, latest per sun web site. maybe a java runtime issue, but suggestions are most welcomed - do not have any experience debugging in a java environment, but am willing to try :)

About 20 shared files, and no more then 8 downloads at the same time.....

Looked under phex directory, and found an error log which is full of the following entry, repeated a gazillion times (20KBytes worth of log file, anyway):

050811 07:42:31,0312:Error/GLOBAL: java.lang.IllegalStateException: Cant merge, thread does not hold segment lock. - Exception: java.lang.IllegalStateException: Cant merge, thread does not hold segment lock.
at phex.download.swarming.SWDownloadSegment.mergeWith NextSegment(Unknown Source)
at phex.download.swarming.SWDownloadFile.mergeSegment s(Unknown Source)
at phex.download.swarming.SWDownloadWorker.handleDown load(Unknown Source)
at phex.download.swarming.SWDownloadWorker.run(Unknow n Source)
at phex.common.ThreadPool$Worker.run(Unknown Source)

Any suggestions would be greatly appreciated....

Comrade Fidel August 13th, 2005 10:26 PM

Follow-up...
 
Just as a follow-up, I greatly reduced the size of my library, and this seems to have cured Phex of its CPU hogging, and also addressed a stability issue it was causing.

Primarily, I did the following on the Windows system:

1. Unshared the Program Files directory and sub-directories.

2. Unshared a directory with with some very large ISO files and ZIPs.

3. Unshared directories which have had files moved or deleted, or sub-directories deleted, then re-shared them after Phex had settled down.

With no other foreground apps running, my P4-3 GHz (multithreaded) runs around 5 to 10%, and Phex has been up for nearly sixteen hours.

I may experiment with the maximum number of connections. I've already set maximum downloads per file to 99, and upped the download limiter.

On a slightly different topic, can someone tell me, in the statistics tab, are the traffic metrics in bytes or bits? Eg, if it says I'm downloading 900KB/s, is that 900 kilobytes, or 900 kilobits. I need to know this to make sure I don't blow out my monthly limit, as I have a fairly fast Internet connection.

Thanks,
Fidel

GregorK August 14th, 2005 03:23 AM

@Fidel..
your config values look fine and since reducing shared files helped this might been the main cause. 5%-10% CPU use is not unusual.

Regarding the connection count, the number of ultrapeer to ultrapeer connections should be at least 16. Otherwise you will have a hard time to connect to anyone.
Using 99 parallel download connections is very much and usually not necessary. The additional resource use is usually much higher then the gained download speed from it.
The statistics tab measures in bytes. And 1024 bytes is one kilobyte.
BTW the displayed speed is not very accurate...


@kitonne
Can you try to stop all downloads and look if it reduces the CPU use?
If this doesn't help try to run Phex in Leaf mode for a while and check if this helps.
To find the problem area its helpful do identify if its caused by network connections or download connections.

The error log you are seeing is known and should be fixed in the next release. Its only critical if it repeats so often that it burdened your CPU (multiple times per second). In this case it might be caused by a screwed download file. You can try to identify it by stoping all except one download.

Gregor

Comrade Fidel August 14th, 2005 07:18 AM

Max connections
 
What is the recommended number of ultrapeer to ultrapeer connections? Is there a limit beyond which Phex will start to misbehave or consume large amounts of system resources?

F.

kitonne August 15th, 2005 04:48 PM

Left it running for more then 8 hours, with no downloads. It was responsive, except for "library" tab, which takes forever to open (3-4 seconds, compared to less then 1/4 of a second for the other tabs). 19 files in the upload directory, should not be an issue..... No new errors in the log, from sitting idle a full day. It looks like the problem only occurs after some downloads have already occured, simply attached to the network is not an issue.

After the problem occurs, it does not matter if I kill all downloads and searches or not - java run time environment still takes 100% of the CPU and needs to be killed (task not responsive to normal window close command - maybe I did not wait long enough).

GregorK August 16th, 2005 08:42 AM

Re: Max connections
 
@fidel:
The default number of UP2UP connections is: 32
The resource use for each additional connection depends very much on the connection itself (fast/slow/how many further connections) but in average its almost linear.

@kitonne
The LibraryTab should perform much better with the next release.
There was a race condition with download I fixed a few days ago but it should not always occure, if at all. But maybe (hopefully) you are the special case and its also fixed with the next release.

Comrade Fidel August 17th, 2005 07:13 AM

Danke...
 
Thanks Gregor! Phex is now well behaved and doing its work. Well done on a great job!

Do you have any intention to support other networks / protocols besides Gnutella anytime in the foreseable future?

F.


All times are GMT -7. The time now is 02:55 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.