Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   New Phex Users (https://www.gnutellaforums.com/new-phex-users/)
-   -   Search Results stalls at 25-30 Percent (https://www.gnutellaforums.com/new-phex-users/85523-search-results-stalls-25-30-percent.html)

wowpow June 26th, 2008 02:38 AM

Search Results stalls at 25-30 Percent
 
Hi , this is my first post on this forum.

I have noticed this from version 2.8.6.95 up to the newest version that search results starting to stall at around 25 to 30% and then simply not continue no matter what I am looking for.

I am in no time ultrapeer ( less then a hour ) and I am connected to 99 hosts.

Also I noticed that the search even if you stop after stalling some how remain active and if you start another search that the CPU usage goes up.

I have DMZ in the router the comp IP where Phex is running at and uninstalled MS Firewall to , btw XP clean installed to and running on Java 5 and tried it at Java 6 both have showed the same problems.

Any help appriciated :Smilywais:.

arne_bab June 27th, 2008 09:14 AM

The searches run only, as long as they didn't generate enough results (dynamic querying). The % slowly increases nontheless (the cut-off line should be somewhere around 500-1000 results, but if I remember correctly, it is lower for rare files).

Also, the additional results you see after stopping are those which other clients send in who get your search request.

Stopping the search stops Phex from sending out new requests, but it doesn't stop other from replying to already sent requests, and Phex doesn't throw away results (except clear spam or when requested by the user or a defined and active rule).

Best wishes,
Arne

wowpow July 8th, 2008 03:13 PM

Well that is not the case , if I search for a mp3 file that is really much present bearshare for example trows out 1000 - 3000 results phex starts to stall at exact 25-30% and some times even sooner and just stop even the query still runs at the same mp3 file as I searched in BS and doesnt go further then 150 - 200 then simply nothing else comes , if I repeat the search several times I get not much more then 150-300 thats odd as gnutella holds the file in BS 1000-3000 hits.

Btw thanks for the reply !!

GregorK July 9th, 2008 01:48 AM

Phex will not extend the query range once 200-300 results are returned. This is done to reduce the network load and to complies with the rules to be a "good Gnutella citizen". Searching for words like 'mp3' are exactly the reason why this is done. The query will likely travel only one hop and cause enough results to be returned in short time. Phex figures that the query is too popular and dynamically reduces the search horizon or maybe even stops the whole query. On the other hand if you search for rare content with few results Phex will extend the range to reach more hosts.
In case slow host still return delayed results they wont get rejected though and displayed.

I recommend you to further refine your query with more specific search terms.

wowpow July 9th, 2008 07:26 AM

I said "a mp3" not the word mp3 , so for the good understanding the mp3 from elvis no matter what title returns not even 5% of what Bearshare does , so talking about who is a decent net citizen will not bring Phex to reduce queries if the Grand Pa bearshare does actualy the oppsite of it , its also a turn that Phex isnt that much spread and people might looking for something else the Phex.

Even I have tried it from one of the earliest versions untill now but if I look for something I want all not 1% not 5% I want at least 80% of what gnutella has to offer like 95% in bearshare.

Lowering a Query ok I understand but Phex is lower their out come to 10% down thats not good , if you could find a compromise where the community can live with thats ok but reduce it that much is imho to much.

arne_bab July 10th, 2008 04:23 AM

I think we might tackle it from a different direction:

What is the reason why you need more than 200 search results on a specific search term?

What additional value do you get from the results number 300 to number 3000?

wowpow July 16th, 2008 02:32 AM

Its very simple , if you limite the queries trough gnutella it also means that you limite the results of files ( yes we all know spamm files ) , so if you limite the querie you also forget that those spam files are results and you findout out that you dont even get the results above that 150-300 as you said , we have filters yes but they are just filtering the results so means results that are included in the querie means results to. So the main issue will be the more results the more chance you get to have a decent file above the spam , aswell that rare files show up more decent if you can search with a more wide range query then 150-300.

arne_bab July 16th, 2008 02:58 PM

The other side is overall network health:

If all searches return just about 300 results, then only 300 results per query have to be transported, and that means that the network has to transport less searches and that it can be scaled up to become more efficient.

So much now for the technical reason.

Now for my own experience: Yes, spam bugs me, too, and much so.

There are simply files which are very hard to find, because spam stops the search, so legimitate results don't even get into reach.

And though there once was an idea how to stop spam for good, that idea got implemented in a LimeWire, and it never made its way into the mainstream LW.

So at the moment, we don't have good enough anti-spam algorithms, and Gnutella is widespread enough that people spam it.

It might be that the technical reason will be too weak, soon, so we'll have to adapt the way of searching.

It is damn efficient, but it gets spammed too badly to allow finding rare files.

IDEA: Don't stop rare searches on some results. Always carry them through till we got at least 300 results. The spam will make sure that the network doesn't get hit by this (I know this sounds strange, but we can leverage the spammers to keep the network safe from greedy searches - they won't like this :) ).

ursula July 16th, 2008 03:22 PM

Hi, wowpow, Gregor and arne !

Some questions that come to my mind... Perhaps/probably not applicable...

wowpow says that wowpow is acting as an ultrapeer...
Nice, but what is the total size of the files being shared by wowpow ?

What is wowpow's real-world up bandwidth ?

Should wowpow be acting as an ultrapeer ?
(A minimum, I am suggesting from observations over years, of 95% of those on Gnutella Network should NOT elect to be ultrapeers.)

What else is wowpow doing in terms of activities on the internet simultaneously with wowpow's searching efforts ?

What port is wowpow using ?

Fun questions, hmmm ?

arne_bab July 16th, 2008 06:56 PM

He can choose to be ultrapeer, and it is not up to us to deny him that :)

And though I have a few tricks up my sleeve to get past the spam, I am bothered by it as well.

It keeps me from some of the nicest fansubbed anime series :)


All times are GMT -7. The time now is 04:41 AM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.