Gnutella Forums

Gnutella Forums (https://www.gnutellaforums.com/)
-   XoloX Feature Request (https://www.gnutellaforums.com/xolox-feature-request/)
-   -   SHA1 File Hash and MetaData (https://www.gnutellaforums.com/xolox-feature-request/3848-sha1-file-hash-metadata.html)

gnutellafan September 21st, 2001 12:49 PM

SHA1 File Hash and MetaData
 
Please add the SHA1 file has as per the Full-file Hashes (FFHP) v.0.9

Also, supporting MetaData would be great!

Moak September 21st, 2001 03:12 PM

Both would be damn cool, man. Wasn't the last clue MD5 over SHA1 or do I mix GDF debates with FastTrack?

Moak

gnutellafan September 22nd, 2001 05:11 AM

No, MD5 was originally proposed but many ppl prefered sha1 because it is tougher to crack.

The protocol has not been officially accapted but discussion on file hashes has stopped at the GDF. If some of the most popular clients would implement it many others would follow. The protocol is complient with older clients and could be revised later as needed.

Moak September 22nd, 2001 07:06 AM

Okay, what is the advantage of SHA1 over MD5, what does "crack" mean in that context?

Let me think "loud". The purpose of a hash is to group identical files or find more matching file for resume. Security is no issue here and wihle bad clients may theoretical lie, a later file test is done in the overlapping resume.
I think a hash should be follow some guidelines: a) be build fast b) should be unique enough to find identical files within a typical horizon (not high secure) c) should be identical even when people change metadata (eg MP3 id tag). So maybe it is enough to be build a hash over the first amount of MBs (skiping the very first bytes where some metatags are), even partials would be hit.

I don't know which algorithm will fit best to this guidelines. Anyone can help, has experience with different hashs?

CU, Moak

gnutellafan September 24th, 2001 07:21 AM

I think there is a concern that if a less secure file hash is used one could engineer bad data that matches a file hash and would be shared, corrupting much of the data on gnet, especially with swarmed downloads implemented.

Moak September 24th, 2001 08:00 AM

thx, could you explain this a little more?

Can't follow yet, sorry maybe my brain might be too slow. I think security can't be achieved here, bad clients can answer Queryhits on random hashs (like Mandragore) or a client has a bug and gives wrong hashs or wrong partials. So every resume _must _ be overlapped. Something Xolox does right now and therfore Xolox resume is working fine, while for example Bearshares resume is terrible (did not check if its better with 2.30), you get a lot files with artifacts. It might be more important to have a fast & small hash.

The Seeker September 24th, 2001 01:49 PM

Metadata
 
As noted in this thread XoloX does not interpet the extra data the latest versions of BearShare and Gnotella(confirmed) send out. As more and more users upgrade to the latest versions of these clients, the number of downloadable MP3 results will drop until practically no files can be downloaded from them unless an exact file name is used. I say this because only the first hit from a BearShare or Gnotella host is usable, the rest get corrupted by XoloX as it improperly reads the metadata as file size or something.

If the developers don't want to spend the time incorperating file property scanners into XoloX, or making an extra column in the search tab to show extra info, they should at least modify the sofware to correctly interpet results that contain metadata so the files can be downloaded.


All times are GMT -7. The time now is 02:49 AM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2024, vBulletin Solutions, Inc.
SEO by vBSEO 3.6.0 ©2011, Crawlability, Inc.

Copyright © 2020 Gnutella Forums.
All Rights Reserved.