message digest of large files

From: James Whitwell (jams_at_momento.com.au)
Date: 08/17/05


Date: Wed, 17 Aug 2005 17:32:44 +1000

Hi,

We're trying to use message digests as a way to uniquely identify large
binary files (around 50-60MB). Is there a limit to the size of the file
that we feed through, say SHA1? I was thinking we could start a new
hash every 10MB or so. Is there any information on this topic that
someone could point me to?

Thanks,
;) James.



Relevant Pages

  • Re: Calculate sha1 hash of a binary file
    ... I'm trying to calculate unique hash values for binary files, ... independent of their location and filename, ... Are there better ways of calculating hash values of binary files? ...
    (comp.lang.python)
  • Re: Calculate sha1 hash of a binary file
    ... I'm trying to calculate unique hash values for binary files, ... Are there better ways of calculating hash values of binary files? ... in non-blocking mode, which you're not doing). ... I wouldn't worry about your hashing code, that looks fine, if I were you ...
    (comp.lang.python)
  • Calculate sha1 hash of a binary file
    ... I'm trying to calculate unique hash values for binary files, ... independent of their location and filename, ... Are there better ways of calculating hash values of binary files? ...
    (comp.lang.python)
  • Re: CAPICOM HashedData, java and binary files
    ... Of course, you could always simply sign the HASH itself (in effect, ... SignedData would be encyrpting the hash of the hash of the large file). ... I'll have a look at your CAPICOM hash calculator tho and try to see ... > with large binary files. ...
    (microsoft.public.platformsdk.security)
  • Re: Calculate sha1 hash of a binary file
    ... I'm trying to calculate unique hash values for binary files, ... independent of their location and filename, ... to indicate that Windows shouldn't treat certain characters -- ...
    (comp.lang.python)