Re: [sleuthkit-users] Photorec carver
Brought to you by:
carrier
From: Simson G. <si...@ac...> - 2015-02-17 22:54:45
|
My thoughts: - De-duplication is so very important in modern forensic processing that you might want to make it a core function of the autopsy pipeline. - Prior to that, it might make sense for modules to be able to perform hashing and submit then include the hash when they submit files for analysis. If a system is I/O bound, then computing the hash might essentially be free, especially if the hash is a lightweight hash like MD5. - So I agree, it makes sense to do the hash calculation in the PhotoRec module and for the module to check to see if the carved object has already been processed. Simson > On Feb 17, 2015, at 5:43 PM, Brian Carrier <ca...@sl...> wrote: > > > On Feb 17, 2015, at 3:01 PM, Nanni Bassetti <dig...@gm...> wrote: > >> It could be useful to run Photorec only on the unallocated space > > It does. > >> and the a special module for deleting the duplicated files by hash comparison. >> Deleted files and carved files compared and the carved files duplicated deleted... > > Hmm, that could be interesting, but a bit challenging with the Autopsy pipelines. Files aren't hashed until they are added to the central database and scheduled for analysis. Hash calc is the first step in the pipeline. > > We could do the calculation in the PhotoRec module, it's just another I/O round trip and a database query, so the question is if carving generates so many duplicate hits that it is worth this effort. > > Thoughts? |