Re: [sleuthkit-developers] Splitting and progress
Brought to you by:
carrier
From: Brian C. <ca...@sl...> - 2003-09-13 17:10:35
|
> I'm on the verge of a new release of the Indexed search > functionality. I was wondering how the progress on the Autopsy > restructuring was going. If you plan to release a new version > soon, I will hold my release to go with it.. Otherwise I will release > a version for Sleuthkit 1.65 and Autopsy 1.74. Not anytime soon. > Has anybody else already tested/used these tools? I would like to > receive some feedback if possible... Have you run the tests that I developed for FAT keyword searching? I released these a couple of weeks back on the CFTT@yahoogroups list. http://www.cerias.purdue.edu/homes/carrier/forensics/tests/test2 > I have just conducted a forensic investigation on a 100 Gb disk... > And therefore got ample oppertunity to test the new indexing > features.. > > A few numbers: > Indexing of the disk took only 9 hours and resulted in 4.7 Billion > indexed points.... Do you know how that time compares with FTK? > All words letter and number combinations of length 3 to 8 were > indexed. > Combining of the index files into a single index took 4.5 hours and > resulted in a 17 Gb file. > Searches for a specific word resulted in 75000 hits and only took 7 > seconds to perform. Very cool. One of the current limitations of the Autopsy searching is that it does not reconstruct files before searching. So, if a file has a string that crosses fragmented sectors, it will not be found. Changing the search method to look at each file (using 'fls' and 'icat') would be VERY slow (it would be comparable to running 'sorter'). I would hope that the indexing could make the search for each file much faster. thanks, brian |