Re: [sleuthkit-users] Acheiving best performance in image aquisitiion
Brought to you by:
carrier
From: Gary F. <ga...@in...> - 2011-04-14 17:03:41
|
On 04/06/11 09:04:11, Marco Leone wrote: > I'm going to use some WD 2TB external disk USB 2.0 (no USB 3.0 or > e-sata port available) and a laptop. > To sum up, what's the best I can achieve in terms of performance and > how to achieve it with the simple hardware and tools I can easily > find. The USB 2.0 interface will likely become a bottleneck. As far as eSATA goes, we have used an inexpensive PCMCIA/PCI-e add-on card with good success. We have also used Firewire 800 on a MacBook Pro. If you stay with USB 2.0, you might consider using compression (gzip/etc), because on a newer laptop the cost of CPU time to compress may be a lot less than the time lost in transferring the data over a slow interface. The trade-off is that the resulting data might not be in exactly the same format as you need it (for loop back mounts, VM and so on). There may be alternative formats such as AFF (http://www.forensicswiki.org/wiki/AFF) that combine compression with a degree of random access, and the ability to mount those images, etc. Something that I thought about trying a while back is pcopy: http://freshmeat.net/projects/pcopy/. The documentation indicates that pcopy achieves better copy times by performing multiple transfers in parallel. The difficulty with a program like that is that a forensic copy generally requires that checksums are also generated, and most checksum algorithms are serialized. Also, parallel I/O is not likely to gain much when using a slow I/O connection (like USB 2.0). I would have liked to see some comparisons/analysis of gains from parallelized copies versus conventional serial bulk copies, as well as suggestions on how to deal with the "checksum problem". - Gary |