[sleuthkit-developers] NTFS data run collisions
Brought to you by:
carrier
From: Hu, H. - 0. - M. <Hon...@ll...> - 2014-03-25 21:01:00
|
Hi, I'm an NTFS rookie with a question about data runs. Are there any normal reasons why two different files might have overlapping data runs, i.e. mapped to some of the same clusters/blocks on the disk? For a research project, I would like to do the following: given a sector on the disk, determine what file (if any) owns the data in that sector. The first thing I tried was to build a simple block to filename hash table. For each file, I look at its data runs and put them into the table. With both TSK and the analyzeMFT library and using a clean Windows XP disk image, I get a non-trivial number of block collisions. Is this normal behavior? I would have thought that the block assignments would be unique. I have not been successful finding any info about this in various documentation. Thanks! -- Hongyi Hu MIT Lincoln Laboratory Group 59 (Cyber System Assessments) Ph: (781) 981-8224 |