Thread: [sleuthkit-users] CD Forensics
Brought to you by:
carrier
From: Nico K. <nka...@gm...> - 2006-01-12 15:12:40
|
Good morning! This doesn't pertain directly to TSK, but I thought some of you might be able to offer me some insight into this issue. I was recently tasked with looking at a CD-R. There was only a single file of about 55KB. However, when checking the properties under Windows it showed that the used space was somewhere in the vicinity of ~580MB, which equalled the capacity of the CD-R. We suspected that there was more data than just that one file. We proceeded to get a bit-level image using dcfldd with various (conv=3Dnoerror,sync) and got to about 50MB when we started getting errors that simply slowed the read process down dramatically. The CD was heavily scratched, which might have caused the issue. However, when looking at the CD image we did get using hexedit and hexdump, we did only see that one fil= e followed eventually by the file table. It appears that the CD format was UDF and that only that one file was on the disk. I tried to replicate the scenario by writing a single file of similar size to a new CD-R using various methods native to XP as well as using Nero. Bu= t in each case the amount of used space according to the Windows properties sheet was in the KB range, which I expected based on file size + file table. But nowhere near the full capacity of the CD. Has anybody run into this or does anybody have any insight? Do you think there might be more data? Since it's a CD-R I really doubt that anybody would write files to the CD and be able to mess with the file table. Particularly the person in question. Also, does anybody have anymore info on the location of the file table? On some CD-Rs and CD-RWs I see the file table at the very beginning of the CD before the files themselves, and on others I see the files first followed eventually by the file table. Thank you VERY much for your time and interest. Cheers! Nico |
From: Rich T. <te...@ap...> - 2006-01-12 15:29:54
|
Could it be that the session was not closed???? What would the files of a opened session CD-R look like in the forensics environment? Interesting- never ran into that before. Cool problem (academically, of course) Nico Kalteis <nka...@gm...> wrote: Good morning! This doesn't pertain directly to TSK, but I thought some of you might be able to offer me some insight into this issue. I was recently tasked with looking at a CD-R. There was only a single file of about 55KB. However, when checking the properties under Windows it showed that the used space was somewhere in the vicinity of ~580MB, which equalled the capacity of the CD-R. We suspected that there was more data than just that one file. We proceeded to get a bit-level image using dcfldd with various (conv=noerror,sync) and got to about 50MB when we started getting errors that simply slowed the read process down dramatically. The CD was heavily scratched, which might have caused the issue. However, when looking at the CD image we did get using hexedit and hexdump, we did only see that one file followed eventually by the file table. It appears that the CD format was UDF and that only that one file was on the disk. I tried to replicate the scenario by writing a single file of similar size to a new CD-R using various methods native to XP as well as using Nero. But in each case the amount of used space according to the Windows properties sheet was in the KB range, which I expected based on file size + file table. But nowhere near the full capacity of the CD. Has anybody run into this or does anybody have any insight? Do you think there might be more data? Since it's a CD-R I really doubt that anybody would write files to the CD and be able to mess with the file table. Particularly the person in question. Also, does anybody have anymore info on the location of the file table? On some CD-Rs and CD-RWs I see the file table at the very beginning of the CD before the files themselves, and on others I see the files first followed eventually by the file table. Thank you VERY much for your time and interest. Cheers! Nico |
From: Barry J. G. <bg...@im...> - 2006-01-12 16:50:50
|
On Thu, 2006-01-12 at 10:12 -0500, Nico Kalteis wrote: > I was recently tasked with looking at a CD-R. In general, this has some pointers you might find interesting. FWIW, this whole series of articles is good stuff. http://www.agilerm.net/linux1.html -- /*************************************** Special Agent Barry J. Grundy NASA Office of Inspector General Computer Crimes Division Goddard Space Flight Center Code 190 Greenbelt Rd. Greenbelt, MD 20771 (301)286-3358 **************************************/ |
From: Nico K. <nka...@gm...> - 2006-01-12 18:18:22
|
I would like to thank everyone very much for their input. It has opened up some interesting additional avenues for exploration, which I will do right away. To answer a couple of questions, I began to dd (dcfldd, actually) the whole thing but started to run into heavy I/O errors after about 55MB or so. Obtaining the image slowed to a crawl where I would get about 1MB every minute, which ended up becoming unfeasible for this particular situation. = I would have loved to get the whole thing, but...oh well. So, I did all of m= y analysis on the 50MB chunk. I did run it through foremost and all it found was just the one file. I was able to verify that finding fairly easily using hexdump since it abbreviates the data with a simple asterisk if the preceeding line occurs more than once. That made the 50+MB fairly easy to skim through manually. The link to http://www.agilerm.net/linux1.html is excellent! Thanks again for your time! Nico |
From: Nico K. <nka...@gm...> - 2006-01-12 21:12:13
|
UPDATE! OK, so I was playing around a bit more. Here is what I did with an experimental CD-R: 1) Wrote two files to the CD using standard WinXP method (right-click and Send To D:) 2) Wrote one more file using same method 3) Wrote one more file using same method At this point Windows showed four files on my CD-R. I take the CD and put it into my Linux box. 1) I run readcd dev=3D/dev/cdrom -fulltoc As expected, I get output saying that there are three sessions on that CD-R= : Read speed: 5645 kB/s (CD 32x, DVD 4x). Write speed: 1411 kB/s (CD 8x, DVD 1x). TOC len: 180. First Session: 1 Last Session: 3. 01 14 00 A0 00 00 00 00 01 20 00 01 14 00 A1 00 00 00 00 01 00 00 01 14 00 A2 00 00 00 00 00 06 03 01 14 00 01 00 00 00 00 00 02 00 01 54 00 B0 02 24 03 02 4F 3B 4A 01 54 00 C0 A0 00 40 00 61 11 06 02 14 00 A0 00 00 00 00 02 20 00 02 14 00 A1 00 00 00 00 02 00 00 02 14 00 A2 00 00 00 00 02 2A 06 02 14 00 02 00 00 00 00 02 26 03 02 54 00 B0 04 0C 06 01 4F 3B 4A 03 14 00 A0 00 00 00 00 03 20 00 03 14 00 A1 00 00 00 00 03 00 00 03 14 00 A2 00 00 00 00 04 12 09 03 14 00 03 00 00 00 00 04 0E 06 03 54 00 B0 05 30 09 01 4F 3B 4A Lead out 1: 303 Lead out 2: 12006 Lead out 3: 19209 2) I do a 'hexdump -C /dev/cdrom | less' Everything goes fine for a while. I see the two files I wrote first and th= e TOC. Then I see a 'hexdump: /dev/hdc: Input/output error'. 3) OK, so I try dcfldd (later I tried unsuccessfully using readcd. It produced tons of I/O errors until it finally froze up. After rebooting I look at the image file written by dcfldd (using hexdump) and it ends right where the original hexdump od /dev/cdrom ended. So, eventually I realized that the place the hexdumps stopped and dcfldd froze up was at blocksize (2048) * 303 (Lead Out 1 from readcd output earlier). That tells me that everything is fine while looking at the first session only. But for some reason hexdump and dcfldd won't show me the other two sessions. Any ideas? For what it's worth, mounting the CD using -t iso9660 and doing an ls -l shows all four files. Again, thanks and enjoy the brain teaser! Nico On 1/12/06, Nico Kalteis <nka...@gm...> wrote: > > I would like to thank everyone very much for their input. It has opened > up some interesting additional avenues for exploration, which I will do > right away. > > To answer a couple of questions, I began to dd (dcfldd, actually) the > whole thing but started to run into heavy I/O errors after about 55MB or > so. Obtaining the image slowed to a crawl where I would get about 1MB ev= ery > minute, which ended up becoming unfeasible for this particular situation.= I > would have loved to get the whole thing, but...oh well. So, I did all of= my > analysis on the 50MB chunk. I did run it through foremost and all it fou= nd > was just the one file. I was able to verify that finding fairly easily > using hexdump since it abbreviates the data with a simple asterisk if the > preceeding line occurs more than once. That made the 50+MB fairly easy t= o > skim through manually. > > The link to http://www.agilerm.net/linux1.html is excellent! > > Thanks again for your time! > > Nico > |
From: farmer d. <far...@ya...> - 2006-01-12 22:22:40
|
Hi Nico, Try mounting as type "cdfs" (you'll most likely need to get that driver and compile against your kernel or use a Linux forensic boot CD that already has it) and then viewing the file "/proc/cdfs". This will show you every session and allow you to mount each as ISO. regards, farmerdude http://www.farmerdude.com/ --- Nico Kalteis <nka...@gm...> wrote: > UPDATE! > > OK, so I was playing around a bit more. Here is > what I did with an > experimental CD-R: > > 1) Wrote two files to the CD using standard WinXP > method (right-click and > Send To D:) > 2) Wrote one more file using same method > 3) Wrote one more file using same method > > At this point Windows showed four files on my CD-R. > > I take the CD and put it into my Linux box. > > 1) I run readcd dev=/dev/cdrom -fulltoc > > As expected, I get output saying that there are > three sessions on that CD-R: > > Read speed: 5645 kB/s (CD 32x, DVD 4x). > Write speed: 1411 kB/s (CD 8x, DVD 1x). > TOC len: 180. First Session: 1 Last Session: 3. > 01 14 00 A0 00 00 00 00 01 20 00 > 01 14 00 A1 00 00 00 00 01 00 00 > 01 14 00 A2 00 00 00 00 00 06 03 > 01 14 00 01 00 00 00 00 00 02 00 > 01 54 00 B0 02 24 03 02 4F 3B 4A > 01 54 00 C0 A0 00 40 00 61 11 06 > 02 14 00 A0 00 00 00 00 02 20 00 > 02 14 00 A1 00 00 00 00 02 00 00 > 02 14 00 A2 00 00 00 00 02 2A 06 > 02 14 00 02 00 00 00 00 02 26 03 > 02 54 00 B0 04 0C 06 01 4F 3B 4A > 03 14 00 A0 00 00 00 00 03 20 00 > 03 14 00 A1 00 00 00 00 03 00 00 > 03 14 00 A2 00 00 00 00 04 12 09 > 03 14 00 03 00 00 00 00 04 0E 06 > 03 54 00 B0 05 30 09 01 4F 3B 4A > Lead out 1: 303 > Lead out 2: 12006 > Lead out 3: 19209 > > 2) I do a 'hexdump -C /dev/cdrom | less' > > Everything goes fine for a while. I see the two > files I wrote first and the > TOC. Then I see a 'hexdump: /dev/hdc: Input/output > error'. > > 3) OK, so I try dcfldd (later I tried unsuccessfully > using readcd. It > produced tons of I/O errors until it finally froze > up. After rebooting I > look at the image file written by dcfldd (using > hexdump) and it ends right > where the original hexdump od /dev/cdrom ended. > > So, eventually I realized that the place the > hexdumps stopped and dcfldd > froze up was at blocksize (2048) * 303 (Lead Out 1 > from readcd output > earlier). That tells me that everything is fine > while looking at the first > session only. But for some reason hexdump and > dcfldd won't show me the > other two sessions. Any ideas? > > For what it's worth, mounting the CD using -t > iso9660 and doing an ls -l > shows all four files. > > Again, thanks and enjoy the brain teaser! > > Nico > > > > On 1/12/06, Nico Kalteis <nka...@gm...> wrote: > > > > I would like to thank everyone very much for their > input. It has opened > > up some interesting additional avenues for > exploration, which I will do > > right away. > > > > To answer a couple of questions, I began to dd > (dcfldd, actually) the > > whole thing but started to run into heavy I/O > errors after about 55MB or > > so. Obtaining the image slowed to a crawl where I > would get about 1MB every > > minute, which ended up becoming unfeasible for > this particular situation. I > > would have loved to get the whole thing, but...oh > well. So, I did all of my > > analysis on the 50MB chunk. I did run it through > foremost and all it found > > was just the one file. I was able to verify that > finding fairly easily > > using hexdump since it abbreviates the data with a > simple asterisk if the > > preceeding line occurs more than once. That made > the 50+MB fairly easy to > > skim through manually. > > > > The link to http://www.agilerm.net/linux1.html is > excellent! > > > > Thanks again for your time! > > > > Nico > > > __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com |