sleuthkit-users Mailing List for The Sleuth Kit (Page 46)
Brought to you by:
carrier
You can subscribe to this list here.
2002 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(6) |
Aug
|
Sep
(11) |
Oct
(5) |
Nov
(4) |
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2003 |
Jan
(1) |
Feb
(20) |
Mar
(60) |
Apr
(40) |
May
(24) |
Jun
(28) |
Jul
(18) |
Aug
(27) |
Sep
(6) |
Oct
(14) |
Nov
(15) |
Dec
(22) |
2004 |
Jan
(34) |
Feb
(13) |
Mar
(28) |
Apr
(23) |
May
(27) |
Jun
(26) |
Jul
(37) |
Aug
(19) |
Sep
(20) |
Oct
(39) |
Nov
(17) |
Dec
(9) |
2005 |
Jan
(45) |
Feb
(43) |
Mar
(66) |
Apr
(36) |
May
(19) |
Jun
(64) |
Jul
(10) |
Aug
(11) |
Sep
(35) |
Oct
(6) |
Nov
(4) |
Dec
(13) |
2006 |
Jan
(52) |
Feb
(34) |
Mar
(39) |
Apr
(39) |
May
(37) |
Jun
(15) |
Jul
(13) |
Aug
(48) |
Sep
(9) |
Oct
(10) |
Nov
(47) |
Dec
(13) |
2007 |
Jan
(25) |
Feb
(4) |
Mar
(2) |
Apr
(29) |
May
(11) |
Jun
(19) |
Jul
(13) |
Aug
(15) |
Sep
(30) |
Oct
(12) |
Nov
(10) |
Dec
(13) |
2008 |
Jan
(2) |
Feb
(54) |
Mar
(58) |
Apr
(43) |
May
(10) |
Jun
(27) |
Jul
(25) |
Aug
(27) |
Sep
(48) |
Oct
(69) |
Nov
(55) |
Dec
(43) |
2009 |
Jan
(26) |
Feb
(36) |
Mar
(28) |
Apr
(27) |
May
(55) |
Jun
(9) |
Jul
(19) |
Aug
(16) |
Sep
(15) |
Oct
(17) |
Nov
(70) |
Dec
(21) |
2010 |
Jan
(56) |
Feb
(59) |
Mar
(53) |
Apr
(32) |
May
(25) |
Jun
(31) |
Jul
(36) |
Aug
(11) |
Sep
(37) |
Oct
(19) |
Nov
(23) |
Dec
(6) |
2011 |
Jan
(21) |
Feb
(20) |
Mar
(30) |
Apr
(30) |
May
(74) |
Jun
(50) |
Jul
(34) |
Aug
(34) |
Sep
(12) |
Oct
(33) |
Nov
(10) |
Dec
(8) |
2012 |
Jan
(23) |
Feb
(57) |
Mar
(26) |
Apr
(14) |
May
(27) |
Jun
(27) |
Jul
(60) |
Aug
(88) |
Sep
(13) |
Oct
(36) |
Nov
(97) |
Dec
(85) |
2013 |
Jan
(60) |
Feb
(24) |
Mar
(43) |
Apr
(32) |
May
(22) |
Jun
(38) |
Jul
(51) |
Aug
(50) |
Sep
(76) |
Oct
(65) |
Nov
(25) |
Dec
(30) |
2014 |
Jan
(19) |
Feb
(41) |
Mar
(43) |
Apr
(28) |
May
(61) |
Jun
(12) |
Jul
(10) |
Aug
(37) |
Sep
(76) |
Oct
(31) |
Nov
(41) |
Dec
(12) |
2015 |
Jan
(33) |
Feb
(28) |
Mar
(53) |
Apr
(22) |
May
(29) |
Jun
(20) |
Jul
(15) |
Aug
(17) |
Sep
(52) |
Oct
(3) |
Nov
(18) |
Dec
(21) |
2016 |
Jan
(20) |
Feb
(8) |
Mar
(21) |
Apr
(7) |
May
(13) |
Jun
(35) |
Jul
(34) |
Aug
(11) |
Sep
(14) |
Oct
(22) |
Nov
(31) |
Dec
(23) |
2017 |
Jan
(20) |
Feb
(7) |
Mar
(5) |
Apr
(6) |
May
(6) |
Jun
(22) |
Jul
(11) |
Aug
(16) |
Sep
(8) |
Oct
(1) |
Nov
(1) |
Dec
(1) |
2018 |
Jan
|
Feb
|
Mar
(16) |
Apr
(2) |
May
(6) |
Jun
(5) |
Jul
|
Aug
(2) |
Sep
(4) |
Oct
|
Nov
(16) |
Dec
(13) |
2019 |
Jan
|
Feb
(1) |
Mar
(25) |
Apr
(9) |
May
(2) |
Jun
(1) |
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2020 |
Jan
(2) |
Feb
|
Mar
(1) |
Apr
|
May
(1) |
Jun
(3) |
Jul
(2) |
Aug
|
Sep
|
Oct
(5) |
Nov
|
Dec
|
2021 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
(4) |
Jul
(1) |
Aug
|
Sep
(1) |
Oct
|
Nov
(1) |
Dec
|
2022 |
Jan
|
Feb
(2) |
Mar
|
Apr
|
May
(2) |
Jun
|
Jul
(3) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2023 |
Jan
(2) |
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
|
Dec
|
2024 |
Jan
|
Feb
(3) |
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2025 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
From: maría e. d. <dar...@gm...> - 2014-03-12 02:47:59
|
Hi: I would like to know if it is possible with dd to secure erase a raid 5 or the first step is to undo the raid and then the secure erase and how to verify it? -- Prof. Ing. María Elena Darahuge |
From: Simson G. <si...@ac...> - 2014-03-11 22:04:11
|
Version 1.4’s SSN recognizer only recognizes labeled SSNs. Version 1.5 allows you to specify one of three SSN recognition modes: -S ssn_mode=0 SSN’s must be labeled “SSN:”. Dashes or no dashes okay. -S ssn_mode=1 No “SSN” required, but dashes are required. -S ssn_mode=2 No dashes required. Allow any 9-digit number that matches SSN allocation range. Simson On Mar 11, 2014, at 3:32 PM, Grundy Barry J TIGTA <Bar...@ti...> wrote: > My confusion likely stems from the fact that I could not get the scanners to pick up SSN’s. I’m running 1.4.1 (the stable version that I have tested for use in our lab), and I could not find reference to SSN’s and applicable scanners in any reference material on line. The capability is mentioned in several places, particularly wrt pii.txt, but nothing else. > > I did not mean for my confusion to confuse you. > > I’ll check out the code in 1.5. Thanks. > > /******************************************* > Barry J. Grundy > Assistant Special Agent in Charge > Digital Forensic Support Group > Electronic Crimes and Intelligence Division > Treasury Inspector General for Tax Administration > (301) 210-8741 (w) > (202) 527-5778 (c) > Bar...@ti... > ********************************************\ > > From: Simson Garfinkel [mailto:si...@ac...] > Sent: Tuesday, March 11, 2014 3:06 PM > To: Grundy Barry J TIGTA > Cc: sle...@li... > Subject: Re: [sleuthkit-users] [OT] Bulk_extractor and SSN's > > Hi, Barry. > > What do you consider confusing? We’ll try to clear it up. > > The SSN detector was cleaned up in version 1.5. You can download the code from the github repo and run it with version 1.4 if you wish. We have created multiple modes for SSN detection which you can set from the command line given the kind of case that you are working. > > Simson > > > On Mar 11, 2014, at 2:27 PM, Grundy Barry J TIGTA <Bar...@ti...> wrote: > > > Good Afternoon, > > Looking for a quick answer, so I thought I’d post here for cross users. > > I’ve been using bulk_extractor for awhile, and it works well. I’m come across an instance where I need to find SSN’s (in the thousands) in an image. While there are other ways to do it, I really like the feature file and histogram output of BE. There’s some very confusing info on the web regarding the ‘accts’ scanner and SSN’s. Is it able to find the numbers or not? It has not worked for me on test files, but I’ve found a number of references that mention it should (The BitCurator Wiki on BE scanners, for example). > > I’ve used the “-f [regexp]” option, but was still wondering if there is a built in scanner. > > Thanks, > > /******************************************* > Barry J. Grundy > Assistant Special Agent in Charge > Digital Forensic Support Group > Electronic Crimes and Intelligence Division > Treasury Inspector General for Tax Administration > (301) 210-8741 (w) > (202) 527-5778 (c) > Bar...@ti... > ********************************************\ > > ------------------------------------------------------------------------------ > Learn Graph Databases - Download FREE O'Reilly Book > "Graph Databases" is the definitive new guide to graph databases and their > applications. Written by three acclaimed leaders in the field, > this first edition is now available. Download your free book today! > http://p.sf.net/sfu/13534_NeoTech_______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org |
From: Grundy B. J T. <Bar...@ti...> - 2014-03-11 19:33:38
|
My confusion likely stems from the fact that I could not get the scanners to pick up SSN's. I'm running 1.4.1 (the stable version that I have tested for use in our lab), and I could not find reference to SSN's and applicable scanners in any reference material on line. The capability is mentioned in several places, particularly wrt pii.txt, but nothing else. I did not mean for my confusion to confuse you. I'll check out the code in 1.5. Thanks. /******************************************* Barry J. Grundy Assistant Special Agent in Charge Digital Forensic Support Group Electronic Crimes and Intelligence Division Treasury Inspector General for Tax Administration (301) 210-8741 (w) (202) 527-5778 (c) Bar...@ti... ********************************************\ From: Simson Garfinkel [mailto:si...@ac...] Sent: Tuesday, March 11, 2014 3:06 PM To: Grundy Barry J TIGTA Cc: sle...@li... Subject: Re: [sleuthkit-users] [OT] Bulk_extractor and SSN's Hi, Barry. What do you consider confusing? We'll try to clear it up. The SSN detector was cleaned up in version 1.5. You can download the code from the github repo and run it with version 1.4 if you wish. We have created multiple modes for SSN detection which you can set from the command line given the kind of case that you are working. Simson On Mar 11, 2014, at 2:27 PM, Grundy Barry J TIGTA <Bar...@ti...<mailto:Bar...@ti...>> wrote: Good Afternoon, Looking for a quick answer, so I thought I'd post here for cross users. I've been using bulk_extractor for awhile, and it works well. I'm come across an instance where I need to find SSN's (in the thousands) in an image. While there are other ways to do it, I really like the feature file and histogram output of BE. There's some very confusing info on the web regarding the 'accts' scanner and SSN's. Is it able to find the numbers or not? It has not worked for me on test files, but I've found a number of references that mention it should (The BitCurator Wiki on BE scanners, for example). I've used the "-f [regexp]" option, but was still wondering if there is a built in scanner. Thanks, /******************************************* Barry J. Grundy Assistant Special Agent in Charge Digital Forensic Support Group Electronic Crimes and Intelligence Division Treasury Inspector General for Tax Administration (301) 210-8741 (w) (202) 527-5778 (c) Bar...@ti...<mailto:Bar...@ti...> ********************************************\ ------------------------------------------------------------------------------ Learn Graph Databases - Download FREE O'Reilly Book "Graph Databases" is the definitive new guide to graph databases and their applications. Written by three acclaimed leaders in the field, this first edition is now available. Download your free book today! http://p.sf.net/sfu/13534_NeoTech_______________________________________________ sleuthkit-users mailing list https://lists.sourceforge.net/lists/listinfo/sleuthkit-users http://www.sleuthkit.org<http://www.sleuthkit.org/> |
From: Simson G. <si...@ac...> - 2014-03-11 19:22:12
|
Hi, Barry. What do you consider confusing? We’ll try to clear it up. The SSN detector was cleaned up in version 1.5. You can download the code from the github repo and run it with version 1.4 if you wish. We have created multiple modes for SSN detection which you can set from the command line given the kind of case that you are working. Simson On Mar 11, 2014, at 2:27 PM, Grundy Barry J TIGTA <Bar...@ti...> wrote: > Good Afternoon, > > Looking for a quick answer, so I thought I’d post here for cross users. > > I’ve been using bulk_extractor for awhile, and it works well. I’m come across an instance where I need to find SSN’s (in the thousands) in an image. While there are other ways to do it, I really like the feature file and histogram output of BE. There’s some very confusing info on the web regarding the ‘accts’ scanner and SSN’s. Is it able to find the numbers or not? It has not worked for me on test files, but I’ve found a number of references that mention it should (The BitCurator Wiki on BE scanners, for example). > > I’ve used the “-f [regexp]” option, but was still wondering if there is a built in scanner. > > Thanks, > > /******************************************* > Barry J. Grundy > Assistant Special Agent in Charge > Digital Forensic Support Group > Electronic Crimes and Intelligence Division > Treasury Inspector General for Tax Administration > (301) 210-8741 (w) > (202) 527-5778 (c) > Bar...@ti... > ********************************************\ > > ------------------------------------------------------------------------------ > Learn Graph Databases - Download FREE O'Reilly Book > "Graph Databases" is the definitive new guide to graph databases and their > applications. Written by three acclaimed leaders in the field, > this first edition is now available. Download your free book today! > http://p.sf.net/sfu/13534_NeoTech_______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org |
From: Grundy B. J T. <Bar...@ti...> - 2014-03-11 19:02:38
|
Good Afternoon, Looking for a quick answer, so I thought I'd post here for cross users. I've been using bulk_extractor for awhile, and it works well. I'm come across an instance where I need to find SSN's (in the thousands) in an image. While there are other ways to do it, I really like the feature file and histogram output of BE. There's some very confusing info on the web regarding the 'accts' scanner and SSN's. Is it able to find the numbers or not? It has not worked for me on test files, but I've found a number of references that mention it should (The BitCurator Wiki on BE scanners, for example). I've used the "-f [regexp]" option, but was still wondering if there is a built in scanner. Thanks, /******************************************* Barry J. Grundy Assistant Special Agent in Charge Digital Forensic Support Group Electronic Crimes and Intelligence Division Treasury Inspector General for Tax Administration (301) 210-8741 (w) (202) 527-5778 (c) Bar...@ti... ********************************************\ |
From: Jason L. <jle...@ba...> - 2014-03-11 15:23:35
|
We're working on adding carving via Scalpel. We've had some hiccups trying to add it in as a library vs its more traditional use as a stand alone tool. If you are inclined, you can see the progress in the "develop" branch on Github (certainly experimental at this stage). We're hoping to get a release out in a couple of months that will have carving added to Autopsy. Jason On Tue, Mar 11, 2014 at 9:25 AM, HADER Consulting <in...@ha...>wrote: > -----BEGIN PGP SIGNED MESSAGE----- > Hash: SHA1 > > Barry, > thanks for the answer. > The source device has been formated by using quick format (only the > directory entries are deleted). The images can be carved by commercial > tools and scalpel / foremost. > You are right, the files can only be found by using header info. So I > miss carving capabilities in autopsy. File carving would be a nice and > useful add on for autopsy/sleuthkit. > Regards > Joachim > > HADER Consulting > Dipl. Ing. (FH) Joachim A. Hader > Authorized expert on IT-Forensics, IT-Systems and Applications > Data protection and privacy official > > Moststraße 7 | 91799 Langenaltheim | Tel: +49 151 53872750 > Email: in...@ha... |WWW: http://www.hader-consulting.de > > Vertraulichkeit, Neutralität, Objektivität sind mein oberstes Gebot > Mitglied der Gesellschaft für Datenschutz und Datensicherheit e.V. > Mitglied des Verbands Europäischer Gutachter und Sachverständiger e.V. > > On 11.03.2014 13:58, Grundy Barry J TIGTA wrote: > > Are the files simply deleted, or are they images in unallocated > > without associated directory entries? Are the 'commercial tools' > > carving the files out? I'm not an Autopsy user, so I'm not sure > > if Autopsy either will, or has a module to, carve out files based > > on signature. I expect that's what's happening here. You'll need > > to find the files based on signature not file system artifacts. > > > > Does anyone know if 'carving' has been added to Autopsy? In the > > meantime you can augment your work with scalpel/Photorec/foremost, > > etc. Or for small test images you can have a really good time > > with sigfind and dd... > > > > /******************************************* Barry J. Grundy > > Assistant Special Agent in Charge Digital Forensic Support Group > > Electronic Crimes and Intelligence Division Treasury Inspector > > General for Tax Administration (301) 210-8741 (w) (202) 527-5778 > > (c) Bar...@ti... > > ********************************************\ > > > > > >> -----Original Message----- From: HADER Consulting > >> [mailto:in...@ha...] Sent: Tuesday, March 11, 2014 > >> 3:48 AM To: sle...@li... Subject: > >> [sleuthkit-users] Deleted files > >> > > Hi there, I'm running Autopsy 3.09 on a Win8-System. I have got a > > test image for comparing commercial and open source forensic > > tools. The test image is called rhinohunt, perhaps somebody knows > > it. On this image there are some pictures which are deleted. With > > autopsy i am not able to find this files. With foremost and > > commercial tools (eg. XWAYS) the files will be found. What went > > wrong with autopsy? Regards Joachim > >> > >> > ------------------------------------------------------------------------------ > >> > >> > Learn > >> > Graph Databases - Download FREE O'Reilly Book "Graph Databases" is > >> the definitive new guide to graph databases and their > >> applications. Written by three acclaimed leaders in the field, > >> this first edition is now available. Download your free book > >> today! http://p.sf.net/sfu/13534_NeoTech > >> _______________________________________________ sleuthkit-users > >> mailing list > >> https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > >> http://www.sleuthkit.org > > > -----BEGIN PGP SIGNATURE----- > Version: GnuPG v1.4.14 (GNU/Linux) > Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ > > iQEcBAEBAgAGBQJTHw7VAAoJEBkXzuy9JFgmaFoIAIgNpJOSbI6RABTJfDByb1nC > 23cwIGXevh4DhQeU/igI7HDAKLX5UPvfmzwp1zwM6K+hYu013+DFo1R8uPT3MM0p > p7NrYi5g7CpQ/J2xarB/rPmmMZibkaac72Y0oYumfyw0mH6QXAXocz+HxTu5UL0E > 3s6p21hOJeWVuQAcuUYwWfUwVHHN+KfqVbQLQb386UXRs6FVUkuox5DmfmdT7ymm > 1YwbtFXoMOqbtzzu2p4H93YBuClXo55nJDnwYH5JQ/Qw4V9faZPX1UpyPYqgGpwW > bIX/xd5nvD0OiOGV69tpLE1q2Z5JRePPzd3hvBt/vu8VjKtSTuLQevR6vXaW/Vg= > =SBOt > -----END PGP SIGNATURE----- > > > ------------------------------------------------------------------------------ > Learn Graph Databases - Download FREE O'Reilly Book > "Graph Databases" is the definitive new guide to graph databases and their > applications. Written by three acclaimed leaders in the field, > this first edition is now available. Download your free book today! > http://p.sf.net/sfu/13534_NeoTech > _______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org > |
From: HADER C. <in...@ha...> - 2014-03-11 14:47:32
|
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Jason, thank you for your reply. I am conviced that adding carving to autopsy is a very good improvement. Best regards Joachim On 11.03.2014 15:29, Jason Letourneau wrote: > We're working on adding carving via Scalpel. We've had some > hiccups trying to add it in as a library vs its more traditional > use as a stand alone tool. If you are inclined, you can see the > progress in the "develop" branch on Github (certainly experimental > at this stage). We're hoping to get a release out in a couple of > months that will have carving added to Autopsy. > > Jason > > > On Tue, Mar 11, 2014 at 9:25 AM, HADER Consulting > <in...@ha...>wrote: > > Barry, thanks for the answer. The source device has been formated > by using quick format (only the directory entries are deleted). The > images can be carved by commercial tools and scalpel / foremost. > You are right, the files can only be found by using header info. So > I miss carving capabilities in autopsy. File carving would be a > nice and useful add on for autopsy/sleuthkit. Regards Joachim > > HADER Consulting Dipl. Ing. (FH) Joachim A. Hader Authorized expert > on IT-Forensics, IT-Systems and Applications Data protection and > privacy official > > Moststraße 7 | 91799 Langenaltheim | Tel: +49 151 53872750 Email: > in...@ha... |WWW: http://www.hader-consulting.de > > Vertraulichkeit, Neutralität, Objektivität sind mein oberstes > Gebot Mitglied der Gesellschaft für Datenschutz und Datensicherheit > e.V. Mitglied des Verbands Europäischer Gutachter und > Sachverständiger e.V. > > On 11.03.2014 13:58, Grundy Barry J TIGTA wrote: >>>> Are the files simply deleted, or are they images in >>>> unallocated without associated directory entries? Are the >>>> 'commercial tools' carving the files out? I'm not an >>>> Autopsy user, so I'm not sure if Autopsy either will, or has >>>> a module to, carve out files based on signature. I expect >>>> that's what's happening here. You'll need to find the files >>>> based on signature not file system artifacts. >>>> >>>> Does anyone know if 'carving' has been added to Autopsy? In >>>> the meantime you can augment your work with >>>> scalpel/Photorec/foremost, etc. Or for small test images you >>>> can have a really good time with sigfind and dd... >>>> >>>> /******************************************* Barry J. Grundy >>>> Assistant Special Agent in Charge Digital Forensic Support >>>> Group Electronic Crimes and Intelligence Division Treasury >>>> Inspector General for Tax Administration (301) 210-8741 (w) >>>> (202) 527-5778 (c) Bar...@ti... >>>> ********************************************\ >>>> >>>> >>>>> -----Original Message----- From: HADER Consulting >>>>> [mailto:in...@ha...] Sent: Tuesday, March 11, >>>>> 2014 3:48 AM To: sle...@li... >>>>> Subject: [sleuthkit-users] Deleted files >>>>> >>>> Hi there, I'm running Autopsy 3.09 on a Win8-System. I have >>>> got a test image for comparing commercial and open source >>>> forensic tools. The test image is called rhinohunt, perhaps >>>> somebody knows it. On this image there are some pictures >>>> which are deleted. With autopsy i am not able to find this >>>> files. With foremost and commercial tools (eg. XWAYS) the >>>> files will be found. What went wrong with autopsy? Regards >>>> Joachim >>>>> >>>>> > ------------------------------------------------------------------------------ >>>>> >>>>> > > Learn >>>>> > Graph Databases - Download FREE O'Reilly Book "Graph Databases" is >>>>> the definitive new guide to graph databases and their >>>>> applications. Written by three acclaimed leaders in the >>>>> field, this first edition is now available. Download your >>>>> free book today! http://p.sf.net/sfu/13534_NeoTech >>>>> _______________________________________________ >>>>> sleuthkit-users mailing list >>>>> https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >>>>> >>>>> http://www.sleuthkit.org >>>> >> >> >> ------------------------------------------------------------------------------ >> >> Learn Graph Databases - Download FREE O'Reilly Book >> "Graph Databases" is the definitive new guide to graph databases >> and their applications. Written by three acclaimed leaders in the >> field, this first edition is now available. Download your free >> book today! http://p.sf.net/sfu/13534_NeoTech >> _______________________________________________ sleuthkit-users >> mailing list >> https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >> http://www.sleuthkit.org >> > -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.14 (GNU/Linux) Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ iQEcBAEBAgAGBQJTHyH7AAoJEBkXzuy9JFgm3n0IAIKvEiNn0ozgF+8MFAjRy01L D884+upVG/ZZdmmxKi7toI0GwkJtVoZxGxHpk36f4WPqqEoxZcIZZmS/n7eztlOe U2sGBBkTBuuxyOEXHsD99qsnP60Ea6doVWoli0vswo47eNP4TCeArMYfvVM5Ft3F e5fq9LHGvzHOr6hz+qgeM1tjetFHAqEbmcr8I5U1T3+ltBvxcCM3ctTpX1T7OPBO AuAyPWN6HP7SQOvWnc3WkbhZHo1sXCrZ0HlzNedXFDBOHo6k63gBVOCdA+fXCVUa u2lbtTLYq02vlzByH6ZSGN4jYGQU1t9W497vR29qRz1rvdhVuc8yX3N5cqPavr0= =a8eT -----END PGP SIGNATURE----- |
From: Grundy B. J T. <Bar...@ti...> - 2014-03-11 13:33:39
|
Are the files simply deleted, or are they images in unallocated without associated directory entries? Are the 'commercial tools' carving the files out? I'm not an Autopsy user, so I'm not sure if Autopsy either will, or has a module to, carve out files based on signature. I expect that's what's happening here. You'll need to find the files based on signature not file system artifacts. Does anyone know if 'carving' has been added to Autopsy? In the meantime you can augment your work with scalpel/Photorec/foremost, etc. Or for small test images you can have a really good time with sigfind and dd... /******************************************* Barry J. Grundy Assistant Special Agent in Charge Digital Forensic Support Group Electronic Crimes and Intelligence Division Treasury Inspector General for Tax Administration (301) 210-8741 (w) (202) 527-5778 (c) Bar...@ti... ********************************************\ > -----Original Message----- > From: HADER Consulting [mailto:in...@ha...] > Sent: Tuesday, March 11, 2014 3:48 AM > To: sle...@li... > Subject: [sleuthkit-users] Deleted files > > -----BEGIN PGP SIGNED MESSAGE----- > Hash: SHA1 > > Hi there, > I'm running Autopsy 3.09 on a Win8-System. > I have got a test image for comparing commercial and open source forensic > tools. The test image is called rhinohunt, perhaps somebody knows it. On this > image there are some pictures which are deleted. > With autopsy i am not able to find this files. With foremost and commercial > tools (eg. XWAYS) the files will be found. > What went wrong with autopsy? > Regards > Joachim > -----BEGIN PGP SIGNATURE----- > Version: GnuPG v1.4.14 (GNU/Linux) > Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ > > iQEcBAEBAgAGBQJTHr/LAAoJEBkXzuy9JFgmOe4H/2f4Y2gBKYpfcl2EGItfKDPz > 56c5T4J1gu8D6Rh+tfWuqYieD4rh7wxSsQimpBxABI+ojHe5pYgUAswtTL07HJR > 9 > yIQU4wJZ/DWZSWqHyQKMHSxMROWDT8fGgsfKmlQnHEI8ONLxkE/LuO75LF > xNG6nD > vVntJfB/JwIrJ9Tdjn9xgqzp1VKQr6DhOBXjXJIfM7xbG4uK76TWF6nfIoiiX1SS > oqTpD2da53EZY51SRc4GSaxoiAz6lOQbhijt5IeaDQCXWqrp02nOCItyrGdQHijS > Vt3Q48LBce/pF+LoqxkadSodkdG/mPY+y9QC1ZiAFowQxTKk8feLLHtOGaHDq > 7A= > =qKrF > -----END PGP SIGNATURE----- > > ------------------------------------------------------------------------------ > Learn Graph Databases - Download FREE O'Reilly Book "Graph Databases" is > the definitive new guide to graph databases and their applications. Written > by three acclaimed leaders in the field, this first edition is now available. > Download your free book today! > http://p.sf.net/sfu/13534_NeoTech > _______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org |
From: HADER C. <in...@ha...> - 2014-03-11 13:25:49
|
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Barry, thanks for the answer. The source device has been formated by using quick format (only the directory entries are deleted). The images can be carved by commercial tools and scalpel / foremost. You are right, the files can only be found by using header info. So I miss carving capabilities in autopsy. File carving would be a nice and useful add on for autopsy/sleuthkit. Regards Joachim HADER Consulting Dipl. Ing. (FH) Joachim A. Hader Authorized expert on IT-Forensics, IT-Systems and Applications Data protection and privacy official Moststraße 7 | 91799 Langenaltheim | Tel: +49 151 53872750 Email: in...@ha... |WWW: http://www.hader-consulting.de Vertraulichkeit, Neutralität, Objektivität sind mein oberstes Gebot Mitglied der Gesellschaft für Datenschutz und Datensicherheit e.V. Mitglied des Verbands Europäischer Gutachter und Sachverständiger e.V. On 11.03.2014 13:58, Grundy Barry J TIGTA wrote: > Are the files simply deleted, or are they images in unallocated > without associated directory entries? Are the 'commercial tools' > carving the files out? I'm not an Autopsy user, so I'm not sure > if Autopsy either will, or has a module to, carve out files based > on signature. I expect that's what's happening here. You'll need > to find the files based on signature not file system artifacts. > > Does anyone know if 'carving' has been added to Autopsy? In the > meantime you can augment your work with scalpel/Photorec/foremost, > etc. Or for small test images you can have a really good time > with sigfind and dd... > > /******************************************* Barry J. Grundy > Assistant Special Agent in Charge Digital Forensic Support Group > Electronic Crimes and Intelligence Division Treasury Inspector > General for Tax Administration (301) 210-8741 (w) (202) 527-5778 > (c) Bar...@ti... > ********************************************\ > > >> -----Original Message----- From: HADER Consulting >> [mailto:in...@ha...] Sent: Tuesday, March 11, 2014 >> 3:48 AM To: sle...@li... Subject: >> [sleuthkit-users] Deleted files >> > Hi there, I'm running Autopsy 3.09 on a Win8-System. I have got a > test image for comparing commercial and open source forensic > tools. The test image is called rhinohunt, perhaps somebody knows > it. On this image there are some pictures which are deleted. With > autopsy i am not able to find this files. With foremost and > commercial tools (eg. XWAYS) the files will be found. What went > wrong with autopsy? Regards Joachim >> >> ------------------------------------------------------------------------------ >> >> Learn >> Graph Databases - Download FREE O'Reilly Book "Graph Databases" is >> the definitive new guide to graph databases and their >> applications. Written by three acclaimed leaders in the field, >> this first edition is now available. Download your free book >> today! http://p.sf.net/sfu/13534_NeoTech >> _______________________________________________ sleuthkit-users >> mailing list >> https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >> http://www.sleuthkit.org > -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.14 (GNU/Linux) Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ iQEcBAEBAgAGBQJTHw7VAAoJEBkXzuy9JFgmaFoIAIgNpJOSbI6RABTJfDByb1nC 23cwIGXevh4DhQeU/igI7HDAKLX5UPvfmzwp1zwM6K+hYu013+DFo1R8uPT3MM0p p7NrYi5g7CpQ/J2xarB/rPmmMZibkaac72Y0oYumfyw0mH6QXAXocz+HxTu5UL0E 3s6p21hOJeWVuQAcuUYwWfUwVHHN+KfqVbQLQb386UXRs6FVUkuox5DmfmdT7ymm 1YwbtFXoMOqbtzzu2p4H93YBuClXo55nJDnwYH5JQ/Qw4V9faZPX1UpyPYqgGpwW bIX/xd5nvD0OiOGV69tpLE1q2Z5JRePPzd3hvBt/vu8VjKtSTuLQevR6vXaW/Vg= =SBOt -----END PGP SIGNATURE----- |
From: HADER C. <in...@ha...> - 2014-03-11 07:48:36
|
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Hi there, I'm running Autopsy 3.09 on a Win8-System. I have got a test image for comparing commercial and open source forensic tools. The test image is called rhinohunt, perhaps somebody knows it. On this image there are some pictures which are deleted. With autopsy i am not able to find this files. With foremost and commercial tools (eg. XWAYS) the files will be found. What went wrong with autopsy? Regards Joachim -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.14 (GNU/Linux) Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ iQEcBAEBAgAGBQJTHr/LAAoJEBkXzuy9JFgmOe4H/2f4Y2gBKYpfcl2EGItfKDPz 56c5T4J1gu8D6Rh+tfWuqYieD4rh7wxSsQimpBxABI+ojHe5pYgUAswtTL07HJR9 yIQU4wJZ/DWZSWqHyQKMHSxMROWDT8fGgsfKmlQnHEI8ONLxkE/LuO75LFxNG6nD vVntJfB/JwIrJ9Tdjn9xgqzp1VKQr6DhOBXjXJIfM7xbG4uK76TWF6nfIoiiX1SS oqTpD2da53EZY51SRc4GSaxoiAz6lOQbhijt5IeaDQCXWqrp02nOCItyrGdQHijS Vt3Q48LBce/pF+LoqxkadSodkdG/mPY+y9QC1ZiAFowQxTKk8feLLHtOGaHDq7A= =qKrF -----END PGP SIGNATURE----- |
From: Hervé Le G. <hl...@fr...> - 2014-03-04 18:24:17
|
Hi, Autopsy gives four timestamps for each event name it has identified: “Modified Time”“Change Time”“Access Time”“Created Time” I would have thought that when doing an investigation that focuses on “what happened at that particular time”, the most significant Timestamp had to be the “Access Time” but I’m getting a bit confused now that I see, for a particular time segment that happens to be the one I’m interested in, a relatively small number of not-so-relevant events, when considering their “Access Time”, and also considering their “Modified Time” timestamps, while there’s an extremely large number of relevant events for the same time segment, seen from their “Change Time” timestamps. Could one kind soul please give the precise definition of these four timestamps, in the Autopsy 3.0.9 context, of course. I took a look first at the Autopsy tutorials I could find on line before asking the list, but couldn’t find this information, apologies if I didn’t spot it. Many thanks in advance, Herve --- Ce courrier électronique ne contient aucun virus ou logiciel malveillant parce que la protection avast! Antivirus est active. http://www.avast.com |
From: Brian C. <ca...@sl...> - 2014-03-03 03:12:33
|
Hi Michael, Currently, you can't do it from within the UI. You need to export the ZIP file, extract it, and then feed the files back in so that they can be hashed, searched, etc. I'll make a feature request to track this though. thanks, brian On Feb 28, 2014, at 3:02 PM, Michael Brown <gen...@gm...> wrote: > Hi all, > > Whenever I run the archive extraction ingest module in Autopsy 3 I get > the following error when a protected archive is encountered: > > "No password was provided for opening protected archive." > > How exactly do I provide passwords for protected archives? In Encase > it's just a matter of right clicking on the file in table view, I > can't seem to find any way to provide Autopsy 3 with a password to use > for encrypted archive files. > > Anyone have any advice? > > - Michael > > ------------------------------------------------------------------------------ > Flow-based real-time traffic analytics software. Cisco certified tool. > Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer > Customize your own dashboards, set traffic alerts and generate reports. > Network behavioral analysis & security monitoring. All-in-one tool. > http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk > _______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org |
From: Michael B. <gen...@gm...> - 2014-02-28 20:03:07
|
Hi all, Whenever I run the archive extraction ingest module in Autopsy 3 I get the following error when a protected archive is encountered: "No password was provided for opening protected archive." How exactly do I provide passwords for protected archives? In Encase it's just a matter of right clicking on the file in table view, I can't seem to find any way to provide Autopsy 3 with a password to use for encrypted archive files. Anyone have any advice? - Michael |
From: Ade <adr...@nt...> - 2014-02-28 09:11:40
|
Off the top of my head... Some libraries to support VM file formats and some code for mounting & processing them would be nice. Processing itunes backups (backups of iPads, iPhones, iPods). Integrate volatility and automagically process swap/hiberfil files. +1 for Derricks suggestion for creating a Python input stream into your case. Ade On Thursday 27 Feb 2014 17:35:47 Brian Carrier wrote: > We're having our company's internal annual hack-a-thon and a team of us > decided to add python bindings to Autopsy. We did it! > > Now, we need to win the competition with a cool demo. Anybody have any > ideas of cool things that can be done in Python that would demo well? > > > > ---------------------------------------------------------------------------- > -- Flow-based real-time traffic analytics software. Cisco certified tool. > Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer > Customize your own dashboards, set traffic alerts and generate reports. > Network behavioral analysis & security monitoring. All-in-one tool. > http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktr > k _______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org |
From: Derrick K. <dk...@gm...> - 2014-02-28 06:28:14
|
Hi Alex. Indeed, xmount is a viable alternative but it only runs on Linux and OSX unfortunately. On the Windows side you can use imdisk, FTK Imager, Mount Image Pro, etc. to expose forensic images to the system but it would be cool to be able to do it natively within Autopsy. Python may not be the best choice for this, but I threw it out there for fun. :) Derrick On Thu, Feb 27, 2014 at 4:20 PM, Alex Nelson <ajn...@cs...> wrote: > Hi Derrick, > > A note on one of your suggestions: The read-only-but-writeable device bit > might already be handled with xmount: > https://www.pinguin.lu/index.php (you should probably just ignore the > website's self-signed cert) (also packaged in several Linux distros, and > worked fo rme) > > I don't think there's need to duplicate that effort. > > --Alex > > > On Feb 27, 2014, at 15:16 , Derrick Karpo <dk...@gm...> wrote: > > Awwww....I was hoping for Lisp integration but will take what I can get. > ;) Yay Python! > > Here's some thoughts that may or may not be useful: > > o mmap larger files so that plugins can query directly against the mmap > file for searches? > o Automatically create a de-duped set of files from an exhibit (maybe > using set() or frozenset()?) > o Cross-image file testing for membership or non-membership (maybe using > set() or frozenset()?) > o Expose an image back to the OS as a read-only physical device (with a > temporary scratchfile). It would be cool if it could expose an image that > tools (ie. virtualization) could then hook into. > o Create a Python input stream (named pipe?) so that any application can > feed data directly into your Autopsy case for ingest. ie. Take your > bulk_extractor output and feed it directly into Autopsy so that Autopsy can > ingest it. Rather than grep'ing across all my data sources I could do it > all in Autopsy where it is indexed. > > Derrick > > > > On Thu, Feb 27, 2014 at 3:35 PM, Brian Carrier <ca...@sl...>wrote: > >> We're having our company's internal annual hack-a-thon and a team of us >> decided to add python bindings to Autopsy. We did it! >> >> Now, we need to win the competition with a cool demo. Anybody have any >> ideas of cool things that can be done in Python that would demo well? >> >> >> >> >> ------------------------------------------------------------------------------ >> Flow-based real-time traffic analytics software. Cisco certified tool. >> Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer >> Customize your own dashboards, set traffic alerts and generate reports. >> Network behavioral analysis & security monitoring. All-in-one tool. >> >> http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk >> _______________________________________________ >> sleuthkit-users mailing list >> https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >> http://www.sleuthkit.org >> > > > ------------------------------------------------------------------------------ > Flow-based real-time traffic analytics software. Cisco certified tool. > Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer > Customize your own dashboards, set traffic alerts and generate reports. > Network behavioral analysis & security monitoring. All-in-one tool. > > http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk_______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org > > > |
From: Alex N. <ajn...@cs...> - 2014-02-27 23:21:06
|
Hi Derrick, A note on one of your suggestions: The read-only-but-writeable device bit might already be handled with xmount: https://www.pinguin.lu/index.php (you should probably just ignore the website's self-signed cert) (also packaged in several Linux distros, and worked fo rme) I don't think there's need to duplicate that effort. --Alex On Feb 27, 2014, at 15:16 , Derrick Karpo <dk...@gm...> wrote: > Awwww....I was hoping for Lisp integration but will take what I can get. ;) Yay Python! > > Here's some thoughts that may or may not be useful: > > o mmap larger files so that plugins can query directly against the mmap file for searches? > o Automatically create a de-duped set of files from an exhibit (maybe using set() or frozenset()?) > o Cross-image file testing for membership or non-membership (maybe using set() or frozenset()?) > o Expose an image back to the OS as a read-only physical device (with a temporary scratchfile). It would be cool if it could expose an image that tools (ie. virtualization) could then hook into. > o Create a Python input stream (named pipe?) so that any application can feed data directly into your Autopsy case for ingest. ie. Take your bulk_extractor output and feed it directly into Autopsy so that Autopsy can ingest it. Rather than grep'ing across all my data sources I could do it all in Autopsy where it is indexed. > > Derrick > > > > On Thu, Feb 27, 2014 at 3:35 PM, Brian Carrier <ca...@sl...> wrote: > We're having our company's internal annual hack-a-thon and a team of us decided to add python bindings to Autopsy. We did it! > > Now, we need to win the competition with a cool demo. Anybody have any ideas of cool things that can be done in Python that would demo well? > > > > ------------------------------------------------------------------------------ > Flow-based real-time traffic analytics software. Cisco certified tool. > Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer > Customize your own dashboards, set traffic alerts and generate reports. > Network behavioral analysis & security monitoring. All-in-one tool. > http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk > _______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org > > ------------------------------------------------------------------------------ > Flow-based real-time traffic analytics software. Cisco certified tool. > Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer > Customize your own dashboards, set traffic alerts and generate reports. > Network behavioral analysis & security monitoring. All-in-one tool. > http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk_______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org |
From: Derrick K. <dk...@gm...> - 2014-02-27 23:16:07
|
Awwww....I was hoping for Lisp integration but will take what I can get. ;) Yay Python! Here's some thoughts that may or may not be useful: o mmap larger files so that plugins can query directly against the mmap file for searches? o Automatically create a de-duped set of files from an exhibit (maybe using set() or frozenset()?) o Cross-image file testing for membership or non-membership (maybe using set() or frozenset()?) o Expose an image back to the OS as a read-only physical device (with a temporary scratchfile). It would be cool if it could expose an image that tools (ie. virtualization) could then hook into. o Create a Python input stream (named pipe?) so that any application can feed data directly into your Autopsy case for ingest. ie. Take your bulk_extractor output and feed it directly into Autopsy so that Autopsy can ingest it. Rather than grep'ing across all my data sources I could do it all in Autopsy where it is indexed. Derrick On Thu, Feb 27, 2014 at 3:35 PM, Brian Carrier <ca...@sl...>wrote: > We're having our company's internal annual hack-a-thon and a team of us > decided to add python bindings to Autopsy. We did it! > > Now, we need to win the competition with a cool demo. Anybody have any > ideas of cool things that can be done in Python that would demo well? > > > > > ------------------------------------------------------------------------------ > Flow-based real-time traffic analytics software. Cisco certified tool. > Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer > Customize your own dashboards, set traffic alerts and generate reports. > Network behavioral analysis & security monitoring. All-in-one tool. > > http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk > _______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org > |
From: Alex N. <ajn...@cs...> - 2014-02-27 22:41:06
|
Fancy matplotlib graphs! One example: get a windowed interactive graphic of your subject data's file system timeline, with but a button push. Gaps in the data would look cool. --Alex On Feb 27, 2014, at 14:35 , Brian Carrier <ca...@sl...> wrote: > We're having our company's internal annual hack-a-thon and a team of us decided to add python bindings to Autopsy. We did it! > > Now, we need to win the competition with a cool demo. Anybody have any ideas of cool things that can be done in Python that would demo well? > > > > ------------------------------------------------------------------------------ > Flow-based real-time traffic analytics software. Cisco certified tool. > Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer > Customize your own dashboards, set traffic alerts and generate reports. > Network behavioral analysis & security monitoring. All-in-one tool. > http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk > _______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org |
From: Brian C. <ca...@sl...> - 2014-02-27 22:35:54
|
We're having our company's internal annual hack-a-thon and a team of us decided to add python bindings to Autopsy. We did it! Now, we need to win the competition with a cool demo. Anybody have any ideas of cool things that can be done in Python that would demo well? |
From: Brian C. <ca...@sl...> - 2014-02-27 03:42:56
|
To wrap up this topic, I confirmed that we incorrectly only shipped 32-bit dlls in the last release. The MS installer that Mauro listed below should fix 3.0.9. On Feb 19, 2014, at 10:36 AM, Brian Carrier <ca...@sl...> wrote: > That's good news. It shouldn't be required, but it at least means that the core problem is that the MS dlls being packaged with Autopsy are not being found. > > Thanks! > > > On Feb 19, 2014, at 10:20 AM, Mauro Silva <mau...@gm...> wrote: > >> Hi all, >> >> I've just tried to install the 64 bit version of Autopsy 3.0.9 in a clean Windows 8 machine and it was having problems. I figured out that it was because I didn't have the Microsoft Visual C++ Redistributable Package installed. >> >> After installing Microsoft Visual C++ (http://www.microsoft.com/en-us/download/details.aspx?id=13523) and reinstalling Autopsy everything started working. >> >> Thought it might help someone, >> Best regards, >> Mauro Silva >> ------------------------------------------------------------------------------ >> Managing the Performance of Cloud-Based Applications >> Take advantage of what the Cloud has to offer - Avoid Common Pitfalls. >> Read the Whitepaper. >> http://pubads.g.doubleclick.net/gampad/clk?id=121054471&iu=/4140/ostg.clktrk_______________________________________________ >> sleuthkit-users mailing list >> https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >> http://www.sleuthkit.org > > > ------------------------------------------------------------------------------ > Managing the Performance of Cloud-Based Applications > Take advantage of what the Cloud has to offer - Avoid Common Pitfalls. > Read the Whitepaper. > http://pubads.g.doubleclick.net/gampad/clk?id=121054471&iu=/4140/ostg.clktrk > _______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org |
From: Hervé Le G. <hl...@fr...> - 2014-02-26 21:58:44
|
Hi, I have started investigating a laptop PC (Windows 7 Pro) that has been "cleaned" before being given back to its legitimate owner, and I have a couple of questions for the cognoscenti: 1) I would need to find out whether some formatting has been performed "recently" and, if yes, when that formatting occurred on this or that volume/partition, 2) Since Photorec finds lots of deleted files on the .dd image I created (Caine + Guymager), it would be useful indeed to know when (date and time) these files were deleted, I understand Windows doesn't record the date/time files were deleted but there must be ways to narrow down the possibilities for such a "massive" deleting action to have been performed. I have imaged the hard drive (Caine + Guymager) and used Autopsy to ingest the .dd file When using its Timeline tool, I can definitely see two days of "intense" activity, right before the PC was given back to its owner, amidst a "desert" of no activity whatsoever in the days/weeks before and after these two days, and I guess it's during these two days that the "cleaning" was performed. I mean, it's pretty clear that for this 2 years laptop to be totally empty of word, excel and pdf files, while Photorec digs out thousands of them, some major cleaning must have been performed, but the question of the date (and, possibly, of the kind of command or application s/w used to do the cleaning) is the one I'd like to address. Help would/will be greatly appreciated, Many thanks in advance, Hervé --- Ce courrier électronique ne contient aucun virus ou logiciel malveillant parce que la protection avast! Antivirus est active. http://www.avast.com |
From: Jason W. <jwr...@gm...> - 2014-02-25 12:26:16
|
Sorry for the redundancy, I see Atila basically said the same thing. On Tue, Feb 25, 2014 at 7:25 AM, Jason Wright <jwr...@gm...> wrote: > Ewaldo, > > The original icat states a filesize of 24064, which is 47 sectors. Drop > your count to 47 in your dd and see if that works. > > R/ > > Jason > > > On Tue, Feb 25, 2014 at 12:05 AM, ewaldo simon <ewa...@gm...>wrote: > >> I'm sorry to bump this thread, but I got lost again.... >> >> here is my dd command >> >> dd if=imagename.001 of=dd_output.xls bs=512 skip=20959 count=48 >> >> >> the skip comes from the first sector that istat produce by working on inode 1571469, which is sector 20896, then I add 63 as the offset. >> >> >> here is my icat command >> >> >> icat -o63 -v imagename.001 1571469 > icat_output.xls >> >> tsk_img_open: Type: 0 NumImg: 1 Img1: imagename.001 >> Not an EWF file >> fsopen: Auto detection mode at offset 32256 >> raw_read: byte offset: 32256 len: 65536 >> raw_read: byte offset: 97792 len: 65536 >> raw_read: byte offset: 294400 len: 65536 >> iso9660_open img_info: 140321104 ftype: 2048 test: 1 >> iso_load_vol_desc: Bad volume descriptor: Magic number is not CD001 >> Trying RAW ISO9660 with 16-byte pre-block size >> fs_prepost_read: Mapped 32768 to 69904 >> iso_load_vol_desc: Bad volume descriptor: Magic number is not CD001 >> Trying RAW ISO9660 with 24-byte pre-block size >> fs_prepost_read: Mapped 32768 to 69912 >> iso_load_vol_desc: Bad volume descriptor: Magic number is not CD001 >> fatfs_inode_lookup: reading sector 105856 for inode 1571469 >> raw_read: byte offset: 54230528 len: 65536 >> raw_read: byte offset: 97280 len: 65536 >> tsk_fs_file_walk: Processing file 1571469 >> fatfs_make_data_run: Processing deleted file 1571469 in recovery mode >> raw_read: byte offset: 10731008 len: 65536 >> >> >> then I look for the md5sums of both of them. >> >> >> 3a1ef7b320ee2d675ed631dbd4bc53c3 dd_output.xls >> 1f12a90ee22e0565be9be1f9a8673688 icat_output.xls >> >> >> >> this is the output of running "diff dd_output.xls icat_output.xls": >> >> Binary files dd_output.xls and icat_output.xls differ >> >> >> why do they have diffrent md5sum? aren't they suppose to be the same file? >> >> >> >> >> On Mon, Feb 24, 2014 at 5:44 PM, ewaldo simon <ewa...@gm...>wrote: >> >>> James, >>> >>> I've read the foremost auditlog, but after trying to find inode of those >>> files that foremost carved, none of them is being pointed by the FAT >>> entries of "tagihan.xls" I've shown before. >>> >>> I've also check the metadata of those excel files that foremost carved, >>> and also no information there. to read the metadata I open them on >>> libreoffice and also right click properties on explorer. >>> >>> I guess that's it then, unless there is another suggestion, I will be >>> more than happy to oblige. >>> anyway thank you for all your help, it really helped me alot, and I have >>> much thing that I learned. >>> >>> >>> >>> >>> On Mon, Feb 24, 2014 at 11:01 AM, James Haughom <ja...@ne...> wrote: >>> >>>> Ewaldo >>>> >>>> Foremost produces an audit file which will give you the offset from >>>> which the file was recovered. You might be able to correlate the offset >>>> with data that you are extracting with istat. >>>> >>>> Good luck >>>> >>>> >>>> On 2/23/14 9:07 PM, ewaldo simon wrote: >>>> > Jason, >>>> > >>>> > Thank you very much, you're right, by running: >>>> > >>>> > ifind -o63 imagename.001 -d 20896 (and on other data unit on the same >>>> > FAT entry) >>>> > >>>> > result in diffrent inode/fat entry (which is 156282). >>>> > >>>> > the thing is I have 6 excel files, carved by foremost that I suspected >>>> > is related to those files (because the content of the file). >>>> > FYI, "tagihan" means invoice, and those 6 files that I carved are >>>> invoices, >>>> > the question is this: >>>> > do you guys have any other mean to connect the files that I carved >>>> (with >>>> > foremost) with those files that I found using fls? or how can I tell >>>> > which data unit that foremost used to carved these files. >>>> > >>>> > Thank you before. >>>> > >>>> > ps. I posted this also in forensicfocus.com <http://forensicfocus.com >>>> >, >>>> > and if it is OK with you, I'll post this discussion result in there >>>> too. >>>> > >>>> > >>>> > >>>> > >>>> > >>>> > On Sun, Feb 23, 2014 at 10:34 PM, Jason Wright <jwr...@gm... >>>> > <mailto:jwr...@gm...>> wrote: >>>> > >>>> > Ewaldo, >>>> > >>>> > It is possible the blocks have been reused since these are all >>>> > deleted references. The metadata can still reference the file, but >>>> > the blocks can be reused by an allocated file. If the header of >>>> the >>>> > file dumped by icat isn't for an xls file then that is likely the >>>> > case. Use ifind to see what inode is associated with the specific >>>> > block (20896 for example). >>>> > >>>> > R/ >>>> > >>>> > Jason >>>> > >>>> > First of all thank you for your suggestion, I've tried your >>>> > suggestion and change the blocksize, It didn't work out either. >>>> I've >>>> > also tried using icat on that inode (which doesn't use blocksize) >>>> it >>>> > didn't work either. >>>> > is it possible that the data unit that being pointed by the >>>> metadata >>>> > (FAT entries) already used by another file/folder? >>>> > >>>> > actually I managed to save some of those (if it is true that that >>>> is >>>> > the files) files needed, by using foremost, and I need to find the >>>> > names of the files, so I tried using dd and/or icat. >>>> > >>>> > so the bottomline is I need a way to extract those excel, or found >>>> > out the name that foremost carved, or make sure that some of those >>>> > files shown with fls is actually the one that foremost recovered. >>>> > >>>> > Thank you before, and sorry for my bad english. >>>> > >>>> > >>>> > >>>> > On Sun, Feb 23, 2014 at 1:13 AM, Alex Nelson < >>>> ajn...@cs... >>>> > <mailto:ajn...@cs...>> wrote: >>>> > >>>> > Hi Ewaldo, >>>> > >>>> > Your dd should work given the sector offsets. However, you >>>> > passed a sector size of 4096 (bs=4096). I'm guessing because >>>> > -o63 worked in istat, you should have passed bs=512 in dd. >>>> > >>>> > --Alex >>>> > >>>> > >>>> > On Feb 22, 2014, at 01:44 , ewaldo simon < >>>> ewa...@gm... >>>> > <mailto:ewa...@gm...>> wrote: >>>> > >>>> >> Dear sleuthkit user mailing list, can anyone help me with >>>> >> this, I am trying to recover some orphan files. >>>> >> >>>> >> >>>> >> 1. first, using fls, I've found some orphaned files: >>>> >> >>>> >> Code: >>>> >> fls -o 63 -r F imagename.001 | grep -i file_name >>>> >> >>>> >> -/r * 649873: $OrphanFiles/TAGIHAN.xls >>>> >> r/r * 122506: $OrphanFiles/PT8D15~1.NUG/REKAP TAGIHAN >>>> MAR'11.xls >>>> >> -/r * 1212051: $OrphanFiles/TAGIHA~1.XLS >>>> >> -/r * 1282702: $OrphanFiles/TAGIHA~1.XLS >>>> >> -/r * 1374865: $OrphanFiles/TAGIHA~1.XLS >>>> >> -/r * 1472145: $OrphanFiles/TAGIHA~1.XLS >>>> >> -/r * 1519249: $OrphanFiles/TAGIHA~1.XLS >>>> >> -/r * 1571469: $OrphanFiles/TAGIHA~1.XLS >>>> >> >>>> >> 2. then, using istat to see the metadata of the last file >>>> >> listed before (this is the part that I got wrong the last >>>> time) >>>> >> >>>> >> Code: >>>> >> istat -o 63 imagename 1571469 >>>> >> >>>> >> Directory Entry: 1571469 >>>> >> Not Allocated >>>> >> File Attributes: File, Archive >>>> >> Size: 24064 >>>> >> Name: TAGIHA~1.XLS >>>> >> >>>> >> Directory Entry Times: >>>> >> Written: Mon Aug 24 14:26:16 2009 >>>> >> Accessed: Tue Aug 7 00:00:00 2012 >>>> >> Created: Tue Aug 7 09:40:58 2012 >>>> >> >>>> >> Sectors: >>>> >> 20896 20897 20898 20899 20900 20901 20902 20903 >>>> >> 20904 20905 20906 20907 20908 20909 20910 20911 >>>> >> 20912 20913 20914 20915 20916 20917 20918 20919 >>>> >> 20920 20921 20922 20923 20924 20925 20926 20927 >>>> >> 20928 20929 20930 20931 20932 20933 20934 20935 >>>> >> 20936 20937 20938 20939 20940 20941 20942 20943 >>>> >> >>>> >> it means that the directory entry still points to the FAT >>>> >> entries and in the end points to the sectors used by that >>>> file. >>>> >> >>>> >> 3. now I don't get how to recover the TAGIHA~1.XLS >>>> >> >>>> >> I've tried using dd: >>>> >> Code: >>>> >> dd if=imagefile of=outputfile bs=4096 skip=20896 count=6 >>>> >> and >>>> >> also icat >>>> >> >>>> >> icat -o 63 imagename.001 1571496 > TAGI~1.xls >>>> >> >>>> >> >>>> >> again to no avail. >>>> >> >>>> >> I've tried recovering with foremost, and it does recover some >>>> >> files, but I need the name of the files, that's why I'm >>>> trying >>>> >> to use this method. >>>> >> Please correct me if I'm wrong, and give me the hint where to >>>> >> go from here. I really appriciate your help, thank you. >>>> >> >>>> >> >>>> >> >>>> >> >>>> >> -- >>>> >> Regards, >>>> >> Ewaldo Simon >>>> >> >>>> ------------------------------------------------------------------------------ >>>> >> Managing the Performance of Cloud-Based Applications >>>> >> Take advantage of what the Cloud has to offer - Avoid Common >>>> >> Pitfalls. >>>> >> Read the Whitepaper. >>>> >> >>>> http://pubads.g.doubleclick.net/gampad/clk?id=121054471&iu=/4140/ostg.clktrk_______________________________________________ >>>> >> sleuthkit-users mailing list >>>> >> https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >>>> >> http://www.sleuthkit.org >>>> > >>>> > >>>> > >>>> > >>>> > -- >>>> > Regards, >>>> > Ewaldo Simon >>>> > >>>> > >>>> ------------------------------------------------------------------------------ >>>> > Managing the Performance of Cloud-Based Applications >>>> > Take advantage of what the Cloud has to offer - Avoid Common >>>> Pitfalls. >>>> > Read the Whitepaper. >>>> > >>>> http://pubads.g.doubleclick.net/gampad/clk?id=121054471&iu=/4140/ostg.clktrk >>>> > _______________________________________________ >>>> > sleuthkit-users mailing list >>>> > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >>>> > http://www.sleuthkit.org >>>> > >>>> > >>>> > >>>> > >>>> > -- >>>> > Regards, >>>> > Ewaldo Simon >>>> > >>>> > >>>> > >>>> ------------------------------------------------------------------------------ >>>> > Flow-based real-time traffic analytics software. Cisco certified tool. >>>> > Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer >>>> > Customize your own dashboards, set traffic alerts and generate >>>> reports. >>>> > Network behavioral analysis & security monitoring. All-in-one tool. >>>> > >>>> http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk >>>> > >>>> > >>>> > >>>> > _______________________________________________ >>>> > sleuthkit-users mailing list >>>> > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >>>> > http://www.sleuthkit.org >>>> > >>>> >>>> >>>> >>> >>> >>> -- >>> Regards, >>> Ewaldo Simon >>> >> >> >> >> -- >> Regards, >> Ewaldo Simon >> >> >> ------------------------------------------------------------------------------ >> Flow-based real-time traffic analytics software. Cisco certified tool. >> Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer >> Customize your own dashboards, set traffic alerts and generate reports. >> Network behavioral analysis & security monitoring. All-in-one tool. >> >> http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk >> _______________________________________________ >> sleuthkit-users mailing list >> https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >> http://www.sleuthkit.org >> >> > |
From: Jason W. <jwr...@gm...> - 2014-02-25 12:25:24
|
Ewaldo, The original icat states a filesize of 24064, which is 47 sectors. Drop your count to 47 in your dd and see if that works. R/ Jason On Tue, Feb 25, 2014 at 12:05 AM, ewaldo simon <ewa...@gm...>wrote: > I'm sorry to bump this thread, but I got lost again.... > > here is my dd command > > dd if=imagename.001 of=dd_output.xls bs=512 skip=20959 count=48 > > > the skip comes from the first sector that istat produce by working on inode 1571469, which is sector 20896, then I add 63 as the offset. > > > here is my icat command > > > icat -o63 -v imagename.001 1571469 > icat_output.xls > > tsk_img_open: Type: 0 NumImg: 1 Img1: imagename.001 > Not an EWF file > fsopen: Auto detection mode at offset 32256 > raw_read: byte offset: 32256 len: 65536 > raw_read: byte offset: 97792 len: 65536 > raw_read: byte offset: 294400 len: 65536 > iso9660_open img_info: 140321104 ftype: 2048 test: 1 > iso_load_vol_desc: Bad volume descriptor: Magic number is not CD001 > Trying RAW ISO9660 with 16-byte pre-block size > fs_prepost_read: Mapped 32768 to 69904 > iso_load_vol_desc: Bad volume descriptor: Magic number is not CD001 > Trying RAW ISO9660 with 24-byte pre-block size > fs_prepost_read: Mapped 32768 to 69912 > iso_load_vol_desc: Bad volume descriptor: Magic number is not CD001 > fatfs_inode_lookup: reading sector 105856 for inode 1571469 > raw_read: byte offset: 54230528 len: 65536 > raw_read: byte offset: 97280 len: 65536 > tsk_fs_file_walk: Processing file 1571469 > fatfs_make_data_run: Processing deleted file 1571469 in recovery mode > raw_read: byte offset: 10731008 len: 65536 > > > then I look for the md5sums of both of them. > > > 3a1ef7b320ee2d675ed631dbd4bc53c3 dd_output.xls > 1f12a90ee22e0565be9be1f9a8673688 icat_output.xls > > > > this is the output of running "diff dd_output.xls icat_output.xls": > > Binary files dd_output.xls and icat_output.xls differ > > > why do they have diffrent md5sum? aren't they suppose to be the same file? > > > > > On Mon, Feb 24, 2014 at 5:44 PM, ewaldo simon <ewa...@gm...>wrote: > >> James, >> >> I've read the foremost auditlog, but after trying to find inode of those >> files that foremost carved, none of them is being pointed by the FAT >> entries of "tagihan.xls" I've shown before. >> >> I've also check the metadata of those excel files that foremost carved, >> and also no information there. to read the metadata I open them on >> libreoffice and also right click properties on explorer. >> >> I guess that's it then, unless there is another suggestion, I will be >> more than happy to oblige. >> anyway thank you for all your help, it really helped me alot, and I have >> much thing that I learned. >> >> >> >> >> On Mon, Feb 24, 2014 at 11:01 AM, James Haughom <ja...@ne...> wrote: >> >>> Ewaldo >>> >>> Foremost produces an audit file which will give you the offset from >>> which the file was recovered. You might be able to correlate the offset >>> with data that you are extracting with istat. >>> >>> Good luck >>> >>> >>> On 2/23/14 9:07 PM, ewaldo simon wrote: >>> > Jason, >>> > >>> > Thank you very much, you're right, by running: >>> > >>> > ifind -o63 imagename.001 -d 20896 (and on other data unit on the same >>> > FAT entry) >>> > >>> > result in diffrent inode/fat entry (which is 156282). >>> > >>> > the thing is I have 6 excel files, carved by foremost that I suspected >>> > is related to those files (because the content of the file). >>> > FYI, "tagihan" means invoice, and those 6 files that I carved are >>> invoices, >>> > the question is this: >>> > do you guys have any other mean to connect the files that I carved >>> (with >>> > foremost) with those files that I found using fls? or how can I tell >>> > which data unit that foremost used to carved these files. >>> > >>> > Thank you before. >>> > >>> > ps. I posted this also in forensicfocus.com <http://forensicfocus.com >>> >, >>> > and if it is OK with you, I'll post this discussion result in there >>> too. >>> > >>> > >>> > >>> > >>> > >>> > On Sun, Feb 23, 2014 at 10:34 PM, Jason Wright <jwr...@gm... >>> > <mailto:jwr...@gm...>> wrote: >>> > >>> > Ewaldo, >>> > >>> > It is possible the blocks have been reused since these are all >>> > deleted references. The metadata can still reference the file, but >>> > the blocks can be reused by an allocated file. If the header of the >>> > file dumped by icat isn't for an xls file then that is likely the >>> > case. Use ifind to see what inode is associated with the specific >>> > block (20896 for example). >>> > >>> > R/ >>> > >>> > Jason >>> > >>> > First of all thank you for your suggestion, I've tried your >>> > suggestion and change the blocksize, It didn't work out either. >>> I've >>> > also tried using icat on that inode (which doesn't use blocksize) >>> it >>> > didn't work either. >>> > is it possible that the data unit that being pointed by the >>> metadata >>> > (FAT entries) already used by another file/folder? >>> > >>> > actually I managed to save some of those (if it is true that that >>> is >>> > the files) files needed, by using foremost, and I need to find the >>> > names of the files, so I tried using dd and/or icat. >>> > >>> > so the bottomline is I need a way to extract those excel, or found >>> > out the name that foremost carved, or make sure that some of those >>> > files shown with fls is actually the one that foremost recovered. >>> > >>> > Thank you before, and sorry for my bad english. >>> > >>> > >>> > >>> > On Sun, Feb 23, 2014 at 1:13 AM, Alex Nelson <ajn...@cs... >>> > <mailto:ajn...@cs...>> wrote: >>> > >>> > Hi Ewaldo, >>> > >>> > Your dd should work given the sector offsets. However, you >>> > passed a sector size of 4096 (bs=4096). I'm guessing because >>> > -o63 worked in istat, you should have passed bs=512 in dd. >>> > >>> > --Alex >>> > >>> > >>> > On Feb 22, 2014, at 01:44 , ewaldo simon < >>> ewa...@gm... >>> > <mailto:ewa...@gm...>> wrote: >>> > >>> >> Dear sleuthkit user mailing list, can anyone help me with >>> >> this, I am trying to recover some orphan files. >>> >> >>> >> >>> >> 1. first, using fls, I've found some orphaned files: >>> >> >>> >> Code: >>> >> fls -o 63 -r F imagename.001 | grep -i file_name >>> >> >>> >> -/r * 649873: $OrphanFiles/TAGIHAN.xls >>> >> r/r * 122506: $OrphanFiles/PT8D15~1.NUG/REKAP TAGIHAN >>> MAR'11.xls >>> >> -/r * 1212051: $OrphanFiles/TAGIHA~1.XLS >>> >> -/r * 1282702: $OrphanFiles/TAGIHA~1.XLS >>> >> -/r * 1374865: $OrphanFiles/TAGIHA~1.XLS >>> >> -/r * 1472145: $OrphanFiles/TAGIHA~1.XLS >>> >> -/r * 1519249: $OrphanFiles/TAGIHA~1.XLS >>> >> -/r * 1571469: $OrphanFiles/TAGIHA~1.XLS >>> >> >>> >> 2. then, using istat to see the metadata of the last file >>> >> listed before (this is the part that I got wrong the last >>> time) >>> >> >>> >> Code: >>> >> istat -o 63 imagename 1571469 >>> >> >>> >> Directory Entry: 1571469 >>> >> Not Allocated >>> >> File Attributes: File, Archive >>> >> Size: 24064 >>> >> Name: TAGIHA~1.XLS >>> >> >>> >> Directory Entry Times: >>> >> Written: Mon Aug 24 14:26:16 2009 >>> >> Accessed: Tue Aug 7 00:00:00 2012 >>> >> Created: Tue Aug 7 09:40:58 2012 >>> >> >>> >> Sectors: >>> >> 20896 20897 20898 20899 20900 20901 20902 20903 >>> >> 20904 20905 20906 20907 20908 20909 20910 20911 >>> >> 20912 20913 20914 20915 20916 20917 20918 20919 >>> >> 20920 20921 20922 20923 20924 20925 20926 20927 >>> >> 20928 20929 20930 20931 20932 20933 20934 20935 >>> >> 20936 20937 20938 20939 20940 20941 20942 20943 >>> >> >>> >> it means that the directory entry still points to the FAT >>> >> entries and in the end points to the sectors used by that >>> file. >>> >> >>> >> 3. now I don't get how to recover the TAGIHA~1.XLS >>> >> >>> >> I've tried using dd: >>> >> Code: >>> >> dd if=imagefile of=outputfile bs=4096 skip=20896 count=6 >>> >> and >>> >> also icat >>> >> >>> >> icat -o 63 imagename.001 1571496 > TAGI~1.xls >>> >> >>> >> >>> >> again to no avail. >>> >> >>> >> I've tried recovering with foremost, and it does recover some >>> >> files, but I need the name of the files, that's why I'm trying >>> >> to use this method. >>> >> Please correct me if I'm wrong, and give me the hint where to >>> >> go from here. I really appriciate your help, thank you. >>> >> >>> >> >>> >> >>> >> >>> >> -- >>> >> Regards, >>> >> Ewaldo Simon >>> >> >>> ------------------------------------------------------------------------------ >>> >> Managing the Performance of Cloud-Based Applications >>> >> Take advantage of what the Cloud has to offer - Avoid Common >>> >> Pitfalls. >>> >> Read the Whitepaper. >>> >> >>> http://pubads.g.doubleclick.net/gampad/clk?id=121054471&iu=/4140/ostg.clktrk_______________________________________________ >>> >> sleuthkit-users mailing list >>> >> https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >>> >> http://www.sleuthkit.org >>> > >>> > >>> > >>> > >>> > -- >>> > Regards, >>> > Ewaldo Simon >>> > >>> > >>> ------------------------------------------------------------------------------ >>> > Managing the Performance of Cloud-Based Applications >>> > Take advantage of what the Cloud has to offer - Avoid Common >>> Pitfalls. >>> > Read the Whitepaper. >>> > >>> http://pubads.g.doubleclick.net/gampad/clk?id=121054471&iu=/4140/ostg.clktrk >>> > _______________________________________________ >>> > sleuthkit-users mailing list >>> > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >>> > http://www.sleuthkit.org >>> > >>> > >>> > >>> > >>> > -- >>> > Regards, >>> > Ewaldo Simon >>> > >>> > >>> > >>> ------------------------------------------------------------------------------ >>> > Flow-based real-time traffic analytics software. Cisco certified tool. >>> > Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer >>> > Customize your own dashboards, set traffic alerts and generate reports. >>> > Network behavioral analysis & security monitoring. All-in-one tool. >>> > >>> http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk >>> > >>> > >>> > >>> > _______________________________________________ >>> > sleuthkit-users mailing list >>> > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >>> > http://www.sleuthkit.org >>> > >>> >>> >>> >> >> >> -- >> Regards, >> Ewaldo Simon >> > > > > -- > Regards, > Ewaldo Simon > > > ------------------------------------------------------------------------------ > Flow-based real-time traffic analytics software. Cisco certified tool. > Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer > Customize your own dashboards, set traffic alerts and generate reports. > Network behavioral analysis & security monitoring. All-in-one tool. > > http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk > _______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org > > |
From: Atila <ati...@dp...> - 2014-02-25 12:00:08
|
try dd if=imagename.001 of=dd_output.xls bs=1 skip=$[20896*512+63*512] count=24064 On 25-02-2014 02:05, ewaldo simon wrote: > I'm sorry to bump this thread, but I got lost again.... > > here is my dd command > > dd if=imagename.001 of=dd_output.xls bs=512 skip=20959 count=48 > the skip comes from the first sector that istat produce by working on inode1571469, which is sector20896, then I add 63 as the offset. > > here is my icat command > > icat -o63 -v imagename.001 1571469 > icat_output.xls > > tsk_img_open: Type: 0 NumImg: 1 Img1: imagename.001 > Not an EWF file > fsopen: Auto detection mode at offset 32256 > raw_read: byte offset: 32256 len: 65536 > raw_read: byte offset: 97792 len: 65536 > raw_read: byte offset: 294400 len: 65536 > iso9660_open img_info: 140321104 ftype: 2048 test: 1 > iso_load_vol_desc: Bad volume descriptor: Magic number is not CD001 > Trying RAW ISO9660 with 16-byte pre-block size > fs_prepost_read: Mapped 32768 to 69904 > iso_load_vol_desc: Bad volume descriptor: Magic number is not CD001 > Trying RAW ISO9660 with 24-byte pre-block size > fs_prepost_read: Mapped 32768 to 69912 > iso_load_vol_desc: Bad volume descriptor: Magic number is not CD001 > fatfs_inode_lookup: reading sector 105856 for inode 1571469 > raw_read: byte offset: 54230528 len: 65536 > raw_read: byte offset: 97280 len: 65536 > tsk_fs_file_walk: Processing file 1571469 > fatfs_make_data_run: Processing deleted file 1571469 in recovery mode > raw_read: byte offset: 10731008 len: 65536 > then I look for the md5sums of both of them. > > 3a1ef7b320ee2d675ed631dbd4bc53c3 dd_output.xls > 1f12a90ee22e0565be9be1f9a8673688 icat_output.xls > this is the output of running "diff dd_output.xls icat_output.xls": > > Binary files dd_output.xls and icat_output.xls differ > why do they have diffrent md5sum? aren't they suppose to be the same file? > > > > On Mon, Feb 24, 2014 at 5:44 PM, ewaldo simon <ewa...@gm... > <mailto:ewa...@gm...>> wrote: > > James, > > I've read the foremost auditlog, but after trying to find inode of > those files that foremost carved, none of them is being pointed by > the FAT entries of "tagihan.xls" I've shown before. > > I've also check the metadata of those excel files that foremost > carved, and also no information there. to read the metadata I open > them on libreoffice and also right click properties on explorer. > > I guess that's it then, unless there is another suggestion, I will > be more than happy to oblige. > anyway thank you for all your help, it really helped me alot, and > I have much thing that I learned. > > > > > On Mon, Feb 24, 2014 at 11:01 AM, James Haughom <ja...@ne... > <mailto:ja...@ne...>> wrote: > > Ewaldo > > Foremost produces an audit file which will give you the offset > from > which the file was recovered. You might be able to correlate > the offset > with data that you are extracting with istat. > > Good luck > > > On 2/23/14 9:07 PM, ewaldo simon wrote: > > Jason, > > > > Thank you very much, you're right, by running: > > > > ifind -o63 imagename.001 -d 20896 (and on other data unit > on the same > > FAT entry) > > > > result in diffrent inode/fat entry (which is 156282). > > > > the thing is I have 6 excel files, carved by foremost that I > suspected > > is related to those files (because the content of the file). > > FYI, "tagihan" means invoice, and those 6 files that I > carved are invoices, > > the question is this: > > do you guys have any other mean to connect the files that I > carved (with > > foremost) with those files that I found using fls? or how > can I tell > > which data unit that foremost used to carved these files. > > > > Thank you before. > > > > ps. I posted this also in forensicfocus.com > <http://forensicfocus.com> <http://forensicfocus.com>, > > and if it is OK with you, I'll post this discussion result > in there too. > > > > > > > > > > > > On Sun, Feb 23, 2014 at 10:34 PM, Jason Wright > <jwr...@gm... <mailto:jwr...@gm...> > > <mailto:jwr...@gm... > <mailto:jwr...@gm...>>> wrote: > > > > Ewaldo, > > > > It is possible the blocks have been reused since these > are all > > deleted references. The metadata can still reference the > file, but > > the blocks can be reused by an allocated file. If the > header of the > > file dumped by icat isn't for an xls file then that is > likely the > > case. Use ifind to see what inode is associated with the > specific > > block (20896 for example). > > > > R/ > > > > Jason > > > > First of all thank you for your suggestion, I've tried your > > suggestion and change the blocksize, It didn't work out > either. I've > > also tried using icat on that inode (which doesn't use > blocksize) it > > didn't work either. > > is it possible that the data unit that being pointed by > the metadata > > (FAT entries) already used by another file/folder? > > > > actually I managed to save some of those (if it is true > that that is > > the files) files needed, by using foremost, and I need > to find the > > names of the files, so I tried using dd and/or icat. > > > > so the bottomline is I need a way to extract those > excel, or found > > out the name that foremost carved, or make sure that > some of those > > files shown with fls is actually the one that foremost > recovered. > > > > Thank you before, and sorry for my bad english. > > > > > > > > On Sun, Feb 23, 2014 at 1:13 AM, Alex Nelson > <ajn...@cs... <mailto:ajn...@cs...> > > <mailto:ajn...@cs... > <mailto:ajn...@cs...>>> wrote: > > > > Hi Ewaldo, > > > > Your dd should work given the sector offsets. > However, you > > passed a sector size of 4096 (bs=4096). I'm > guessing because > > -o63 worked in istat, you should have passed bs=512 > in dd. > > > > --Alex > > > > > > On Feb 22, 2014, at 01:44 , ewaldo simon > <ewa...@gm... <mailto:ewa...@gm...> > > <mailto:ewa...@gm... > <mailto:ewa...@gm...>>> wrote: > > > >> Dear sleuthkit user mailing list, can anyone help > me with > >> this, I am trying to recover some orphan files. > >> > >> > >> 1. first, using fls, I've found some orphaned files: > >> > >> Code: > >> fls -o 63 -r F imagename.001 | grep -i file_name > >> > >> -/r * 649873: $OrphanFiles/TAGIHAN.xls > >> r/r * 122506: $OrphanFiles/PT8D15~1.NUG/REKAP > TAGIHAN MAR'11.xls > >> -/r * 1212051: $OrphanFiles/TAGIHA~1.XLS > >> -/r * 1282702: $OrphanFiles/TAGIHA~1.XLS > >> -/r * 1374865: $OrphanFiles/TAGIHA~1.XLS > >> -/r * 1472145: $OrphanFiles/TAGIHA~1.XLS > >> -/r * 1519249: $OrphanFiles/TAGIHA~1.XLS > >> -/r * 1571469: $OrphanFiles/TAGIHA~1.XLS > >> > >> 2. then, using istat to see the metadata of the > last file > >> listed before (this is the part that I got wrong > the last time) > >> > >> Code: > >> istat -o 63 imagename 1571469 > >> > >> Directory Entry: 1571469 > >> Not Allocated > >> File Attributes: File, Archive > >> Size: 24064 > >> Name: TAGIHA~1.XLS > >> > >> Directory Entry Times: > >> Written: Mon Aug 24 14:26:16 2009 > >> Accessed: Tue Aug 7 00:00:00 2012 > >> Created: Tue Aug 7 09:40:58 2012 > >> > >> Sectors: > >> 20896 20897 20898 20899 20900 20901 20902 20903 > >> 20904 20905 20906 20907 20908 20909 20910 20911 > >> 20912 20913 20914 20915 20916 20917 20918 20919 > >> 20920 20921 20922 20923 20924 20925 20926 20927 > >> 20928 20929 20930 20931 20932 20933 20934 20935 > >> 20936 20937 20938 20939 20940 20941 20942 20943 > >> > >> it means that the directory entry still points to > the FAT > >> entries and in the end points to the sectors used > by that file. > >> > >> 3. now I don't get how to recover the TAGIHA~1.XLS > >> > >> I've tried using dd: > >> Code: > >> dd if=imagefile of=outputfile bs=4096 skip=20896 > count=6 > >> and > >> also icat > >> > >> icat -o 63 imagename.001 1571496 > TAGI~1.xls > >> > >> > >> again to no avail. > >> > >> I've tried recovering with foremost, and it does > recover some > >> files, but I need the name of the files, that's why > I'm trying > >> to use this method. > >> Please correct me if I'm wrong, and give me the > hint where to > >> go from here. I really appriciate your help, thank you. > >> > >> > >> > >> > >> -- > >> Regards, > >> Ewaldo Simon > >> > ------------------------------------------------------------------------------ > >> Managing the Performance of Cloud-Based Applications > >> Take advantage of what the Cloud has to offer - > Avoid Common > >> Pitfalls. > >> Read the Whitepaper. > >> > http://pubads.g.doubleclick.net/gampad/clk?id=121054471&iu=/4140/ostg.clktrk_______________________________________________ > >> sleuthkit-users mailing list > >> https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > >> http://www.sleuthkit.org > > > > > > > > > > -- > > Regards, > > Ewaldo Simon > > > > > ------------------------------------------------------------------------------ > > Managing the Performance of Cloud-Based Applications > > Take advantage of what the Cloud has to offer - Avoid > Common Pitfalls. > > Read the Whitepaper. > > > http://pubads.g.doubleclick.net/gampad/clk?id=121054471&iu=/4140/ostg.clktrk > > _______________________________________________ > > sleuthkit-users mailing list > > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > > http://www.sleuthkit.org > > > > > > > > > > -- > > Regards, > > Ewaldo Simon > > > > > > > ------------------------------------------------------------------------------ > > Flow-based real-time traffic analytics software. Cisco > certified tool. > > Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow > Analyzer > > Customize your own dashboards, set traffic alerts and > generate reports. > > Network behavioral analysis & security monitoring. > All-in-one tool. > > > http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk > > > > > > > > _______________________________________________ > > sleuthkit-users mailing list > > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > > http://www.sleuthkit.org > > > > > > > > -- > Regards, > Ewaldo Simon > > > > > -- > Regards, > Ewaldo Simon > > > ------------------------------------------------------------------------------ > Flow-based real-time traffic analytics software. Cisco certified tool. > Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer > Customize your own dashboards, set traffic alerts and generate reports. > Network behavioral analysis & security monitoring. All-in-one tool. > http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk > > > _______________________________________________ > sleuthkit-users mailing list > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users > http://www.sleuthkit.org |
From: ewaldo s. <ewa...@gm...> - 2014-02-25 05:06:03
|
I'm sorry to bump this thread, but I got lost again.... here is my dd command dd if=imagename.001 of=dd_output.xls bs=512 skip=20959 count=48 the skip comes from the first sector that istat produce by working on inode 1571469, which is sector 20896, then I add 63 as the offset. here is my icat command icat -o63 -v imagename.001 1571469 > icat_output.xls tsk_img_open: Type: 0 NumImg: 1 Img1: imagename.001 Not an EWF file fsopen: Auto detection mode at offset 32256 raw_read: byte offset: 32256 len: 65536 raw_read: byte offset: 97792 len: 65536 raw_read: byte offset: 294400 len: 65536 iso9660_open img_info: 140321104 ftype: 2048 test: 1 iso_load_vol_desc: Bad volume descriptor: Magic number is not CD001 Trying RAW ISO9660 with 16-byte pre-block size fs_prepost_read: Mapped 32768 to 69904 iso_load_vol_desc: Bad volume descriptor: Magic number is not CD001 Trying RAW ISO9660 with 24-byte pre-block size fs_prepost_read: Mapped 32768 to 69912 iso_load_vol_desc: Bad volume descriptor: Magic number is not CD001 fatfs_inode_lookup: reading sector 105856 for inode 1571469 raw_read: byte offset: 54230528 len: 65536 raw_read: byte offset: 97280 len: 65536 tsk_fs_file_walk: Processing file 1571469 fatfs_make_data_run: Processing deleted file 1571469 in recovery mode raw_read: byte offset: 10731008 len: 65536 then I look for the md5sums of both of them. 3a1ef7b320ee2d675ed631dbd4bc53c3 dd_output.xls 1f12a90ee22e0565be9be1f9a8673688 icat_output.xls this is the output of running "diff dd_output.xls icat_output.xls": Binary files dd_output.xls and icat_output.xls differ why do they have diffrent md5sum? aren't they suppose to be the same file? On Mon, Feb 24, 2014 at 5:44 PM, ewaldo simon <ewa...@gm...> wrote: > James, > > I've read the foremost auditlog, but after trying to find inode of those > files that foremost carved, none of them is being pointed by the FAT > entries of "tagihan.xls" I've shown before. > > I've also check the metadata of those excel files that foremost carved, > and also no information there. to read the metadata I open them on > libreoffice and also right click properties on explorer. > > I guess that's it then, unless there is another suggestion, I will be more > than happy to oblige. > anyway thank you for all your help, it really helped me alot, and I have > much thing that I learned. > > > > > On Mon, Feb 24, 2014 at 11:01 AM, James Haughom <ja...@ne...> wrote: > >> Ewaldo >> >> Foremost produces an audit file which will give you the offset from >> which the file was recovered. You might be able to correlate the offset >> with data that you are extracting with istat. >> >> Good luck >> >> >> On 2/23/14 9:07 PM, ewaldo simon wrote: >> > Jason, >> > >> > Thank you very much, you're right, by running: >> > >> > ifind -o63 imagename.001 -d 20896 (and on other data unit on the same >> > FAT entry) >> > >> > result in diffrent inode/fat entry (which is 156282). >> > >> > the thing is I have 6 excel files, carved by foremost that I suspected >> > is related to those files (because the content of the file). >> > FYI, "tagihan" means invoice, and those 6 files that I carved are >> invoices, >> > the question is this: >> > do you guys have any other mean to connect the files that I carved (with >> > foremost) with those files that I found using fls? or how can I tell >> > which data unit that foremost used to carved these files. >> > >> > Thank you before. >> > >> > ps. I posted this also in forensicfocus.com <http://forensicfocus.com>, >> > and if it is OK with you, I'll post this discussion result in there too. >> > >> > >> > >> > >> > >> > On Sun, Feb 23, 2014 at 10:34 PM, Jason Wright <jwr...@gm... >> > <mailto:jwr...@gm...>> wrote: >> > >> > Ewaldo, >> > >> > It is possible the blocks have been reused since these are all >> > deleted references. The metadata can still reference the file, but >> > the blocks can be reused by an allocated file. If the header of the >> > file dumped by icat isn't for an xls file then that is likely the >> > case. Use ifind to see what inode is associated with the specific >> > block (20896 for example). >> > >> > R/ >> > >> > Jason >> > >> > First of all thank you for your suggestion, I've tried your >> > suggestion and change the blocksize, It didn't work out either. I've >> > also tried using icat on that inode (which doesn't use blocksize) it >> > didn't work either. >> > is it possible that the data unit that being pointed by the metadata >> > (FAT entries) already used by another file/folder? >> > >> > actually I managed to save some of those (if it is true that that is >> > the files) files needed, by using foremost, and I need to find the >> > names of the files, so I tried using dd and/or icat. >> > >> > so the bottomline is I need a way to extract those excel, or found >> > out the name that foremost carved, or make sure that some of those >> > files shown with fls is actually the one that foremost recovered. >> > >> > Thank you before, and sorry for my bad english. >> > >> > >> > >> > On Sun, Feb 23, 2014 at 1:13 AM, Alex Nelson <ajn...@cs... >> > <mailto:ajn...@cs...>> wrote: >> > >> > Hi Ewaldo, >> > >> > Your dd should work given the sector offsets. However, you >> > passed a sector size of 4096 (bs=4096). I'm guessing because >> > -o63 worked in istat, you should have passed bs=512 in dd. >> > >> > --Alex >> > >> > >> > On Feb 22, 2014, at 01:44 , ewaldo simon <ewa...@gm... >> > <mailto:ewa...@gm...>> wrote: >> > >> >> Dear sleuthkit user mailing list, can anyone help me with >> >> this, I am trying to recover some orphan files. >> >> >> >> >> >> 1. first, using fls, I've found some orphaned files: >> >> >> >> Code: >> >> fls -o 63 -r F imagename.001 | grep -i file_name >> >> >> >> -/r * 649873: $OrphanFiles/TAGIHAN.xls >> >> r/r * 122506: $OrphanFiles/PT8D15~1.NUG/REKAP TAGIHAN >> MAR'11.xls >> >> -/r * 1212051: $OrphanFiles/TAGIHA~1.XLS >> >> -/r * 1282702: $OrphanFiles/TAGIHA~1.XLS >> >> -/r * 1374865: $OrphanFiles/TAGIHA~1.XLS >> >> -/r * 1472145: $OrphanFiles/TAGIHA~1.XLS >> >> -/r * 1519249: $OrphanFiles/TAGIHA~1.XLS >> >> -/r * 1571469: $OrphanFiles/TAGIHA~1.XLS >> >> >> >> 2. then, using istat to see the metadata of the last file >> >> listed before (this is the part that I got wrong the last time) >> >> >> >> Code: >> >> istat -o 63 imagename 1571469 >> >> >> >> Directory Entry: 1571469 >> >> Not Allocated >> >> File Attributes: File, Archive >> >> Size: 24064 >> >> Name: TAGIHA~1.XLS >> >> >> >> Directory Entry Times: >> >> Written: Mon Aug 24 14:26:16 2009 >> >> Accessed: Tue Aug 7 00:00:00 2012 >> >> Created: Tue Aug 7 09:40:58 2012 >> >> >> >> Sectors: >> >> 20896 20897 20898 20899 20900 20901 20902 20903 >> >> 20904 20905 20906 20907 20908 20909 20910 20911 >> >> 20912 20913 20914 20915 20916 20917 20918 20919 >> >> 20920 20921 20922 20923 20924 20925 20926 20927 >> >> 20928 20929 20930 20931 20932 20933 20934 20935 >> >> 20936 20937 20938 20939 20940 20941 20942 20943 >> >> >> >> it means that the directory entry still points to the FAT >> >> entries and in the end points to the sectors used by that file. >> >> >> >> 3. now I don't get how to recover the TAGIHA~1.XLS >> >> >> >> I've tried using dd: >> >> Code: >> >> dd if=imagefile of=outputfile bs=4096 skip=20896 count=6 >> >> and >> >> also icat >> >> >> >> icat -o 63 imagename.001 1571496 > TAGI~1.xls >> >> >> >> >> >> again to no avail. >> >> >> >> I've tried recovering with foremost, and it does recover some >> >> files, but I need the name of the files, that's why I'm trying >> >> to use this method. >> >> Please correct me if I'm wrong, and give me the hint where to >> >> go from here. I really appriciate your help, thank you. >> >> >> >> >> >> >> >> >> >> -- >> >> Regards, >> >> Ewaldo Simon >> >> >> ------------------------------------------------------------------------------ >> >> Managing the Performance of Cloud-Based Applications >> >> Take advantage of what the Cloud has to offer - Avoid Common >> >> Pitfalls. >> >> Read the Whitepaper. >> >> >> http://pubads.g.doubleclick.net/gampad/clk?id=121054471&iu=/4140/ostg.clktrk_______________________________________________ >> >> sleuthkit-users mailing list >> >> https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >> >> http://www.sleuthkit.org >> > >> > >> > >> > >> > -- >> > Regards, >> > Ewaldo Simon >> > >> > >> ------------------------------------------------------------------------------ >> > Managing the Performance of Cloud-Based Applications >> > Take advantage of what the Cloud has to offer - Avoid Common >> Pitfalls. >> > Read the Whitepaper. >> > >> http://pubads.g.doubleclick.net/gampad/clk?id=121054471&iu=/4140/ostg.clktrk >> > _______________________________________________ >> > sleuthkit-users mailing list >> > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >> > http://www.sleuthkit.org >> > >> > >> > >> > >> > -- >> > Regards, >> > Ewaldo Simon >> > >> > >> > >> ------------------------------------------------------------------------------ >> > Flow-based real-time traffic analytics software. Cisco certified tool. >> > Monitor traffic, SLAs, QoS, Medianet, WAAS etc. with NetFlow Analyzer >> > Customize your own dashboards, set traffic alerts and generate reports. >> > Network behavioral analysis & security monitoring. All-in-one tool. >> > >> http://pubads.g.doubleclick.net/gampad/clk?id=126839071&iu=/4140/ostg.clktrk >> > >> > >> > >> > _______________________________________________ >> > sleuthkit-users mailing list >> > https://lists.sourceforge.net/lists/listinfo/sleuthkit-users >> > http://www.sleuthkit.org >> > >> >> >> > > > -- > Regards, > Ewaldo Simon > -- Regards, Ewaldo Simon |