HDGraph doesn't show true disk usage.

  • Mustang

    Mustang - 2007-02-26


    I ran HDGraph on our file server from the root folder, and after several minutes of work it displayed the graph. All the disk usage amounts were incorrect. It states that the over all usage was 152GBs, which should be around 480GBs.

    Is there something I did wrong? or is this a bug.

    • jyl

      jyl - 2007-02-27

      HDGraph only displays the folders and files it has right to. More precisely, if it is not allowed to access a file or a folder (a good example if the folder "c:\System Volume Information"), it considers its space is 0 byte (the file or folder is skipped). In order to reduce the risk of having access restrictions, make sure you run HDGraph as administrator.

      The only way for the moment to know which files and folders are skipped is to open the log file. I plan to add a more "user friendly" way to see the skipped items in the next version of HDGraph.
      The log file can be found in the directory where HDGraph was installed (default: C:\Program Files\HDGraph\HDGraph.log). Please open it manually instead of using the HDGraph 'Tools' menu (which works randomly, I'm working on it). If the log file doesn't exist, please make sure that HDGraph has the write access on its own folder.

      If you think that access restrictions are not the real reason of your problem, please send me your log file. I'll try to see if there are other exceptions.

      I'm really interested in this problem. Could you keep me informed on this ? Thank you very much.


      • Mustang

        Mustang - 2007-02-27


        I downloaded a trail version TreeSize Professional, and got different results. It reported total usage as 207GB.

        I then manually right mouse clicked on all the root folders, and manually totaled the folder sizes. My calculations were about 207GB.

        When I ran HDGraph. It wasn't just the total size that was wrong, but the subfolder sizes as well. For example: The "staff" folder which is used by everyone in the office (so permissions is for all) was reported to have a size of 17GB, but it really has 64GB.

        I do know that we have a large number of files on the file server marked as hidden. This was a bug in the file transfer when we upgraded our file system to a Unix box. For some reason the ghosting of the files all had their hidden attribute set. If that's what is causing the problem with HDGraph, then I think HDGraph should include hidden files in it's searches.

        The log file is about 24K, and has lots of entries like this, but for different folders. I'd post the entire log file, but it contains information I don't think my boss would like posted on the internet. :)

        HDGraph.exe Error: 0 : Error during file analysis (\\Virgil\d1\deadline_repository\DeadlineRepository25\0%.64). Details: Could not find file '0%.64'. - Source: mscorlib - Stack:    at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
           at System.IO.FileInfo.get_Length()
           at HDGraph.MoteurGraphiqueur.ConstruireArborescence(DirectoryNode dir, Int32 maxLevel)

    • jyl

      jyl - 2007-02-27


      After several tests, I can't explain the differences you have. I'm sure it's not a capacity problem (HDGraph supports folder sizes up to 9 223 372 To), even if HDG hasn't been tested on so large network drives like yours...
      I'm also sure that it's not a "hidden attribute" problem: HDGraph already includes hidden files.

      The last possibilities I'm thinking about are the following:
      - some short network breaks, which could abort the analysis for an isolated folder and its subfolders.
      - an important activity on the network drive, which could explain errors like "Could not find file '0%.64'" : the file has perhaps been deleted between the moment HDGraph list the files and the moment it reads the file size.
      - some problems between the Win32 API and the Unix filesystem (my knowledge are very limited about this).

      ... I agree it's not very conclusive.

      The several tests I made at home show that HDGraph displays the same sizes as its competitors (TreeSize, WindirStat). So I didn't succeeded in reproducing the problem.

      What do you think about this ?
      Do you have some ideas ?
      Did you test HDGraph more than one time ? Can you reproduce the problem ?

      Many thanks for the time you are spending on this,


      • Mustang

        Mustang - 2007-02-28

        This network drive has about 222,716 files on it, but it changes everyday because it's used as a storage drive for our render farm.

        What is a "network break"? I never have problems reading files on the server.
        I couldn't find any files on the drive under that folder with the name "0%.64" so there is something strange about the file name reported by HDGraph.
        I don't think there are any "Win32 API and Unix" problems. I'm not sure about such things. I do know that the machine is contains several RAID drives, and they are mounting to the server as one drive. It is a terabyte server that has been mirrored. So it shows up as about 550GB max storage. I think it actually contains four 250GB drives. The reason I used HDGraph was that we've run out of disk space, and I need to figure out where the storage is being used.

        Anyway, I'm hoping that this summer we'll upgrade to a 10 terabyte server. I wonder how long it would take HDGraph to run on that thing? :)

        I'm running HDGraph again this morning, and will post my results. See if it's changed.

      • Mustang

        Mustang - 2007-02-28

        Hi again,

        I ran HDGraph again on the drive, and got the same results. The root total is wrong, and the sub-folder totals are wrong.

        I took an image capture of HDGraph, and an example of the properties of one of the sub-folders. You can see they don't match.


        Any ideas?

    • jyl

      jyl - 2007-02-28

      Ok. I have a good clue. I found this on pinvoke.net ( http://pinvoke.net/default.aspx/kernel32/FindFirstFile.html ):

      The managed API only gives access to files/directories limited to 260 chars in length. This is a major flaw in the managed APIs, which every other administrator of a file server out there should agree on. This is the common scenario:
      1. Create a folder like E:\CompanyData\Management on the server.
      2. Share this as Management
      3. The user mount this from his client PC to a drive letter like F:
      4. The user creates directories and files below this F:. The user then has acess to a new 260 char length.
      5. If the user creates a file with a name of e.g. 250 chars or with a combined length of subdirectorie names and file names of up to 260, these files will not be visible in Windows Explorer on the fileserver AND NOT be visible to the 'FileInfo' and 'DirectoryInfo', as well. In fact those API's blows up with an exception when encountering this situation.

      HDGraph use the managed API of the .NET Framework and is so completely concerned by this problem (it uses the 'FileInfo' and 'DirectoryInfo' objects).
      I'll try to correct this during this week-end by using the win32 API directly.


    • jyl

      jyl - 2007-03-07

      I'm currently working on this. See bug #1676114 to stay informed. ( http://sourceforge.net/tracker/index.php?func=detail&aid=1676114&group_id=179516&atid=889463 ).


Get latest updates about Open Source Projects, Conferences and News.

Sign up for the SourceForge newsletter:

No, thanks