We have several .wav files that are 1-2GB in length and follow the DS64 standard. After we use BWF MetaEdit to add metadata to these files, they are reported as truncated.
It looks like BWF Metaedit does not updating the riffsize field of the DS64 chunk if the file will be less than MAX_RIFF_SIZE when saved. When BWF MetaEdit attempts to read the modified field, it uses the invalid value in the DS64 riffsize field and reports the file as truncated.
For example, the relevant chunk field of our original file look like this:
File size: 2184161768 (as reported by Windows)
Header id: RF64
Header length: 0xFFFFFFFF (-1)
Header riff type: wave
DS64 riff size: 2184161760
DS64 datasize: 2184160788
After BWF Metaedit modies the file (using the command-line "--history=testing", this is how they look:
File size: 2184161776 (as reported by Windows)
Header id: RF64
Header length: 2184161768
Header rifftype: wave
DS64 riff size: 2184161760
DS64 datasize: 2184160788
The DS64 riff size field should be (I think) 2184161768, but it is still equal to 2184161760, which is what it was in the original file.
I'm baserely fluent in c++, but it looks like this is happening in line 321 of Riff_base.cpp (link:
if (Block_Size>RIFF_Size_Limit)
{
//We need RF64
if (Chunk.Header.Level==1)
{
if (Global->ds64==NULL)
{
Chunk.Header.List=Elements::RF64;
Global->ds64=new Riff_Base::global::chunk_ds64;
Subs.insert(Subs.begin(), new Riff_WAVE_ds64(Global)); //First place, always
Subs[0]->Modify();
Block_Size=Block_Size_Get();
}
if (Chunk.Header.List==Elements::RF64)
{
Global->ds64->riffSize=Block_Size-8;
if (Global->data)
Global->ds64->dataSize=Global->data->Size;
}
Subs[0]->Modify();
}
//Setting default value
Block_Size=8+0xFFFFFFFFLL; //Putting the maximum size in this chunk size
}
The only test here is whether the size is greater than RIFF_Size_Limit. If the size is not greater than RIFF_Size_Limit, then the utility modifies the standard file header, etc. It does not update or remove the DS64 chunk or modify the header id to be "RIFF".
If necessary, I can provide you with test files, etc.
Thanks for the comment. This issue was fixed in v1.3.1. Are you using 1.3.1?
Thanks for your reponse. We are using 1.3.1 and I am debugging against the source code for 1.3.1, which I've modified to compile in Visual Studio 2015.
There are comments in the source code (lines 184-189 of Riff_Base.cpp (sourceforge.net) that indicate that a ds64 issue was fixed:
But it looks like this fix only addresses an issue with the size of the data chunk.
In our case, the size of the data chunk reported in the DS64 does not change, so it remains valid. But size adding metadata increases the size of one of the other chunks. This increases the size of the file, but the RIFFSize field of the DS64 chunk is not updated to reflect the new size.
It looks like the issue happens in line 479-480 of Riff_Base.h (sourceforge.net):
Again, I'm not much of a c++ programmer, but it looks like this line is setting Chunk.Content.Size to the value of the riffsize field in the ds64 block-4 if the chunk being read is the ds64 chunk regardless of whether the header length field is set to RIFF_Size_Limit.
If I modify these lines to be:
Then it seems to fix the truncation issue. With this change in place, I can, for example, run --out-core-xml against the file that, before the change, was reported as truncated and get this xml:
But if I undo the change, the truncated error reoccurs.
Again, I'm not a c++ developer and I think things are lot more complicated than this -- the ds64 chunk still has invalid data after the above change.
Finally, I've put "before" and "after" versions file that I'm working against in a shared box.com folder and have given Mike Casey permissions to share the folder, if that helps.
Thanks yet again for taking time to look into this.