Menu

#7 Always attempts to overburn CD

open
nobody
None
5
2003-09-24
2003-09-24
Anonymous
No

Whether or not --cdsize is supplied cdbkup always tries over
burn any CD way past its true size. Here is a clip from typical
attempt:
Performing OPC...
Starting new track at sector: 0
Track 01: 702 MB written (fifo 96%) [buf 94%]
10.1x./usr/bin/cdrecord: Input/output error. write_g1: scsi
sendcmd: no error
CDB: 2A 00 00 05 7D 89 00 00 1F 00
status: 0x2 (CHECK CONDITION)
Sense Bytes: 70 00 05 00 00 00 00 0A 00 00 00 00 24 00 00 00
Sense Key: 0x5 Illegal Request, Segment 0
Sense Code: 0x24 Qual 0x00 (invalid field in cdb) Fru 0x0
Sense flags: Blk 0 (not valid)
cmd finished after 0.000s timeout 40s

write track data: error after 736905216 bytes
Sense Bytes: 70 00 00 00 00 00 00 0A 00 00 00 00 00 00 00 00
00 00
Writing time: 494.815s
Min drive buffer fill was 94%
Fixating...
/usr/bin/cdrecord: Input/output error. close track/session: scsi
sendcmd: no error
CDB: 5B 00 02 00 00 00 00 00 00 00
status: 0x2 (CHECK CONDITION)
Sense Bytes: 70 00 05 00 00 00 00 0A 00 00 00 00 2C 04 00 00
Sense Key: 0x5 Illegal Request, Segment 0
Sense Code: 0x2C Qual 0x04 (current program area is empty)
Fru 0x0
Sense flags: Blk 0 (not valid)
cmd finished after 0.000s timeout 480s
cmd finished after 0.000s timeout 480s
Fixating time: 0.001s
/usr/bin/cdrecord: fifo had 11671 puts and 11608 gets.
/usr/bin/cdrecord: fifo was 0 times empty and 10358 times full,
min fill was 81%.
Error saving image 1 occured.
Error reported by recording process or moving facility.
You may be able to fixate the disk with:
/usr/bin/cdrecord -v -data dev=1,0,0 speed=10 -fix
either now in another shell or later.
What now?

Discussion

  • Marco R. Gazzetta

    Logged In: YES
    user_id=179147

    I found there is a bug in the way the CD size is determined. cdsplit,
    which is the utility that actually collects the data from tar, will report a
    smaller size than the actual file, which makes the image too big for the
    CD.

    I tried with CD sizes much smaller than the actual CD, and it seems to
    work. The incriminating section of code is:

    # Create the temporary file, watching out for filesize overload
    $imgsize = 0;
    open( TMPIMG, ">$tmpimg") || die "Can't open '$tmpimg': $!\n";
    while( $rc = read( INPUT, $data, $increment)) {
    $imgsize += $rc;
    print TMPIMG $data || die "Can't write data to tmp file:
    $!\n";
    last if( netsize($imgsize + $increment) > $maxsize);
    }

    I haven't found out why the sizes reported by read and the ones
    actually dumped by print are different. I thought it might be byte vs
    char and tried use bytes and no bytes in all combinations I could think
    of, but to no avail.

     
  • Mike Colandro

    Mike Colandro - 2004-01-04

    Logged In: YES
    user_id=61073

    Workaround: --cdsize=487356803 for a 700MB CD-R. This works
    on my RedHat 8.0 system. Agree with the last commenter about
    the location of the problem. Need some way to check that $rc
    is not too different from $increment. Why not just use
    $increment. This would introduce at most a 2048 absolute error.

     
  • Marco R. Gazzetta

    Logged In: YES
    user_id=179147

    the fix turns out to be really easy to implement. in cdsplit, in the
    section of code mentioned above, the file TMPIMG has to be set to
    binary mode:

    open( TMPIMG....) || die...;
    binmode( TMPIMG );

    Then everything is fine.

     

Log in to post a comment.