From: Michael B. <mi...@aa...> - 2004-05-01 07:05:25
|
Hi list, got the tools compiled (finally) with the help of a lexer fix I found in the list archives (Jeff's I think?) and the file can be read. Unfortunately, I have discovered that a couple of the fields are large objects (Images I think, judging by the program). The mdb-export seems to just output these as large numbers, but not large enough to be the whole image. Do binary objects get included in an export or is that just silly because of it being a text file. Is there a way to get the data into some other format? I seem to remember reading that you could export into other database formats. Probably any of them would do as long as the binary data is all there. Also, I am getting a segfault after only a few dozen rows. I include the output below in case it helps anyone. the problem is: Unhandled ole field flags = ffff Segmentation fault mdb-schema outputs the following for the table in question: DROP TABLE Exam; CREATE TABLE Exam ( ID Long Integer (4), Patient_ID Text (40), TestType Text (100), TemplateName DateTime (Short) (8), Date Text (2), Duration Single (4), Eye Single (4), LensSphere Single (4), LensCylinder Memo/Hyperlink, LensAxis OLE, Comments Long Integer (4), Data OLE, AnnotationData Long Integer (4) ); mdb-export outputs: ID,Patient_ID,TestType,TemplateName,Date,Duration,Eye,LensSphere,LensCylinder,LensAxis,Comments,Data,AnnotationData ,"","","","",,,,"",,,,-805253120 ,"","","","",,,,"",,,,-805253120 ,"","","","",,,,"",,,,-805253120 ,"","","","",,,,"",,,,-805253120 ,"","","","",,,,"",,,,-805253120 <snip: many more the same> ,"","","","",,,,"",,,,-805253120 ,"","","","",,,,"",,,,-805253120 ,"","","","",,,,"",,,,-805253120 ,"","","","",,,,"",,,,-805253120 ,"","","","",,,,"",,,,-805253120 ,"","","","",,,,"",,,,-805253120 ,"","","","",,,,"",,,,-805253120 Unhandled ole field flags = ffff ,"","","","",,,,"",,,, Reading LVAL page 0002eb row num 0 row start 10 row stop 4095 row num 10 row start 256 row stop 256 Segmentation fault Any ideas? I realise that since I have modified that skip-deleted flag, that I may be reading invalid data and really pushing sh*t uphill, but I thought there might be a way to get it to just skip the busted rows because there are another 3000-4000 rows to come that may have good data in them. (That's if I can get the binary data out too!) Thank you for your patience and time, I'll donate some of the money I get from this to the project if I'm successful. -- Have a Nice Day! Mike |