On an attempt to build www.wikidata.org in xowa, using online import, the bz2 file downloads fine, it extracts fine, but when the database starts building, the following errors are thrown with each chunk that is read:
00.41%|Q10672|class gplx.Err 0 New_stmt_prep.fail; sql=SELECT site_id,ns_id,ns_case,ns_subpages,ns_content,ns_name,ns_canonical FROM wmf_ns WHERE site_id = ? err=class java.sql.SQLException [SQLITE_ERROR] SQL error or missing database (no such column: ns_subpages):class java.sql.SQLException [SQLITE_ERROR] SQL error or missing database (no such column: ns_subpages)
1 class java.sql.SQLException [SQLITE_ERROR] SQL error or missing database (no such column: ns_subpages) <java.sql.SQLException>
This also occurs if an offline import of the downloaded bz2 file is attempted. The errors also show if the import from list option is used. The manual method also throws these errors.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Damn, I broke this in v2.5.1. I'll fix this for either v2.5.2.2 (tomorrow: still debating) or v2.5.3.1 (next week). In the meantime, you can replace your copy with the attached wmf_data.sqlite3 at C\xowa\bin\any\xowa\cfg\wiki\wmf_data.sqlite3 (or wherever it is installed). You can also run the SQL statements below.
Also, I like to credit users for finding defects in XOWA. There would be a line in the XOWA change log with text like the following "Fix broken wikidata import {detected by anonymous}." Let me know if you want to be cited by a nickname, or would rather remain as anonymous.
Thanks!
ALTER TABLE wmf_ns ADD ns_subpages byte NOT NULL DEFAULT 0;
ALTER TABLE wmf_ns ADD ns_content byte NOT NULL DEFAULT 0;
Also - is the attachment usable on Linux (you use a Windows file reference), or is it device independent and I can drop it into ./xowa/wiki/?
Yup. All the SQLite files are OS-agnostic: Linux, Mac OS X, Windows, etc.
I gave a Windows file reference just for simplicity's sake. Assuming your xowa is at /home/your_name/xowa/xowa_linux_64.jar, then you would overwrite the file at /home/your_name/xowa/bin/any/xowa/cfg/wiki/wmf_data.sqlite3 .
I'm going to try to release a version tonight, but it'll just include the file above.
You can cite my nickname; glad to be of help.
Cool. "Tim Weigel" it is. Thanks again for the catch!
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Thanks for confirming the file is platform independent; I did find it in the file structure before reading this as well. So I'll replace the file and try again to build wikidata. If it builds, then I suspect I'll need to rebuild the other two wikis I have downloaded (I let those jobs go overnight, so I don't know what happened, but the English wiki completed really fast, so I suspect it is corrupted too).
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
The wikidata build failure shouldn't affect the build of the other two wikis. It will affect how the other wikis show articles (i.e.: a bad wikidata may cause pages in English Wikipedia to not show the correct birthdate), but the build process for any other wiki does not rely on wikidata.
Also, as point of reference, English wiki takes about 2 to 3 hours to build. If it completes fast, then you may have a truncated dump file. This has actually happened to me once or twice before (out of a few hundred downloads). I'd check the file size against http://dumps.wikimedia.org/enwiki/
Hope this helps.
Last edit: gnosygnu 2015-05-12
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
A little more info - I was building the wiki from an existing dump (the one that pulls down from the import page) and an already decompressed xml file. It shouldn't make a difference, but I thought I'd point that out.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
We can call this issue "Solved". I used the automated import overnight last night. Wikidata built correctly and no errors showed on the command console.
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
On an attempt to build www.wikidata.org in xowa, using online import, the bz2 file downloads fine, it extracts fine, but when the database starts building, the following errors are thrown with each chunk that is read:
This also occurs if an offline import of the downloaded bz2 file is attempted. The errors also show if the import from list option is used. The manual method also throws these errors.
These errors occur on a Linux 2.6 platform, I do not know if they occur in Windows.
Damn, I broke this in v2.5.1. I'll fix this for either v2.5.2.2 (tomorrow: still debating) or v2.5.3.1 (next week). In the meantime, you can replace your copy with the attached wmf_data.sqlite3 at C\xowa\bin\any\xowa\cfg\wiki\wmf_data.sqlite3 (or wherever it is installed). You can also run the SQL statements below.
Also, I like to credit users for finding defects in XOWA. There would be a line in the XOWA change log with text like the following "Fix broken wikidata import {detected by anonymous}." Let me know if you want to be cited by a nickname, or would rather remain as anonymous.
Thanks!
You can cite my nickname; glad to be of help.
Also - is the attachment usable on Linux (you use a Windows file reference), or is it device independent and I can drop it into ./xowa/wiki/?
Yup. All the SQLite files are OS-agnostic: Linux, Mac OS X, Windows, etc.
I gave a Windows file reference just for simplicity's sake. Assuming your xowa is at /home/your_name/xowa/xowa_linux_64.jar, then you would overwrite the file at /home/your_name/xowa/bin/any/xowa/cfg/wiki/wmf_data.sqlite3 .
I'm going to try to release a version tonight, but it'll just include the file above.
Cool. "Tim Weigel" it is. Thanks again for the catch!
Thanks for confirming the file is platform independent; I did find it in the file structure before reading this as well. So I'll replace the file and try again to build wikidata. If it builds, then I suspect I'll need to rebuild the other two wikis I have downloaded (I let those jobs go overnight, so I don't know what happened, but the English wiki completed really fast, so I suspect it is corrupted too).
The wikidata build failure shouldn't affect the build of the other two wikis. It will affect how the other wikis show articles (i.e.: a bad wikidata may cause pages in English Wikipedia to not show the correct birthdate), but the build process for any other wiki does not rely on wikidata.
Also, as point of reference, English wiki takes about 2 to 3 hours to build. If it completes fast, then you may have a truncated dump file. This has actually happened to me once or twice before (out of a few hundred downloads). I'd check the file size against http://dumps.wikimedia.org/enwiki/
Hope this helps.
Last edit: gnosygnu 2015-05-12
The wmf_data.sqlite file worked to build things and got me further than before, but the following dialog popped up:
Where do I find that file to make the program 'happy'?
A little more info - I was building the wiki from an existing dump (the one that pulls down from the import page) and an already decompressed xml file. It shouldn't make a difference, but I thought I'd point that out.
Ok. If you were building a wiki manually from home/wiki/Help:Import/Script, you will also need to download http://dumps.wikimedia.org/enwiki/20150403/enwiki-20150403-categorylinks.sql.gz and http://dumps.wikimedia.org/enwiki/20150403/enwiki-20150403-page_props.sql.gz . The page assumes the user will download all files manually.
Hope that helps.
Will give it a try tonight. It's rough when the day job interferes with important things. :)
Understood. Let me know if there's anything else. Thanks!
We can call this issue "Solved". I used the automated import overnight last night. Wikidata built correctly and no errors showed on the command console.
Cool. Thanks for the follow-up!