From: Harold H. <ha...@ha...> - 2022-01-17 19:27:57
|
THANKS for all your work on this! I moved the latest_ver, links, page_data, and ver_data to a backup directory, created new empty files, then did a restore from a downloaded dump zip. All looks good so far with the garbage files gone. I'll check in a few days to make sure they do not reappear as people try to hack the system. THANKS! Harold On Mon, January 17, 2022 4:04 am, Vargenau, Marc-Etienne (Nokia - FR/Paris-Saclay) wrote: > Hi Harold, > > With the modification introduced in Subversion 10904, accessing a > non-existent page should no longer create a file in page_data directory. > Please test it is working for you. > > I expect to publish PhpWiki 1.6.1 soon. > > Best regards, > > Marc-Etienne > > -----Original Message----- > From: Harold Hallikainen <ha...@ha...> > Sent: Monday, January 17, 2022 5:27 AM > To: php...@li... > Subject: Re: [Phpwiki-talk] Issues with Wiki Dump and Garbage in page_data > > Done! Actually, the method of my previous updates was pretty poor, and I > ended up with nested versions. So, it was difficult to tell what version > was actually running. I have now just unzipped version 10904, renamed it, > and copied over my old config.ini after comparing it with config-dist to > see what changes needed to be made. It appears to be working! > > So, back to the original question. Are files now created in page_data and > ver_data when a nonexistant page is requested? Again, my DB method is > file. I have a lot of trash in those two directories. > > I don't see an easy way of getting rid of the trash other than doing a > dump of the wiki, deleting everything in those directories, then doing a > respore. Is this a reasonable approach? > > THANKS! > > Harold > > > On Sun, January 16, 2022 12:43 pm, Vargenau, Marc-Etienne (Nokia - > FR/Paris-Saclay) wrote: >> Hi Harold, >> >> Can you please update to Subversion 10904 and test? >> >> Also, can you do an >> https://bh.hallikainen.org/wiki/index.php/HomePage?action=upgrade&over >> write=1 you have quite old versions of system files like >> PhpWikiAdministration >> >> Best regards, >> >> Marc-Etienne >> >> -----Original Message----- >> From: Harold Hallikainen <ha...@ha...> >> Sent: Saturday, January 15, 2022 11:43 PM >> To: php...@li... >> Subject: Re: [Phpwiki-talk] Issues with Wiki Dump >> >> Thanks for the QUICK response! >> >> I ran >> >> dnf install php-zip >> service httpd restart >> >> and it works! >> >> >> I am using the flat file database. In wiki_data, I find page_data and >> ver_data. It LOOKS like ver_data has all versions of the page while >> page_data has just the most current. I have tried deleting the files >> in page_data and then looking at the wiki. The latest versions of the >> pages still show up, and are then added to page_data. >> >> Since access data is stored in the page file, every time someone tries >> to access a page that does not exist, it appears that an appropriate >> file is created to hold that access data. As people try to hack the >> system, I end up with files like those below in page_data. >> >> >> %27A >> %28%29 >> %28CASE+WHEN+%283541%3D3541%29+THEN+3541+ELSE+3541%2A%28SELECT+3541+FR >> OM+DUAL+UNION+SELECT+2885+FROM+DUAL%29+END%29 >> %28CASE+WHEN+%284882%3D2501%29+THEN+4882+ELSE+4882%2A%28SELECT+4882+FR >> OM+DUAL+UNION+SELECT+2501+FROM+DUAL%29+END%29 >> %28CASE+WHEN+%289302%3D9302%29+THEN+SLEEP%2832%29+ELSE+9302+END%29 >> %28CASE+WHEN+4772%3D4772+THEN+4772+ELSE+NULL+END%29 >> %28CASE+WHEN+7219%3D9319+THEN+7219+ELSE+NULL+END%29 >> %28SELECT+%28CASE+WHEN+%283521%3D3521%29+THEN+%27RCA%27+ELSE+%28SELECT >> +5220+UNION+SELECT+3216%29+END%29%29 >> %28SELECT+%28CASE+WHEN+%289983%3D6874%29+THEN+%27RCA%27+ELSE+%28SELECT >> +6874+UNION+SELECT+8819%29+END%29%29 >> %28SELECT+CONCAT%280x7162716a71%2C%28ELT%284931%3D4931%2C1%29%29%2C0x7 >> 17a767871%29%29 >> %28SELECT+CONCAT%280x716a6b7071%2C%28ELT%282020%3D2020%2C1%29%29%2C0x7 >> 17a786b71%29%29 >> %28SELECT+CONCAT%280x7170717171%2C%28ELT%281739%3D1739%2C1%29%29%2C0x7 >> 171627a71%29%29 >> %28SELECT+CONCAT%280x7176627171%2C%28ELT%289244%3D9244%2C1%29%29%2C0x7 >> 16a717a71%29%29 >> %28SELECT+CONCAT%280x71766a7871%2C%28ELT%285835%3D5835%2C1%29%29%2C0x7 >> 178787871%29%29 >> %28SELECT+CONCAT%280x717a7a6271%2C%28ELT%284238%3D4238%2C1%29%29%2C0x7 >> 1626a6b71%29%29 >> %28SELECT+CONCAT%28CONCAT%28%27qbqjq%27%2C%28CASE+WHEN+%283392%3D3392% >> 29+THEN+%271%27+ELSE+%270%27+END%29%29%2C%27qzvxq%27%29%29 >> %28SELECT+CONCAT%28CONCAT%28%27qjkpq%27%2C%28CASE+WHEN+%284051%3D4051% >> 29+THEN+%271%27+ELSE+%270%27+END%29%29%2C%27qzxkq%27%29%29 >> %28SELECT+CONCAT%28CONCAT%28%27qpqqq%27%2C%28CASE+WHEN+%282945%3D2945% >> 29+THEN+%271%27+ELSE+%270%27+END%29%29%2C%27qqbzq%27%29%29 >> %28SELECT+CONCAT%28CONCAT%28%27qvbqq%27%2C%28CASE+WHEN+%281904%3D1904% >> 29+THEN+%271%27+ELSE+%270%27+END%29%29%2C%27qjqzq%27%29%29 >> %28SELECT+CONCAT%28CONCAT%28%27qvjxq%27%2C%28CASE+WHEN+%281582%3D1582% >> 29+THEN+%271%27+ELSE+%270%27+END%29%29%2C%27qxxxq%27%29%29 >> %28SELECT+CONCAT%28CONCAT%28%27qzzbq%27%2C%28CASE+WHEN+%282980%3D2980% >> 29+THEN+%271%27+ELSE+%270%27+END%29%29%2C%27qbjkq%27%29%29 >> >> When I unzip the dump, I do not see those filenames, so that's good! >> So, how do I get rid of these "hack" files? Maybe empty the page_data >> and page_ver directories, then do a restore? >> >> Also, it would be great if an attempt to access a non-existent page >> would not create a file for it. >> >> THANKS! >> >> Harold >> >> >> >> >> >> On Sat, January 15, 2022 2:36 pm, Vargenau, Marc-Etienne (Nokia - >> FR/Paris-Saclay) wrote: >>> Hi Harold, >>> 'ZipArchive' is a standard class of PHP since PHP 5.2.0 >>> https://www.php.net/manual/en/class.ziparchive >>> Probably your PHP is not compiled with that class. >>> You should check with phpinfo(). >>> Best regards, >>> Marc-Etienne >>> -----Original Message----- >>> From: Harold Hallikainen <ha...@ha...> >>> Sent: Saturday, January 15, 2022 8:23 PM >>> To: Harold Hallikainen <ha...@ha...> >>> Cc: Discussion on PhpWiki features, bugs, development. >>> <php...@li...> >>> Subject: [Phpwiki-talk] Issues with Wiki Dump I am having issues with >>> the wiki dump. Here are the PHP error messages: >> [15-Jan-2022 18:55:17 UTC] PHP Fatal error: Uncaught Error: Class >> 'ZipArchive' not found in >>> /home/harold/public_html/org/bh/wiki/lib/loadsave.php:217 >>> Stack trace: >>> #0 /home/harold/public_html/org/bh/wiki/lib/main.php(1273): >>> MakeWikiZip(Object(WikiRequest)) >>> #1 /home/harold/public_html/org/bh/wiki/lib/main.php(819): >>> WikiRequest->action_zip() >>> #2 /home/harold/public_html/org/bh/wiki/lib/main.php(1456): >>> WikiRequest->handleAction() >>> #3 /home/harold/public_html/org/bh/wiki/lib/main.php(1480): main() #4 >> /home/harold/public_html/org/bh/wiki/index.php(60): >>> include('/home/harold/pu...') >>> #5 {main} >>> thrown in /home/harold/public_html/org/bh/wiki/lib/loadsave.php on >> line >>> 217 >>> [15-Jan-2022 18:55:59 UTC] PHP Notice: Undefined index: MinPage in >> /home/harold/public_html/fr/index.php on line 9 >>> [15-Jan-2022 18:55:59 UTC] PHP Notice: Undefined index: MaxPage in >> /home/harold/public_html/fr/index.php on line 10 >>> [15-Jan-2022 18:56:00 UTC] PHP Notice: Undefined index: HOME in >> /home/harold/public_html/org/w6iwi/script/rbn.php on line 108 >>> [15-Jan-2022 18:56:05 UTC] PHP Notice: Undefined index: MinPage in >> /home/harold/public_html/fr/index.php on line 9 >>> [15-Jan-2022 18:56:05 UTC] PHP Notice: Undefined index: MaxPage in >> /home/harold/public_html/fr/index.php on line 10 >>> [15-Jan-2022 18:56:26 UTC] PHP Notice: Undefined index: MinPage in >> /home/harold/public_html/fr/index.php on line 9 >>> [15-Jan-2022 18:56:26 UTC] PHP Notice: Undefined index: MaxPage in >> /home/harold/public_html/fr/index.php on line 10 >>> [15-Jan-2022 18:56:35 UTC] PHP Fatal error: Uncaught Error: Class >> 'ZipArchive' not found in >>> /home/harold/public_html/org/bh/wiki/lib/loadsave.php:217 >>> Stack trace: >>> #0 /home/harold/public_html/org/bh/wiki/lib/main.php(1273): >>> MakeWikiZip(Object(WikiRequest)) >>> #1 /home/harold/public_html/org/bh/wiki/lib/main.php(819): >>> WikiRequest->action_zip() >>> #2 /home/harold/public_html/org/bh/wiki/lib/main.php(1456): >>> WikiRequest->handleAction() >>> #3 /home/harold/public_html/org/bh/wiki/lib/main.php(1480): main() #4 >> /home/harold/public_html/org/bh/wiki/index.php(60): >>> include('/home/harold/pu...') >>> #5 {main} >>> thrown in /home/harold/public_html/org/bh/wiki/lib/loadsave.php on >> line >>> 217 >>> Using Dump To Directory, a lot of files (perhaps all) show up in the >> specified directory, but this error message appears at the bottom of >> the page. >>> Fatal PhpWiki Error: couldn't open file "/home/harold/tmp/wikidump/" >>> for >> writing >>> I just updated to phpwiki-code-r10903-trunk and the errors still >> appear. >>> I am going to try a restore from the dump to directory since it looks >> like >>> it may be working. >>> THANKS! >>> Harold >>> _______________________________________________ >>> Phpwiki-talk mailing list >>> Php...@li... >>> https://lists.sourceforge.net/lists/listinfo/phpwiki-talk >>> _______________________________________________ >>> Phpwiki-talk mailing list >>> Php...@li... >>> https://lists.sourceforge.net/lists/listinfo/phpwiki-talk >> >> >> -- >> FCC Rules Updated Daily at http://www.hallikainen.com Not sent from an >> iPhone. >> >> >> >> >> >> _______________________________________________ >> Phpwiki-talk mailing list >> Php...@li... >> https://lists.sourceforge.net/lists/listinfo/phpwiki-talk >> >> >> _______________________________________________ >> Phpwiki-talk mailing list >> Php...@li... >> https://lists.sourceforge.net/lists/listinfo/phpwiki-talk >> >> > > > -- > FCC Rules Updated Daily at http://www.hallikainen.com Not sent from an > iPhone. > > > _______________________________________________ > Phpwiki-talk mailing list > Php...@li... > https://lists.sourceforge.net/lists/listinfo/phpwiki-talk > > > _______________________________________________ > Phpwiki-talk mailing list > Php...@li... > https://lists.sourceforge.net/lists/listinfo/phpwiki-talk > > -- FCC Rules Updated Daily at http://www.hallikainen.com Not sent from an iPhone. |