You can subscribe to this list here.
2004 |
Jan
|
Feb
|
Mar
(35) |
Apr
(63) |
May
(5) |
Jun
|
Jul
(4) |
Aug
(3) |
Sep
|
Oct
|
Nov
|
Dec
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
2006 |
Jan
|
Feb
|
Mar
(62) |
Apr
(5) |
May
|
Jun
|
Jul
|
Aug
(2) |
Sep
(9) |
Oct
(5) |
Nov
(39) |
Dec
(9) |
2007 |
Jan
(8) |
Feb
(7) |
Mar
(4) |
Apr
(1) |
May
|
Jun
(6) |
Jul
(2) |
Aug
(2) |
Sep
(9) |
Oct
|
Nov
(3) |
Dec
(1) |
From: PChot <pc...@gm...> - 2007-12-17 10:10:32
|
Hi, i find out that there is a problem with conversion of contents inside <math></math>. Contents should be unchanged but i find some problems: <math>a_2 + a^2</math> is replaced with: a\_2 + a\^{}2 but should be a_2 + a^2 this problem i found yesterday, could be more of them Regards Jan -- Macs are for those who don't want to know why their computer works. Linux is for those who want to know why their computer works. DOS is for those who want to know why their computer doesn't work. Windows is for those who don't want to know why their computer doesn't work. |
From: PChot <pc...@gm...> - 2007-11-20 22:54:07
|
hi all i'm always for debate. here is my arguments: 1. On our system we have problem that all want to download data and pages, including with autodownloads, nobody wants to add or correct anything on the system. Let me say that users are very selfish. Here i see print to pdf as reward for people who also write. 2. My second argument is server power. We cannot afford high performance server which would be required for rendering mass pdf since we have page with about 10k that could be converted to pdfs and we have massive peeks in usability So i see WikiPDF as tool for setting wiki. That for now, regards,Jan On Nov 20, 2007 11:00 PM, patrick jayet <pa...@ja...> wrote: > Hi Jan, > > Sorry, I did not look at this idea yet. Let me ask the question to > the others, to see what they think. > > > 2. Is it possible and if you are thought anything about that > > generating PDFs is privilege in mediawiki. Here I mean that I would > > like to have special group that can generate PDFs. Something like: > > $wgGroupPermissions['pdfgroup' ]['pdf'] = true; > > @Jan, @all > What do you think about this suggestion? Would it be necessary for > you to have that permission check? I don't exactly see why we should > restrict the use of the PDF export, if it is possible for a user to > access the information anyway. Have you got useful use cases, where > it would make sense or be necessary? > > Cheers, > Pat > > -- Macs are for those who don't want to know why their computer works. Linux is for those who want to know why their computer works. DOS is for those who want to know why their computer doesn't work. Windows is for those who don't want to know why their computer doesn't work. |
From: patrick j. <pa...@ja...> - 2007-11-20 22:01:18
|
Hi Jan, Sorry, I did not look at this idea yet. Let me ask the question to the others, to see what they think. > 2. Is it possible and if you are thought anything about that > generating PDFs is privilege in mediawiki. Here I mean that I would > like to have special group that can generate PDFs. Something like: > $wgGroupPermissions['pdfgroup' ]['pdf'] = true; @Jan, @all What do you think about this suggestion? Would it be necessary for you to have that permission check? I don't exactly see why we should restrict the use of the PDF export, if it is possible for a user to access the information anyway. Have you got useful use cases, where it would make sense or be necessary? Cheers, Pat |
From: PChot <pc...@gm...> - 2007-11-19 11:48:44
|
Hi, i'm watching little wikiPDF again. Patrick, did u have some to check my suggestion about PDF permmision? Can anything be done? (At the moment that's only problem to use it in product wiki) Thx. Regards, Jan -- Macs are for those who don't want to know why their computer works. Linux is for those who want to know why their computer works. DOS is for those who want to know why their computer doesn't work. Windows is for those who don't want to know why their computer doesn't work. |
From: pajai <pa...@us...> - 2007-09-25 20:07:38
|
Update of /cvsroot/wikipdf/extension/config In directory sc8-pr-cvs9.sourceforge.net:/tmp/cvs-serv24381/config Modified Files: site.cfg Log Message: typo Index: site.cfg =================================================================== RCS file: /cvsroot/wikipdf/extension/config/site.cfg,v retrieving revision 1.3 retrieving revision 1.4 diff -C2 -d -r1.3 -r1.4 *** site.cfg 25 Sep 2007 20:02:57 -0000 1.3 --- site.cfg 25 Sep 2007 20:07:32 -0000 1.4 *************** *** 300,304 **** image = Bild special = Spezial ! baseurl = https://wiki.bsiag.com/wiki/index.php/ realname = German --- 300,304 ---- image = Bild special = Spezial ! baseurl = http://de.wikipedia.org/wiki/ realname = German |
From: pajai <pa...@us...> - 2007-09-25 20:03:57
|
Update of /cvsroot/wikipdf/extension In directory sc8-pr-cvs9.sourceforge.net:/tmp/cvs-serv22737 Modified Files: CHANGELOG wikipdf.php Log Message: updated wikipdf.php for mw 1.11 (hook slightly changed + include adapted) Index: CHANGELOG =================================================================== RCS file: /cvsroot/wikipdf/extension/CHANGELOG,v retrieving revision 1.3 retrieving revision 1.4 diff -C2 -d -r1.3 -r1.4 *** CHANGELOG 19 Sep 2007 21:50:16 -0000 1.3 --- CHANGELOG 25 Sep 2007 20:03:51 -0000 1.4 *************** *** 1,2 **** --- 1,7 ---- + v.0.06.1: 2007 September 24 (Patrick Jayet, <pa...@ex...>) + - Adapted for Mediawiki 1.11: + - Hook mechanism slightly changed + - The file Image.php is called now ImagePage.php (for include in wiki.pdf) + v.0.05.1: 2007 June 10 (Patrick Jayet, <pa...@ex...>) - WikiPDF is now platform independant (works for a Windows and Unix/Linux server). Index: wikipdf.php =================================================================== RCS file: /cvsroot/wikipdf/extension/wikipdf.php,v retrieving revision 1.18 retrieving revision 1.19 diff -C2 -d -r1.18 -r1.19 *** wikipdf.php 19 Sep 2007 21:49:45 -0000 1.18 --- wikipdf.php 25 Sep 2007 20:03:51 -0000 1.19 *************** *** 63,69 **** </FORM> '); ! } ! ! if ($action == 'createpdf'){ global $wgRequest; global $wgLaTeXTemplate; --- 63,67 ---- </FORM> '); ! } else if ($action == 'createpdf'){ global $wgRequest; global $wgLaTeXTemplate; *************** *** 76,80 **** --- 74,84 ---- $wgOut->setSubtitle( $PDFtitle ); pdf_create( $PDFtitle, $wgArticle ); + + // else: no such action + } else { + return true; } + + return false; } *************** *** 92,96 **** global $wgLaTeXTemplate; ! require_once( 'Image.php' ); $ext_dir = "extensions".DIRECTORY_SEPARATOR."wikipdf"; --- 96,100 ---- global $wgLaTeXTemplate; ! require_once( 'ImagePage.php' ); $ext_dir = "extensions".DIRECTORY_SEPARATOR."wikipdf"; |
From: pajai <pa...@us...> - 2007-09-25 20:03:07
|
Update of /cvsroot/wikipdf/extension/config In directory sc8-pr-cvs9.sourceforge.net:/tmp/cvs-serv22626/config Modified Files: site.cfg Log Message: site param for german without bgdblquote and enddblquote Index: site.cfg =================================================================== RCS file: /cvsroot/wikipdf/extension/config/site.cfg,v retrieving revision 1.2 retrieving revision 1.3 diff -C2 -d -r1.2 -r1.3 *** site.cfg 6 Mar 2006 04:15:41 -0000 1.2 --- site.cfg 25 Sep 2007 20:02:57 -0000 1.3 *************** *** 297,306 **** [de] - bgdblquote = \\glqq{} babel = ngerman image = Bild special = Spezial ! baseurl = http://de.wikipedia.org/wiki/ ! enddblquote = \\grqq{} realname = German --- 297,304 ---- [de] babel = ngerman image = Bild special = Spezial ! baseurl = https://wiki.bsiag.com/wiki/index.php/ realname = German |
From: patrick j. <pa...@ja...> - 2007-09-23 12:44:24
|
Hi all, In the meantime I responded to Jan, telling him the new version was worth testing and should be quite stable. I unfortunately noticed I didn't respond to the mailing list itself. Has anybody else tested the new version? Feedback welcome. Cheers, Patrick |
From: PChot <pc...@gm...> - 2007-09-20 09:14:31
|
Hi, i just see that some new commit was put to sf. In now WikiPDF working to the point that is possible to test it? Have a nice day, Jan -- Macs are for those who don't want to know why their computer works. Linux is for those who want to know why their computer works. DOS is for those who want to know why their computer doesn't work. Windows is for those who don't want to know why their computer doesn't work. |
From: pajai <pa...@us...> - 2007-09-19 21:50:24
|
Update of /cvsroot/wikipdf/extension In directory sc8-pr-cvs9.sourceforge.net:/tmp/cvs-serv8045 Modified Files: INSTALL CHANGELOG README Log Message: readme files Index: README =================================================================== RCS file: /cvsroot/wikipdf/extension/README,v retrieving revision 1.2 retrieving revision 1.3 diff -C2 -d -r1.2 -r1.3 *** README 10 Mar 2006 03:53:47 -0000 1.2 --- README 19 Sep 2007 21:50:16 -0000 1.3 *************** *** 1,4 **** ! wikiPDF extension for Mediawiki v.0.04.1 (a lot based on wiki2PDF) converts Mediawiki-articles to LaTeX or PDF --- 1,4 ---- ! wikiPDF extension for Mediawiki v.0.05.1 (a lot based on wiki2PDF) converts Mediawiki-articles to LaTeX or PDF *************** *** 6,9 **** --- 6,10 ---- (c) 2004 Stephan Walter (wiki2PDF) (c) 2006 Felipe Corrêa da Silva Sanches (mediawiki extension) + (c) 2007 Patrick Jayet Licensed under the GNU General Public License (GPL) *************** *** 24,27 **** --- 25,29 ---- Lexor <lex...@my...> Andi Albrecht <and...@we...> + Patrick Jayet <pa...@ex...> December 5th 2005: *************** *** 30,33 **** --- 32,40 ---- Integrated to the mediawiki interface by Felipe Corrêa da Silva Sanches <fel...@gm...> + June 10th, 2007 + Script made platform independant (Linux/Windows) + Lots of bugfixes + Tuning of the latex templates + INSTALL instructions extended =============================================================================== Index: INSTALL =================================================================== RCS file: /cvsroot/wikipdf/extension/INSTALL,v retrieving revision 1.1.1.1 retrieving revision 1.2 diff -C2 -d -r1.1.1.1 -r1.2 *** INSTALL 4 Mar 2006 18:18:23 -0000 1.1.1.1 --- INSTALL 19 Sep 2007 21:50:16 -0000 1.2 *************** *** 1,5 **** WikiPDF is a Mediawiki extension. Unpack it on the extensions directory of your mediawiki installation. Add the folowing to the end of your LocalSettings.php: ! -------------------------------------- $wgWikiURL = "<put here the URL that you want to see in the PDF>"; $wgPDFMessage = "<message you want to see in the header of the PDF>"; --- 1,45 ---- + + This files describes the installation of WikiPDF, either on Linux or Windows, along with its dependencies. + + + = Dependencies = + + It is assumed that you have already mediawiki installed and thus a recent version of Apache and PHP running. + + The dependencies specifically needed by WikiPDF are the following. + + + == Imagemagick == + + For image conversion. Installation under windows through the visual installer (http://www.imagemagick.org) and under linux with your favorite package manager, e.g. using apt: + + apt-get install imagemagick + + + == Latex == + + For the Latex to PDF conversion, the pdflatex command is needed. + + On Windows, download and install MikTex. + + On Linux, install tetex: + + apt-get install tetex + + + == Python == + + On windows, download and install the python interpreter from http://www.python.org/download/ + + On linux, python should normally already be installed. If not, you can install it with your favorite package manager, e.g. for apt: + + apt-get install python + + + = WikiPDF Installation = + WikiPDF is a Mediawiki extension. Unpack it on the extensions directory of your mediawiki installation. Add the folowing to the end of your LocalSettings.php: ! --- $wgWikiURL = "<put here the URL that you want to see in the PDF>"; $wgPDFMessage = "<message you want to see in the header of the PDF>"; *************** *** 7,20 **** require_once( "extensions/wikipdf/wikipdf.php" ); ! ------------------------------------- Templates available in this package are: "article", "minimal" and "twocolumn". Each of these templates will give you a different formatting on your PDF - $wgWikiURL is the URL to your wiki ( $wgWikiURL = "http://aluno.no-ip.info/juca" in the case on my wiki) - It will be used in the links for other articles of your wiki that will appear in the PDF. ! $wgPDFMessage is the message that will appear in the header of the PDF created ! There are some other optional variables. Take a look at http://aluno.no-ip.info/juca/index.php/WikiPDF ! Please, report bugs to fel...@gm... --- 47,89 ---- require_once( "extensions/wikipdf/wikipdf.php" ); ! --- ! Templates available in this package are: "article", "minimal" and "twocolumn". Each of these templates will give you a different formatting on your PDF ! = Variables Explanation = ! * $wgWikiURL is the URL to your wiki ( $wgWikiURL = "http://aluno.no-ip.info/juca" in the case on my wiki). It will be used in the links for other articles of your wiki that will appear in the PDF. ! * $wgPDFMessage is the message that will appear in the header of the PDF created ! ! There are some other optional variables. Take a look at http://aluno.no-ip.info/juca/index.php/WikiPDF (fixme: links broken) ! ! ! = Tested Configuration = ! ! WikiPDF has been sucessfully tested on various Linux and Windows systems including Ubuntu Dapper and Windows Server 2003. ! ! ! = Troubleshooting = ! ! If there is a problem during PDF generation, first check the 'error log w2latex' logs to see if there has been a problem during Latex code generation. There might be a problem with a special wiki page which format is legal but not yet supported by WikiPDF. ! ! If no problem was detected during the last step, you can check the pdflatex logs, to see if there was a problem during PDF generation (An image might be missing or an illegal formatting element in the document, see below, might be the cause). Another problem might be that there is a latex package missing (not included with your latex installation). You can either install it for your latex distribution or simply put the corresponding file (which name appears in pdflatex logs) in the latex folder within wikipdf. ! ! ! = Problematic Formatting = ! ! The following formatting elements are known to make problems to WikiPDF: ! * Nested tables ! * Numbered or bullet lists within a table cell ! * Code block (line(s) starting with a space) within a table cell ! ! ! Please, report bugs to ! * fel...@gm... ! * pa...@ex... ! ! ! PJ / 2007 Index: CHANGELOG =================================================================== RCS file: /cvsroot/wikipdf/extension/CHANGELOG,v retrieving revision 1.2 retrieving revision 1.3 diff -C2 -d -r1.2 -r1.3 *** CHANGELOG 10 Mar 2006 03:53:47 -0000 1.2 --- CHANGELOG 19 Sep 2007 21:50:16 -0000 1.3 *************** *** 1,13 **** v.0.04.1: 2006 March 10 ! -added options to the ui. ! -EMERGENCIAL: corrected bug that caused generation of MainPage PDF even if you choose other page. v.0.04: 2006 March 9 ! -implemented user interface v. 0.03.2: 2006 12 january ! -added some language codes to the babel list ! -corrected bug that made wikipdf not work when $wgScriptPath had more than one directory. Thanks to PChott ! -added $'s to the variable names in the default messages v. 0.03.1: ! -corrected "line 21" issue. Thanks to Yaroslav Fedevych. ! --- 1,20 ---- + v.0.05.1: 2007 June 10 (Patrick Jayet, <pa...@ex...>) + - WikiPDF is now platform independant (works for a Windows and Unix/Linux server). + - Lots of small bugs fixed in the python script wiki2latex.py. + - Unicode2Latex script rewritten to be faster (was a problem with huge wiki pages) + - Tuned latex templates + latex formatting of wiki elements + v.0.04.1: 2006 March 10 ! - added options to the ui. ! - EMERGENCIAL: corrected bug that caused generation of MainPage PDF even if you choose other page. ! v.0.04: 2006 March 9 ! - implemented user interface ! v. 0.03.2: 2006 12 january ! - added some language codes to the babel list ! - corrected bug that made wikipdf not work when $wgScriptPath had more than one directory. Thanks to PChott ! - added $'s to the variable names in the default messages v. 0.03.1: ! - corrected "line 21" issue. Thanks to Yaroslav Fedevych. |
From: pajai <pa...@us...> - 2007-09-19 21:49:51
|
Update of /cvsroot/wikipdf/extension In directory sc8-pr-cvs9.sourceforge.net:/tmp/cvs-serv7673 Modified Files: wikipdf.php Log Message: wikipdf php script updated Index: wikipdf.php =================================================================== RCS file: /cvsroot/wikipdf/extension/wikipdf.php,v retrieving revision 1.17 retrieving revision 1.18 diff -C2 -d -r1.17 -r1.18 *** wikipdf.php 27 Mar 2006 17:44:23 -0000 1.17 --- wikipdf.php 19 Sep 2007 21:49:45 -0000 1.18 *************** *** 1,19 **** <?php ! function pdf_tab($content_actions) { global $wgUser; global $wgTitle; ! global $wgAllowAnonymousPDF; ! if( $wgTitle->getNamespace() != NS_SPECIAL and ($wgUser->getID() != 0 or $wgAllowAnonymousPDF) and $action != 'submit' and $action != 'edit') { ! $content_actions['pdf'] = array( 'class' => ($action == 'pdf') ? 'selected' : false, 'text' => "PDF", 'href' => $wgTitle->getLocalUrl( 'action=pdf' ) ); ! } } function unknown_action($action, $wgArticle){ global $wgLogo; global $wgSitename; --- 1,27 ---- <?php + + // Patrick Jayet <pa...@ex...>: this code is now platform independant (windows/unix), + // fixed a couple of bugs (2007), cleaned the Latex templates ! /* show a new pdf tab */ ! function pdf_tab(&$content_actions) { global $wgUser; global $wgTitle; ! global $action; ! #$wgTitle->getNamespace() != NS_SPECIAL and $wgUser->getID() != 0 and ! if($action != 'submit' and $action != 'edit') { ! $content_actions['pdf'] = array( 'class' => ($action == 'pdf') ? 'selected' : false, 'text' => "PDF", 'href' => $wgTitle->getLocalUrl( 'action=pdf' ) ); ! } + return true; } + /* handler for the additional actions */ function unknown_action($action, $wgArticle){ + global $wgLogo; global $wgSitename; *************** *** 24,67 **** global $wgLicense; global $wgScriptPath; - global $wgExtraNamespaces; if ($action == 'pdf'){ #thanks to Yaroslav Fedevych for his PHP advices $bloodyTempVar = $wgArticle->getTitle(); $title = $bloodyTempVar->GetText(); ! $NameSp=$wgExtraNamespaces[$bloodyTempVar->getNamespace()]; ! if ($NameSp!=''){ ! $title=$wgExtraNamespaces[$bloodyTempVar->getNamespace()].":".$title; ! } else { ! if ($bloodyTempVar->getNamespace()==4){ ! $title=$wgSitename.":".$title; ! } ! } ! global $wgOut; ! $wgOut->setPagetitle( "WikiPDF" ); ! $wgOut->setSubtitle( "From article: ".$title ); $wgOut->addHTML( '<FORM action="'.$wgScriptPath.'/index.php" method="post" > ! <INPUT type="hidden" name="title" value="'.$title.'"/><br> ! Alternative title:<br> ! <INPUT type="text" name="PDFtitle" value="'.$title.'"/><br>'); ! global $wgDisablePDFHeaderEdit; ! if (!$wgDisablePDFHeaderEdit){ ! $wgOut->addHTML('Document`s header message:<br> ! <TEXTAREA rows="3" name="PdfMsg">'.$wgPDFMessage.'</TEXTAREA><br>'); ! } ! $wgOut->addHTML('LaTeX template:<br> ! <SELECT name="template">'); ! exec('ls extensions/wikipdf/latex/ | grep .tex', $lsoutput, $return); ! foreach( $lsoutput as $a ){ ! $wgOut->addHTML('<OPTION value="'.$a.'">'.$a.'</OPTION>'); ! } ! $wgOut->addHTML(' ! </SELECT> ! <SELECT name="license"> ! <OPTION value="">No License</OPTION> ! <OPTION value="gnu-fdl">GNU FDL</OPTION> </SELECT> <INPUT type="submit" label="Create a PDF" value="createpdf" name="action"/> --- 32,62 ---- global $wgLicense; global $wgScriptPath; if ($action == 'pdf'){ #thanks to Yaroslav Fedevych for his PHP advices + //print_r($wgArticle); // ok + //echo "Content: ". $wgArticle->getContent( false ); // ok $bloodyTempVar = $wgArticle->getTitle(); $title = $bloodyTempVar->GetText(); ! global $wgOut; ! $wgOut->setPagetitle( "WikiPDF" ); ! $wgOut->setSubtitle( $title ); $wgOut->addHTML( '<FORM action="'.$wgScriptPath.'/index.php" method="post" > ! From article:<br> ! <TEXTAREA name="title">'.$title.'</TEXTAREA><br> ! Alternative title to be displayed on the document:<br> ! <TEXTAREA name="PDFtitle">'.$title.'</TEXTAREA><br> ! Message that will be displayed on the document header:<br> ! <TEXTAREA rows="3" name="PdfMsg">'.$wgPDFMessage.'</TEXTAREA><br> ! Select the desired LaTeX template to be used:<br> ! <SELECT name="template"> ! <!--<OPTION value="book">Book</OPTION>--> ! <OPTION value="article">Article</OPTION> ! <!--OPTION value="compact">Compact</OPTION>--> ! <OPTION value="twocolumn">Two Column</OPTION> ! <OPTION value="minimal">Minimal</OPTION> </SELECT> <INPUT type="submit" label="Create a PDF" value="createpdf" name="action"/> *************** *** 70,97 **** } - if ($action == 'createpdf'){ global $wgRequest; global $wgLaTeXTemplate; global $wgPDFMessage; - global $wgDisablePDFHeaderEdit; - global $wgLicense; - - if (!$wgDisablePDFHeaderEdit){ - $wgPDFMessage = $wgRequest->getVal( 'PdfMsg' ); - } $wgLaTeXTemplate = $wgRequest->getVal( 'template' ); $PDFtitle = $wgRequest->getVal( 'PDFtitle' ); - $wgLicense = $wgRequest->getVal( 'license' ); global $wgOut; $wgOut->setPagetitle( "WikiPDF" ); $wgOut->setSubtitle( $PDFtitle ); ! pdf_create( $PDFtitle ); } } ! function pdf_create($title){ global $wgOut; - global $wgArticle; global $wgLogo; global $wgSitename; --- 65,85 ---- } if ($action == 'createpdf'){ global $wgRequest; global $wgLaTeXTemplate; global $wgPDFMessage; $wgLaTeXTemplate = $wgRequest->getVal( 'template' ); + $wgPDFMessage = $wgRequest->getVal( 'PdfMsg' ); $PDFtitle = $wgRequest->getVal( 'PDFtitle' ); global $wgOut; $wgOut->setPagetitle( "WikiPDF" ); $wgOut->setSubtitle( $PDFtitle ); ! pdf_create( $PDFtitle, $wgArticle ); } } ! /* main function to do the wiki2latex2pdf transformation */ ! function pdf_create($title, $wgArticle){ global $wgOut; global $wgLogo; global $wgSitename; *************** *** 106,131 **** require_once( 'Image.php' ); ! $ext_dir = "extensions/wikipdf"; ! #Deleting old tempdirs ! foreach (glob($ext_dir."/tmp/tmp-*") as $tempdir){ ! $age = time() - filectime($tempdir); ! if ($age > 1200) { ! $handle = opendir($tempdir); ! while($filename = readdir($handle)) { ! if ($filename != "." && $filename != "..") { ! unlink($tempdir."/".$filename); } ! } ! closedir($handle); ! rmdir($tempdir); } } // make a temporary directory with an unique name ! $mytemp = $ext_dir."/tmp/tmp-".time()."-".rand(); mkdir($mytemp); chmod($mytemp, 0777); ! $code = $wgLanguageCode; $article_f = fopen($mytemp."/wiki_code",'w'); --- 94,123 ---- require_once( 'Image.php' ); ! $ext_dir = "extensions".DIRECTORY_SEPARATOR."wikipdf"; ! $ext_dir_url = "extensions/wikipdf"; ! #Deleting old tempdirs ! foreach (glob($ext_dir."/tmp/*") as $tempdir){ ! $age = time() - filectime($tempdir); ! if ($age > 1200) { ! $handle = opendir($tempdir); ! while($filename = readdir($handle)) { ! if ($filename != "." && $filename != "..") { ! unlink($tempdir."/".$filename); ! } } ! closedir($handle); ! rmdir($tempdir); } } // make a temporary directory with an unique name ! $nounce = time()."-".rand(); ! $mytemp = $ext_dir.DIRECTORY_SEPARATOR."tmp".DIRECTORY_SEPARATOR.$nounce; ! $mytempurl = $ext_dir_url."/tmp/".$nounce; ! mkdir($mytemp); chmod($mytemp, 0777); ! $code = $wgLanguageCode; $article_f = fopen($mytemp."/wiki_code",'w'); *************** *** 134,142 **** fclose($article_f); ! $latex_f = fopen($ext_dir."/latex/".$wgLaTeXTemplate, "r"); ! $latex = fread($latex_f, filesize($ext_dir."/latex/".$wgLaTeXTemplate)); fclose($latex_f); ! ! $LanguageName['ca'] = "catalan"; $LanguageName['de'] = "german"; $LanguageName['en'] = "english"; --- 126,133 ---- fclose($article_f); ! $latex_f = fopen($ext_dir."/latex/".$wgLaTeXTemplate.".tex", "r"); ! $latex = fread($latex_f, filesize($ext_dir."/latex/".$wgLaTeXTemplate.".tex")); fclose($latex_f); ! $LanguageName['de'] = "german"; $LanguageName['en'] = "english"; *************** *** 146,151 **** $LanguageName['pt'] = "portuguese"; $LanguageName['sl'] = "slovene"; ! $LanguageName['ru'] = "russian"; ! if ($wgWikiURL == ""){ $wgWikiURL = "Set \\\\$ \!wgWikiURL on LocalSettings.php"; --- 137,141 ---- $LanguageName['pt'] = "portuguese"; $LanguageName['sl'] = "slovene"; ! if ($wgWikiURL == ""){ $wgWikiURL = "Set \\\\$ \!wgWikiURL on LocalSettings.php"; *************** *** 162,175 **** $dotdots = $dotdots."../"; } ! $latex = preg_replace("/%!LogoFilename/", $dotdots.$wgLogo, $latex); $latex = preg_replace("/%!Sitename/", $wgSitename, $latex); $latex = preg_replace("/%!WikiLanguage/", $LanguageName[$wgLanguageCode], $latex); $latex = preg_replace("/%!WikiURL/", $wgWikiURL, $latex); $latex = preg_replace("/%!InputEnc/", $wgInputEnc, $latex); $latex = preg_replace("/%!License/", $wgLicenseLaTeX, $latex); ! // TODO: get "Image:" code from config file ! preg_match_all ("/\[\[Image:([^\[\|]*)\|*([0-9]+px)?\|?[^\[]*\]\]/", $content, $matches,PREG_SET_ORDER); foreach ( $matches as $match ){ --- 152,166 ---- $dotdots = $dotdots."../"; } ! $latex = preg_replace("/%!LogoFilename/", $dotdots.$wgLogo, $latex); $latex = preg_replace("/%!Sitename/", $wgSitename, $latex); $latex = preg_replace("/%!WikiLanguage/", $LanguageName[$wgLanguageCode], $latex); $latex = preg_replace("/%!WikiURL/", $wgWikiURL, $latex); + $latex = preg_replace("/%!PDFMessage/", $wgPDFMessage, $latex); $latex = preg_replace("/%!InputEnc/", $wgInputEnc, $latex); $latex = preg_replace("/%!License/", $wgLicenseLaTeX, $latex); ! // TODO: get "Image:" code from config file ! preg_match_all ("/\[\[Image:([^\[\|]*)\|([0-9]+px)?\|?[^\[]*\]\]/", $content, $matches,PREG_SET_ORDER); foreach ( $matches as $match ){ *************** *** 192,196 **** default: $target_type = ".".$image_split[2]; ! $do_convert = FALSE; } --- 183,189 ---- default: $target_type = ".".$image_split[2]; ! // pat: we want to copy the image anyway ! // TODO: implement a proper copy ! //$do_convert = FALSE; } *************** *** 199,245 **** if ($pixels == ""){ $system_command = "convert \"".wfImageDir($image)."/".$image."\" \"".$IP."/".$mytemp."/".$sanitized_name.$target_type."\""; ! } else { $system_command = "convert \"".wfImageThumbDir($image)."/".$pixels."-".$image."\" \"".$IP."/".$mytemp."/".$sanitized_name.$target_type."\""; - } ! system ($system_command); } - } ! $latex_file = fopen($mytemp."/t.tex", "w"); ! fwrite( $latex_file, $latex); fclose( $latex_file ); ! $pdfmessage_f = fopen($mytemp."/pdfmessage.aux", "w"); ! fwrite( $pdfmessage_f, $wgPDFMessage); ! fclose( $pdfmessage_f ); ! ! system("extensions/wikipdf/src/wiki2latex.py ".$code." ".$wgLaTeXTemplate." ".$mytemp." \"".$title."\" >& ".$mytemp."/erro.w2latex"); ! ! $runpdflatex = "TEXMFOUTPUT=".$mytemp." TEXINPUTS=extensions/wikipdf/latex:extensions/wikipdf/latex/includes:extensions/wikipdf/font:".$mytemp.": TEXFONTS=extensions/wikipdf/font: pdflatex --interaction nonstopmode ".$mytemp."/t.tex >".$mytemp."/t.log"; // Run pdflatex three times so that all references (TOC, etc.) are resolved. ! system($runpdflatex); ! system($runpdflatex); ! system($runpdflatex); ! system("mv ".$mytemp."/t.pdf ".$mytemp."/\"".$title.".pdf\""); ! system("mv ".$mytemp."/t.tex ".$mytemp."/\"".$title.".tex\""); ! system("mv ".$mytemp."/t.aux ".$mytemp."/\"".$title.".aux\""); ! system("mv ".$mytemp."/t.log ".$mytemp."/\"".$title.".log\""); ! // Just a hack so that non-root can delete stale tempdirs on the server foreach (glob($mytemp."/*") as $tfile) ! chmod($tfile, 0777); ! if ( filesize($mytemp."/erro.w2latex") != 0){ ! $wgOut->addHTML( '<img src="extensions/wikipdf/misc/images/Nuvola_mimetypes_core.png"><br>Ooops! Something is not working. It is probable that the parser have failed. Take a look at the <a href="'.$mytemp.'/erro.w2latex">parser`s error log</a>. If possible, report this error on <a href="http://sf.net/projects/wikipdf/">SourceForge</a>.' ); ! } elseif( filesize($mytemp.'/'.$title.'.pdf') == 0){ ! $wgOut->addHTML( '<img src="extensions/wikipdf/misc/images/Nuvola_mimetypes_core.png"><br>Ooops! Something is not working properly! This can be caused by a malformed LaTeX file. Or, maybe, I am just a bit confused :-) Take a look at the <a href="'.$mytemp.'/'.$title.'.log">pdflatex log</a>. If possible report this error on <a href="http://sf.net/projects/wikipdf">SourceForge</a>.' ); ! } else { ! $wgOut->addHTML( '<a href="'.$mytemp.'/'.rawurlencode($title).'.pdf"><img src="extensions/wikipdf/misc/images/Nuvola_mimetypes_pdf.png"></img></a><a href="'.$mytemp.'/'.rawurlencode($title).'.tex"><img src="extensions/wikipdf/misc/images/Nuvola_mimetypes_tex.png"></img></a><br>' ); ! } } $wgHooks['SkinTemplateContentActions'][] = 'pdf_tab'; $wgHooks['UnknownAction'][] = 'unknown_action'; --- 192,264 ---- if ($pixels == ""){ $system_command = "convert \"".wfImageDir($image)."/".$image."\" \"".$IP."/".$mytemp."/".$sanitized_name.$target_type."\""; ! }else{ $system_command = "convert \"".wfImageThumbDir($image)."/".$pixels."-".$image."\" \"".$IP."/".$mytemp."/".$sanitized_name.$target_type."\""; } ! system ($system_command); ! // pat ! // echo $system_command; } ! } ! ! // linux ! if (DIRECTORY_SEPARATOR == "/") ! $execprefix = ""; ! // windows ! else ! // pat: for windows we need this, otherwise stderr redirection does not work ! $execprefix = getenv( "COMSPEC" ) . " /C "; ! ! $com = $execprefix . "extensions".DIRECTORY_SEPARATOR."wikipdf".DIRECTORY_SEPARATOR."src".DIRECTORY_SEPARATOR."wiki2latex.py ".$code." ".$wgLaTeXTemplate." ".$mytemp." \"".$title."\" > $mytemp".DIRECTORY_SEPARATOR."erro.w2latex 2>&1"; ! // debug ! // echo $com."\n"; ! exec($com); ! ! $latex_file = fopen($mytemp."/t.tex", "r"); ! $latex_content = fread( $latex_file, filesize($mytemp."/t.tex")); fclose( $latex_file ); + + $latex = preg_replace( "/%!Content/", $latex_content, $latex); + + $latex_file = fopen($mytemp."/t.tex", "w"); + fwrite( $latex_file, $latex ); + fclose ( $latex_file ); ! // pat: on linux (with standard pdflatex) ! if (DIRECTORY_SEPARATOR == "/") ! $runpdflatex = "TEXMFOUTPUT=".$mytemp." TEXINPUTS=extensions/wikipdf/latex:extensions/wikipdf/font:".$mytemp.": TEXFONTS=extensions/wikipdf/font: pdflatex --interaction nonstopmode ".$mytemp."/t.tex >".$mytemp."/t.log"; ! // on windows (using pdflatex from miktex) ! else ! $runpdflatex = $execprefix . "pdflatex --interaction nonstopmode --aux-directory=".$mytemp." --output-directory=".$mytemp." --include-directory=extensions\wikipdf\latex --include-directory=extensions\wikipdf\font --include-directory=".$mytemp." ".$mytemp."\\t.tex >> $mytemp\pdflatex.err 2>&1"; ! // debug ! // echo $runpdflatex."\n"; // Run pdflatex three times so that all references (TOC, etc.) are resolved. ! exec($runpdflatex); ! exec($runpdflatex); ! exec($runpdflatex); ! // windows ! if (DIRECTORY_SEPARATOR == "\\") ! $mv = "move"; ! // unix ! else ! $mv = "mv"; ! // moves files from within the temp dir ! exec($mv." ".$mytemp.DIRECTORY_SEPARATOR."t.pdf ".$mytemp.DIRECTORY_SEPARATOR."\"".$title.".pdf\""); ! exec($mv." ".$mytemp.DIRECTORY_SEPARATOR."t.tex ".$mytemp.DIRECTORY_SEPARATOR."\"".$title.".tex\""); ! exec($mv." ".$mytemp.DIRECTORY_SEPARATOR."t.aux ".$mytemp.DIRECTORY_SEPARATOR."\"".$title.".aux\""); ! exec($mv." ".$mytemp.DIRECTORY_SEPARATOR."t.log ".$mytemp.DIRECTORY_SEPARATOR."\"".$title.".log\""); ! ! // Just a hack so that non-root can delete stale tempdirs on the server foreach (glob($mytemp."/*") as $tfile) ! chmod($tfile, 0777); ! $wgOut->addHTML( '<a href="'.$mytempurl.'/'.$title.'.pdf">PDF</a><p><small><a href="'.$mytempurl.'/'.$title.'.tex">LaTeX</a> -- <a href="'.$mytempurl.'/'.$title.'.log">pdflatex log</a></small> -- <a href="'.$mytempurl.'/erro.w2latex">error log w2latex</a></small>' ); ! } + /* register hooks for new tab and new actions */ $wgHooks['SkinTemplateContentActions'][] = 'pdf_tab'; $wgHooks['UnknownAction'][] = 'unknown_action'; |
Update of /cvsroot/wikipdf/extension/src In directory sc8-pr-cvs9.sourceforge.net:/tmp/cvs-serv6847/src Modified Files: config.pyc unicode2latex.py ClientTable.pyc wiki2latex.py common_funcs.pyc unicode2latex.pyc Added Files: doPreformatted2.py Log Message: merged python scripts Index: wiki2latex.py =================================================================== RCS file: /cvsroot/wikipdf/extension/src/wiki2latex.py,v retrieving revision 1.8 retrieving revision 1.9 diff -C2 -d -r1.8 -r1.9 *** wiki2latex.py 29 Mar 2006 09:26:04 -0000 1.8 --- wiki2latex.py 19 Sep 2007 21:47:59 -0000 1.9 *************** *** 1,581 **** ! #!/usr/bin/env python ! #coding: iso-8859-1 ! # $Id$ ! # ! # wiki2latex ! # ! # convert Wikipedia article to latex code ! # ! # Copyright (c) 2004 by Stephan Walter <ste...@ep...> ! # Licensed under the GNU General Public License (GPL) [...1166 lines suppressed...] ! #article_name = wiki_code_file.readline() ! ! wiki_code = "" ! for linha in wiki_code_file.readlines(): ! wiki_code += linha ! ! wiki_code_file.close() ! ! tex_code = r'\beginarticle{' + unicode2latex.convert2(article_name) + '}' ! tex_code += doWiki(wiki_code) ! tex_code += "\\endarticle\n" ! ! #tail = open("extensions/wikipdf/latex/"+templ+"-tail.tex") ! #tex_code += tail.read() ! #tail.close() ! ! output = open(sys.argv[3]+"/t.tex", 'w') ! ! print >> output, tex_code ! output.close() --- NEW FILE: doPreformatted2.py --- # pat: customized version of the function using verbatimtab def doPreFormatted2(text): #JM : to allow user to put a backslash in his article #pat: this replaces backslashes of the latex syntax #text = text.replace("\\", "$\\backslash$") #JM : I wanted to have text like that : my_variable text = text.replace("_", r'\_') #text = text.replace("/", "\\/") a = string.split(text, '\n') ptext = "" tabbing = -1 for x in range(len(a)): ptext += '\n' if a[x] == '': continue # pat {\tt ... } prints the content in bracket using the typeface font if a[x][0] == " ": if (a[x-1] == '') or (a[x-1][0] != " "): #JM : I prefer the environment tabbing. #ptext += r'{\tt' + '\n' + r' \begin{tabbing}' + '\n' #pat: let's use verbatimtab ptext += r'\begin{verbatimtab}[2]' + '\n' tabbing = 1 #if (tabbing == 1): ##JM : to allow user to have an { in his article ##pat #new_s = a[x].replace("{", r'\{') ##JM : same as juste before ##pat #new_s = new_s.replace("}", r'\}') ##JM : to find the number of spaces before the sentence, to make tabulation, in tabbing mode. #i = max(a[x].find(a[x].strip()), 0) #i = 0.3*i #ptext += " \\hspace{" + str(i) + "cm}" #ptext += new_s ##JM : I put a newline #ptext += r'\\\\' #else: ptext += a[x] #pat: don't need this # ptext += '\n' if ((a[x][0] == " ") and (x < len(a) - 1)) and ((a[x+1] == '') or (a[x+1][0] != " ")): #ptext += '\n' + r' \end{tabbing}' + '\n}' #pat ptext += '\n' + r'\end{verbatimtab}' tabbing = -1 #if (x == len(a) - 1) and (tabbing == 1): # ptext += r'\end{tabbing}}' + '\n' # tabbing = -1 return ptext Index: unicode2latex.py =================================================================== RCS file: /cvsroot/wikipdf/extension/src/unicode2latex.py,v retrieving revision 1.2 retrieving revision 1.3 diff -C2 -d -r1.2 -r1.3 *** unicode2latex.py 6 Mar 2006 04:15:41 -0000 1.2 --- unicode2latex.py 19 Sep 2007 21:47:58 -0000 1.3 *************** *** 1,286 **** # $Id$ ! codes = { ! # 0x0009: ' ', ! # 0x000a: '\n', ! # 0x0023: '{\#}', ! # 0x0026: '{\&}', ! 0x00a0: '{~}', ! 0x00a1: '{!`}', ! 0x00a2: '{\\not{c}}', ! 0x00a3: '{\\pounds}', ! 0x00a7: '{\\S}', ! 0x00a8: '{\\"{}}', ! 0x00a9: '{\\copyright}', ! 0x00af: '{\\={}}', ! 0x00ac: '{\\neg}', ! 0x00ad: '{\\-}', ! 0x00b0: '{\\mbox{$^\\circ$}}', ! 0x00b1: '{\\mbox{$\\pm$}}', ! 0x00b2: '{\\mbox{$^2$}}', ! 0x00b3: '{\\mbox{$^3$}}', ! 0x00b4: "{\\'{}}", ! 0x00b5: '{\\mbox{$\\mu$}}', ! 0x00b6: '{\\P}', ! 0x00b7: '{\\mbox{$\\cdot$}}', ! 0x00b8: '{\\c{}}', ! 0x00b9: '{\\mbox{$^1$}}', ! 0x00bf: '{?`}', ! 0x00c0: '{\\a`A}', ! 0x00c1: "{\\a'A}", ! 0x00c2: '{\\^A}', ! 0x00c3: '{\\~A}', ! 0x00c4: '{\\"A}', ! 0x00c5: '{\\AA}', ! 0x00c6: '{\\AE}', ! 0x00c7: '{\\c{C}}', ! 0x00c8: '{\\a`E}', ! 0x00c9: "{\\a'E}", ! 0x00ca: '{\\^E}', ! 0x00cb: '{\\"E}', ! 0x00cc: '{\\a`I}', ! 0x00cd: "{\\a'I}", ! 0x00ce: '{\\^I}', ! 0x00cf: '{\\"I}', ! 0x00d1: '{\\~N}', ! 0x00d2: '{\\a`O}', ! 0x00d3: "{\\a'O}", ! 0x00d4: '{\\^O}', ! 0x00d5: '{\\~O}', ! 0x00d6: '{\\"O}', ! 0x00d7: '{\\mbox{$\\times$}}', ! 0x00d8: '{\\O}', ! 0x00d9: '{\\a`U}', ! 0x00da: "{\\a'U}", ! 0x00db: '{\\^U}', ! 0x00dc: '{\\"U}', ! 0x00dd: "{\\a'Y}", ! 0x00df: '{\\ss}', ! 0x00e0: '{\\a`a}', ! 0x00e1: "{\\a'a}", ! 0x00e2: '{\\^a}', ! 0x00e3: '{\\~a}', ! 0x00e4: '{\\"a}', ! 0x00e5: '{\\aa}', ! 0x00e6: '{\\ae}', ! 0x00e7: '{\\c{c}}', ! 0x00e8: '{\\a`e}', ! 0x00e9: "{\\a'e}", ! 0x00ea: '{\\^e}', ! 0x00eb: '{\\"e}', ! 0x00ec: '{\\a`\\i}', ! 0x00ed: "{\\a'\\i}", ! 0x00ee: '{\\^\\i}', ! 0x00ef: '{\\"\\i}', ! 0x00f1: '{\\~n}', ! 0x00f2: '{\\a`o}', ! 0x00f3: "{\\a'o}", ! 0x00f4: '{\\^o}', ! 0x00f5: '{\\~o}', ! 0x00f6: '{\\"o}', ! 0x00f7: '{\\mbox{$\\div$}}', ! 0x00f8: '{\\o}', ! 0x00f9: '{\\a`u}', ! 0x00fa: "{\\a'u}", ! 0x00fb: '{\\^u}', ! 0x00fc: '{\\"u}', ! 0x00fd: "{\\a'y}", ! 0x00ff: '{\\"y}', ! ! 0x0100: '{\\=A}', ! 0x0101: '{\\=a}', ! 0x0102: '{\\u{A}}', ! 0x0103: '{\\u{a}}', ! 0x0104: '{\\c{A}}', ! 0x0105: '{\\c{a}}', ! 0x0106: "{\\a'C}", ! 0x0107: "{\\a'c}", ! 0x0108: "{\\^C}", ! 0x0109: "{\\^c}", ! 0x010a: "{\\.C}", ! 0x010b: "{\\.c}", ! 0x010c: "{\\v{C}}", ! 0x010d: "{\\v{c}}", ! 0x010e: "{\\v{D}}", ! 0x010f: "{\\v{d}}", ! 0x0112: '{\\=E}', ! 0x0113: '{\\=e}', ! 0x0114: '{\\u{E}}', ! 0x0115: '{\\u{e}}', ! 0x0116: '{\\.E}', ! 0x0117: '{\\.e}', ! 0x0118: '{\\c{E}}', ! 0x0119: '{\\c{e}}', ! 0x011a: "{\\v{E}}", ! 0x011b: "{\\v{e}}", ! 0x011c: '{\\^G}', ! 0x011d: '{\\^g}', ! 0x011e: '{\\u{G}}', ! 0x011f: '{\\u{g}}', ! 0x0120: '{\\.G}', ! 0x0121: '{\\.g}', ! 0x0122: '{\\c{G}}', ! 0x0123: '{\\c{g}}', ! 0x0124: '{\\^H}', ! 0x0125: '{\\^h}', ! 0x0128: '{\\~I}', ! 0x0129: '{\\~\\i}', ! 0x012a: '{\\=I}', ! 0x012b: '{\\=\\i}', ! 0x012c: '{\\u{I}}', ! 0x012d: '{\\u\\i}', ! 0x012e: '{\\c{I}}', ! 0x012f: '{\\c{i}}', ! 0x0130: '{\\.I}', ! 0x0131: '{\\i}', ! 0x0132: '{IJ}', ! 0x0133: '{ij}', ! 0x0134: '{\\^J}', ! 0x0135: '{\\^\\j}', ! 0x0136: '{\\c{K}}', ! 0x0137: '{\\c{k}}', ! 0x0139: "{\\a'L}", ! 0x013a: "{\\a'l}", ! 0x013b: "{\\c{L}}", ! 0x013c: "{\\c{l}}", ! 0x013d: "{\\v{L}}", ! 0x013e: "{\\v{l}}", ! 0x0141: '{\\L}', ! 0x0142: '{\\l}', ! 0x0143: "{\\a'N}", ! 0x0144: "{\\a'n}", ! 0x0145: "{\\c{N}}", ! 0x0146: "{\\c{n}}", ! 0x0147: "{\\v{N}}", ! 0x0148: "{\\v{n}}", ! 0x014c: '{\\=O}', ! 0x014d: '{\\=o}', ! 0x014e: '{\\u{O}}', ! 0x014f: '{\\u{o}}', ! 0x0150: '{\\H{O}}', ! 0x0151: '{\\H{o}}', ! 0x0152: '{\\OE}', ! 0x0153: '{\\oe}', ! 0x0154: "{\\a'R}", ! 0x0155: "{\\a'r}", ! 0x0156: "{\\c{R}}", ! 0x0157: "{\\c{r}}", ! 0x0158: "{\\v{R}}", ! 0x0159: "{\\v{r}}", ! 0x015a: "{\\a'S}", ! 0x015b: "{\\a's}", ! 0x015c: "{\\^S}", ! 0x015d: "{\\^s}", ! 0x015e: "{\\c{S}}", ! 0x015f: "{\\c{s}}", ! 0x0160: "{\\v{S}}", ! 0x0161: "{\\v{s}}", ! 0x0162: "{\\c{T}}", ! 0x0163: "{\\c{t}}", ! 0x0164: "{\\v{T}}", ! 0x0165: "{\\v{t}}", ! 0x0168: "{\\~U}", ! 0x0169: "{\\~u}", ! 0x016a: "{\\=U}", ! 0x016b: "{\\=u}", ! 0x016c: "{\\u{U}}", ! 0x016d: "{\\u{u}}", ! 0x016e: "{\\r{U}}", ! 0x016f: "{\\r{u}}", ! 0x0170: "{\\H{U}}", ! 0x0171: "{\\H{u}}", ! 0x0172: "{\\c{U}}", ! 0x0173: "{\\c{u}}", ! 0x0174: "{\\^W}", ! 0x0175: "{\\^w}", ! 0x0176: "{\\^Y}", ! 0x0177: "{\\^y}", ! 0x0178: '{\\"Y}', ! 0x0179: "{\\a'Z}", ! 0x017a: "{\\a'Z}", ! 0x017b: "{\\.Z}", ! 0x017c: "{\\.Z}", ! 0x017d: "{\\v{Z}}", ! 0x017e: "{\\v{z}}", ! 0x01c4: "{D\\v{Z}}", ! 0x01c5: "{D\\v{z}}", ! 0x01c6: "{d\\v{z}}", ! 0x01c7: "{LJ}", ! 0x01c8: "{Lj}", ! 0x01c9: "{lj}", ! 0x01ca: "{NJ}", ! 0x01cb: "{Nj}", ! 0x01cc: "{nj}", ! 0x01cd: "{\\v{A}}", ! 0x01ce: "{\\v{a}}", ! 0x01cf: "{\\v{I}}", ! 0x01d0: "{\\v\\i}", ! 0x01d1: "{\\v{O}}", ! 0x01d2: "{\\v{o}}", ! 0x01d3: "{\\v{U}}", ! 0x01d4: "{\\v{u}}", ! 0x01e6: "{\\v{G}}", ! 0x01e7: "{\\v{g}}", ! 0x01e8: "{\\v{K}}", ! 0x01e9: "{\\v{k}}", ! 0x01ea: "{\\c{O}}", ! 0x01eb: "{\\c{o}}", ! 0x01f0: "{\\v\\j}", ! 0x01f1: "{DZ}", ! 0x01f2: "{Dz}", ! 0x01f3: "{dz}", ! 0x01f4: "{\\a'G}", ! 0x01f5: "{\\a'g}", ! 0x01fc: "{\\a'\\AE}", ! 0x01fd: "{\\a'\\ae}", ! 0x01fe: "{\\a'\\O}", ! 0x01ff: "{\\a'\\o}", ! 0x02c6: '{\\^{}}', ! 0x02dc: '{\\~{}}', ! 0x02d8: '{\\u{}}', ! 0x02d9: '{\\.{}}', ! 0x02da: "{\\r{}}", ! 0x02dd: '{\\H{}}', ! 0x02db: '{\\c{}}', ! 0x02c7: '{\\v{}}', ! ! 0x03c0: '{\\mbox{$\\pi$}}', ! # consider adding more Greek here ! ! 0xfb01: '{fi}', ! 0xfb02: '{fl}', ! ! 0x2013: '{--}', ! 0x2014: '{---}', ! 0x2018: "{`}", ! 0x2019: "{'}", ! 0x201c: "{``}", ! 0x201d: "{''}", ! 0x2020: "{\\dag}", ! 0x2021: "{\\ddag}", ! 0x2122: "{\\mbox{$^\\mbox{TM}$}}", ! 0x2022: "{\\mbox{$\\bullet$}}", ! 0x2026: "{\\ldots}", ! 0x2202: "{\\mbox{$\\partial$}}", ! 0x220f: "{\\mbox{$\\prod$}}", ! 0x2211: "{\\mbox{$\\sum$}}", ! 0x221a: "{\\mbox{$\\surd$}}", ! 0x221e: "{\\mbox{$\\infty$}}", ! 0x222b: "{\\mbox{$\\int$}}", ! 0x2248: "{\\mbox{$\\approx$}}", ! 0x2260: "{\\mbox{$\\neq$}}", ! 0x2264: "{\\mbox{$\\leq$}}", ! 0x2265: "{\\mbox{$\\geq$}}", ! } - def convert(text): - t = unicode(text, 'utf8') - t1 = "" - for c in t: - if ord(c) in codes: - t1 += codes[ord(c)] + r'\allowhyphens{}' - else: - t1 += c - return t1.encode('utf8') --- 1,290 ---- # $Id$ + # + # Patrick Jayet <pa...@ex...> (rewrote the unicode to latex translation (2007) ! import re ! # pat: original implementation of convert was very slow for a huge document ! # because it was depending on the document size. This is a new implementation ! # depending on the number of replacements ! def convert2(text): ! text = unicode(text, 'utf8') ! for tuple in mapping: ! #pat: debug ! #print tuple[0].encode('utf8') + " " + tuple[1] ! text = text.replace(tuple[0],tuple[1]) ! return text.encode('utf8') ! ! mapping = [ ! (u'\u00a0', '{~}'), ! (u'\u00a1', '{!`}'), ! (u'\u00a2', '{\\not{c}}'), ! (u'\u00a3', '{\\pounds}'), ! (u'\u00a7', '{\\S}'), ! (u'\u00a8', '{\\"{}}'), ! (u'\u00a9', '{\\copyright}'), ! (u'\u00af', '{\\={}}'), ! (u'\u00ac', '{\\neg}'), ! (u'\u00ad', '{\\-}'), ! (u'\u00b0', '{\\mbox{$^\\circ$}}'), ! (u'\u00b1', '{\\mbox{$\\pm$}}'), ! (u'\u00b2', '{\\mbox{$^2$}}'), ! (u'\u00b3', '{\\mbox{$^3$}}'), ! (u'\u00b4', "{\\'{}}"), ! (u'\u00b5', '{\\mbox{$\\mu$}}'), ! (u'\u00b6', '{\\P}'), ! (u'\u00b7', '{\\mbox{$\\cdot$}}'), ! (u'\u00b8', '{\\c{}}'), ! (u'\u00b9', '{\\mbox{$^1$}}'), ! (u'\u00bf', '{?`}'), ! (u'\u00c0', '{\\a`A}'), ! (u'\u00c1', "{\\a'A}"), ! (u'\u00c2', '{\\^A}'), ! (u'\u00c3', '{\\~A}'), ! (u'\u00c4', '{\\"A}'), ! (u'\u00c5', '{\\AA}'), ! (u'\u00c6', '{\\AE}'), ! (u'\u00c7', '{\\c{C}}'), ! (u'\u00c8', '{\\a`E}'), ! (u'\u00c9', "{\\a'E}"), ! (u'\u00ca', '{\\^E}'), ! (u'\u00cb', '{\\"E}'), ! (u'\u00cc', '{\\a`I}'), ! (u'\u00cd', "{\\a'I}"), ! (u'\u00ce', '{\\^I}'), ! (u'\u00cf', '{\\"I}'), ! (u'\u00d1', '{\\~N}'), ! (u'\u00d2', '{\\a`O}'), ! (u'\u00d3', "{\\a'O}"), ! (u'\u00d4', '{\\^O}'), ! (u'\u00d5', '{\\~O}'), ! (u'\u00d6', '{\\"O}'), ! (u'\u00d7', '{\\mbox{$\\times$}}'), ! (u'\u00d8', '{\\O}'), ! (u'\u00d9', '{\\a`U}'), ! (u'\u00da', "{\\a'U}"), ! (u'\u00db', '{\\^U}'), ! (u'\u00dc', '{\\"U}'), ! (u'\u00dd', "{\\a'Y}"), ! (u'\u00df', '{\\ss}'), ! (u'\u00e0', '{\\a`a}'), ! (u'\u00e1', "{\\a'a}"), ! (u'\u00e2', '{\\^a}'), ! (u'\u00e3', '{\\~a}'), ! (u'\u00e4', '{\\"a}'), ! (u'\u00e5', '{\\aa}'), ! (u'\u00e6', '{\\ae}'), ! (u'\u00e7', '{\\c{c}}'), ! (u'\u00e8', '{\\a`e}'), ! (u'\u00e9', "{\\a'e}"), ! (u'\u00ea', '{\\^e}'), ! (u'\u00eb', '{\\"e}'), ! (u'\u00ec', '{\\a`\\i}'), ! (u'\u00ed', "{\\a'\\i}"), ! (u'\u00ee', '{\\^\\i}'), ! (u'\u00ef', '{\\"\\i}'), ! (u'\u00f1', '{\\~n}'), ! (u'\u00f2', '{\\a`o}'), ! (u'\u00f3', "{\\a'o}"), ! (u'\u00f4', '{\\^o}'), ! (u'\u00f5', '{\\~o}'), ! (u'\u00f6', '{\\"o}'), ! (u'\u00f7', '{\\mbox{$\\div$}}'), ! (u'\u00f8', '{\\o}'), ! (u'\u00f9', '{\\a`u}'), ! (u'\u00fa', "{\\a'u}"), ! (u'\u00fb', '{\\^u}'), ! (u'\u00fc', '{\\"u}'), ! (u'\u00fd', "{\\a'y}"), ! (u'\u00ff', '{\\"y}'), ! ! (u'\u0100', '{\\=A}'), ! (u'\u0101', '{\\=a}'), ! (u'\u0102', '{\\u{A}}'), ! (u'\u0103', '{\\u{a}}'), ! (u'\u0104', '{\\c{A}}'), ! (u'\u0105', '{\\c{a}}'), ! (u'\u0106', "{\\a'C}"), ! (u'\u0107', "{\\a'c}"), ! (u'\u0108', "{\\^C}"), ! (u'\u0109', "{\\^c}"), ! (u'\u010a', "{\\.C}"), ! (u'\u010b', "{\\.c}"), ! (u'\u010c', "{\\v{C}}"), ! (u'\u010d', "{\\v{c}}"), ! (u'\u010e', "{\\v{D}}"), ! (u'\u010f', "{\\v{d}}"), ! (u'\u0112', '{\\=E}'), ! (u'\u0113', '{\\=e}'), ! (u'\u0114', '{\\u{E}}'), ! (u'\u0115', '{\\u{e}}'), ! (u'\u0116', '{\\.E}'), ! (u'\u0117', '{\\.e}'), ! (u'\u0118', '{\\c{E}}'), ! (u'\u0119', '{\\c{e}}'), ! (u'\u011a', "{\\v{E}}"), ! (u'\u011b', "{\\v{e}}"), ! (u'\u011c', '{\\^G}'), ! (u'\u011d', '{\\^g}'), ! (u'\u011e', '{\\u{G}}'), ! (u'\u011f', '{\\u{g}}'), ! (u'\u0120', '{\\.G}'), ! (u'\u0121', '{\\.g}'), ! (u'\u0122', '{\\c{G}}'), ! (u'\u0123', '{\\c{g}}'), ! (u'\u0124', '{\\^H}'), ! (u'\u0125', '{\\^h}'), ! (u'\u0128', '{\\~I}'), ! (u'\u0129', '{\\~\\i}'), ! (u'\u012a', '{\\=I}'), ! (u'\u012b', '{\\=\\i}'), ! (u'\u012c', '{\\u{I}}'), ! (u'\u012d', '{\\u\\i}'), ! (u'\u012e', '{\\c{I}}'), ! (u'\u012f', '{\\c{i}}'), ! (u'\u0130', '{\\.I}'), ! (u'\u0131', '{\\i}'), ! (u'\u0132', '{IJ}'), ! (u'\u0133', '{ij}'), ! (u'\u0134', '{\\^J}'), ! (u'\u0135', '{\\^\\j}'), ! (u'\u0136', '{\\c{K}}'), ! (u'\u0137', '{\\c{k}}'), ! (u'\u0139', "{\\a'L}"), ! (u'\u013a', "{\\a'l}"), ! (u'\u013b', "{\\c{L}}"), ! (u'\u013c', "{\\c{l}}"), ! (u'\u013d', "{\\v{L}}"), ! (u'\u013e', "{\\v{l}}"), ! (u'\u0141', '{\\L}'), ! (u'\u0142', '{\\l}'), ! (u'\u0143', "{\\a'N}"), ! (u'\u0144', "{\\a'n}"), ! (u'\u0145', "{\\c{N}}"), ! (u'\u0146', "{\\c{n}}"), ! (u'\u0147', "{\\v{N}}"), ! (u'\u0148', "{\\v{n}}"), ! (u'\u014c', '{\\=O}'), ! (u'\u014d', '{\\=o}'), ! (u'\u014e', '{\\u{O}}'), ! (u'\u014f', '{\\u{o}}'), ! (u'\u0150', '{\\H{O}}'), ! (u'\u0151', '{\\H{o}}'), ! (u'\u0152', '{\\OE}'), ! (u'\u0153', '{\\oe}'), ! (u'\u0154', "{\\a'R}"), ! (u'\u0155', "{\\a'r}"), ! (u'\u0156', "{\\c{R}}"), ! (u'\u0157', "{\\c{r}}"), ! (u'\u0158', "{\\v{R}}"), ! (u'\u0159', "{\\v{r}}"), ! (u'\u015a', "{\\a'S}"), ! (u'\u015b', "{\\a's}"), ! (u'\u015c', "{\\^S}"), ! (u'\u015d', "{\\^s}"), ! (u'\u015e', "{\\c{S}}"), ! (u'\u015f', "{\\c{s}}"), ! (u'\u0160', "{\\v{S}}"), ! (u'\u0161', "{\\v{s}}"), ! (u'\u0162', "{\\c{T}}"), ! (u'\u0163', "{\\c{t}}"), ! (u'\u0164', "{\\v{T}}"), ! (u'\u0165', "{\\v{t}}"), ! (u'\u0168', "{\\~U}"), ! (u'\u0169', "{\\~u}"), ! (u'\u016a', "{\\=U}"), ! (u'\u016b', "{\\=u}"), ! (u'\u016c', "{\\u{U}}"), ! (u'\u016d', "{\\u{u}}"), ! (u'\u016e', "{\\r{U}}"), ! (u'\u016f', "{\\r{u}}"), ! (u'\u0170', "{\\H{U}}"), ! (u'\u0171', "{\\H{u}}"), ! (u'\u0172', "{\\c{U}}"), ! (u'\u0173', "{\\c{u}}"), ! (u'\u0174', "{\\^W}"), ! (u'\u0175', "{\\^w}"), ! (u'\u0176', "{\\^Y}"), ! (u'\u0177', "{\\^y}"), ! (u'\u0178', '{\\"Y}'), ! (u'\u0179', "{\\a'Z}"), ! (u'\u017a', "{\\a'Z}"), ! (u'\u017b', "{\\.Z}"), ! (u'\u017c', "{\\.Z}"), ! (u'\u017d', "{\\v{Z}}"), ! (u'\u017e', "{\\v{z}}"), ! ! (u'\u01c4', "{D\\v{Z}}"), ! (u'\u01c5', "{D\\v{z}}"), ! (u'\u01c6', "{d\\v{z}}"), ! (u'\u01c7', "{LJ}"), ! (u'\u01c8', "{Lj}"), ! (u'\u01c9', "{lj}"), ! (u'\u01ca', "{NJ}"), ! (u'\u01cb', "{Nj}"), ! (u'\u01cc', "{nj}"), ! (u'\u01cd', "{\\v{A}}"), ! (u'\u01ce', "{\\v{a}}"), ! (u'\u01cf', "{\\v{I}}"), ! (u'\u01d0', "{\\v\\i}"), ! (u'\u01d1', "{\\v{O}}"), ! (u'\u01d2', "{\\v{o}}"), ! (u'\u01d3', "{\\v{U}}"), ! (u'\u01d4', "{\\v{u}}"), ! (u'\u01e6', "{\\v{G}}"), ! (u'\u01e7', "{\\v{g}}"), ! (u'\u01e8', "{\\v{K}}"), ! (u'\u01e9', "{\\v{k}}"), ! (u'\u01ea', "{\\c{O}}"), ! (u'\u01eb', "{\\c{o}}"), ! (u'\u01f0', "{\\v\\j}"), ! (u'\u01f1', "{DZ}"), ! (u'\u01f2', "{Dz}"), ! (u'\u01f3', "{dz}"), ! (u'\u01f4', "{\\a'G}"), ! (u'\u01f5', "{\\a'g}"), ! (u'\u01fc', "{\\a'\\AE}"), ! (u'\u01fd', "{\\a'\\ae}"), ! (u'\u01fe', "{\\a'\\O}"), ! (u'\u01ff', "{\\a'\\o}"), ! ! (u'\u02c6', '{\\^{}}'), ! (u'\u02dc', '{\\~{}}'), ! (u'\u02d8', '{\\u{}}'), ! (u'\u02d9', '{\\.{}}'), ! (u'\u02da', "{\\r{}}"), ! (u'\u02dd', '{\\H{}}'), ! (u'\u02db', '{\\c{}}'), ! (u'\u02c7', '{\\v{}}'), ! ! (u'\u03c0', '{\\mbox{$\\pi$}}'), ! # consider adding more Greek here ! ! (u'\ufb01', '{fi}'), ! (u'\ufb02', '{fl}'), ! ! (u'\u2013', '{--}'), ! (u'\u2014', '{---}'), ! (u'\u2018', "{`}"), ! (u'\u2019', "{'}"), ! (u'\u201c', "{``}"), ! (u'\u201d', "{''}"), ! (u'\u2020', "{\\dag}"), ! (u'\u2021', "{\\ddag}"), ! (u'\u2122', "{\\mbox{$^\\mbox{TM}$}}"), ! (u'\u2192', "{\\mbox{$\\rightarrow$}}"), ! (u'\u2022', "{\\mbox{$\\bullet$}}"), ! (u'\u2026', "{\\ldots}"), ! (u'\u2202', "{\\mbox{$\\partial$}}"), ! (u'\u220f', "{\\mbox{$\\prod$}}"), ! (u'\u2211', "{\\mbox{$\\sum$}}"), ! (u'\u221a', "{\\mbox{$\\surd$}}"), ! (u'\u221e', "{\\mbox{$\\infty$}}"), ! (u'\u222b', "{\\mbox{$\\int$}}"), ! (u'\u2248', "{\\mbox{$\\approx$}}"), ! (u'\u2260', "{\\mbox{$\\neq$}}"), ! (u'\u2264', "{\\mbox{$\\leq$}}"), ! (u'\u2265', "{\\mbox{$\\geq$}}"), ! ] Index: config.pyc =================================================================== RCS file: /cvsroot/wikipdf/extension/src/config.pyc,v retrieving revision 1.1.1.1 retrieving revision 1.2 diff -C2 -d -r1.1.1.1 -r1.2 Binary files /tmp/cvsHCNfj4 and /tmp/cvsbScmLH differ Index: ClientTable.pyc =================================================================== RCS file: /cvsroot/wikipdf/extension/src/ClientTable.pyc,v retrieving revision 1.1.1.1 retrieving revision 1.2 diff -C2 -d -r1.1.1.1 -r1.2 Binary files /tmp/cvspNLKRi and /tmp/cvsOUhkDY differ Index: unicode2latex.pyc =================================================================== RCS file: /cvsroot/wikipdf/extension/src/unicode2latex.pyc,v retrieving revision 1.1.1.1 retrieving revision 1.2 diff -C2 -d -r1.1.1.1 -r1.2 Binary files /tmp/cvsUJh0kA and /tmp/cvsltsTvg differ Index: common_funcs.pyc =================================================================== RCS file: /cvsroot/wikipdf/extension/src/common_funcs.pyc,v retrieving revision 1.1.1.1 retrieving revision 1.2 diff -C2 -d -r1.1.1.1 -r1.2 Binary files /tmp/cvsDjwnbT and /tmp/cvspvN7LB differ |
From: Buttay c. <cyr...@fr...> - 2007-09-01 13:54:21
|
Hi Patrick, patrick jayet wrote: > Hi Cyril, > > Sorry for giving no news lately. I was pretty busy at work. Now it > should be better. > > My sourceforge username is pajai. If you give me a write access for > wikipdf, I can commit my work. I'll do that right away > > If you are interested to discuss regarding wikipdf and future > developments, let me know. I would be glad to do a chat session over > the week-end about that. I have a few ideas I would like to discuss. Yes, I must say I've not done much in the past months. I'd be happy to discuss that. However, I'm leaving tomorrow for a week, so we can either discuss today or after the 10th of September. Regards Cyril > > Cheers, > > Pat |
From: patrick j. <pa...@ja...> - 2007-08-31 19:14:47
|
Hi Cyril, Sorry for giving no news lately. I was pretty busy at work. Now it should be better. My sourceforge username is pajai. If you give me a write access for wikipdf, I can commit my work. If you are interested to discuss regarding wikipdf and future developments, let me know. I would be glad to do a chat session over the week-end about that. I have a few ideas I would like to discuss. Cheers, Pat On Jun 11, 2007, at 11:09 PM, Buttay cyril wrote: > Hi Patrick, > > Thank you for your work. > > I haven't worked on wikipdf for the last two months, partly because > I was fairly busy with many things, but also partly because I'm a > bit stuck with the improvement I tried to implement. > > As you might have noticed, I started a new branch (wikipdf- > flexbisonparse), with a lot of changes to the core of the > application (mainly adding an intermediate xml converter). The > problem is that the wiki markup is very, very complicated, and has > never been formally defined, so the only good parser is the one in > mediawiki itself! I was trying to recode it in python, but it turns > out to be a very time-consuming task (many thousands of lines of > php...). > > I explain that because I have the impression that you worked on the > stable branch of wikipdf, which is the one I never ever touched. > Therefore, I'd be pleased if you could maintain it (and, maybe, > give me a hand in developing the new branch...). > > As I don't know much about the stable version, I think it is better > for you to commit your own changes (if possible, little by little, > so we can ensure that nothing is broken). If you tell me what your > sourceforge username is, I'll be pleased to give you cvs access. > > I'll be away for two weeks starting this Friday, but hopefully, i > should have time to work on wikipdf when I come back. > > Cheers > > Cyril > > patrick jayet wrote: >> Hi guys, >> >> A couple of weeks ago, I contacted Felipe and Cyril regarding >> Wikipdf asking a few questions about it and saying that I was keen >> to extend/adapt the project (in particular for my needs). >> >> In the meantime I have been able to use WikiPDF for documentation >> purpose. It is by the way very handy to convert a whole Wiki page >> into a PDF document. >> >> I have also made numerous extensions/bugfixes to the project. Here >> is a summary of them >> - I fixed a few bugs in wikipdf.php and made the php script >> platform independant (working on a server with Linux as well as >> Windows) >> - I fixed numerous bugs in the wiki2latex python script (and did a >> few adjustment about formatting of some elements) >> - I rewrote unicode2latex. It was too slow for a huge document, >> because there was a for loop in there iterating through each >> character of the input document. >> - I changed slightly the latex templates >> - I extended the INSTALL instructions. >> >> You can find my version of wikipdf in attachment. I named it with >> version 0.05.1. I don't have a cvs access to the sourceforge >> project. If you like you can commit my changes or give me a cvs >> access so I can commit them. >> >> Feel free to give me your feedback about this version. (I think it >> is going pretty well, although there are still a couple of >> problems with the conversion of some elements, as documented in >> INSTALL). >> >> Kind regards. >> Cheers, >> >> Pat >> >> |
From: Felipe S. <fel...@gm...> - 2007-08-05 17:21:00
|
hey guys, could you help him, please? ---------- Forwarded message ---------- From: Chinese Medicine Times <enq...@ch...> Date: Aug 5, 2007 10:47 AM Subject: Wikipdf To: fel...@gm... Hi, Thanks for a great extension. I've installed the wikipdf extension, but it's not working. Please have a look at http://www.chinesemedicinetimes.com/wiki/CMTpedia and see what i'm doing wrong. Thank you. Kind regards, Attilio D'Alberto |
From: patrick j. <pa...@ja...> - 2007-07-09 15:14:30
|
Hi everybody, I am now in holidays until the end of the week. But we can chat a bit on irc when I am back (next week will be quite busy, but the following one should be fine). Regards, Cheers, Pat On Sat, 07 Jul 2007 14:56:27 +0100, Buttay cyril <cyr...@fr...> wrote: > Hi Guys, > > I'm just back from a combined conference+holidays (generates less CO2, > and, as a side effect, makes the lab pay for the travel). It was > relatively productive, and very relaxing, respectively for the work and > holidays parts of the trip, but as I couldn't really access my email > during the time, I have quite a backlog to clear now. > > Regarding wikipdf, as you might have noticed, the development slowed > down in the last two ot three months, because of a busy personal > schedule, but also because I'm a bit puzzled about the route to follow > for the wiki-to-XML conversion. > > Did you manage to discuss that over the IRC? did you get some ideas? > > In any case, I'm more or less available until the end of the month > (evenings - london timezone - and weekends), so we can discuss this. > > Cheers > > Cyril > > > patrick jayet wrote: >> Oups, I misread your schedule Felipe. 10 p.m. to 04 a.m. UTC -3 is not >> so handy for me (UTC + 2 european time). If you can show up before >> that we can chat a bit. Otherwise tomorrow. >> >> @ Cyril >> Nice holidays, in case you are taking some. >> >> Cheers, >> >> Pat >> >> >> >> On Jun 15, 2007, at 2:53 PM, Felipe Sanches wrote: >> >>> i may appear at the channel during the saturday evening. I will >>> probably be online from 22 p.m. on saturday, until 3 or 4 a.m. on >>> sunday (Brazilian time = UTC - 3) >>> >>> cheers, >>> Juca >>> >>> On 6/15/07, *patrick jayet* <pa...@ja... >>> <mailto:pa...@ja...>> wrote: >>> >>> Hi everybody, >>> >>> That's a good idea. I have time this week-end (on Saturday >>> evening or Sunday). Which free slot do you have guys? >>> >>> Cya. >>> Cheers, >>> >>> Pat >>> >>> >>> On Jun 15, 2007, at 1:08 AM, Felipe Sanches wrote: >>> >>>> I propose that we could meet on #wikipdf at irc.freenode.net >>>> <http://irc.freenode.net> sometimes to discuss wikipdf development >>>> >>>> what do you think about that? >>>> >>>> Felipe Sanches >>> >>> >> |
From: Buttay c. <cyr...@fr...> - 2007-07-07 13:57:12
|
Hi Guys, I'm just back from a combined conference+holidays (generates less CO2, and, as a side effect, makes the lab pay for the travel). It was relatively productive, and very relaxing, respectively for the work and holidays parts of the trip, but as I couldn't really access my email during the time, I have quite a backlog to clear now. Regarding wikipdf, as you might have noticed, the development slowed down in the last two ot three months, because of a busy personal schedule, but also because I'm a bit puzzled about the route to follow for the wiki-to-XML conversion. Did you manage to discuss that over the IRC? did you get some ideas? In any case, I'm more or less available until the end of the month (evenings - london timezone - and weekends), so we can discuss this. Cheers Cyril patrick jayet wrote: > Oups, I misread your schedule Felipe. 10 p.m. to 04 a.m. UTC -3 is not > so handy for me (UTC + 2 european time). If you can show up before > that we can chat a bit. Otherwise tomorrow. > > @ Cyril > Nice holidays, in case you are taking some. > > Cheers, > > Pat > > > > On Jun 15, 2007, at 2:53 PM, Felipe Sanches wrote: > >> i may appear at the channel during the saturday evening. I will >> probably be online from 22 p.m. on saturday, until 3 or 4 a.m. on >> sunday (Brazilian time = UTC - 3) >> >> cheers, >> Juca >> >> On 6/15/07, *patrick jayet* <pa...@ja... >> <mailto:pa...@ja...>> wrote: >> >> Hi everybody, >> >> That's a good idea. I have time this week-end (on Saturday >> evening or Sunday). Which free slot do you have guys? >> >> Cya. >> Cheers, >> >> Pat >> >> >> On Jun 15, 2007, at 1:08 AM, Felipe Sanches wrote: >> >>> I propose that we could meet on #wikipdf at irc.freenode.net >>> <http://irc.freenode.net> sometimes to discuss wikipdf development >>> >>> what do you think about that? >>> >>> Felipe Sanches >> >> > |
From: patrick j. <pa...@ja...> - 2007-06-16 18:07:16
|
Oups, I misread your schedule Felipe. 10 p.m. to 04 a.m. UTC -3 is not so handy for me (UTC + 2 european time). If you can show up before that we can chat a bit. Otherwise tomorrow. @ Cyril Nice holidays, in case you are taking some. Cheers, Pat On Jun 15, 2007, at 2:53 PM, Felipe Sanches wrote: > i may appear at the channel during the saturday evening. I will > probably be online from 22 p.m. on saturday, until 3 or 4 a.m. on > sunday (Brazilian time = UTC - 3) > > cheers, > Juca > > On 6/15/07, patrick jayet <pa...@ja...> wrote: > Hi everybody, > > That's a good idea. I have time this week-end (on Saturday evening > or Sunday). Which free slot do you have guys? > > Cya. > Cheers, > > Pat > > > On Jun 15, 2007, at 1:08 AM, Felipe Sanches wrote: > >> I propose that we could meet on #wikipdf at irc.freenode.net >> sometimes to discuss wikipdf development >> >> what do you think about that? >> >> Felipe Sanches > > |
From: patrick j. <pa...@ja...> - 2007-06-15 14:09:02
|
Hi Felipe, Ok, I'll try to show up during that time. Cheers, Pat On Jun 15, 2007, at 2:53 PM, Felipe Sanches wrote: > i may appear at the channel during the saturday evening. I will > probably be online from 22 p.m. on saturday, until 3 or 4 a.m. on > sunday (Brazilian time = UTC - 3) > > cheers, > Juca > > On 6/15/07, patrick jayet <pa...@ja...> wrote: > Hi everybody, > > That's a good idea. I have time this week-end (on Saturday evening > or Sunday). Which free slot do you have guys? > > Cya. > Cheers, > > Pat > > > On Jun 15, 2007, at 1:08 AM, Felipe Sanches wrote: > >> I propose that we could meet on #wikipdf at irc.freenode.net >> sometimes to discuss wikipdf development >> >> what do you think about that? >> >> Felipe Sanches > > |
From: Felipe S. <fel...@gm...> - 2007-06-15 12:53:38
|
i may appear at the channel during the saturday evening. I will probably be online from 22 p.m. on saturday, until 3 or 4 a.m. on sunday (Brazilian time = UTC - 3) cheers, Juca On 6/15/07, patrick jayet <pa...@ja...> wrote: > > Hi everybody, > That's a good idea. I have time this week-end (on Saturday evening or > Sunday). Which free slot do you have guys? > > Cya. > Cheers, > > Pat > > > On Jun 15, 2007, at 1:08 AM, Felipe Sanches wrote: > > I propose that we could meet on #wikipdf at irc.freenode.net sometimes to > discuss wikipdf development > > what do you think about that? > > Felipe Sanches > > > |
From: patrick j. <pa...@ja...> - 2007-06-15 06:57:29
|
Hi everybody, That's a good idea. I have time this week-end (on Saturday evening or Sunday). Which free slot do you have guys? Cya. Cheers, Pat On Jun 15, 2007, at 1:08 AM, Felipe Sanches wrote: > I propose that we could meet on #wikipdf at irc.freenode.net > sometimes to discuss wikipdf development > > what do you think about that? > > Felipe Sanches |
From: Felipe S. <fel...@gm...> - 2007-06-14 23:08:59
|
I propose that we could meet on #wikipdf at irc.freenode.net sometimes to discuss wikipdf development what do you think about that? Felipe Sanches |
From: Buttay c. <cyr...@fr...> - 2007-06-11 21:09:00
|
Hi Patrick, Thank you for your work. I haven't worked on wikipdf for the last two months, partly because I was fairly busy with many things, but also partly because I'm a bit stuck with the improvement I tried to implement. As you might have noticed, I started a new branch (wikipdf-flexbisonparse), with a lot of changes to the core of the application (mainly adding an intermediate xml converter). The problem is that the wiki markup is very, very complicated, and has never been formally defined, so the only good parser is the one in mediawiki itself! I was trying to recode it in python, but it turns out to be a very time-consuming task (many thousands of lines of php...). I explain that because I have the impression that you worked on the stable branch of wikipdf, which is the one I never ever touched. Therefore, I'd be pleased if you could maintain it (and, maybe, give me a hand in developing the new branch...). As I don't know much about the stable version, I think it is better for you to commit your own changes (if possible, little by little, so we can ensure that nothing is broken). If you tell me what your sourceforge username is, I'll be pleased to give you cvs access. I'll be away for two weeks starting this Friday, but hopefully, i should have time to work on wikipdf when I come back. Cheers Cyril patrick jayet wrote: > Hi guys, > > A couple of weeks ago, I contacted Felipe and Cyril regarding Wikipdf > asking a few questions about it and saying that I was keen to > extend/adapt the project (in particular for my needs). > > In the meantime I have been able to use WikiPDF for documentation > purpose. It is by the way very handy to convert a whole Wiki page into > a PDF document. > > I have also made numerous extensions/bugfixes to the project. Here is > a summary of them > - I fixed a few bugs in wikipdf.php and made the php script platform > independant (working on a server with Linux as well as Windows) > - I fixed numerous bugs in the wiki2latex python script (and did a few > adjustment about formatting of some elements) > - I rewrote unicode2latex. It was too slow for a huge document, > because there was a for loop in there iterating through each character > of the input document. > - I changed slightly the latex templates > - I extended the INSTALL instructions. > > You can find my version of wikipdf in attachment. I named it with > version 0.05.1. I don't have a cvs access to the sourceforge project. > If you like you can commit my changes or give me a cvs access so I can > commit them. > > Feel free to give me your feedback about this version. (I think it is > going pretty well, although there are still a couple of problems with > the conversion of some elements, as documented in INSTALL). > > Kind regards. > Cheers, > > Pat > > |
From: buttay <cyr...@us...> - 2007-04-01 21:38:50
|
Update of /cvsroot/wikipdf/wikipdf-flexbison/src In directory sc8-pr-cvs9.sourceforge.net:/tmp/cvs-serv17880 Modified Files: Parser.py Added Files: Title.py Log Message: some more translation from the php source of mediawiki. As things go, the scope of the translation keeps growing, and I wonder if it is possible to get something working with less than 10000 lines of code... By the way, it is not functional yet, I'm currently working on internal links conversion --- NEW FILE: Title.py --- ## /** ## * See title.txt ## * ## * @package MediaWiki ## */ ## /** */ ## require_once( 'normal/UtfNormal.php' ); ## define ( 'GAID_FOR_UPDATE', 1 ); ## # Title::newFromTitle maintains a cache to avoid ## # expensive re-normalization of commonly used titles. ## # On a batch operation this can become a memory leak ## # if not bounded. After hitting this many titles, ## # reset the cache. ## define( 'MW_TITLECACHE_MAX', 1000 ); ## /** [...2416 lines suppressed...] ## return false; ## } ## /** ## * If the Title refers to a special page alias which is not the local default, ## * returns a new Title which points to the local default. Otherwise, returns $this. ## */ ## function fixSpecialName() { ## if ( $this->getNamespace() == NS_SPECIAL ) { ## $canonicalName = SpecialPage::resolveAlias( $this->mDbkeyform ); ## if ( $canonicalName ) { ## $localName = SpecialPage::getLocalNameFor( $canonicalName ); ## if ( $localName != $this->mDbkeyform ) { ## return Title::makeTitle( NS_SPECIAL, $localName ); ## } ## } ## } ## return $this; ## } ## } Index: Parser.py =================================================================== RCS file: /cvsroot/wikipdf/wikipdf-flexbison/src/Parser.py,v retrieving revision 1.4 retrieving revision 1.5 diff -C2 -d -r1.4 -r1.5 *** Parser.py 31 Mar 2007 22:20:54 -0000 1.4 --- Parser.py 1 Apr 2007 21:38:43 -0000 1.5 *************** *** 12,16 **** import Sanitizer, StringUtils ! wgUrlProtocols = ['http://', 'https://', 'ftp://', --- 12,16 ---- import Sanitizer, StringUtils ! wfUrlProtocols = ['http://', 'https://', 'ftp://', *************** *** 43,46 **** --- 43,75 ---- MW_COLON_STATE_COMMENTDASHDASH = 7 + #----------------------------------------------------------------------------- + # constants copied from DefaultSettings.php + + # Allowed title characters -- regex character class + # Don't change this unless you know what you're doing + # + # Problematic punctuation: + # []{}|# Are needed for link syntax, never enable these + # % Enabled by default, minor problems with path to query rewrite rules, see below + # + Enabled by default, but doesn't work with path to query rewrite rules, corrupted by apache + # ? Enabled by default, but doesn't work with path to PATH_INFO rewrites + # + # All three of these punctuation problems can be avoided by using an alias, instead of a + # rewrite rule of either variety. + # + # The problem with % is that when using a path to query rewrite rule, URLs are + # double-unescaped: once by Apache's path conversion code, and again by PHP. So + # %253F, for example, becomes "?". Our code does not double-escape to compensate + # for this, indeed double escaping would break if the double-escaped title was + # passed in the query string rather than the path. This is a minor security issue + # because articles can be created such that they are hard to view or edit. + # + # In some rare cases you may wish to remove + for compatibility with old links. + # + # Theoretically 0x80-0x9F of ISO 8859-1 should be disallowed, but + # this breaks interlanguage links + wgLegalTitleChars = " %!\"$&'()*,\\-.\\/0-9:;=?@A-Z\\\\^_`a-z~\\x80-\\xFF+" + + class Parser: """This class implements a parser that converts raw wikitext in HTML. It is *************** *** 130,134 **** text = self.doBlockLevels(text, linestart) ## [TODO?]text = self.replaceLinkHolders(text) ! print text def strip(self, text, state): --- 159,163 ---- text = self.doBlockLevels(text, linestart) ## [TODO?]text = self.replaceLinkHolders(text) ! ## print text def strip(self, text, state): *************** *** 317,321 **** # by earlier parser steps, but should avoid splitting up eg # attribute values containing literal "||". ! ##[TODO] $cells = StringUtils::explodeMarkup( '||' , $line ); lines[key] = '' # Loop through each table cell --- 346,350 ---- # by earlier parser steps, but should avoid splitting up eg # attribute values containing literal "||". ! cells = StringUtils.StringUtils.explodeMarkup('||' , line) lines[key] = '' # Loop through each table cell *************** *** 402,406 **** ## } text = self.doAllQuotes(text) ! ## text = self.replaceInternalLinks(text) [TODO] not working yet! ## $text = $this->replaceExternalLinks( $text ); --- 431,435 ---- ## } text = self.doAllQuotes(text) ! text = self.replaceInternalLinks(text) # [TODO] not working yet! ## $text = $this->replaceExternalLinks( $text ); *************** *** 563,588 **** def replaceInternalLinks(self, s): ! """ Process [[ ]] wikilinks""" ! # global $wgContLang; ! tc = False # the % is needed to support urlencoded titles as well ! ## if ( !$tc ) { $tc = Title::legalChars() . '#%'; } ! tc = '#%' ! ## $sk = $this->mOptions->getSkin(); #split the entire text string on occurences of [[ ! a = s.split('[[') # [TODO] check if the space added by the php program is of any use #get the first element (all text up to first [[), and remove the space we added ! s, a = a[0], a[1:] # Match a link having the form [[namespace:link|alternate]]trail ! e1 = re.compile("^([%s]+)(?:\\|(.+?))?]](.*)\$" %tc, re.DOTALL) # [TODO] the original regexp ended in /sD, and I don't know what D is # Match cases where there is no "]]", which might still be images ! e1_img = re.compile("^([%s]+)\\|(.*)\$" %tc, re.DOTALL|re.MULTILINE) # Match the end of a line for a word that's not followed by whitespace, # e.g. in the case of 'The Arab al[[Razi]]', 'al' will be matched ! ## e2 = wfMsgForContent( 'linkprefix' ); #[TODO] implement this ! ## $useLinkPrefixExtension = $wgContLang->linkPrefixExtension(); # see setup.php ## if( is_null( $this->mTitle ) ) { ## throw new MWException( __METHOD__.": \$this->mTitle is null\n" ); --- 592,616 ---- def replaceInternalLinks(self, s): ! """ Process [[ ]] wikilinks""" # the % is needed to support urlencoded titles as well ! tc = wgLegalTitleChars+'#%' # this constant is defined at the beginning of the present file ! ## $sk = $this->mOptions->mgetSkin(); #split the entire text string on occurences of [[ ! s = ' '+s ! a = s.split('[[') #get the first element (all text up to first [[), and remove the space we added ! s = a[0][1:] ! a = a[1:] # Match a link having the form [[namespace:link|alternate]]trail ! e1 = re.compile("^([%s]+)(?:\\|(.+?))?]](.*)$" %tc, re.DOTALL|re.MULTILINE) # [TODO] the original regexp ended in /sD, and I don't know what D is # Match cases where there is no "]]", which might still be images ! e1_img = re.compile("^([%s]+)\\|(.*)$" %tc, re.DOTALL|re.MULTILINE) # Match the end of a line for a word that's not followed by whitespace, # e.g. in the case of 'The Arab al[[Razi]]', 'al' will be matched ! ##[TODO] e2 = wfMsgForContent( 'linkprefix' ); #[TODO] implement this ! useLinkPrefixExtension = False ##[TODO] = $wgContLang->linkPrefixExtension(); # see setup.php ## if( is_null( $this->mTitle ) ) { ## throw new MWException( __METHOD__.": \$this->mTitle is null\n" ); *************** *** 598,602 **** ## } ## } else { ! ## $prefix = ''; ## } --- 626,630 ---- ## } ## } else { ! prefix = '' ## } *************** *** 607,617 **** ## } ## $useSubpages = $this->areSubpagesAllowed(); ! if True: ! if True: ! ! # Loop for each link ! for k, line in enumerate(a): ! ## if ( $useLinkPrefixExtension ) { ! ## if ( preg_match( $e2, $s, $m ) ) { ## $prefix = $m[2]; ## $s = $m[1]; --- 635,644 ---- ## } ## $useSubpages = $this->areSubpagesAllowed(); ! ! # Loop for each link ! for k, line in enumerate(a): ! if useLinkPrefixExtension: ! pass ! ## [TODO] if ( preg_match( $e2, $s, $m ) ) { ## $prefix = $m[2]; ## $s = $m[1]; *************** *** 626,690 **** ! might_be_img = False ! ! temp = e1.match(line) ! if temp: ! text = m[2] ! # If we get a ] at the beginning of $m[3] that means we have a link that's something like: ! # [[Image:Foo.jpg|[http://example.com desc]]] <- having three ] in a row fucks up, ! # the real problem is with the $e1 regex ! # See bug 1300. ! # ! # Still some problems for cases where the ] is meant to be outside punctuation, ! # and no image is in sight. See bug 2095. ! # ! if text != '' and m[3][0:1] == ']' and '[' in text: ! text += ']' # so that replaceExternalLinks($text) works later ! m[3] = m[3][1:] ! # fix up urlencoded title texts ! if '%' in m[1]: ! # Should anchors '#' also be rejected? ## m[1] = str_replace( array('<', '>'), array('<', '>'), urldecode($m[1]) ); [TODO] ! m[1] = m[1].replace('<', '<') ! m[1] = m[1].replace('>', '>') ! trail = m[3] ! else: ! m = e1_img.match(line) ! if m: # Invalid, but might be an image with a link in its caption ! might_be_img = True ! text = m[2] ! if '%' in m[1]: ## $m[1] = urldecode($m[1]); # [TODO] ! pass ! trail = "" ! else: # Invalid form; output directly ! s += prefix+'[['+line ! continue ! ! # Don't allow internal links to pages containing ! # PROTO: where PROTO is a valid URL protocol; these ! # should be external links. ! re_url = re.compile('^(\b(?:%s))' %'|'.join(wfUrlProtocols)) ! if re_url.match(m[1]): ! s += prefix + '[[' + line ! continue ! # Make subpage if necessary ## if( $useSubpages ) { ## $link = $this->maybeDoSubpageLink( $m[1], $text ); ## else: ! link = m[1] ! if m[1][0] == ':': ! noforce = False ! else: ! noforce = True ! if not noforce: ! # Strip off leading ':' ! link = link[1:] ! ## nt = Title::newFromText( $this->mStripState->unstripNoWiki($link) ); ! if not nt : ! s += prefix + '[[' + line ! continue ## ns = $nt->getNamespace(); --- 653,717 ---- ! might_be_img = False ! temp = e1.match(line) ! if temp: ! m = temp.group() ! text = m[2] ! # If we get a ] at the beginning of $m[3] that means we have a link that's something like: ! # [[Image:Foo.jpg|[http://example.com desc]]] <- having three ] in a row fucks up, ! # the real problem is with the $e1 regex ! # See bug 1300. ! # ! # Still some problems for cases where the ] is meant to be outside punctuation, ! # and no image is in sight. See bug 2095. ! # ! if text != '' and m[3][0:1] == ']' and '[' in text: ! text += ']' # so that replaceExternalLinks($text) works later ! m[3] = m[3][1:] ! # fix up urlencoded title texts ! if '%' in m[1]: ! # Should anchors '#' also be rejected? ## m[1] = str_replace( array('<', '>'), array('<', '>'), urldecode($m[1]) ); [TODO] ! m[1] = m[1].replace('<', '<') ! m[1] = m[1].replace('>', '>') ! trail = m[3] ! else: ! m = e1_img.match(line) ! if m: # Invalid, but might be an image with a link in its caption ! might_be_img = True ! text = m[2] ! if '%' in m[1]: ## $m[1] = urldecode($m[1]); # [TODO] ! pass ! trail = "" ! else: # Invalid form; output directly ! s += prefix+'[['+line ! continue ! ! # Don't allow internal links to pages containing ! # PROTO: where PROTO is a valid URL protocol; these ! # should be external links. ! re_url = re.compile('^(\b(?:%s))' %'|'.join(wfUrlProtocols)) ! if re_url.match(m[1]): ! s += prefix + '[[' + line ! continue ! # Make subpage if necessary ## if( $useSubpages ) { ## $link = $this->maybeDoSubpageLink( $m[1], $text ); ## else: ! link = m[1] ! if m[1][0] == ':': ! noforce = False ! else: ! noforce = True ! if not noforce: ! # Strip off leading ':' ! link = link[1:] ! nt = Title::newFromText( $this->mStripState->unstripNoWiki($link) ); ! if not nt : ! s += prefix + '[[' + line ! continue ## ns = $nt->getNamespace(); *************** *** 795,799 **** ## $s .= $this->makeLinkHolder( $nt, $text, '', $trail, $prefix ); ! return s def replaceExternalLinks(self, text): --- 822,826 ---- ## $s .= $this->makeLinkHolder( $nt, $text, '', $trail, $prefix ); ! return s def replaceExternalLinks(self, text): |
From: Buttay c. <cyr...@fr...> - 2007-03-31 22:31:56
|
patrick jayet wrote: > In fact, I have in the meantime worked a bit on the project and > essentially fixed bugs and changed a few things in the wiki2latex > python script. If you are interested, I can send you and Cyril the > changes that I have made. I have also adapted the php code such that > it works also under Windows, because I had to find a solution for > converting the mediawiki format to pdf on a windows platform. So now, > the version I have works on unix and windows. Hi Patrick, I'm interested in your version, although I made a lot of changes to the interface of the program, so I don't expect them to work out of the box. I don't have any skills in php, so I would be pleased if you could adapt your code for the current development version of wikipdf. (the branch is wikipdf-flexbison on the cvs). To give you an overview of what I do, here are the different steps wikipdf has come through: - replacement of the wiki to latex engine by an existing wiki to xml (flexbisonparse), followed by a home-made xml-to-latex converter. This works relatively well, but flexbisonparse is bugged and no longer maintained. - I tried to write a wiki to xml converter from scratch, but I stopped, because the wiki markup has evolved in such a complicated language that I couldn't build a satisfying parser - at the moment, I simply translate the php code for the parser into Python. This will result in a sub-optimal program, but at least it should produce a correct output (providing I tranlate correctly the thousands of lines of the original code...) The idea is to get something as modular as possible, so any interface can be used. I'm also keen on the intermediate XML step, because I will make possible to write another XML to anything converter later (for example XML to openoffice) There is plenty of work to do, and I would be please to get people on board, as I'm not a very experienced programmer. Cheers Cyril > > Cyril: if you like I can contribute to the project. I think it is a > very nice one. Don't hesitate to take contact with me regarding this. > > Cheers, :) > > Pat > > > On Mar 31, 2007, at 5:20 PM, Felipe Sanches wrote: > >> I have downloaded the last version available on sourceforge [1], but >> this version does not seem to be still supported. >> >> >> this latest version is a very old one. We havent released a newer one >> yet, but there is 1 programer still constantly working on the >> project. But he is working on the core of the application, not te UI. >> Maybe that´s why there is no new release available yet. I am not >> working on wikipdf anymore. Maybe someday I work on it again, but not >> today :-(... >> You can contact Cyril Buttay at cyr...@us... >> <mailto:cyr...@us...> or send a message to the >> devel mailing list: wik...@li... >> <mailto:wik...@li...> >> You can also get a CVS copy of the project, but (in the wikipdf case) >> it wont be much useful if youre not a programmer interested in >> putting hands on code. >> >> I hope it helps, >> Felipe > |