doxygen-develop Mailing List for Doxygen (Page 8)
Brought to you by:
dimitri
You can subscribe to this list here.
2001 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(4) |
Jun
(4) |
Jul
(29) |
Aug
(8) |
Sep
(8) |
Oct
(17) |
Nov
(34) |
Dec
(6) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2002 |
Jan
(20) |
Feb
(14) |
Mar
(11) |
Apr
(9) |
May
(8) |
Jun
(7) |
Jul
(25) |
Aug
(12) |
Sep
(12) |
Oct
(24) |
Nov
(27) |
Dec
(12) |
2003 |
Jan
(12) |
Feb
(14) |
Mar
(15) |
Apr
(11) |
May
(17) |
Jun
(20) |
Jul
(32) |
Aug
(13) |
Sep
(34) |
Oct
(12) |
Nov
(16) |
Dec
(33) |
2004 |
Jan
(20) |
Feb
(6) |
Mar
(20) |
Apr
(15) |
May
(16) |
Jun
(28) |
Jul
(7) |
Aug
(7) |
Sep
(17) |
Oct
(16) |
Nov
(17) |
Dec
(43) |
2005 |
Jan
(15) |
Feb
(5) |
Mar
(14) |
Apr
(4) |
May
(3) |
Jun
(8) |
Jul
(17) |
Aug
(16) |
Sep
(7) |
Oct
(17) |
Nov
(1) |
Dec
(7) |
2006 |
Jan
(7) |
Feb
(6) |
Mar
(10) |
Apr
(6) |
May
(3) |
Jun
(4) |
Jul
(3) |
Aug
(3) |
Sep
(18) |
Oct
(11) |
Nov
(10) |
Dec
(3) |
2007 |
Jan
(12) |
Feb
(12) |
Mar
(23) |
Apr
(5) |
May
(13) |
Jun
(6) |
Jul
(5) |
Aug
(4) |
Sep
(8) |
Oct
(10) |
Nov
(6) |
Dec
(7) |
2008 |
Jan
(7) |
Feb
(13) |
Mar
(35) |
Apr
(14) |
May
(13) |
Jun
(4) |
Jul
(9) |
Aug
(6) |
Sep
(12) |
Oct
(9) |
Nov
(6) |
Dec
(3) |
2009 |
Jan
(2) |
Feb
(2) |
Mar
(2) |
Apr
(15) |
May
(1) |
Jun
(2) |
Jul
(7) |
Aug
(3) |
Sep
(4) |
Oct
(1) |
Nov
(2) |
Dec
(1) |
2010 |
Jan
(4) |
Feb
|
Mar
(5) |
Apr
(1) |
May
(5) |
Jun
|
Jul
(2) |
Aug
(3) |
Sep
(11) |
Oct
(2) |
Nov
(1) |
Dec
(5) |
2011 |
Jan
(12) |
Feb
(3) |
Mar
(28) |
Apr
(4) |
May
(3) |
Jun
(4) |
Jul
(15) |
Aug
(12) |
Sep
(2) |
Oct
(3) |
Nov
(6) |
Dec
(3) |
2012 |
Jan
(1) |
Feb
(4) |
Mar
(9) |
Apr
(5) |
May
(6) |
Jun
(6) |
Jul
(3) |
Aug
(3) |
Sep
(4) |
Oct
(2) |
Nov
(9) |
Dec
(7) |
2013 |
Jan
(8) |
Feb
(14) |
Mar
(15) |
Apr
(21) |
May
(29) |
Jun
(34) |
Jul
(3) |
Aug
(7) |
Sep
(13) |
Oct
(1) |
Nov
(3) |
Dec
(5) |
2014 |
Jan
|
Feb
|
Mar
|
Apr
(10) |
May
(2) |
Jun
(4) |
Jul
(2) |
Aug
(2) |
Sep
(4) |
Oct
(4) |
Nov
(4) |
Dec
(2) |
2015 |
Jan
(7) |
Feb
(4) |
Mar
(3) |
Apr
(15) |
May
(4) |
Jun
(9) |
Jul
(1) |
Aug
(2) |
Sep
|
Oct
|
Nov
(3) |
Dec
(7) |
2016 |
Jan
(1) |
Feb
|
Mar
|
Apr
(1) |
May
(1) |
Jun
(1) |
Jul
|
Aug
(5) |
Sep
|
Oct
(1) |
Nov
(1) |
Dec
(1) |
2017 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
(9) |
Aug
|
Sep
|
Oct
|
Nov
(1) |
Dec
(5) |
2018 |
Jan
|
Feb
(2) |
Mar
(3) |
Apr
|
May
(7) |
Jun
(1) |
Jul
|
Aug
(1) |
Sep
|
Oct
|
Nov
|
Dec
|
2019 |
Jan
(4) |
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
(3) |
Oct
|
Nov
|
Dec
|
2020 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(1) |
Aug
|
Sep
|
Oct
(1) |
Nov
(1) |
Dec
(1) |
2021 |
Jan
(2) |
Feb
|
Mar
(2) |
Apr
|
May
|
Jun
(3) |
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
2022 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
(1) |
Sep
|
Oct
|
Nov
(1) |
Dec
|
From: Trond D. <tro...@no...> - 2013-08-29 06:34:58
|
Hi, I just started using Doxygen for VHDL, and I think I have found a bug. We use arrays of unconstrained elements - a VHDL-2008 feature - a lot in our design, but this triggers a bug in the VHDL scanner. Consider the following code: -------------------------------------------------------------------------------------------------------------------------------------------- library ieee; use ieee.std_logic_1164.all; entity top is end entity top; architecture str of top is type slv_vector is array(natural range <>) of std_logic_vector; signal clk : std_logic; signal data, q : slv_vector(0 to 10)(15 downto 0); begin is_missing_from_diagram_inst : entity work.is_missing_from_diagram port map(clk, data, q); end architecture str; -------------------------------------------------------------------------------------------------------------------------------------------- When compiling this with HAVE_DOT and OPTIMIZE_FOR_VHDL set to yes, the class hierarchy diagram is missing. If I replace the slv_vector type with another type such as integer, everything is fine. I would guess that the definition for arrays of unconstrained elements is missing from the vhdlparser.y file, but I am no Flex and Bison wizard, so I thought I'd ask here first before diving into the code again. -- Trond Danielsen <tro...@no...> Project Engineer Norbit Subsea AS <http://www.norbit.no> +47 404 94 711 Email written on eco-friendly reusable memory |
From: Slaughter, A. E <and...@in...> - 2013-08-21 20:56:51
|
I am working on changes to Doxygen that my boss would like involving groups and was hoping someone could point me to the correct spot in the code to start implementing one of my changes: breaking the documentation into two sections a general section (no groups) and advanced sections. I am having problems with the \name command and inheritance. Referring to the code snippet at the bottom of this message, the output for BaseClass is correct, it combines the two "Advanced Public Member Functions" under a single heading. But, the additional advanced method add by TestClass is under another heading (with the exact name). I would like to modify Doxygen to pick on this and put everything under the same heading. Is this something that is feasible? If so, can someone please point me in the write direction within the source code. I forked the repository and have been playing around a bit with the code. I will submit a pull request when I get this up and running. Thanks, Andrew ----------------------------------------------------------------------------------------------------------------------- /** * A Base Class */ class BaseClass { public: BaseClass(); /** General Base Method */ virtual void generalMethod(); /** \name Advanced Public Member Functions */ ///@{ /** An advanced base method */ void advancedBaseMethod(); ///@} /** \name Advanced Public Member Functions */ ///@{ void anotherAdvancedBaseMethod(); ///@} protected: /** An general use protected member */ virtual void generalProtectedMethod(); /** \name Advanced Protected Member Functions */ ///@{ /** An advanced protected method */ virtual void advancedProtectedMethod(); ///@} }; /** * A Test class */ class TestClass : public BaseClass { public: TestClass(); /** General method */ void generalMethod(); /** \name Advanced Public Member Functions */ ///@{ /** Advanced method */ void advancedMethod(); ///@} }; |
From: Helmut G. <he...@su...> - 2013-08-04 20:11:17
|
Control: forwarded 718151 https://bugzilla.gnome.org/show_bug.cgi?id=701295 Control: tags 718151 fixed-upstream On Sun, Aug 04, 2013 at 05:40:40PM +0200, Albert wrote: > This looks a bit like *Bug > 701295*<https://bugzilla.gnome.org/show_bug.cgi?id=701295>- Doxygen > 1.8.4 goes into an endless loop. > Did you try it with the current git version as well? Thanks for looking it up. The git version changes the relevant code in a way, that fixes the observed behaviour. Helmut |
From: Albert <alb...@gm...> - 2013-08-04 15:40:48
|
This looks a bit like *Bug 701295*<https://bugzilla.gnome.org/show_bug.cgi?id=701295>- Doxygen 1.8.4 goes into an endless loop. Did you try it with the current git version as well? If the problem is still present please submit a bug in the bug tracer with a self-contained example (source+config file in a tar or zip) that allows us to reproduce the problem? Albert On Sun, Aug 4, 2013 at 2:31 PM, Helmut Grohne <he...@su...> wrote: > Control: clone 718151 -1 > Control: reassign -1 doxygen 1.8.4-1 > Control: severity -1 normal > Control: retitle -1 doxygen loops when passing "foo(>0)" to > findParameterList > > On Sat, Aug 03, 2013 at 11:21:11PM +0200, Matthias Klose wrote: > > fix it in libburn or disable building the docs. upstream did tell you > that they > > didn't want to update that for newer doxygen versions. There is > absolutely no > > reason to reassign this to doxygen. > > While the report was not overly helpful, part of the issue is with > doxygen. I looked into the issue and pull in > dox...@li... for further assistance. > > So the issue at hand is that doxygen does not terminate when building > the documentation for libburn. On interrupting doxygen after leaving it > running for some time you will see a traceback like this: > > #0 findParameterList (name=...) at util.cpp:1848 > #1 0x00000000005f894f in resolveRef (scName=0x0, name=0x1657d30 > "burn_abort(>0)", inSeeBlock=false, resContext=0x7fff9a4e8c40, > resMember=0x7fff9a4e8c48, lookForSpecialization=true, > currentFile=0x1603fc0, checkScope=true) at util.cpp:4363 > #2 0x00000000006a608c in handleLinkedWord (parent=0x1c8f4c0, > children=...) at docparser.cpp:1030 > #3 0x00000000006b832e in DocPara::parse (this=0x1c8f480) at > docparser.cpp:6311 > #4 0x00000000006b257e in DocSimpleSect::parse (this=0x1c8f410, > userTitle=false, needsSeparator=false) at docparser.cpp:4570 > #5 0x00000000006b3436 in DocPara::handleSimpleSection (this=0x16594c0, > t=DocSimpleSect::Since, xmlContext=false) at docparser.cpp:4887 > #6 0x00000000006b59bf in DocPara::handleCommand (this=0x16594c0, > cmdName=...) at docparser.cpp:5399 > #7 0x00000000006b8a4e in DocPara::parse (this=0x16594c0) at > docparser.cpp:6478 > #8 0x00000000006b9abb in DocRoot::parse (this=0x1651080) at > docparser.cpp:6843 > #9 0x00000000006ba35b in validatingParseDoc (fileName=0x164f620 > ".../libburn/libburn.h", startLine=3608, ctx=0x156b7e0, md=0x183a3a0, > input=0x1653f80 " Either by setting an own handler or\nby activating the > built-in signal handler.\n\nA function parameter handle of NULL activates > the built-in abort handler. \nDepending on mode it may cancel all drive > op"..., indexWords=true, isExample=false, exampleName=0x0, > singleLine=false, linkFromIndex=false) at docparser.cpp:7085 > #10 0x0000000000574224 in OutputList::generateDoc (this=0x18790e0, > fileName=0x164f620 ".../libburn/libburn.h", startLine=3608, ctx=0x156b7e0, > md=0x183a3a0, docStr=..., indexWords=true, isExample=false, > exampleName=0x0, singleLine=false, linkFromIndex=false) at > outputlist.cpp:153 > #11 0x000000000055e4a0 in MemberDef::writeDocumentation (this=0x183a3a0, > ml=0x17405b0, ol=..., scName=0x1641150 "libburn.h", container=0x1603fc0, > inGroup=false, showEnumValues=false, showInline=false) at memberdef.cpp:2745 > #12 0x000000000056aff5 in MemberList::writeDocumentation (this=0x17405b0, > ol=..., scopeName=0x1641150 "libburn.h", container=0x1603fc0, > title=0x163d660 "Function Documentation", showEnumValues=false, > showInline=false) at memberlist.cpp:655 > #13 0x0000000000446904 in FileDef::writeMemberDocumentation > (this=0x1603fc0, ol=..., lt=MemberListType_docFuncMembers, title=...) at > filedef.cpp:1742 > #14 0x000000000044263f in FileDef::writeDocumentation (this=0x1603fc0, > ol=...) at filedef.cpp:685 > #15 0x000000000042364f in generateFileDocs () at doxygen.cpp:7842 > #16 0x0000000000430ab7 in generateOutput () at doxygen.cpp:11231 > #17 0x00000000004032c6 in main (argc=2, argv=0x7fff9a4e9818) at main.cpp:38 > > Indeed the issue lies within findParameterList. The name parameter > passed to resolvRef as a c string is passed to the former function as a > QString, but its value still represents "burn_abort(>0)". Given this > value, findParameterList goes into an infinite "do { ... } while (...)" > loop. The loop iterations alternate between templateDepth=0 and > templateDepth=1. On the second iteration it will take "then" branch of > the outer "if" and will set nextOpenPos=-1 and nextClosePos=-1. This > causes the inner "else" branch to be selected setting pos=-2 and thus > continuing the loop. > > A possible fix would be to change the loop condition from > "while(pos != -1)" to "while(pos >= 0)". > > Is this analysis correct? > > As for the libburn maintainers, I suggest to change the comment in the > header to not include the verbatim string "burn_abort(>0)" in order to > not confuse doxygen. Yeah, it's a bug in doxygen that it loops, but > you'll have to work around this one for the time being. > > Helmut > > > ------------------------------------------------------------------------------ > Get your SQL database under version control now! > Version control is standard for application code, but databases havent > caught up. So what steps can you take to put your SQL databases under > version control? Why should you start doing it? Read more to find out. > http://pubads.g.doubleclick.net/gampad/clk?id=49501711&iu=/4140/ostg.clktrk > _______________________________________________ > Doxygen-develop mailing list > Dox...@li... > https://lists.sourceforge.net/lists/listinfo/doxygen-develop > |
From: Helmut G. <he...@su...> - 2013-08-04 12:58:10
|
Control: clone 718151 -1 Control: reassign -1 doxygen 1.8.4-1 Control: severity -1 normal Control: retitle -1 doxygen loops when passing "foo(>0)" to findParameterList On Sat, Aug 03, 2013 at 11:21:11PM +0200, Matthias Klose wrote: > fix it in libburn or disable building the docs. upstream did tell you that they > didn't want to update that for newer doxygen versions. There is absolutely no > reason to reassign this to doxygen. While the report was not overly helpful, part of the issue is with doxygen. I looked into the issue and pull in dox...@li... for further assistance. So the issue at hand is that doxygen does not terminate when building the documentation for libburn. On interrupting doxygen after leaving it running for some time you will see a traceback like this: #0 findParameterList (name=...) at util.cpp:1848 #1 0x00000000005f894f in resolveRef (scName=0x0, name=0x1657d30 "burn_abort(>0)", inSeeBlock=false, resContext=0x7fff9a4e8c40, resMember=0x7fff9a4e8c48, lookForSpecialization=true, currentFile=0x1603fc0, checkScope=true) at util.cpp:4363 #2 0x00000000006a608c in handleLinkedWord (parent=0x1c8f4c0, children=...) at docparser.cpp:1030 #3 0x00000000006b832e in DocPara::parse (this=0x1c8f480) at docparser.cpp:6311 #4 0x00000000006b257e in DocSimpleSect::parse (this=0x1c8f410, userTitle=false, needsSeparator=false) at docparser.cpp:4570 #5 0x00000000006b3436 in DocPara::handleSimpleSection (this=0x16594c0, t=DocSimpleSect::Since, xmlContext=false) at docparser.cpp:4887 #6 0x00000000006b59bf in DocPara::handleCommand (this=0x16594c0, cmdName=...) at docparser.cpp:5399 #7 0x00000000006b8a4e in DocPara::parse (this=0x16594c0) at docparser.cpp:6478 #8 0x00000000006b9abb in DocRoot::parse (this=0x1651080) at docparser.cpp:6843 #9 0x00000000006ba35b in validatingParseDoc (fileName=0x164f620 ".../libburn/libburn.h", startLine=3608, ctx=0x156b7e0, md=0x183a3a0, input=0x1653f80 " Either by setting an own handler or\nby activating the built-in signal handler.\n\nA function parameter handle of NULL activates the built-in abort handler. \nDepending on mode it may cancel all drive op"..., indexWords=true, isExample=false, exampleName=0x0, singleLine=false, linkFromIndex=false) at docparser.cpp:7085 #10 0x0000000000574224 in OutputList::generateDoc (this=0x18790e0, fileName=0x164f620 ".../libburn/libburn.h", startLine=3608, ctx=0x156b7e0, md=0x183a3a0, docStr=..., indexWords=true, isExample=false, exampleName=0x0, singleLine=false, linkFromIndex=false) at outputlist.cpp:153 #11 0x000000000055e4a0 in MemberDef::writeDocumentation (this=0x183a3a0, ml=0x17405b0, ol=..., scName=0x1641150 "libburn.h", container=0x1603fc0, inGroup=false, showEnumValues=false, showInline=false) at memberdef.cpp:2745 #12 0x000000000056aff5 in MemberList::writeDocumentation (this=0x17405b0, ol=..., scopeName=0x1641150 "libburn.h", container=0x1603fc0, title=0x163d660 "Function Documentation", showEnumValues=false, showInline=false) at memberlist.cpp:655 #13 0x0000000000446904 in FileDef::writeMemberDocumentation (this=0x1603fc0, ol=..., lt=MemberListType_docFuncMembers, title=...) at filedef.cpp:1742 #14 0x000000000044263f in FileDef::writeDocumentation (this=0x1603fc0, ol=...) at filedef.cpp:685 #15 0x000000000042364f in generateFileDocs () at doxygen.cpp:7842 #16 0x0000000000430ab7 in generateOutput () at doxygen.cpp:11231 #17 0x00000000004032c6 in main (argc=2, argv=0x7fff9a4e9818) at main.cpp:38 Indeed the issue lies within findParameterList. The name parameter passed to resolvRef as a c string is passed to the former function as a QString, but its value still represents "burn_abort(>0)". Given this value, findParameterList goes into an infinite "do { ... } while (...)" loop. The loop iterations alternate between templateDepth=0 and templateDepth=1. On the second iteration it will take "then" branch of the outer "if" and will set nextOpenPos=-1 and nextClosePos=-1. This causes the inner "else" branch to be selected setting pos=-2 and thus continuing the loop. A possible fix would be to change the loop condition from "while(pos != -1)" to "while(pos >= 0)". Is this analysis correct? As for the libburn maintainers, I suggest to change the comment in the header to not include the verbatim string "burn_abort(>0)" in order to not confuse doxygen. Yeah, it's a bug in doxygen that it loops, but you'll have to work around this one for the time being. Helmut |
From: Olaf M. <o.m...@me...> - 2013-07-30 13:03:29
|
Hello, I added a patch to a bug report (#703791) that left a few open questions. Would it be possible to comment / clarify these here? 1. When documenting classes out-of-line, is the following desired behaviour (note the missing space): \class A<a.h> ? It clashes with my proposed solution for template specialisation: \class A<double> . 2. Which function to call to get the canonical class name from within commentscan.l ? Best regards, Olaf Mandel |
From: <Eck...@t-...> - 2013-07-03 16:30:08
|
Hello Dimitri. Thank you for your effort. I already posted a replay to your bug-comment at https://bugzilla.gnome.org/show_bug.cgi?id=700148 I hope that the attached zip-archive is helpful for you. In the meanwhile I have an additional lesson learned associated to perl-filter. In my company we tried to use the perl-version that comes along with matlab. But this was not working since the windows PATH variable contained not the perl-directory (due to the kind of matlab-installation). Thus we configured the PERL_PATH available on the external-tab of the doxywizard with its installation path, unfortunately with the same result. Yesterday I tried out in the doxywizard *.m=C:folder\subfolder\perl m2cpp.pl" where C:folder\subfolder\ stands for the installation folder of perl. This call of perl works. I assume that the configuration of PERL_PATH will be used only, if doxygen wants to use perl directly . If the user calls perl by using an other kind of configuration (in this situation as filter-interpretor) doxygen is not recognizing that perl should be called and tries just to start the defined command. Once you understand that, this behaviour is OK. But it may be a good idea to add some additional lines in the description of PERL_PATH to explain this to reduce the confusion of the user. I hope the fact that I stress this filter-things is not to much annoying for you. But I hope you remember that I'm the developer of a tool that is able to create nassi shneidermann diagrams and since some weeks uml activity diagrams to be used together with doxygen. Driven by my own needs I try to support not only C/Cpp and Python but Matlab and Pascal also. Thus for me it is very important how to use doxygen with this languages also. Best regards, Eckard. |
From: Hervé S. <her...@gm...> - 2013-07-01 20:34:09
|
Hello, we use Doxygen 1.8.4 (HTML Generation for PHP Classes). My use case is : We use it for web-designers who don't need to have informations about namespaces, just classes and methods. So, I found "SHOW_NAMESPACES" in the config file. After the change, it's strange to have still namespaces Objects in Classes list, etc. I don't know if you think it's normal, but for me and mainly for users (web-designers) :), it's perturbing. I propose a patch of index.cpp, I use still SHOW_NAMESPACES option, but for "Classes List", etc... : --- index.cpp 2013-07-01 22:14:07.000000000 +0200 +++ index.cpp.new 2013-07-01 22:14:07.000000000 +0200 @@ -2100,7 +2100,12 @@ Doxygen::indexList->incContentsDepth(); } FTVHelp* ftv = new FTVHelp(FALSE); - writeNamespaceTree(Doxygen::namespaceSDict,ftv,TRUE,TRUE,addToIndex); + + if(Config_getBool("SHOW_NAMESPACES")) + { + writeNamespaceTree(Doxygen::namespaceSDict,ftv,TRUE,TRUE,addToIndex); + } + writeClassTree(Doxygen::classSDict,ftv,addToIndex,TRUE); QGString outStr; FTextStream t(&outStr); Thank you, Hervé Seignole Web Architect Manager |
From: Michael S. <ms...@re...> - 2013-06-19 10:32:59
|
On 11/06/13 20:05, Markus Geimer wrote: >> The only other thing I can suggest to remedy the dependencies problem >> is to bundle all dependency source code with Doxygen and build it >> along with everything else, much like you already do for libmd5. This >> also would include any dependencies of dependencies, transitively. >> This doesn't seem like it would be too big of a deal except for QT, >> which is pretty huge. Thoughts? > > I strongly recommend to *not* create such a monster tarball because of > various reasons: > > - Most importantly, it will become a maintenance nightmare. Someone > needs to keep track of new versions of all the dependency packages, > and based on the changelog decide whether an update is required for > doxygen (e.g., when a security issue has been fixed) or not. > > - Additional burden is put onto the distro package maintainers, as > they will most likely patch the sources to use the system-provided > libraries instead of the included ones (this is, e.g., already the > case for libpng on Debian and maybe other distros). > > - It will unnecessarily increase build times for 99% of the users > which have the common libraries already installed (not such a big > deal if parallel builds are working, but Qt will be a killer). that's all very true. there is a sort of "compromise" between bundling stuff and not bundling stuff that is used by LibreOffice: we have dozens of external libraries that are bundled, and we can build all of these during the LibreOffice build, but they are not actually included in the git repo or in source tarballs. it works like this: 1. configure has for most externals a --with[out]-system-foo switch to determine if it should be bundled (defaults to bundled on Windows) 2. if configure says it's bundled, then a source tarball is automatically downloaded from somewhere.libreoffice.org 3. the tarball is unpacked, some patches are applied (most things don't build out of the box on all platforms), the external thing is built and its libraries copied to well known library dir of course Doxygen has a lot fewer dependencies and so a much smaller problem so i wouldn't recommend to copy the approach exactly but perhaps it can provide some inspiration. most likely it's not necessary to build bundled libraries on Linux, because package managers allow to easily install stuff there, it's more of a problem for Mac and Windows. >>>>>> Another area to consider is the standard C/C++ libraries. I have >>>>>> read the info behind the libraries, and found that this is where you >>>>>> could run into problems, as every std library is configured differently >>>>>> to make system calls to the Linux kernel. > > Yep, libstdc++ would really be an issue. Typically, you can build > a binary against a version of libstdc++ and then run it using a > newer one, but not vice versa. Therefore, the best thing to do is > to always use the system-installed C++ library. yes, if you want to distribute binaries of Doxygen that are supposed to run on any old Linux system you need to think about this; it's not just libstdc++ but also glibc, or well any system library really. for LibreOffice we build the upstream release Linux binaries on a RHEL5 system (very old versions of everything) for that reason; for OpenOffice.org a similar approach was used with an NFS-mounted baseline of libraries and the GCC --sysroot (which prevents it from looking in standard system dirs). but given that there's a Doxygen package for every distro anyway and it's a developer focused tool (so users can be expected to build it from source if needed) i'm not sure why you'd bother to do this :) |
From: Bastiaan V. <Bas...@SA...> - 2013-06-18 20:25:47
|
Hi, I must admit that I haven´t studied this in great detail, but if most binary dependencies are available for windows from somewhere, so that cmake would work well for 90% of the dependencies, one possibility for the remaining 10% would be to distribute binaries on the doxygen site for the sole purpose that cmake can download them from there. What are the percentages actually? It may be an obvious thing to say, but I thought I´d bring it up nonetheless. Bastiaan. On 18-6-2013 19:17, Robert Dailey wrote: > Sorry for the delay in response. > > So basically the problem is between Windows and Linux. On Windows, I > do not have a package manager. It's a huge pain in the ass to download > each dependency, as well as its dependencies (transitively) and build > each individually. This takes sometimes days, and is impossible to do > depending on what libs you use. If anyon tried to build SVN on Windows > 5 years ago, they would know what I'm talking about. > > However on Linux, things are much easier in this respect. It's OK to > let the user install the dependencies because they just type in > apt-get and they are done. > > find_package() is ideally the cross-platform way of handling > dependencies in CMake, but the problem is just that: Fulfilling the > dependencies, as I noted above, is rather painful on Windows. Not only > that, but CMake doesn't come pre-packaged with find modules for each > and every library known to man. For any libraries that CMake does not > know how to search for, you must write the corresponding find package > script. Granted it's a one time thing, but adds the occasional extra > maintenance. > > If we go with find_package in Cmake, which is the most-likely approach > we will have to take, then we will be required to have *compiled* > versions of dependencies on the system. Whether that means the package > manager on that distro already downloads the binaries *or* the > additional step of building them must be taken, is outside of the > control of CMake. It'll have to be something that is taken care of > externally (I think this is the case already, based on the feedback so > far). > > Can anyone think of an easier way to setup dependencies on Windows? QT > has an installer which is fine, and CMake also already has a find > module for it that ships with CMake. What about the others? For > example, if we depended on OpenSSL, it's rather painful to build on > Windows outside of Cygwin (even with Cygwin, you still have to setup > the environment to point to the correct compiler & stuff, much easier > to run the Visual Studio bundled command prompt scripts). In those > cases, getting up and running on Windows can still be a pain :( > > On Wed, Jun 12, 2013 at 1:10 AM, Torbjörn Klatt > <ope...@to...> wrote: >> -----BEGIN PGP SIGNED MESSAGE----- >> Hash: SHA1 >> >> Dear all >> >> On 12.06.2013 04:47, Anthony Foiani wrote: >>> Robert, Markus, all -- >>> >>> On Tue, Jun 11, 2013 at 12:05 PM, Markus Geimer <mg...@we...> >>> wrote: >>>> Robert, >>>> >>>>> The only other thing I can suggest to remedy the dependencies >>>>> problem is to bundle all dependency source code with Doxygen >>>>> and build it along with everything else, much like you already >>>>> do for libmd5. This also would include any dependencies of >>>>> dependencies, transitively. This doesn't seem like it would be >>>>> too big of a deal except for QT, which is pretty huge. >>>>> Thoughts? >> Only ever do this for libraries which do not provide pre-built Windows >> binaries or where most linux distributions do not provide packages >> (which is very very rare). >> And even then you really need to know what you are doing and should >> prefer to let the use resolve these dependencies on his own. He/she >> knows the system well better than you ever do via CMake. >> >> CMake's purpose is exactly to overcome the need to create such >> monstrosities of bundles and maintenance nightmares. >> >>>> I strongly recommend to *not* create such a monster tarball >>>> because of various reasons: >>> I very strongly concur. >>> >>> Robert's proposal (create private copies of everything) was the >>> route taken by Chromium. Getting it accepted into Fedora has been >>> delayed for years because the original packagers did not use the >>> libraries already available on the platform. >>> >>> (Not to mention the fact that, with the "monster tarball" >>> approach, you also need to keep up with security patches for all of >>> those programs you copied...) >>> >>> More info on this particular case here: >>> http://ostatic.com/blog/making-projects-easier-to-package-why-chromium-isnt-in-fedora >>> >>> >>>> My recommendation would be to stick to proven best practices: Let >>>> the configure step try to determine whether an appropriate >>>> version of a dependency library is installed on the system, use >>>> it if available, and complain (or disable optional functionality) >>>> otherwise. >>> +1. >>> >>> I can't believe that CMake doesn't have this capability already, >>> even if it's in the guise of "is this symbol available from any >>> known library on the system?" >> CMake does have such possibilities. It's called >> >> FIND_PACKAGE(mypackage REQUIRED) >> >> This will abort the whole configuration step if 'mypackage' is not >> available on the system (read: not found in system's paths). >> >> E.g. to require the Boost system and regex libraries and headers of >> version 1.50 and later one writes >> >> FIND_PACKAGE(Boost 1.50 REQUIRED COMPONENTS system regex) >> >> For a list of somehow inbuilt supported packages to search for see [1] >> and all items starting with 'Find'. >> In case the desired package is not in this list, one usually finds >> such via Google (or GitHub) easily. >> >> If you ever encounter a library where no FindX module is available, >> you should try detecting existence on the system via the low-level >> CMake commands FIND_LIBRARY and FIND_PROGRAM which are used by the >> FindX modules. >> >> [1] >> http://www.cmake.org/cmake/help/v2.8.11/cmake.html#section_StandardCMakeModules >> >>> Either way, I do wish Robert luck; CMake has the potential to >>> improve cross-platform projects quite a bit, but I agree with >>> Markus that bringing in tarballs of dependencies is not the right >>> way. >> I sign this. >> >>> Best regards, Anthony Foiani >> Cheers, >> Torbjörn Klatt >> >> -----BEGIN PGP SIGNATURE----- >> Version: GnuPG v2.0.18 (GNU/Linux) >> Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ >> >> iQEcBAEBAgAGBQJRuBDpAAoJENyw9v81DsTG5PwIAKSjp8gClLd5gOGF8OIeo0QO >> lzc8G7Kvn6wwSpklJqiVKFE3FA3y3y1bnnOOj4nSCycLgJFUFUhaWekCYHYM0yWJ >> CCjj2EExFGs6J1gKkopInEmEhj5REhKJwTZzZjs88kGqVCbiZgE6LWkDTJd4PjE7 >> dSl+sLdJZRWyadzgyDpG6kWBs0eE0EKtT+Nql94nXrWTP1nDoSp/fQZevpyBP5++ >> fq2D9UufgpBkossXfEQN2MxwzEZduX4MmvUBd5PBKwleI+PXQAM86l+t09A5Hpiu >> I28+YUQQYStFD9ztgu91ps/Vs7dJxzlAvqs56PcdgNCnZTdjwt9JML8IjNzUiW8= >> =UNdQ >> -----END PGP SIGNATURE----- >> >> ------------------------------------------------------------------------------ >> This SF.net email is sponsored by Windows: >> >> Build for Windows Store. >> >> http://p.sf.net/sfu/windows-dev2dev >> _______________________________________________ >> Doxygen-develop mailing list >> Dox...@li... >> https://lists.sourceforge.net/lists/listinfo/doxygen-develop > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > _______________________________________________ > Doxygen-develop mailing list > Dox...@li... > https://lists.sourceforge.net/lists/listinfo/doxygen-develop |
From: Robert D. <rcd...@gm...> - 2013-06-18 17:17:20
|
Sorry for the delay in response. So basically the problem is between Windows and Linux. On Windows, I do not have a package manager. It's a huge pain in the ass to download each dependency, as well as its dependencies (transitively) and build each individually. This takes sometimes days, and is impossible to do depending on what libs you use. If anyon tried to build SVN on Windows 5 years ago, they would know what I'm talking about. However on Linux, things are much easier in this respect. It's OK to let the user install the dependencies because they just type in apt-get and they are done. find_package() is ideally the cross-platform way of handling dependencies in CMake, but the problem is just that: Fulfilling the dependencies, as I noted above, is rather painful on Windows. Not only that, but CMake doesn't come pre-packaged with find modules for each and every library known to man. For any libraries that CMake does not know how to search for, you must write the corresponding find package script. Granted it's a one time thing, but adds the occasional extra maintenance. If we go with find_package in Cmake, which is the most-likely approach we will have to take, then we will be required to have *compiled* versions of dependencies on the system. Whether that means the package manager on that distro already downloads the binaries *or* the additional step of building them must be taken, is outside of the control of CMake. It'll have to be something that is taken care of externally (I think this is the case already, based on the feedback so far). Can anyone think of an easier way to setup dependencies on Windows? QT has an installer which is fine, and CMake also already has a find module for it that ships with CMake. What about the others? For example, if we depended on OpenSSL, it's rather painful to build on Windows outside of Cygwin (even with Cygwin, you still have to setup the environment to point to the correct compiler & stuff, much easier to run the Visual Studio bundled command prompt scripts). In those cases, getting up and running on Windows can still be a pain :( On Wed, Jun 12, 2013 at 1:10 AM, Torbjörn Klatt <ope...@to...> wrote: > -----BEGIN PGP SIGNED MESSAGE----- > Hash: SHA1 > > Dear all > > On 12.06.2013 04:47, Anthony Foiani wrote: >> Robert, Markus, all -- >> >> On Tue, Jun 11, 2013 at 12:05 PM, Markus Geimer <mg...@we...> >> wrote: >>> Robert, >>> >>>> The only other thing I can suggest to remedy the dependencies >>>> problem is to bundle all dependency source code with Doxygen >>>> and build it along with everything else, much like you already >>>> do for libmd5. This also would include any dependencies of >>>> dependencies, transitively. This doesn't seem like it would be >>>> too big of a deal except for QT, which is pretty huge. >>>> Thoughts? > > Only ever do this for libraries which do not provide pre-built Windows > binaries or where most linux distributions do not provide packages > (which is very very rare). > And even then you really need to know what you are doing and should > prefer to let the use resolve these dependencies on his own. He/she > knows the system well better than you ever do via CMake. > > CMake's purpose is exactly to overcome the need to create such > monstrosities of bundles and maintenance nightmares. > >>> >>> I strongly recommend to *not* create such a monster tarball >>> because of various reasons: >> >> I very strongly concur. >> >> Robert's proposal (create private copies of everything) was the >> route taken by Chromium. Getting it accepted into Fedora has been >> delayed for years because the original packagers did not use the >> libraries already available on the platform. >> >> (Not to mention the fact that, with the "monster tarball" >> approach, you also need to keep up with security patches for all of >> those programs you copied...) >> >> More info on this particular case here: >> http://ostatic.com/blog/making-projects-easier-to-package-why-chromium-isnt-in-fedora >> >> >>> My recommendation would be to stick to proven best practices: Let >>> the configure step try to determine whether an appropriate >>> version of a dependency library is installed on the system, use >>> it if available, and complain (or disable optional functionality) >>> otherwise. >> >> +1. >> >> I can't believe that CMake doesn't have this capability already, >> even if it's in the guise of "is this symbol available from any >> known library on the system?" > > CMake does have such possibilities. It's called > > FIND_PACKAGE(mypackage REQUIRED) > > This will abort the whole configuration step if 'mypackage' is not > available on the system (read: not found in system's paths). > > E.g. to require the Boost system and regex libraries and headers of > version 1.50 and later one writes > > FIND_PACKAGE(Boost 1.50 REQUIRED COMPONENTS system regex) > > For a list of somehow inbuilt supported packages to search for see [1] > and all items starting with 'Find'. > In case the desired package is not in this list, one usually finds > such via Google (or GitHub) easily. > > If you ever encounter a library where no FindX module is available, > you should try detecting existence on the system via the low-level > CMake commands FIND_LIBRARY and FIND_PROGRAM which are used by the > FindX modules. > > [1] > http://www.cmake.org/cmake/help/v2.8.11/cmake.html#section_StandardCMakeModules > >> >> Either way, I do wish Robert luck; CMake has the potential to >> improve cross-platform projects quite a bit, but I agree with >> Markus that bringing in tarballs of dependencies is not the right >> way. > > I sign this. > >> >> Best regards, Anthony Foiani > > Cheers, > Torbjörn Klatt > > -----BEGIN PGP SIGNATURE----- > Version: GnuPG v2.0.18 (GNU/Linux) > Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ > > iQEcBAEBAgAGBQJRuBDpAAoJENyw9v81DsTG5PwIAKSjp8gClLd5gOGF8OIeo0QO > lzc8G7Kvn6wwSpklJqiVKFE3FA3y3y1bnnOOj4nSCycLgJFUFUhaWekCYHYM0yWJ > CCjj2EExFGs6J1gKkopInEmEhj5REhKJwTZzZjs88kGqVCbiZgE6LWkDTJd4PjE7 > dSl+sLdJZRWyadzgyDpG6kWBs0eE0EKtT+Nql94nXrWTP1nDoSp/fQZevpyBP5++ > fq2D9UufgpBkossXfEQN2MxwzEZduX4MmvUBd5PBKwleI+PXQAM86l+t09A5Hpiu > I28+YUQQYStFD9ztgu91ps/Vs7dJxzlAvqs56PcdgNCnZTdjwt9JML8IjNzUiW8= > =UNdQ > -----END PGP SIGNATURE----- > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > _______________________________________________ > Doxygen-develop mailing list > Dox...@li... > https://lists.sourceforge.net/lists/listinfo/doxygen-develop |
From: <Eck...@t-...> - 2013-06-17 17:42:48
|
Hello Dimitri In the meanwhile I updated my doxygen-installation to 1.8.4 and tried out the behaviour of the configuration of the FILTER_PATTERNS. It is still the same like described in the bug-report [Bug 700148] FILTER_PATTERNS is working with additional " at the end. It doesn't matter if the filter is a script where I have to mention the script-interpretor also ( first example with a perl based filter for matlab) or if the filter is a independent exe-file only ( second example with an exe-fileter for pascal). You mention that the use of a batch-script is the solution. Is it possible for you to use both examples I added to my bug-report for creating examples how to use both filters with a windows bath-file? Will this batch-file also be usable for INPUT_FILTER? Best regards, Eckard Klotz. |
From: Torbjörn K. <ope...@to...> - 2013-06-12 06:29:40
|
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Dear all On 12.06.2013 04:47, Anthony Foiani wrote: > Robert, Markus, all -- > > On Tue, Jun 11, 2013 at 12:05 PM, Markus Geimer <mg...@we...> > wrote: >> Robert, >> >>> The only other thing I can suggest to remedy the dependencies >>> problem is to bundle all dependency source code with Doxygen >>> and build it along with everything else, much like you already >>> do for libmd5. This also would include any dependencies of >>> dependencies, transitively. This doesn't seem like it would be >>> too big of a deal except for QT, which is pretty huge. >>> Thoughts? Only ever do this for libraries which do not provide pre-built Windows binaries or where most linux distributions do not provide packages (which is very very rare). And even then you really need to know what you are doing and should prefer to let the use resolve these dependencies on his own. He/she knows the system well better than you ever do via CMake. CMake's purpose is exactly to overcome the need to create such monstrosities of bundles and maintenance nightmares. >> >> I strongly recommend to *not* create such a monster tarball >> because of various reasons: > > I very strongly concur. > > Robert's proposal (create private copies of everything) was the > route taken by Chromium. Getting it accepted into Fedora has been > delayed for years because the original packagers did not use the > libraries already available on the platform. > > (Not to mention the fact that, with the "monster tarball" > approach, you also need to keep up with security patches for all of > those programs you copied...) > > More info on this particular case here: > http://ostatic.com/blog/making-projects-easier-to-package-why-chromium-isnt-in-fedora > > >> My recommendation would be to stick to proven best practices: Let >> the configure step try to determine whether an appropriate >> version of a dependency library is installed on the system, use >> it if available, and complain (or disable optional functionality) >> otherwise. > > +1. > > I can't believe that CMake doesn't have this capability already, > even if it's in the guise of "is this symbol available from any > known library on the system?" CMake does have such possibilities. It's called FIND_PACKAGE(mypackage REQUIRED) This will abort the whole configuration step if 'mypackage' is not available on the system (read: not found in system's paths). E.g. to require the Boost system and regex libraries and headers of version 1.50 and later one writes FIND_PACKAGE(Boost 1.50 REQUIRED COMPONENTS system regex) For a list of somehow inbuilt supported packages to search for see [1] and all items starting with 'Find'. In case the desired package is not in this list, one usually finds such via Google (or GitHub) easily. If you ever encounter a library where no FindX module is available, you should try detecting existence on the system via the low-level CMake commands FIND_LIBRARY and FIND_PROGRAM which are used by the FindX modules. [1] http://www.cmake.org/cmake/help/v2.8.11/cmake.html#section_StandardCMakeModules > > Either way, I do wish Robert luck; CMake has the potential to > improve cross-platform projects quite a bit, but I agree with > Markus that bringing in tarballs of dependencies is not the right > way. I sign this. > > Best regards, Anthony Foiani Cheers, Torbjörn Klatt -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.18 (GNU/Linux) Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ iQEcBAEBAgAGBQJRuBDpAAoJENyw9v81DsTG5PwIAKSjp8gClLd5gOGF8OIeo0QO lzc8G7Kvn6wwSpklJqiVKFE3FA3y3y1bnnOOj4nSCycLgJFUFUhaWekCYHYM0yWJ CCjj2EExFGs6J1gKkopInEmEhj5REhKJwTZzZjs88kGqVCbiZgE6LWkDTJd4PjE7 dSl+sLdJZRWyadzgyDpG6kWBs0eE0EKtT+Nql94nXrWTP1nDoSp/fQZevpyBP5++ fq2D9UufgpBkossXfEQN2MxwzEZduX4MmvUBd5PBKwleI+PXQAM86l+t09A5Hpiu I28+YUQQYStFD9ztgu91ps/Vs7dJxzlAvqs56PcdgNCnZTdjwt9JML8IjNzUiW8= =UNdQ -----END PGP SIGNATURE----- |
From: Anthony F. <ant...@gm...> - 2013-06-12 02:48:04
|
Robert, Markus, all -- On Tue, Jun 11, 2013 at 12:05 PM, Markus Geimer <mg...@we...> wrote: > Robert, > >> The only other thing I can suggest to remedy the dependencies problem >> is to bundle all dependency source code with Doxygen and build it >> along with everything else, much like you already do for libmd5. This >> also would include any dependencies of dependencies, transitively. >> This doesn't seem like it would be too big of a deal except for QT, >> which is pretty huge. Thoughts? > > I strongly recommend to *not* create such a monster tarball because of > various reasons: I very strongly concur. Robert's proposal (create private copies of everything) was the route taken by Chromium. Getting it accepted into Fedora has been delayed for years because the original packagers did not use the libraries already available on the platform. (Not to mention the fact that, with the "monster tarball" approach, you also need to keep up with security patches for all of those programs you copied...) More info on this particular case here: http://ostatic.com/blog/making-projects-easier-to-package-why-chromium-isnt-in-fedora > My recommendation would be to stick to proven best practices: Let the > configure step try to determine whether an appropriate version of a > dependency library is installed on the system, use it if available, > and complain (or disable optional functionality) otherwise. +1. I can't believe that CMake doesn't have this capability already, even if it's in the guise of "is this symbol available from any known library on the system?" Either way, I do wish Robert luck; CMake has the potential to improve cross-platform projects quite a bit, but I agree with Markus that bringing in tarballs of dependencies is not the right way. Best regards, Anthony Foiani |
From: Markus G. <mg...@we...> - 2013-06-11 18:06:31
|
Robert, > The only other thing I can suggest to remedy the dependencies problem > is to bundle all dependency source code with Doxygen and build it > along with everything else, much like you already do for libmd5. This > also would include any dependencies of dependencies, transitively. > This doesn't seem like it would be too big of a deal except for QT, > which is pretty huge. Thoughts? I strongly recommend to *not* create such a monster tarball because of various reasons: - Most importantly, it will become a maintenance nightmare. Someone needs to keep track of new versions of all the dependency packages, and based on the changelog decide whether an update is required for doxygen (e.g., when a security issue has been fixed) or not. - Additional burden is put onto the distro package maintainers, as they will most likely patch the sources to use the system-provided libraries instead of the included ones (this is, e.g., already the case for libpng on Debian and maybe other distros). - It will unnecessarily increase build times for 99% of the users which have the common libraries already installed (not such a big deal if parallel builds are working, but Qt will be a killer). >>>> I know that most linux distros have very different package managers. >>>> Syntax is different, and some download binaries while others download >>>> source. The source case will be difficult, as this custom script will >>>> not only need to invoke the platform's package manager to download the >>>> source but also build each package the moment it is downloaded in some >>>> uniform way. Does this seem feasible? Builds are normally done as ordinary user, while installation of packages is done as root. So, trying to install packages using a package manager at build time of doxygen sounds crazy to me. >>>>> Another area to consider is the standard C/C++ libraries. I have >>>>> read the info behind the libraries, and found that this is where you >>>>> could run into problems, as every std library is configured differently >>>>> to make system calls to the Linux kernel. Yep, libstdc++ would really be an issue. Typically, you can build a binary against a version of libstdc++ and then run it using a newer one, but not vice versa. Therefore, the best thing to do is to always use the system-installed C++ library. >>>>> It is much better to compile dependencies during the actual building >>>>> process (like the QTools package in doxygen's source code). Hmm... I disagree. Often, the dependencies are used by multiple packages, i.e., you don't want to compile/install them multiple times. And as soon as they are shared libraries, you might end up in a big mess... My recommendation would be to stick to proven best practices: Let the configure step try to determine whether an appropriate version of a dependency library is installed on the system, use it if available, and complain (or disable optional functionality) otherwise. Of course, there should also be a way for the user to specify an alternate installation path to override the auto-detection and a list of all dependencies including URLs in the INSTALL file (which I think is already the case). Best, Markus |
From: Robert D. <rcd...@gm...> - 2013-06-10 17:13:30
|
The only other thing I can suggest to remedy the dependencies problem is to bundle all dependency source code with Doxygen and build it along with everything else, much like you already do for libmd5. This also would include any dependencies of dependencies, transitively. This doesn't seem like it would be too big of a deal except for QT, which is pretty huge. Thoughts? On Sun, Jun 9, 2013 at 4:52 PM, Dimitri van Heesch <do...@gm...> wrote: > Hi Robert, > > On Jun 9, 2013, at 23:13 , Robert Dailey <rcd...@gm...> wrote: > >> The problem with depending on linux package managers is that you won't >> be able to use those libraries on Windows or MacOS. So, as annoying as >> it is, maintaining the packages yourself is the most portable >> solution. > > Then CMake doesn't offer a solution for those. So it seems the solution > will mostly be for Windows (MacOS can be treated as a Unix flavor as well). > > So maybe we should rethink what the problem is that we are trying to > solve and if CMake is indeed the solution, or if adding some extra rules > to the Doxygen.vcproj file will do just as well. > >> >> What linux platforms/distros does Doxygen need to support? > > Well there are many Linux distro's that now bundle doxygen > (Ubuntu, Fedora, RedHat, Mint, Debian, Arch, Gentoo, etc.) and > then there is BSD (Free, Net, Open, DragonFly flavors) and Solaris (Sparc/x86) > and maybe some more exotic Unix flavors. > > I do not provide packages for any of those, but package maintainers > should be able to build doxygen using the other packages. > >> Which third party libraries are we referring to? > > For building: perl, python, flex, bison, sed > For compiling: Qt4 (for Doxywizard), Xapian (for doxysearch), libclang (for clang support), libmd5 (now bundled with doxygen). > Install time: LaTeX for generating the manual. > Runtime (optional): Graphviz, mscgen, bibtex, epstopdf, dvips > > Note that a number of these package depend on other packages themselves. > > Regards, > Dimitri > >> >> On Sun, Jun 9, 2013 at 2:18 PM, Kevin McBride <dol...@ai...> wrote: >>> Robert, >>> >>> That sounds like a good idea. I will leave the final decision to Dimitri, >>> as he is the one who will have to commit the changes to GIT. >>> >>> >>> Kevin McBride >>> dol...@ai... >>> >>> >>> -----Original Message----- >>> From: Robert Dailey <rcd...@gm...> >>> To: Kevin McBride <dol...@ai...> >>> Cc: Doxygen <do...@gm...>; Doxygen Developers >>> <dox...@li...> >>> Sent: Sun, Jun 9, 2013 1:45 pm >>> Subject: Re: [Doxygen-develop] CMake >>> >>> I think what I will do is redesign my framework to take a hybrid >>> approach. I have a manifest file where you define the third party >>> libraries plus their versions. I will change this so you define >>> whether or not to download from a custom repository or download from >>> package manager. In the latter case, you will need to define a script >>> that takes a library name and version. This script will execute the >>> platform's package manager to make sure that the appropriate includes >>> and binaries are downloaded (or compiled) as part of the process CMake >>> goes through to prepare third party libs for usage. >>> >>> I know that most linux distros have very different package managers. >>> Syntax is different, and some download binaries while others download >>> source. The source case will be difficult, as this custom script will >>> not only need to invoke the platform's package manager to download the >>> source but also build each package the moment it is downloaded in some >>> uniform way. Does this seem feasible? >>> >>> On Sat, Jun 8, 2013 at 6:42 PM, Kevin McBride <dol...@ai...> wrote: >>>> >>>> Another area to consider is the standard C/C++ libraries. I have >>> >>> read the >>>> >>>> info behind the libraries, and found that this is where you could run >>> >>> into >>>> >>>> problems, as every std library is configured differently to make >>> >>> system >>>> >>>> calls to the Linux kernel. It is much better to compile dependencies >>> >>> during >>>> >>>> the actual building process (like the QTools package in doxygen's >>> >>> source >>>> >>>> code). >>>> >>>> When I used to compile RPMs (I was the one who came up with the `make >>> >>> rpm' >>>> >>>> command to the makefiles of Doxygen) I compiled under the Fedora >>> >>> distro, >>>> >>>> which had a dynamic libpng installed. I always had the linking >>> >>> process >>>> >>>> dynamically link to the libpng in the Fedora distro. The libpng >>> >>> dynamic >>>> >>>> linking enhancement was not included in the master repository because >>>> Dimitri and I found only a small speed difference when using libpng >>> >>> that >>>> >>>> some distros provided. >>>> >>>> Can cmake do a configure process just like what is currently done to >>> >>> compile >>>> >>>> the doxygen source code? I really do think it is best to link >>> >>> dynamically >>>> >>>> to the common libraries that almost every distro has. You run into >>> >>> less >>>> >>>> problems this way, especially considering the example that the binary >>> >>> form >>>> >>>> of the standard C/C++ libraries do differ from distro to distro. >>>> >>>> >>>> Kevin McBride >>>> dol...@ai... >>>> >>>> >>>> -----Original Message----- >>>> From: Robert Dailey <rcd...@gm...> >>>> To: Kevin McBride <dol...@ai...> >>>> Cc: Doxygen <do...@gm...>; Doxygen Developers >>>> <dox...@li...> >>>> Sent: Sat, Jun 8, 2013 4:43 pm >>>> Subject: Re: [Doxygen-develop] CMake >>>> >>>> Suppose you have a library on linux called D. The dependency tree is >>> >>> as >>>> >>>> follows: >>>> >>>> D -> B, C >>>> B -> A >>>> >>>> So basically: D depends on libraries B & C, while library B depends on >>>> library A >>>> >>>> In this case, you'd compile all 4 libraries on your target toolchain >>>> and architecture (GCC + Intel) and put that in your CMake repository. >>>> Would these 4 binaries not be usable on multiple distros? As long as >>>> the kernel is the same (or at least compatible), it should work fine. >>>> The only edge case I can think of is if the kernel is vastly different >>>> on each distro, meaning things like the memory manager changes and >>>> thus libraries would need to be recompiled for each kernel. >>>> >>>> On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> >>> >>> wrote: >>>>> >>>>> >>>>> Robert, >>>>> >>>>> Unlike Windows, Linux is written in a way that allows many different >>>>> "distros" to be written. Some people prefer not to compile packages >>>> >>>> >>>> from >>>>> >>>>> >>>>> sources (Fedora is good for these people). Others that prefer to >>>> >>>> >>>> compile >>>>> >>>>> >>>>> from sources would typically use a distro that is more friendly to >>>>> developers. >>>>> >>>>> With such differences, it is not wise to use one binary for all Linux >>>>> distros. In fact, new stuff that gets compiled would be replaced >>>> >>>> >>>> with old >>>>> >>>>> >>>>> dependencies, resulting in chaos for developers as they try to stick >>>> >>>> >>>> with >>>>> >>>>> >>>>> their preferred versions of libraries and programs they have >>> >>> compiled. >>>>> >>>>> >>>>> I once thought the autotools were good for doxygen, but the autotools >>>> >>>> >>>> are >>>>> >>>>> >>>>> good only on *nix platforms. They do not perform well on Windows. >>>>> >>>>> Hope this brief explanation about Linux helps. >>>>> >>>>> Kevin McBride >>>>> dol...@ai... >>>>> >>>>> >>>>> >>>>> -----Original Message----- >>>>> From: Robert Dailey <rcd...@gm...> >>>>> To: Dimitri van Heesch <do...@gm...> >>>>> Cc: Doxygen Developers <dox...@li...> >>>>> Sent: Sat, Jun 8, 2013 3:46 pm >>>>> Subject: Re: [Doxygen-develop] CMake >>>>> >>>>> I'd have to rewrite the framework to handle the special case package >>>>> handling, which would be significant work for what might be little >>>>> gain. The system would have to be used for all platforms as it >>>>> currently is. I'm not a Unix developer, so I'm not sure why Unix >>> >>> would >>>>> >>>>> be more difficult than windows. For example, pretty much every linux >>>>> distro I know supports GCC with Intel architecture. Even if you need >>>>> to support GCC + ARM, couldn't you easily maintain 2 sets of packages >>>>> on Unix (That would be: GCC+x86 and GCC+ARM)? >>>>> >>>>> Another reason why I prefer this approach on linux is because when >>> >>> you >>>>> >>>>> download libs through package managers on Linux, your build system >>>>> can't really control what version you have. With this approach, we >>> >>> are >>>>> >>>>> in control of the packages, so we can guarantee the versions of our >>>>> libs we use are those we have tested with. When we upgrade a library, >>>>> we can perform regression testing and update the package system on >>> >>> the >>>>> >>>>> CMake side to use the new version. >>>>> >>>>> At this point I'd just like a bit of education on where the >>> >>> complexity >>>>> >>>>> lies on the unix side. Teach me a little and I might be able to come >>>>> up with some ideas for you :) >>>>> >>>>> Thanks. >>>>> >>>>> On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch >>> >>> <do...@gm...> >>>>> >>>>> wrote: >>>>>> >>>>>> >>>>>> >>>>>> Hi Robert, >>>>>> >>>>>> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> >>>>> >>>>> >>>>> >>>>> wrote: >>>>>> >>>>>> >>>>>> >>>>>> >>>>>>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey >>>>> >>>>> >>>>> >>>>> <rcd...@gm...> wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> Starting a new discussion thread here in the dev mailing list for >>>>>>>> CMake support. I'll be working on this over on my github fork: >>>>>>>> https://github.com/rcdailey/doxygen >>>>>>>> >>>>>>>> I'll be spending my spare time on this so please forgive any slow >>>>> >>>>> >>>>> >>>>> progress :) >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> Concerning third party dependencies that you do not build yourself >>>> >>>> >>>> as >>>>>>> >>>>>>> >>>>>>> part of your make command, would you be able to maintain your own >>>>>>> binaries for these in a repository? >>>>>>> >>>>>>> I already have a CMake framework that I can drop into doxygen and >>>>> >>>>> >>>>> >>>>> use. >>>>>>> >>>>>>> >>>>>>> >>>>>>> To be the most platform agnostic, I have set it up to download >>>>>>> archives of precompiled binaries for third party libraries from an >>>>>>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>>>>>> cache). Basically for every platform or toolchain you plan to build >>>>>>> doxygen on or with, you will need to have include files + binaries >>>> >>>> >>>> in >>>>>>> >>>>>>> >>>>>>> an archive. Those will sit in a repository and the CMake scripts >>>> >>>> >>>> will >>>>>>> >>>>>>> >>>>>>> download them, extract them, and automatically setup include >>>>>>> directories and dependencies for you. >>>>>>> >>>>>>> There are a couple of benefits to having this approach: >>>>>>> 1. No need to search for these libraries on the system. The CMake >>>>>>> scripts will always be able to guarantee that they are on the >>>> >>>> >>>> system >>>>>>> >>>>>>> >>>>>>> since it will be downloading them from a repository you maintain. >>>>>>> 2. Easier for new developers to just pick up the code and start >>>>>>> building, since they do not have to spend time building libraries. >>>>>>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>>>>>> just another dependency to worry about) mechanism, so this makes up >>>>>>> for that and makes it super easy to get a build started on Windows. >>>>>>> >>>>>>> The downside, of course, is that this can become a maintenance >>>>> >>>>> >>>>> >>>>> problem >>>>>>> >>>>>>> >>>>>>> >>>>>>> if you have a ton of libraries and/or platforms or toolchains to >>>>>>> support. >>>>>>> >>>>>>> Let me know how you want to approach this, as it will deeply impact >>>>>>> the work. Personally I suggest we take this approach, assuming you >>>>> >>>>> >>>>> >>>>> can >>>>>>> >>>>>>> >>>>>>> >>>>>>> setup an FTP/HTTP server somewhere to pull down the archives. I >>> >>> will >>>>>>> >>>>>>> also post this in the dev mailing list, as I have created a >>>> >>>> >>>> dedicated >>>>>>> >>>>>>> >>>>>>> thread there for CMake discussion. Join me there! >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> It is not problem for me to host the packages, and I do need them >>>>> >>>>> >>>>> >>>>> myself when I >>>>>> >>>>>> >>>>>> >>>>>> build a doxygen release. So for Windows (32bit/64bit + debug/release >>>>> >>>>> >>>>> >>>>> flavors) and >>>>>> >>>>>> >>>>>> >>>>>> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems >>> >>> like >>>>> >>>>> >>>>> >>>>> a good approach. >>>>>> >>>>>> >>>>>> >>>>>> For Linux, however, it would be better to depend on the packages >>> >>> that >>>>> >>>>> >>>>> >>>>> come with >>>>>> >>>>>> >>>>>> >>>>>> a distribution (there are too many distros to support). >>>>>> >>>>>> Can CMake be configured like that? or is it one approach or the >>> >>> other >>>>> >>>>> >>>>> >>>>> for all platforms. >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> Regards, >>>>>> Dimitri >>>>>> >>>>> >>>>> >>>> >>> ------------------------------------------------------------------------- >>> >>>> >>>>> ----- >>>>> How ServiceNow helps IT people transform IT departments: >>>>> 1. A cloud service to automate IT design, transition and operations >>>>> 2. Dashboards that offer high-level views of enterprise services >>>>> 3. A single system of record for all IT processes >>>>> http://p.sf.net/sfu/servicenow-d2d-j >>>>> _______________________________________________ >>>>> Doxygen-develop mailing list >>>>> Dox...@li... >>>>> https://lists.sourceforge.net/lists/listinfo/doxygen-develop >>>>> >>>>> >>>>> >>>> >>>> >>>> >>> >>> >>> > |
From: Dimitri v. H. <do...@gm...> - 2013-06-09 21:53:00
|
Hi Robert, On Jun 9, 2013, at 23:13 , Robert Dailey <rcd...@gm...> wrote: > The problem with depending on linux package managers is that you won't > be able to use those libraries on Windows or MacOS. So, as annoying as > it is, maintaining the packages yourself is the most portable > solution. Then CMake doesn't offer a solution for those. So it seems the solution will mostly be for Windows (MacOS can be treated as a Unix flavor as well). So maybe we should rethink what the problem is that we are trying to solve and if CMake is indeed the solution, or if adding some extra rules to the Doxygen.vcproj file will do just as well. > > What linux platforms/distros does Doxygen need to support? Well there are many Linux distro's that now bundle doxygen (Ubuntu, Fedora, RedHat, Mint, Debian, Arch, Gentoo, etc.) and then there is BSD (Free, Net, Open, DragonFly flavors) and Solaris (Sparc/x86) and maybe some more exotic Unix flavors. I do not provide packages for any of those, but package maintainers should be able to build doxygen using the other packages. > Which third party libraries are we referring to? For building: perl, python, flex, bison, sed For compiling: Qt4 (for Doxywizard), Xapian (for doxysearch), libclang (for clang support), libmd5 (now bundled with doxygen). Install time: LaTeX for generating the manual. Runtime (optional): Graphviz, mscgen, bibtex, epstopdf, dvips Note that a number of these package depend on other packages themselves. Regards, Dimitri > > On Sun, Jun 9, 2013 at 2:18 PM, Kevin McBride <dol...@ai...> wrote: >> Robert, >> >> That sounds like a good idea. I will leave the final decision to Dimitri, >> as he is the one who will have to commit the changes to GIT. >> >> >> Kevin McBride >> dol...@ai... >> >> >> -----Original Message----- >> From: Robert Dailey <rcd...@gm...> >> To: Kevin McBride <dol...@ai...> >> Cc: Doxygen <do...@gm...>; Doxygen Developers >> <dox...@li...> >> Sent: Sun, Jun 9, 2013 1:45 pm >> Subject: Re: [Doxygen-develop] CMake >> >> I think what I will do is redesign my framework to take a hybrid >> approach. I have a manifest file where you define the third party >> libraries plus their versions. I will change this so you define >> whether or not to download from a custom repository or download from >> package manager. In the latter case, you will need to define a script >> that takes a library name and version. This script will execute the >> platform's package manager to make sure that the appropriate includes >> and binaries are downloaded (or compiled) as part of the process CMake >> goes through to prepare third party libs for usage. >> >> I know that most linux distros have very different package managers. >> Syntax is different, and some download binaries while others download >> source. The source case will be difficult, as this custom script will >> not only need to invoke the platform's package manager to download the >> source but also build each package the moment it is downloaded in some >> uniform way. Does this seem feasible? >> >> On Sat, Jun 8, 2013 at 6:42 PM, Kevin McBride <dol...@ai...> wrote: >>> >>> Another area to consider is the standard C/C++ libraries. I have >> >> read the >>> >>> info behind the libraries, and found that this is where you could run >> >> into >>> >>> problems, as every std library is configured differently to make >> >> system >>> >>> calls to the Linux kernel. It is much better to compile dependencies >> >> during >>> >>> the actual building process (like the QTools package in doxygen's >> >> source >>> >>> code). >>> >>> When I used to compile RPMs (I was the one who came up with the `make >> >> rpm' >>> >>> command to the makefiles of Doxygen) I compiled under the Fedora >> >> distro, >>> >>> which had a dynamic libpng installed. I always had the linking >> >> process >>> >>> dynamically link to the libpng in the Fedora distro. The libpng >> >> dynamic >>> >>> linking enhancement was not included in the master repository because >>> Dimitri and I found only a small speed difference when using libpng >> >> that >>> >>> some distros provided. >>> >>> Can cmake do a configure process just like what is currently done to >> >> compile >>> >>> the doxygen source code? I really do think it is best to link >> >> dynamically >>> >>> to the common libraries that almost every distro has. You run into >> >> less >>> >>> problems this way, especially considering the example that the binary >> >> form >>> >>> of the standard C/C++ libraries do differ from distro to distro. >>> >>> >>> Kevin McBride >>> dol...@ai... >>> >>> >>> -----Original Message----- >>> From: Robert Dailey <rcd...@gm...> >>> To: Kevin McBride <dol...@ai...> >>> Cc: Doxygen <do...@gm...>; Doxygen Developers >>> <dox...@li...> >>> Sent: Sat, Jun 8, 2013 4:43 pm >>> Subject: Re: [Doxygen-develop] CMake >>> >>> Suppose you have a library on linux called D. The dependency tree is >> >> as >>> >>> follows: >>> >>> D -> B, C >>> B -> A >>> >>> So basically: D depends on libraries B & C, while library B depends on >>> library A >>> >>> In this case, you'd compile all 4 libraries on your target toolchain >>> and architecture (GCC + Intel) and put that in your CMake repository. >>> Would these 4 binaries not be usable on multiple distros? As long as >>> the kernel is the same (or at least compatible), it should work fine. >>> The only edge case I can think of is if the kernel is vastly different >>> on each distro, meaning things like the memory manager changes and >>> thus libraries would need to be recompiled for each kernel. >>> >>> On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> >> >> wrote: >>>> >>>> >>>> Robert, >>>> >>>> Unlike Windows, Linux is written in a way that allows many different >>>> "distros" to be written. Some people prefer not to compile packages >>> >>> >>> from >>>> >>>> >>>> sources (Fedora is good for these people). Others that prefer to >>> >>> >>> compile >>>> >>>> >>>> from sources would typically use a distro that is more friendly to >>>> developers. >>>> >>>> With such differences, it is not wise to use one binary for all Linux >>>> distros. In fact, new stuff that gets compiled would be replaced >>> >>> >>> with old >>>> >>>> >>>> dependencies, resulting in chaos for developers as they try to stick >>> >>> >>> with >>>> >>>> >>>> their preferred versions of libraries and programs they have >> >> compiled. >>>> >>>> >>>> I once thought the autotools were good for doxygen, but the autotools >>> >>> >>> are >>>> >>>> >>>> good only on *nix platforms. They do not perform well on Windows. >>>> >>>> Hope this brief explanation about Linux helps. >>>> >>>> Kevin McBride >>>> dol...@ai... >>>> >>>> >>>> >>>> -----Original Message----- >>>> From: Robert Dailey <rcd...@gm...> >>>> To: Dimitri van Heesch <do...@gm...> >>>> Cc: Doxygen Developers <dox...@li...> >>>> Sent: Sat, Jun 8, 2013 3:46 pm >>>> Subject: Re: [Doxygen-develop] CMake >>>> >>>> I'd have to rewrite the framework to handle the special case package >>>> handling, which would be significant work for what might be little >>>> gain. The system would have to be used for all platforms as it >>>> currently is. I'm not a Unix developer, so I'm not sure why Unix >> >> would >>>> >>>> be more difficult than windows. For example, pretty much every linux >>>> distro I know supports GCC with Intel architecture. Even if you need >>>> to support GCC + ARM, couldn't you easily maintain 2 sets of packages >>>> on Unix (That would be: GCC+x86 and GCC+ARM)? >>>> >>>> Another reason why I prefer this approach on linux is because when >> >> you >>>> >>>> download libs through package managers on Linux, your build system >>>> can't really control what version you have. With this approach, we >> >> are >>>> >>>> in control of the packages, so we can guarantee the versions of our >>>> libs we use are those we have tested with. When we upgrade a library, >>>> we can perform regression testing and update the package system on >> >> the >>>> >>>> CMake side to use the new version. >>>> >>>> At this point I'd just like a bit of education on where the >> >> complexity >>>> >>>> lies on the unix side. Teach me a little and I might be able to come >>>> up with some ideas for you :) >>>> >>>> Thanks. >>>> >>>> On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch >> >> <do...@gm...> >>>> >>>> wrote: >>>>> >>>>> >>>>> >>>>> Hi Robert, >>>>> >>>>> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> >>>> >>>> >>>> >>>> wrote: >>>>> >>>>> >>>>> >>>>> >>>>>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey >>>> >>>> >>>> >>>> <rcd...@gm...> wrote: >>>>>>> >>>>>>> >>>>>>> >>>>>>> Starting a new discussion thread here in the dev mailing list for >>>>>>> CMake support. I'll be working on this over on my github fork: >>>>>>> https://github.com/rcdailey/doxygen >>>>>>> >>>>>>> I'll be spending my spare time on this so please forgive any slow >>>> >>>> >>>> >>>> progress :) >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> Concerning third party dependencies that you do not build yourself >>> >>> >>> as >>>>>> >>>>>> >>>>>> part of your make command, would you be able to maintain your own >>>>>> binaries for these in a repository? >>>>>> >>>>>> I already have a CMake framework that I can drop into doxygen and >>>> >>>> >>>> >>>> use. >>>>>> >>>>>> >>>>>> >>>>>> To be the most platform agnostic, I have set it up to download >>>>>> archives of precompiled binaries for third party libraries from an >>>>>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>>>>> cache). Basically for every platform or toolchain you plan to build >>>>>> doxygen on or with, you will need to have include files + binaries >>> >>> >>> in >>>>>> >>>>>> >>>>>> an archive. Those will sit in a repository and the CMake scripts >>> >>> >>> will >>>>>> >>>>>> >>>>>> download them, extract them, and automatically setup include >>>>>> directories and dependencies for you. >>>>>> >>>>>> There are a couple of benefits to having this approach: >>>>>> 1. No need to search for these libraries on the system. The CMake >>>>>> scripts will always be able to guarantee that they are on the >>> >>> >>> system >>>>>> >>>>>> >>>>>> since it will be downloading them from a repository you maintain. >>>>>> 2. Easier for new developers to just pick up the code and start >>>>>> building, since they do not have to spend time building libraries. >>>>>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>>>>> just another dependency to worry about) mechanism, so this makes up >>>>>> for that and makes it super easy to get a build started on Windows. >>>>>> >>>>>> The downside, of course, is that this can become a maintenance >>>> >>>> >>>> >>>> problem >>>>>> >>>>>> >>>>>> >>>>>> if you have a ton of libraries and/or platforms or toolchains to >>>>>> support. >>>>>> >>>>>> Let me know how you want to approach this, as it will deeply impact >>>>>> the work. Personally I suggest we take this approach, assuming you >>>> >>>> >>>> >>>> can >>>>>> >>>>>> >>>>>> >>>>>> setup an FTP/HTTP server somewhere to pull down the archives. I >> >> will >>>>>> >>>>>> also post this in the dev mailing list, as I have created a >>> >>> >>> dedicated >>>>>> >>>>>> >>>>>> thread there for CMake discussion. Join me there! >>>>> >>>>> >>>>> >>>>> >>>>> It is not problem for me to host the packages, and I do need them >>>> >>>> >>>> >>>> myself when I >>>>> >>>>> >>>>> >>>>> build a doxygen release. So for Windows (32bit/64bit + debug/release >>>> >>>> >>>> >>>> flavors) and >>>>> >>>>> >>>>> >>>>> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems >> >> like >>>> >>>> >>>> >>>> a good approach. >>>>> >>>>> >>>>> >>>>> For Linux, however, it would be better to depend on the packages >> >> that >>>> >>>> >>>> >>>> come with >>>>> >>>>> >>>>> >>>>> a distribution (there are too many distros to support). >>>>> >>>>> Can CMake be configured like that? or is it one approach or the >> >> other >>>> >>>> >>>> >>>> for all platforms. >>>>> >>>>> >>>>> >>>>> >>>>> Regards, >>>>> Dimitri >>>>> >>>> >>>> >>> >> ------------------------------------------------------------------------- >> >>> >>>> ----- >>>> How ServiceNow helps IT people transform IT departments: >>>> 1. A cloud service to automate IT design, transition and operations >>>> 2. Dashboards that offer high-level views of enterprise services >>>> 3. A single system of record for all IT processes >>>> http://p.sf.net/sfu/servicenow-d2d-j >>>> _______________________________________________ >>>> Doxygen-develop mailing list >>>> Dox...@li... >>>> https://lists.sourceforge.net/lists/listinfo/doxygen-develop >>>> >>>> >>>> >>> >>> >>> >> >> >> |
From: Robert D. <rcd...@gm...> - 2013-06-09 21:13:09
|
The problem with depending on linux package managers is that you won't be able to use those libraries on Windows or MacOS. So, as annoying as it is, maintaining the packages yourself is the most portable solution. What linux platforms/distros does Doxygen need to support? Which third party libraries are we referring to? On Sun, Jun 9, 2013 at 2:18 PM, Kevin McBride <dol...@ai...> wrote: > Robert, > > That sounds like a good idea. I will leave the final decision to Dimitri, > as he is the one who will have to commit the changes to GIT. > > > Kevin McBride > dol...@ai... > > > -----Original Message----- > From: Robert Dailey <rcd...@gm...> > To: Kevin McBride <dol...@ai...> > Cc: Doxygen <do...@gm...>; Doxygen Developers > <dox...@li...> > Sent: Sun, Jun 9, 2013 1:45 pm > Subject: Re: [Doxygen-develop] CMake > > I think what I will do is redesign my framework to take a hybrid > approach. I have a manifest file where you define the third party > libraries plus their versions. I will change this so you define > whether or not to download from a custom repository or download from > package manager. In the latter case, you will need to define a script > that takes a library name and version. This script will execute the > platform's package manager to make sure that the appropriate includes > and binaries are downloaded (or compiled) as part of the process CMake > goes through to prepare third party libs for usage. > > I know that most linux distros have very different package managers. > Syntax is different, and some download binaries while others download > source. The source case will be difficult, as this custom script will > not only need to invoke the platform's package manager to download the > source but also build each package the moment it is downloaded in some > uniform way. Does this seem feasible? > > On Sat, Jun 8, 2013 at 6:42 PM, Kevin McBride <dol...@ai...> wrote: >> >> Another area to consider is the standard C/C++ libraries. I have > > read the >> >> info behind the libraries, and found that this is where you could run > > into >> >> problems, as every std library is configured differently to make > > system >> >> calls to the Linux kernel. It is much better to compile dependencies > > during >> >> the actual building process (like the QTools package in doxygen's > > source >> >> code). >> >> When I used to compile RPMs (I was the one who came up with the `make > > rpm' >> >> command to the makefiles of Doxygen) I compiled under the Fedora > > distro, >> >> which had a dynamic libpng installed. I always had the linking > > process >> >> dynamically link to the libpng in the Fedora distro. The libpng > > dynamic >> >> linking enhancement was not included in the master repository because >> Dimitri and I found only a small speed difference when using libpng > > that >> >> some distros provided. >> >> Can cmake do a configure process just like what is currently done to > > compile >> >> the doxygen source code? I really do think it is best to link > > dynamically >> >> to the common libraries that almost every distro has. You run into > > less >> >> problems this way, especially considering the example that the binary > > form >> >> of the standard C/C++ libraries do differ from distro to distro. >> >> >> Kevin McBride >> dol...@ai... >> >> >> -----Original Message----- >> From: Robert Dailey <rcd...@gm...> >> To: Kevin McBride <dol...@ai...> >> Cc: Doxygen <do...@gm...>; Doxygen Developers >> <dox...@li...> >> Sent: Sat, Jun 8, 2013 4:43 pm >> Subject: Re: [Doxygen-develop] CMake >> >> Suppose you have a library on linux called D. The dependency tree is > > as >> >> follows: >> >> D -> B, C >> B -> A >> >> So basically: D depends on libraries B & C, while library B depends on >> library A >> >> In this case, you'd compile all 4 libraries on your target toolchain >> and architecture (GCC + Intel) and put that in your CMake repository. >> Would these 4 binaries not be usable on multiple distros? As long as >> the kernel is the same (or at least compatible), it should work fine. >> The only edge case I can think of is if the kernel is vastly different >> on each distro, meaning things like the memory manager changes and >> thus libraries would need to be recompiled for each kernel. >> >> On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> > > wrote: >>> >>> >>> Robert, >>> >>> Unlike Windows, Linux is written in a way that allows many different >>> "distros" to be written. Some people prefer not to compile packages >> >> >> from >>> >>> >>> sources (Fedora is good for these people). Others that prefer to >> >> >> compile >>> >>> >>> from sources would typically use a distro that is more friendly to >>> developers. >>> >>> With such differences, it is not wise to use one binary for all Linux >>> distros. In fact, new stuff that gets compiled would be replaced >> >> >> with old >>> >>> >>> dependencies, resulting in chaos for developers as they try to stick >> >> >> with >>> >>> >>> their preferred versions of libraries and programs they have > > compiled. >>> >>> >>> I once thought the autotools were good for doxygen, but the autotools >> >> >> are >>> >>> >>> good only on *nix platforms. They do not perform well on Windows. >>> >>> Hope this brief explanation about Linux helps. >>> >>> Kevin McBride >>> dol...@ai... >>> >>> >>> >>> -----Original Message----- >>> From: Robert Dailey <rcd...@gm...> >>> To: Dimitri van Heesch <do...@gm...> >>> Cc: Doxygen Developers <dox...@li...> >>> Sent: Sat, Jun 8, 2013 3:46 pm >>> Subject: Re: [Doxygen-develop] CMake >>> >>> I'd have to rewrite the framework to handle the special case package >>> handling, which would be significant work for what might be little >>> gain. The system would have to be used for all platforms as it >>> currently is. I'm not a Unix developer, so I'm not sure why Unix > > would >>> >>> be more difficult than windows. For example, pretty much every linux >>> distro I know supports GCC with Intel architecture. Even if you need >>> to support GCC + ARM, couldn't you easily maintain 2 sets of packages >>> on Unix (That would be: GCC+x86 and GCC+ARM)? >>> >>> Another reason why I prefer this approach on linux is because when > > you >>> >>> download libs through package managers on Linux, your build system >>> can't really control what version you have. With this approach, we > > are >>> >>> in control of the packages, so we can guarantee the versions of our >>> libs we use are those we have tested with. When we upgrade a library, >>> we can perform regression testing and update the package system on > > the >>> >>> CMake side to use the new version. >>> >>> At this point I'd just like a bit of education on where the > > complexity >>> >>> lies on the unix side. Teach me a little and I might be able to come >>> up with some ideas for you :) >>> >>> Thanks. >>> >>> On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch > > <do...@gm...> >>> >>> wrote: >>>> >>>> >>>> >>>> Hi Robert, >>>> >>>> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> >>> >>> >>> >>> wrote: >>>> >>>> >>>> >>>> >>>>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey >>> >>> >>> >>> <rcd...@gm...> wrote: >>>>>> >>>>>> >>>>>> >>>>>> Starting a new discussion thread here in the dev mailing list for >>>>>> CMake support. I'll be working on this over on my github fork: >>>>>> https://github.com/rcdailey/doxygen >>>>>> >>>>>> I'll be spending my spare time on this so please forgive any slow >>> >>> >>> >>> progress :) >>>>> >>>>> >>>>> >>>>> >>>>> Concerning third party dependencies that you do not build yourself >> >> >> as >>>>> >>>>> >>>>> part of your make command, would you be able to maintain your own >>>>> binaries for these in a repository? >>>>> >>>>> I already have a CMake framework that I can drop into doxygen and >>> >>> >>> >>> use. >>>>> >>>>> >>>>> >>>>> To be the most platform agnostic, I have set it up to download >>>>> archives of precompiled binaries for third party libraries from an >>>>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>>>> cache). Basically for every platform or toolchain you plan to build >>>>> doxygen on or with, you will need to have include files + binaries >> >> >> in >>>>> >>>>> >>>>> an archive. Those will sit in a repository and the CMake scripts >> >> >> will >>>>> >>>>> >>>>> download them, extract them, and automatically setup include >>>>> directories and dependencies for you. >>>>> >>>>> There are a couple of benefits to having this approach: >>>>> 1. No need to search for these libraries on the system. The CMake >>>>> scripts will always be able to guarantee that they are on the >> >> >> system >>>>> >>>>> >>>>> since it will be downloading them from a repository you maintain. >>>>> 2. Easier for new developers to just pick up the code and start >>>>> building, since they do not have to spend time building libraries. >>>>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>>>> just another dependency to worry about) mechanism, so this makes up >>>>> for that and makes it super easy to get a build started on Windows. >>>>> >>>>> The downside, of course, is that this can become a maintenance >>> >>> >>> >>> problem >>>>> >>>>> >>>>> >>>>> if you have a ton of libraries and/or platforms or toolchains to >>>>> support. >>>>> >>>>> Let me know how you want to approach this, as it will deeply impact >>>>> the work. Personally I suggest we take this approach, assuming you >>> >>> >>> >>> can >>>>> >>>>> >>>>> >>>>> setup an FTP/HTTP server somewhere to pull down the archives. I > > will >>>>> >>>>> also post this in the dev mailing list, as I have created a >> >> >> dedicated >>>>> >>>>> >>>>> thread there for CMake discussion. Join me there! >>>> >>>> >>>> >>>> >>>> It is not problem for me to host the packages, and I do need them >>> >>> >>> >>> myself when I >>>> >>>> >>>> >>>> build a doxygen release. So for Windows (32bit/64bit + debug/release >>> >>> >>> >>> flavors) and >>>> >>>> >>>> >>>> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems > > like >>> >>> >>> >>> a good approach. >>>> >>>> >>>> >>>> For Linux, however, it would be better to depend on the packages > > that >>> >>> >>> >>> come with >>>> >>>> >>>> >>>> a distribution (there are too many distros to support). >>>> >>>> Can CMake be configured like that? or is it one approach or the > > other >>> >>> >>> >>> for all platforms. >>>> >>>> >>>> >>>> >>>> Regards, >>>> Dimitri >>>> >>> >>> >> > ------------------------------------------------------------------------- > >> >>> ----- >>> How ServiceNow helps IT people transform IT departments: >>> 1. A cloud service to automate IT design, transition and operations >>> 2. Dashboards that offer high-level views of enterprise services >>> 3. A single system of record for all IT processes >>> http://p.sf.net/sfu/servicenow-d2d-j >>> _______________________________________________ >>> Doxygen-develop mailing list >>> Dox...@li... >>> https://lists.sourceforge.net/lists/listinfo/doxygen-develop >>> >>> >>> >> >> >> > > > |
From: Kevin M. <dol...@ai...> - 2013-06-09 19:18:50
|
Robert, That sounds like a good idea. I will leave the final decision to Dimitri, as he is the one who will have to commit the changes to GIT. Kevin McBride dol...@ai... -----Original Message----- From: Robert Dailey <rcd...@gm...> To: Kevin McBride <dol...@ai...> Cc: Doxygen <do...@gm...>; Doxygen Developers <dox...@li...> Sent: Sun, Jun 9, 2013 1:45 pm Subject: Re: [Doxygen-develop] CMake I think what I will do is redesign my framework to take a hybrid approach. I have a manifest file where you define the third party libraries plus their versions. I will change this so you define whether or not to download from a custom repository or download from package manager. In the latter case, you will need to define a script that takes a library name and version. This script will execute the platform's package manager to make sure that the appropriate includes and binaries are downloaded (or compiled) as part of the process CMake goes through to prepare third party libs for usage. I know that most linux distros have very different package managers. Syntax is different, and some download binaries while others download source. The source case will be difficult, as this custom script will not only need to invoke the platform's package manager to download the source but also build each package the moment it is downloaded in some uniform way. Does this seem feasible? On Sat, Jun 8, 2013 at 6:42 PM, Kevin McBride <dol...@ai...> wrote: > Another area to consider is the standard C/C++ libraries. I have read the > info behind the libraries, and found that this is where you could run into > problems, as every std library is configured differently to make system > calls to the Linux kernel. It is much better to compile dependencies during > the actual building process (like the QTools package in doxygen's source > code). > > When I used to compile RPMs (I was the one who came up with the `make rpm' > command to the makefiles of Doxygen) I compiled under the Fedora distro, > which had a dynamic libpng installed. I always had the linking process > dynamically link to the libpng in the Fedora distro. The libpng dynamic > linking enhancement was not included in the master repository because > Dimitri and I found only a small speed difference when using libpng that > some distros provided. > > Can cmake do a configure process just like what is currently done to compile > the doxygen source code? I really do think it is best to link dynamically > to the common libraries that almost every distro has. You run into less > problems this way, especially considering the example that the binary form > of the standard C/C++ libraries do differ from distro to distro. > > > Kevin McBride > dol...@ai... > > > -----Original Message----- > From: Robert Dailey <rcd...@gm...> > To: Kevin McBride <dol...@ai...> > Cc: Doxygen <do...@gm...>; Doxygen Developers > <dox...@li...> > Sent: Sat, Jun 8, 2013 4:43 pm > Subject: Re: [Doxygen-develop] CMake > > Suppose you have a library on linux called D. The dependency tree is as > follows: > > D -> B, C > B -> A > > So basically: D depends on libraries B & C, while library B depends on > library A > > In this case, you'd compile all 4 libraries on your target toolchain > and architecture (GCC + Intel) and put that in your CMake repository. > Would these 4 binaries not be usable on multiple distros? As long as > the kernel is the same (or at least compatible), it should work fine. > The only edge case I can think of is if the kernel is vastly different > on each distro, meaning things like the memory manager changes and > thus libraries would need to be recompiled for each kernel. > > On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> wrote: >> >> Robert, >> >> Unlike Windows, Linux is written in a way that allows many different >> "distros" to be written. Some people prefer not to compile packages > > from >> >> sources (Fedora is good for these people). Others that prefer to > > compile >> >> from sources would typically use a distro that is more friendly to >> developers. >> >> With such differences, it is not wise to use one binary for all Linux >> distros. In fact, new stuff that gets compiled would be replaced > > with old >> >> dependencies, resulting in chaos for developers as they try to stick > > with >> >> their preferred versions of libraries and programs they have compiled. >> >> I once thought the autotools were good for doxygen, but the autotools > > are >> >> good only on *nix platforms. They do not perform well on Windows. >> >> Hope this brief explanation about Linux helps. >> >> Kevin McBride >> dol...@ai... >> >> >> >> -----Original Message----- >> From: Robert Dailey <rcd...@gm...> >> To: Dimitri van Heesch <do...@gm...> >> Cc: Doxygen Developers <dox...@li...> >> Sent: Sat, Jun 8, 2013 3:46 pm >> Subject: Re: [Doxygen-develop] CMake >> >> I'd have to rewrite the framework to handle the special case package >> handling, which would be significant work for what might be little >> gain. The system would have to be used for all platforms as it >> currently is. I'm not a Unix developer, so I'm not sure why Unix would >> be more difficult than windows. For example, pretty much every linux >> distro I know supports GCC with Intel architecture. Even if you need >> to support GCC + ARM, couldn't you easily maintain 2 sets of packages >> on Unix (That would be: GCC+x86 and GCC+ARM)? >> >> Another reason why I prefer this approach on linux is because when you >> download libs through package managers on Linux, your build system >> can't really control what version you have. With this approach, we are >> in control of the packages, so we can guarantee the versions of our >> libs we use are those we have tested with. When we upgrade a library, >> we can perform regression testing and update the package system on the >> CMake side to use the new version. >> >> At this point I'd just like a bit of education on where the complexity >> lies on the unix side. Teach me a little and I might be able to come >> up with some ideas for you :) >> >> Thanks. >> >> On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch <do...@gm...> >> wrote: >>> >>> >>> Hi Robert, >>> >>> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> >> >> >> wrote: >>> >>> >>> >>>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey >> >> >> <rcd...@gm...> wrote: >>>>> >>>>> >>>>> Starting a new discussion thread here in the dev mailing list for >>>>> CMake support. I'll be working on this over on my github fork: >>>>> https://github.com/rcdailey/doxygen >>>>> >>>>> I'll be spending my spare time on this so please forgive any slow >> >> >> progress :) >>>> >>>> >>>> >>>> Concerning third party dependencies that you do not build yourself > > as >>>> >>>> part of your make command, would you be able to maintain your own >>>> binaries for these in a repository? >>>> >>>> I already have a CMake framework that I can drop into doxygen and >> >> >> use. >>>> >>>> >>>> To be the most platform agnostic, I have set it up to download >>>> archives of precompiled binaries for third party libraries from an >>>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>>> cache). Basically for every platform or toolchain you plan to build >>>> doxygen on or with, you will need to have include files + binaries > > in >>>> >>>> an archive. Those will sit in a repository and the CMake scripts > > will >>>> >>>> download them, extract them, and automatically setup include >>>> directories and dependencies for you. >>>> >>>> There are a couple of benefits to having this approach: >>>> 1. No need to search for these libraries on the system. The CMake >>>> scripts will always be able to guarantee that they are on the > > system >>>> >>>> since it will be downloading them from a repository you maintain. >>>> 2. Easier for new developers to just pick up the code and start >>>> building, since they do not have to spend time building libraries. >>>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>>> just another dependency to worry about) mechanism, so this makes up >>>> for that and makes it super easy to get a build started on Windows. >>>> >>>> The downside, of course, is that this can become a maintenance >> >> >> problem >>>> >>>> >>>> if you have a ton of libraries and/or platforms or toolchains to >>>> support. >>>> >>>> Let me know how you want to approach this, as it will deeply impact >>>> the work. Personally I suggest we take this approach, assuming you >> >> >> can >>>> >>>> >>>> setup an FTP/HTTP server somewhere to pull down the archives. I will >>>> also post this in the dev mailing list, as I have created a > > dedicated >>>> >>>> thread there for CMake discussion. Join me there! >>> >>> >>> >>> It is not problem for me to host the packages, and I do need them >> >> >> myself when I >>> >>> >>> build a doxygen release. So for Windows (32bit/64bit + debug/release >> >> >> flavors) and >>> >>> >>> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like >> >> >> a good approach. >>> >>> >>> For Linux, however, it would be better to depend on the packages that >> >> >> come with >>> >>> >>> a distribution (there are too many distros to support). >>> >>> Can CMake be configured like that? or is it one approach or the other >> >> >> for all platforms. >>> >>> >>> >>> Regards, >>> Dimitri >>> >> >> > ------------------------------------------------------------------------- > >> ----- >> How ServiceNow helps IT people transform IT departments: >> 1. A cloud service to automate IT design, transition and operations >> 2. Dashboards that offer high-level views of enterprise services >> 3. A single system of record for all IT processes >> http://p.sf.net/sfu/servicenow-d2d-j >> _______________________________________________ >> Doxygen-develop mailing list >> Dox...@li... >> https://lists.sourceforge.net/lists/listinfo/doxygen-develop >> >> >> > > > |
From: Robert D. <rcd...@gm...> - 2013-06-09 17:45:24
|
I think what I will do is redesign my framework to take a hybrid approach. I have a manifest file where you define the third party libraries plus their versions. I will change this so you define whether or not to download from a custom repository or download from package manager. In the latter case, you will need to define a script that takes a library name and version. This script will execute the platform's package manager to make sure that the appropriate includes and binaries are downloaded (or compiled) as part of the process CMake goes through to prepare third party libs for usage. I know that most linux distros have very different package managers. Syntax is different, and some download binaries while others download source. The source case will be difficult, as this custom script will not only need to invoke the platform's package manager to download the source but also build each package the moment it is downloaded in some uniform way. Does this seem feasible? On Sat, Jun 8, 2013 at 6:42 PM, Kevin McBride <dol...@ai...> wrote: > Another area to consider is the standard C/C++ libraries. I have read the > info behind the libraries, and found that this is where you could run into > problems, as every std library is configured differently to make system > calls to the Linux kernel. It is much better to compile dependencies during > the actual building process (like the QTools package in doxygen's source > code). > > When I used to compile RPMs (I was the one who came up with the `make rpm' > command to the makefiles of Doxygen) I compiled under the Fedora distro, > which had a dynamic libpng installed. I always had the linking process > dynamically link to the libpng in the Fedora distro. The libpng dynamic > linking enhancement was not included in the master repository because > Dimitri and I found only a small speed difference when using libpng that > some distros provided. > > Can cmake do a configure process just like what is currently done to compile > the doxygen source code? I really do think it is best to link dynamically > to the common libraries that almost every distro has. You run into less > problems this way, especially considering the example that the binary form > of the standard C/C++ libraries do differ from distro to distro. > > > Kevin McBride > dol...@ai... > > > -----Original Message----- > From: Robert Dailey <rcd...@gm...> > To: Kevin McBride <dol...@ai...> > Cc: Doxygen <do...@gm...>; Doxygen Developers > <dox...@li...> > Sent: Sat, Jun 8, 2013 4:43 pm > Subject: Re: [Doxygen-develop] CMake > > Suppose you have a library on linux called D. The dependency tree is as > follows: > > D -> B, C > B -> A > > So basically: D depends on libraries B & C, while library B depends on > library A > > In this case, you'd compile all 4 libraries on your target toolchain > and architecture (GCC + Intel) and put that in your CMake repository. > Would these 4 binaries not be usable on multiple distros? As long as > the kernel is the same (or at least compatible), it should work fine. > The only edge case I can think of is if the kernel is vastly different > on each distro, meaning things like the memory manager changes and > thus libraries would need to be recompiled for each kernel. > > On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> wrote: >> >> Robert, >> >> Unlike Windows, Linux is written in a way that allows many different >> "distros" to be written. Some people prefer not to compile packages > > from >> >> sources (Fedora is good for these people). Others that prefer to > > compile >> >> from sources would typically use a distro that is more friendly to >> developers. >> >> With such differences, it is not wise to use one binary for all Linux >> distros. In fact, new stuff that gets compiled would be replaced > > with old >> >> dependencies, resulting in chaos for developers as they try to stick > > with >> >> their preferred versions of libraries and programs they have compiled. >> >> I once thought the autotools were good for doxygen, but the autotools > > are >> >> good only on *nix platforms. They do not perform well on Windows. >> >> Hope this brief explanation about Linux helps. >> >> Kevin McBride >> dol...@ai... >> >> >> >> -----Original Message----- >> From: Robert Dailey <rcd...@gm...> >> To: Dimitri van Heesch <do...@gm...> >> Cc: Doxygen Developers <dox...@li...> >> Sent: Sat, Jun 8, 2013 3:46 pm >> Subject: Re: [Doxygen-develop] CMake >> >> I'd have to rewrite the framework to handle the special case package >> handling, which would be significant work for what might be little >> gain. The system would have to be used for all platforms as it >> currently is. I'm not a Unix developer, so I'm not sure why Unix would >> be more difficult than windows. For example, pretty much every linux >> distro I know supports GCC with Intel architecture. Even if you need >> to support GCC + ARM, couldn't you easily maintain 2 sets of packages >> on Unix (That would be: GCC+x86 and GCC+ARM)? >> >> Another reason why I prefer this approach on linux is because when you >> download libs through package managers on Linux, your build system >> can't really control what version you have. With this approach, we are >> in control of the packages, so we can guarantee the versions of our >> libs we use are those we have tested with. When we upgrade a library, >> we can perform regression testing and update the package system on the >> CMake side to use the new version. >> >> At this point I'd just like a bit of education on where the complexity >> lies on the unix side. Teach me a little and I might be able to come >> up with some ideas for you :) >> >> Thanks. >> >> On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch <do...@gm...> >> wrote: >>> >>> >>> Hi Robert, >>> >>> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> >> >> >> wrote: >>> >>> >>> >>>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey >> >> >> <rcd...@gm...> wrote: >>>>> >>>>> >>>>> Starting a new discussion thread here in the dev mailing list for >>>>> CMake support. I'll be working on this over on my github fork: >>>>> https://github.com/rcdailey/doxygen >>>>> >>>>> I'll be spending my spare time on this so please forgive any slow >> >> >> progress :) >>>> >>>> >>>> >>>> Concerning third party dependencies that you do not build yourself > > as >>>> >>>> part of your make command, would you be able to maintain your own >>>> binaries for these in a repository? >>>> >>>> I already have a CMake framework that I can drop into doxygen and >> >> >> use. >>>> >>>> >>>> To be the most platform agnostic, I have set it up to download >>>> archives of precompiled binaries for third party libraries from an >>>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>>> cache). Basically for every platform or toolchain you plan to build >>>> doxygen on or with, you will need to have include files + binaries > > in >>>> >>>> an archive. Those will sit in a repository and the CMake scripts > > will >>>> >>>> download them, extract them, and automatically setup include >>>> directories and dependencies for you. >>>> >>>> There are a couple of benefits to having this approach: >>>> 1. No need to search for these libraries on the system. The CMake >>>> scripts will always be able to guarantee that they are on the > > system >>>> >>>> since it will be downloading them from a repository you maintain. >>>> 2. Easier for new developers to just pick up the code and start >>>> building, since they do not have to spend time building libraries. >>>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>>> just another dependency to worry about) mechanism, so this makes up >>>> for that and makes it super easy to get a build started on Windows. >>>> >>>> The downside, of course, is that this can become a maintenance >> >> >> problem >>>> >>>> >>>> if you have a ton of libraries and/or platforms or toolchains to >>>> support. >>>> >>>> Let me know how you want to approach this, as it will deeply impact >>>> the work. Personally I suggest we take this approach, assuming you >> >> >> can >>>> >>>> >>>> setup an FTP/HTTP server somewhere to pull down the archives. I will >>>> also post this in the dev mailing list, as I have created a > > dedicated >>>> >>>> thread there for CMake discussion. Join me there! >>> >>> >>> >>> It is not problem for me to host the packages, and I do need them >> >> >> myself when I >>> >>> >>> build a doxygen release. So for Windows (32bit/64bit + debug/release >> >> >> flavors) and >>> >>> >>> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like >> >> >> a good approach. >>> >>> >>> For Linux, however, it would be better to depend on the packages that >> >> >> come with >>> >>> >>> a distribution (there are too many distros to support). >>> >>> Can CMake be configured like that? or is it one approach or the other >> >> >> for all platforms. >>> >>> >>> >>> Regards, >>> Dimitri >>> >> >> > ------------------------------------------------------------------------- > >> ----- >> How ServiceNow helps IT people transform IT departments: >> 1. A cloud service to automate IT design, transition and operations >> 2. Dashboards that offer high-level views of enterprise services >> 3. A single system of record for all IT processes >> http://p.sf.net/sfu/servicenow-d2d-j >> _______________________________________________ >> Doxygen-develop mailing list >> Dox...@li... >> https://lists.sourceforge.net/lists/listinfo/doxygen-develop >> >> >> > > > |
From: Kevin M. <dol...@ai...> - 2013-06-08 23:55:04
|
Another area to consider is the standard C/C++ libraries. I have read the info behind the libraries, and found that this is where you could run into problems, as every std library is configured differently to make system calls to the Linux kernel. It is much better to compile dependencies during the actual building process (like the QTools package in doxygen's source code). When I used to compile RPMs (I was the one who came up with the `make rpm' command to the makefiles of Doxygen) I compiled under the Fedora distro, which had a dynamic libpng installed. I always had the linking process dynamically link to the libpng in the Fedora distro. The libpng dynamic linking enhancement was not included in the master repository because Dimitri and I found only a small speed difference when using libpng that some distros provided. Can cmake do a configure process just like what is currently done to compile the doxygen source code? I really do think it is best to link dynamically to the common libraries that almost every distro has. You run into less problems this way, especially considering the example that the binary form of the standard C/C++ libraries do differ from distro to distro. Kevin McBride dol...@ai... -----Original Message----- From: Robert Dailey <rcd...@gm...> To: Kevin McBride <dol...@ai...> Cc: Doxygen <do...@gm...>; Doxygen Developers <dox...@li...> Sent: Sat, Jun 8, 2013 4:43 pm Subject: Re: [Doxygen-develop] CMake Suppose you have a library on linux called D. The dependency tree is as follows: D -> B, C B -> A So basically: D depends on libraries B & C, while library B depends on library A In this case, you'd compile all 4 libraries on your target toolchain and architecture (GCC + Intel) and put that in your CMake repository. Would these 4 binaries not be usable on multiple distros? As long as the kernel is the same (or at least compatible), it should work fine. The only edge case I can think of is if the kernel is vastly different on each distro, meaning things like the memory manager changes and thus libraries would need to be recompiled for each kernel. On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> wrote: > Robert, > > Unlike Windows, Linux is written in a way that allows many different > "distros" to be written. Some people prefer not to compile packages from > sources (Fedora is good for these people). Others that prefer to compile > from sources would typically use a distro that is more friendly to > developers. > > With such differences, it is not wise to use one binary for all Linux > distros. In fact, new stuff that gets compiled would be replaced with old > dependencies, resulting in chaos for developers as they try to stick with > their preferred versions of libraries and programs they have compiled. > > I once thought the autotools were good for doxygen, but the autotools are > good only on *nix platforms. They do not perform well on Windows. > > Hope this brief explanation about Linux helps. > > Kevin McBride > dol...@ai... > > > > -----Original Message----- > From: Robert Dailey <rcd...@gm...> > To: Dimitri van Heesch <do...@gm...> > Cc: Doxygen Developers <dox...@li...> > Sent: Sat, Jun 8, 2013 3:46 pm > Subject: Re: [Doxygen-develop] CMake > > I'd have to rewrite the framework to handle the special case package > handling, which would be significant work for what might be little > gain. The system would have to be used for all platforms as it > currently is. I'm not a Unix developer, so I'm not sure why Unix would > be more difficult than windows. For example, pretty much every linux > distro I know supports GCC with Intel architecture. Even if you need > to support GCC + ARM, couldn't you easily maintain 2 sets of packages > on Unix (That would be: GCC+x86 and GCC+ARM)? > > Another reason why I prefer this approach on linux is because when you > download libs through package managers on Linux, your build system > can't really control what version you have. With this approach, we are > in control of the packages, so we can guarantee the versions of our > libs we use are those we have tested with. When we upgrade a library, > we can perform regression testing and update the package system on the > CMake side to use the new version. > > At this point I'd just like a bit of education on where the complexity > lies on the unix side. Teach me a little and I might be able to come > up with some ideas for you :) > > Thanks. > > On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch <do...@gm...> > wrote: >> >> Hi Robert, >> >> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> > > wrote: >> >> >>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey > > <rcd...@gm...> wrote: >>>> >>>> Starting a new discussion thread here in the dev mailing list for >>>> CMake support. I'll be working on this over on my github fork: >>>> https://github.com/rcdailey/doxygen >>>> >>>> I'll be spending my spare time on this so please forgive any slow > > progress :) >>> >>> >>> Concerning third party dependencies that you do not build yourself as >>> part of your make command, would you be able to maintain your own >>> binaries for these in a repository? >>> >>> I already have a CMake framework that I can drop into doxygen and > > use. >>> >>> To be the most platform agnostic, I have set it up to download >>> archives of precompiled binaries for third party libraries from an >>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>> cache). Basically for every platform or toolchain you plan to build >>> doxygen on or with, you will need to have include files + binaries in >>> an archive. Those will sit in a repository and the CMake scripts will >>> download them, extract them, and automatically setup include >>> directories and dependencies for you. >>> >>> There are a couple of benefits to having this approach: >>> 1. No need to search for these libraries on the system. The CMake >>> scripts will always be able to guarantee that they are on the system >>> since it will be downloading them from a repository you maintain. >>> 2. Easier for new developers to just pick up the code and start >>> building, since they do not have to spend time building libraries. >>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>> just another dependency to worry about) mechanism, so this makes up >>> for that and makes it super easy to get a build started on Windows. >>> >>> The downside, of course, is that this can become a maintenance > > problem >>> >>> if you have a ton of libraries and/or platforms or toolchains to >>> support. >>> >>> Let me know how you want to approach this, as it will deeply impact >>> the work. Personally I suggest we take this approach, assuming you > > can >>> >>> setup an FTP/HTTP server somewhere to pull down the archives. I will >>> also post this in the dev mailing list, as I have created a dedicated >>> thread there for CMake discussion. Join me there! >> >> >> It is not problem for me to host the packages, and I do need them > > myself when I >> >> build a doxygen release. So for Windows (32bit/64bit + debug/release > > flavors) and >> >> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like > > a good approach. >> >> For Linux, however, it would be better to depend on the packages that > > come with >> >> a distribution (there are too many distros to support). >> >> Can CMake be configured like that? or is it one approach or the other > > for all platforms. >> >> >> Regards, >> Dimitri >> > > ------------------------------------------------------------------------- > ----- > How ServiceNow helps IT people transform IT departments: > 1. A cloud service to automate IT design, transition and operations > 2. Dashboards that offer high-level views of enterprise services > 3. A single system of record for all IT processes > http://p.sf.net/sfu/servicenow-d2d-j > _______________________________________________ > Doxygen-develop mailing list > Dox...@li... > https://lists.sourceforge.net/lists/listinfo/doxygen-develop > > > |
From: Robert D. <rcd...@gm...> - 2013-06-08 20:43:18
|
Suppose you have a library on linux called D. The dependency tree is as follows: D -> B, C B -> A So basically: D depends on libraries B & C, while library B depends on library A In this case, you'd compile all 4 libraries on your target toolchain and architecture (GCC + Intel) and put that in your CMake repository. Would these 4 binaries not be usable on multiple distros? As long as the kernel is the same (or at least compatible), it should work fine. The only edge case I can think of is if the kernel is vastly different on each distro, meaning things like the memory manager changes and thus libraries would need to be recompiled for each kernel. On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> wrote: > Robert, > > Unlike Windows, Linux is written in a way that allows many different > "distros" to be written. Some people prefer not to compile packages from > sources (Fedora is good for these people). Others that prefer to compile > from sources would typically use a distro that is more friendly to > developers. > > With such differences, it is not wise to use one binary for all Linux > distros. In fact, new stuff that gets compiled would be replaced with old > dependencies, resulting in chaos for developers as they try to stick with > their preferred versions of libraries and programs they have compiled. > > I once thought the autotools were good for doxygen, but the autotools are > good only on *nix platforms. They do not perform well on Windows. > > Hope this brief explanation about Linux helps. > > Kevin McBride > dol...@ai... > > > > -----Original Message----- > From: Robert Dailey <rcd...@gm...> > To: Dimitri van Heesch <do...@gm...> > Cc: Doxygen Developers <dox...@li...> > Sent: Sat, Jun 8, 2013 3:46 pm > Subject: Re: [Doxygen-develop] CMake > > I'd have to rewrite the framework to handle the special case package > handling, which would be significant work for what might be little > gain. The system would have to be used for all platforms as it > currently is. I'm not a Unix developer, so I'm not sure why Unix would > be more difficult than windows. For example, pretty much every linux > distro I know supports GCC with Intel architecture. Even if you need > to support GCC + ARM, couldn't you easily maintain 2 sets of packages > on Unix (That would be: GCC+x86 and GCC+ARM)? > > Another reason why I prefer this approach on linux is because when you > download libs through package managers on Linux, your build system > can't really control what version you have. With this approach, we are > in control of the packages, so we can guarantee the versions of our > libs we use are those we have tested with. When we upgrade a library, > we can perform regression testing and update the package system on the > CMake side to use the new version. > > At this point I'd just like a bit of education on where the complexity > lies on the unix side. Teach me a little and I might be able to come > up with some ideas for you :) > > Thanks. > > On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch <do...@gm...> > wrote: >> >> Hi Robert, >> >> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> > > wrote: >> >> >>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey > > <rcd...@gm...> wrote: >>>> >>>> Starting a new discussion thread here in the dev mailing list for >>>> CMake support. I'll be working on this over on my github fork: >>>> https://github.com/rcdailey/doxygen >>>> >>>> I'll be spending my spare time on this so please forgive any slow > > progress :) >>> >>> >>> Concerning third party dependencies that you do not build yourself as >>> part of your make command, would you be able to maintain your own >>> binaries for these in a repository? >>> >>> I already have a CMake framework that I can drop into doxygen and > > use. >>> >>> To be the most platform agnostic, I have set it up to download >>> archives of precompiled binaries for third party libraries from an >>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>> cache). Basically for every platform or toolchain you plan to build >>> doxygen on or with, you will need to have include files + binaries in >>> an archive. Those will sit in a repository and the CMake scripts will >>> download them, extract them, and automatically setup include >>> directories and dependencies for you. >>> >>> There are a couple of benefits to having this approach: >>> 1. No need to search for these libraries on the system. The CMake >>> scripts will always be able to guarantee that they are on the system >>> since it will be downloading them from a repository you maintain. >>> 2. Easier for new developers to just pick up the code and start >>> building, since they do not have to spend time building libraries. >>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>> just another dependency to worry about) mechanism, so this makes up >>> for that and makes it super easy to get a build started on Windows. >>> >>> The downside, of course, is that this can become a maintenance > > problem >>> >>> if you have a ton of libraries and/or platforms or toolchains to >>> support. >>> >>> Let me know how you want to approach this, as it will deeply impact >>> the work. Personally I suggest we take this approach, assuming you > > can >>> >>> setup an FTP/HTTP server somewhere to pull down the archives. I will >>> also post this in the dev mailing list, as I have created a dedicated >>> thread there for CMake discussion. Join me there! >> >> >> It is not problem for me to host the packages, and I do need them > > myself when I >> >> build a doxygen release. So for Windows (32bit/64bit + debug/release > > flavors) and >> >> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like > > a good approach. >> >> For Linux, however, it would be better to depend on the packages that > > come with >> >> a distribution (there are too many distros to support). >> >> Can CMake be configured like that? or is it one approach or the other > > for all platforms. >> >> >> Regards, >> Dimitri >> > > ------------------------------------------------------------------------- > ----- > How ServiceNow helps IT people transform IT departments: > 1. A cloud service to automate IT design, transition and operations > 2. Dashboards that offer high-level views of enterprise services > 3. A single system of record for all IT processes > http://p.sf.net/sfu/servicenow-d2d-j > _______________________________________________ > Doxygen-develop mailing list > Dox...@li... > https://lists.sourceforge.net/lists/listinfo/doxygen-develop > > > |
From: Kevin M. <dol...@ai...> - 2013-06-08 20:32:28
|
Robert, Unlike Windows, Linux is written in a way that allows many different "distros" to be written. Some people prefer not to compile packages from sources (Fedora is good for these people). Others that prefer to compile from sources would typically use a distro that is more friendly to developers. With such differences, it is not wise to use one binary for all Linux distros. In fact, new stuff that gets compiled would be replaced with old dependencies, resulting in chaos for developers as they try to stick with their preferred versions of libraries and programs they have compiled. I once thought the autotools were good for doxygen, but the autotools are good only on *nix platforms. They do not perform well on Windows. Hope this brief explanation about Linux helps. Kevin McBride dol...@ai... -----Original Message----- From: Robert Dailey <rcd...@gm...> To: Dimitri van Heesch <do...@gm...> Cc: Doxygen Developers <dox...@li...> Sent: Sat, Jun 8, 2013 3:46 pm Subject: Re: [Doxygen-develop] CMake I'd have to rewrite the framework to handle the special case package handling, which would be significant work for what might be little gain. The system would have to be used for all platforms as it currently is. I'm not a Unix developer, so I'm not sure why Unix would be more difficult than windows. For example, pretty much every linux distro I know supports GCC with Intel architecture. Even if you need to support GCC + ARM, couldn't you easily maintain 2 sets of packages on Unix (That would be: GCC+x86 and GCC+ARM)? Another reason why I prefer this approach on linux is because when you download libs through package managers on Linux, your build system can't really control what version you have. With this approach, we are in control of the packages, so we can guarantee the versions of our libs we use are those we have tested with. When we upgrade a library, we can perform regression testing and update the package system on the CMake side to use the new version. At this point I'd just like a bit of education on where the complexity lies on the unix side. Teach me a little and I might be able to come up with some ideas for you :) Thanks. On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch <do...@gm...> wrote: > Hi Robert, > > On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> wrote: > >> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey <rcd...@gm...> wrote: >>> Starting a new discussion thread here in the dev mailing list for >>> CMake support. I'll be working on this over on my github fork: >>> https://github.com/rcdailey/doxygen >>> >>> I'll be spending my spare time on this so please forgive any slow progress :) >> >> Concerning third party dependencies that you do not build yourself as >> part of your make command, would you be able to maintain your own >> binaries for these in a repository? >> >> I already have a CMake framework that I can drop into doxygen and use. >> To be the most platform agnostic, I have set it up to download >> archives of precompiled binaries for third party libraries from an >> FTP/HTTP/Windows file share of your choosing (configurable in CMake >> cache). Basically for every platform or toolchain you plan to build >> doxygen on or with, you will need to have include files + binaries in >> an archive. Those will sit in a repository and the CMake scripts will >> download them, extract them, and automatically setup include >> directories and dependencies for you. >> >> There are a couple of benefits to having this approach: >> 1. No need to search for these libraries on the system. The CMake >> scripts will always be able to guarantee that they are on the system >> since it will be downloading them from a repository you maintain. >> 2. Easier for new developers to just pick up the code and start >> building, since they do not have to spend time building libraries. >> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >> just another dependency to worry about) mechanism, so this makes up >> for that and makes it super easy to get a build started on Windows. >> >> The downside, of course, is that this can become a maintenance problem >> if you have a ton of libraries and/or platforms or toolchains to >> support. >> >> Let me know how you want to approach this, as it will deeply impact >> the work. Personally I suggest we take this approach, assuming you can >> setup an FTP/HTTP server somewhere to pull down the archives. I will >> also post this in the dev mailing list, as I have created a dedicated >> thread there for CMake discussion. Join me there! > > It is not problem for me to host the packages, and I do need them myself when I > build a doxygen release. So for Windows (32bit/64bit + debug/release flavors) and > MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like a good approach. > For Linux, however, it would be better to depend on the packages that come with > a distribution (there are too many distros to support). > > Can CMake be configured like that? or is it one approach or the other for all platforms. > > Regards, > Dimitri > ------------------------------------------------------------------------- ----- How ServiceNow helps IT people transform IT departments: 1. A cloud service to automate IT design, transition and operations 2. Dashboards that offer high-level views of enterprise services 3. A single system of record for all IT processes http://p.sf.net/sfu/servicenow-d2d-j _______________________________________________ Doxygen-develop mailing list Dox...@li... https://lists.sourceforge.net/lists/listinfo/doxygen-develop |
From: Robert D. <rcd...@gm...> - 2013-06-08 19:46:17
|
I'd have to rewrite the framework to handle the special case package handling, which would be significant work for what might be little gain. The system would have to be used for all platforms as it currently is. I'm not a Unix developer, so I'm not sure why Unix would be more difficult than windows. For example, pretty much every linux distro I know supports GCC with Intel architecture. Even if you need to support GCC + ARM, couldn't you easily maintain 2 sets of packages on Unix (That would be: GCC+x86 and GCC+ARM)? Another reason why I prefer this approach on linux is because when you download libs through package managers on Linux, your build system can't really control what version you have. With this approach, we are in control of the packages, so we can guarantee the versions of our libs we use are those we have tested with. When we upgrade a library, we can perform regression testing and update the package system on the CMake side to use the new version. At this point I'd just like a bit of education on where the complexity lies on the unix side. Teach me a little and I might be able to come up with some ideas for you :) Thanks. On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch <do...@gm...> wrote: > Hi Robert, > > On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> wrote: > >> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey <rcd...@gm...> wrote: >>> Starting a new discussion thread here in the dev mailing list for >>> CMake support. I'll be working on this over on my github fork: >>> https://github.com/rcdailey/doxygen >>> >>> I'll be spending my spare time on this so please forgive any slow progress :) >> >> Concerning third party dependencies that you do not build yourself as >> part of your make command, would you be able to maintain your own >> binaries for these in a repository? >> >> I already have a CMake framework that I can drop into doxygen and use. >> To be the most platform agnostic, I have set it up to download >> archives of precompiled binaries for third party libraries from an >> FTP/HTTP/Windows file share of your choosing (configurable in CMake >> cache). Basically for every platform or toolchain you plan to build >> doxygen on or with, you will need to have include files + binaries in >> an archive. Those will sit in a repository and the CMake scripts will >> download them, extract them, and automatically setup include >> directories and dependencies for you. >> >> There are a couple of benefits to having this approach: >> 1. No need to search for these libraries on the system. The CMake >> scripts will always be able to guarantee that they are on the system >> since it will be downloading them from a repository you maintain. >> 2. Easier for new developers to just pick up the code and start >> building, since they do not have to spend time building libraries. >> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >> just another dependency to worry about) mechanism, so this makes up >> for that and makes it super easy to get a build started on Windows. >> >> The downside, of course, is that this can become a maintenance problem >> if you have a ton of libraries and/or platforms or toolchains to >> support. >> >> Let me know how you want to approach this, as it will deeply impact >> the work. Personally I suggest we take this approach, assuming you can >> setup an FTP/HTTP server somewhere to pull down the archives. I will >> also post this in the dev mailing list, as I have created a dedicated >> thread there for CMake discussion. Join me there! > > It is not problem for me to host the packages, and I do need them myself when I > build a doxygen release. So for Windows (32bit/64bit + debug/release flavors) and > MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like a good approach. > For Linux, however, it would be better to depend on the packages that come with > a distribution (there are too many distros to support). > > Can CMake be configured like that? or is it one approach or the other for all platforms. > > Regards, > Dimitri > |
From: Dimitri v. H. <do...@gm...> - 2013-06-08 09:59:28
|
Hi Robert, On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> wrote: > On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey <rcd...@gm...> wrote: >> Starting a new discussion thread here in the dev mailing list for >> CMake support. I'll be working on this over on my github fork: >> https://github.com/rcdailey/doxygen >> >> I'll be spending my spare time on this so please forgive any slow progress :) > > Concerning third party dependencies that you do not build yourself as > part of your make command, would you be able to maintain your own > binaries for these in a repository? > > I already have a CMake framework that I can drop into doxygen and use. > To be the most platform agnostic, I have set it up to download > archives of precompiled binaries for third party libraries from an > FTP/HTTP/Windows file share of your choosing (configurable in CMake > cache). Basically for every platform or toolchain you plan to build > doxygen on or with, you will need to have include files + binaries in > an archive. Those will sit in a repository and the CMake scripts will > download them, extract them, and automatically setup include > directories and dependencies for you. > > There are a couple of benefits to having this approach: > 1. No need to search for these libraries on the system. The CMake > scripts will always be able to guarantee that they are on the system > since it will be downloading them from a repository you maintain. > 2. Easier for new developers to just pick up the code and start > building, since they do not have to spend time building libraries. > 3. Windows doesn't have an apt-get (unless you use cygwin, which is > just another dependency to worry about) mechanism, so this makes up > for that and makes it super easy to get a build started on Windows. > > The downside, of course, is that this can become a maintenance problem > if you have a ton of libraries and/or platforms or toolchains to > support. > > Let me know how you want to approach this, as it will deeply impact > the work. Personally I suggest we take this approach, assuming you can > setup an FTP/HTTP server somewhere to pull down the archives. I will > also post this in the dev mailing list, as I have created a dedicated > thread there for CMake discussion. Join me there! It is not problem for me to host the packages, and I do need them myself when I build a doxygen release. So for Windows (32bit/64bit + debug/release flavors) and MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like a good approach. For Linux, however, it would be better to depend on the packages that come with a distribution (there are too many distros to support). Can CMake be configured like that? or is it one approach or the other for all platforms. Regards, Dimitri |