Thread: [Doxygen-develop] CMake
Brought to you by:
dimitri
From: Robert D. <rcd...@gm...> - 2013-06-06 21:04:00
|
Starting a new discussion thread here in the dev mailing list for CMake support. I'll be working on this over on my github fork: https://github.com/rcdailey/doxygen I'll be spending my spare time on this so please forgive any slow progress :) |
From: Robert D. <rcd...@gm...> - 2013-06-07 00:21:56
|
On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey <rcd...@gm...> wrote: > Starting a new discussion thread here in the dev mailing list for > CMake support. I'll be working on this over on my github fork: > https://github.com/rcdailey/doxygen > > I'll be spending my spare time on this so please forgive any slow progress :) Concerning third party dependencies that you do not build yourself as part of your make command, would you be able to maintain your own binaries for these in a repository? I already have a CMake framework that I can drop into doxygen and use. To be the most platform agnostic, I have set it up to download archives of precompiled binaries for third party libraries from an FTP/HTTP/Windows file share of your choosing (configurable in CMake cache). Basically for every platform or toolchain you plan to build doxygen on or with, you will need to have include files + binaries in an archive. Those will sit in a repository and the CMake scripts will download them, extract them, and automatically setup include directories and dependencies for you. There are a couple of benefits to having this approach: 1. No need to search for these libraries on the system. The CMake scripts will always be able to guarantee that they are on the system since it will be downloading them from a repository you maintain. 2. Easier for new developers to just pick up the code and start building, since they do not have to spend time building libraries. 3. Windows doesn't have an apt-get (unless you use cygwin, which is just another dependency to worry about) mechanism, so this makes up for that and makes it super easy to get a build started on Windows. The downside, of course, is that this can become a maintenance problem if you have a ton of libraries and/or platforms or toolchains to support. Let me know how you want to approach this, as it will deeply impact the work. Personally I suggest we take this approach, assuming you can setup an FTP/HTTP server somewhere to pull down the archives. I will also post this in the dev mailing list, as I have created a dedicated thread there for CMake discussion. Join me there! |
From: Dimitri v. H. <do...@gm...> - 2013-06-08 09:59:28
|
Hi Robert, On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> wrote: > On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey <rcd...@gm...> wrote: >> Starting a new discussion thread here in the dev mailing list for >> CMake support. I'll be working on this over on my github fork: >> https://github.com/rcdailey/doxygen >> >> I'll be spending my spare time on this so please forgive any slow progress :) > > Concerning third party dependencies that you do not build yourself as > part of your make command, would you be able to maintain your own > binaries for these in a repository? > > I already have a CMake framework that I can drop into doxygen and use. > To be the most platform agnostic, I have set it up to download > archives of precompiled binaries for third party libraries from an > FTP/HTTP/Windows file share of your choosing (configurable in CMake > cache). Basically for every platform or toolchain you plan to build > doxygen on or with, you will need to have include files + binaries in > an archive. Those will sit in a repository and the CMake scripts will > download them, extract them, and automatically setup include > directories and dependencies for you. > > There are a couple of benefits to having this approach: > 1. No need to search for these libraries on the system. The CMake > scripts will always be able to guarantee that they are on the system > since it will be downloading them from a repository you maintain. > 2. Easier for new developers to just pick up the code and start > building, since they do not have to spend time building libraries. > 3. Windows doesn't have an apt-get (unless you use cygwin, which is > just another dependency to worry about) mechanism, so this makes up > for that and makes it super easy to get a build started on Windows. > > The downside, of course, is that this can become a maintenance problem > if you have a ton of libraries and/or platforms or toolchains to > support. > > Let me know how you want to approach this, as it will deeply impact > the work. Personally I suggest we take this approach, assuming you can > setup an FTP/HTTP server somewhere to pull down the archives. I will > also post this in the dev mailing list, as I have created a dedicated > thread there for CMake discussion. Join me there! It is not problem for me to host the packages, and I do need them myself when I build a doxygen release. So for Windows (32bit/64bit + debug/release flavors) and MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like a good approach. For Linux, however, it would be better to depend on the packages that come with a distribution (there are too many distros to support). Can CMake be configured like that? or is it one approach or the other for all platforms. Regards, Dimitri |
From: Robert D. <rcd...@gm...> - 2013-06-08 19:46:17
|
I'd have to rewrite the framework to handle the special case package handling, which would be significant work for what might be little gain. The system would have to be used for all platforms as it currently is. I'm not a Unix developer, so I'm not sure why Unix would be more difficult than windows. For example, pretty much every linux distro I know supports GCC with Intel architecture. Even if you need to support GCC + ARM, couldn't you easily maintain 2 sets of packages on Unix (That would be: GCC+x86 and GCC+ARM)? Another reason why I prefer this approach on linux is because when you download libs through package managers on Linux, your build system can't really control what version you have. With this approach, we are in control of the packages, so we can guarantee the versions of our libs we use are those we have tested with. When we upgrade a library, we can perform regression testing and update the package system on the CMake side to use the new version. At this point I'd just like a bit of education on where the complexity lies on the unix side. Teach me a little and I might be able to come up with some ideas for you :) Thanks. On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch <do...@gm...> wrote: > Hi Robert, > > On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> wrote: > >> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey <rcd...@gm...> wrote: >>> Starting a new discussion thread here in the dev mailing list for >>> CMake support. I'll be working on this over on my github fork: >>> https://github.com/rcdailey/doxygen >>> >>> I'll be spending my spare time on this so please forgive any slow progress :) >> >> Concerning third party dependencies that you do not build yourself as >> part of your make command, would you be able to maintain your own >> binaries for these in a repository? >> >> I already have a CMake framework that I can drop into doxygen and use. >> To be the most platform agnostic, I have set it up to download >> archives of precompiled binaries for third party libraries from an >> FTP/HTTP/Windows file share of your choosing (configurable in CMake >> cache). Basically for every platform or toolchain you plan to build >> doxygen on or with, you will need to have include files + binaries in >> an archive. Those will sit in a repository and the CMake scripts will >> download them, extract them, and automatically setup include >> directories and dependencies for you. >> >> There are a couple of benefits to having this approach: >> 1. No need to search for these libraries on the system. The CMake >> scripts will always be able to guarantee that they are on the system >> since it will be downloading them from a repository you maintain. >> 2. Easier for new developers to just pick up the code and start >> building, since they do not have to spend time building libraries. >> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >> just another dependency to worry about) mechanism, so this makes up >> for that and makes it super easy to get a build started on Windows. >> >> The downside, of course, is that this can become a maintenance problem >> if you have a ton of libraries and/or platforms or toolchains to >> support. >> >> Let me know how you want to approach this, as it will deeply impact >> the work. Personally I suggest we take this approach, assuming you can >> setup an FTP/HTTP server somewhere to pull down the archives. I will >> also post this in the dev mailing list, as I have created a dedicated >> thread there for CMake discussion. Join me there! > > It is not problem for me to host the packages, and I do need them myself when I > build a doxygen release. So for Windows (32bit/64bit + debug/release flavors) and > MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like a good approach. > For Linux, however, it would be better to depend on the packages that come with > a distribution (there are too many distros to support). > > Can CMake be configured like that? or is it one approach or the other for all platforms. > > Regards, > Dimitri > |
From: Kevin M. <dol...@ai...> - 2013-06-08 20:32:28
|
Robert, Unlike Windows, Linux is written in a way that allows many different "distros" to be written. Some people prefer not to compile packages from sources (Fedora is good for these people). Others that prefer to compile from sources would typically use a distro that is more friendly to developers. With such differences, it is not wise to use one binary for all Linux distros. In fact, new stuff that gets compiled would be replaced with old dependencies, resulting in chaos for developers as they try to stick with their preferred versions of libraries and programs they have compiled. I once thought the autotools were good for doxygen, but the autotools are good only on *nix platforms. They do not perform well on Windows. Hope this brief explanation about Linux helps. Kevin McBride dol...@ai... -----Original Message----- From: Robert Dailey <rcd...@gm...> To: Dimitri van Heesch <do...@gm...> Cc: Doxygen Developers <dox...@li...> Sent: Sat, Jun 8, 2013 3:46 pm Subject: Re: [Doxygen-develop] CMake I'd have to rewrite the framework to handle the special case package handling, which would be significant work for what might be little gain. The system would have to be used for all platforms as it currently is. I'm not a Unix developer, so I'm not sure why Unix would be more difficult than windows. For example, pretty much every linux distro I know supports GCC with Intel architecture. Even if you need to support GCC + ARM, couldn't you easily maintain 2 sets of packages on Unix (That would be: GCC+x86 and GCC+ARM)? Another reason why I prefer this approach on linux is because when you download libs through package managers on Linux, your build system can't really control what version you have. With this approach, we are in control of the packages, so we can guarantee the versions of our libs we use are those we have tested with. When we upgrade a library, we can perform regression testing and update the package system on the CMake side to use the new version. At this point I'd just like a bit of education on where the complexity lies on the unix side. Teach me a little and I might be able to come up with some ideas for you :) Thanks. On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch <do...@gm...> wrote: > Hi Robert, > > On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> wrote: > >> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey <rcd...@gm...> wrote: >>> Starting a new discussion thread here in the dev mailing list for >>> CMake support. I'll be working on this over on my github fork: >>> https://github.com/rcdailey/doxygen >>> >>> I'll be spending my spare time on this so please forgive any slow progress :) >> >> Concerning third party dependencies that you do not build yourself as >> part of your make command, would you be able to maintain your own >> binaries for these in a repository? >> >> I already have a CMake framework that I can drop into doxygen and use. >> To be the most platform agnostic, I have set it up to download >> archives of precompiled binaries for third party libraries from an >> FTP/HTTP/Windows file share of your choosing (configurable in CMake >> cache). Basically for every platform or toolchain you plan to build >> doxygen on or with, you will need to have include files + binaries in >> an archive. Those will sit in a repository and the CMake scripts will >> download them, extract them, and automatically setup include >> directories and dependencies for you. >> >> There are a couple of benefits to having this approach: >> 1. No need to search for these libraries on the system. The CMake >> scripts will always be able to guarantee that they are on the system >> since it will be downloading them from a repository you maintain. >> 2. Easier for new developers to just pick up the code and start >> building, since they do not have to spend time building libraries. >> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >> just another dependency to worry about) mechanism, so this makes up >> for that and makes it super easy to get a build started on Windows. >> >> The downside, of course, is that this can become a maintenance problem >> if you have a ton of libraries and/or platforms or toolchains to >> support. >> >> Let me know how you want to approach this, as it will deeply impact >> the work. Personally I suggest we take this approach, assuming you can >> setup an FTP/HTTP server somewhere to pull down the archives. I will >> also post this in the dev mailing list, as I have created a dedicated >> thread there for CMake discussion. Join me there! > > It is not problem for me to host the packages, and I do need them myself when I > build a doxygen release. So for Windows (32bit/64bit + debug/release flavors) and > MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like a good approach. > For Linux, however, it would be better to depend on the packages that come with > a distribution (there are too many distros to support). > > Can CMake be configured like that? or is it one approach or the other for all platforms. > > Regards, > Dimitri > ------------------------------------------------------------------------- ----- How ServiceNow helps IT people transform IT departments: 1. A cloud service to automate IT design, transition and operations 2. Dashboards that offer high-level views of enterprise services 3. A single system of record for all IT processes http://p.sf.net/sfu/servicenow-d2d-j _______________________________________________ Doxygen-develop mailing list Dox...@li... https://lists.sourceforge.net/lists/listinfo/doxygen-develop |
From: Robert D. <rcd...@gm...> - 2013-06-08 20:43:18
|
Suppose you have a library on linux called D. The dependency tree is as follows: D -> B, C B -> A So basically: D depends on libraries B & C, while library B depends on library A In this case, you'd compile all 4 libraries on your target toolchain and architecture (GCC + Intel) and put that in your CMake repository. Would these 4 binaries not be usable on multiple distros? As long as the kernel is the same (or at least compatible), it should work fine. The only edge case I can think of is if the kernel is vastly different on each distro, meaning things like the memory manager changes and thus libraries would need to be recompiled for each kernel. On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> wrote: > Robert, > > Unlike Windows, Linux is written in a way that allows many different > "distros" to be written. Some people prefer not to compile packages from > sources (Fedora is good for these people). Others that prefer to compile > from sources would typically use a distro that is more friendly to > developers. > > With such differences, it is not wise to use one binary for all Linux > distros. In fact, new stuff that gets compiled would be replaced with old > dependencies, resulting in chaos for developers as they try to stick with > their preferred versions of libraries and programs they have compiled. > > I once thought the autotools were good for doxygen, but the autotools are > good only on *nix platforms. They do not perform well on Windows. > > Hope this brief explanation about Linux helps. > > Kevin McBride > dol...@ai... > > > > -----Original Message----- > From: Robert Dailey <rcd...@gm...> > To: Dimitri van Heesch <do...@gm...> > Cc: Doxygen Developers <dox...@li...> > Sent: Sat, Jun 8, 2013 3:46 pm > Subject: Re: [Doxygen-develop] CMake > > I'd have to rewrite the framework to handle the special case package > handling, which would be significant work for what might be little > gain. The system would have to be used for all platforms as it > currently is. I'm not a Unix developer, so I'm not sure why Unix would > be more difficult than windows. For example, pretty much every linux > distro I know supports GCC with Intel architecture. Even if you need > to support GCC + ARM, couldn't you easily maintain 2 sets of packages > on Unix (That would be: GCC+x86 and GCC+ARM)? > > Another reason why I prefer this approach on linux is because when you > download libs through package managers on Linux, your build system > can't really control what version you have. With this approach, we are > in control of the packages, so we can guarantee the versions of our > libs we use are those we have tested with. When we upgrade a library, > we can perform regression testing and update the package system on the > CMake side to use the new version. > > At this point I'd just like a bit of education on where the complexity > lies on the unix side. Teach me a little and I might be able to come > up with some ideas for you :) > > Thanks. > > On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch <do...@gm...> > wrote: >> >> Hi Robert, >> >> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> > > wrote: >> >> >>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey > > <rcd...@gm...> wrote: >>>> >>>> Starting a new discussion thread here in the dev mailing list for >>>> CMake support. I'll be working on this over on my github fork: >>>> https://github.com/rcdailey/doxygen >>>> >>>> I'll be spending my spare time on this so please forgive any slow > > progress :) >>> >>> >>> Concerning third party dependencies that you do not build yourself as >>> part of your make command, would you be able to maintain your own >>> binaries for these in a repository? >>> >>> I already have a CMake framework that I can drop into doxygen and > > use. >>> >>> To be the most platform agnostic, I have set it up to download >>> archives of precompiled binaries for third party libraries from an >>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>> cache). Basically for every platform or toolchain you plan to build >>> doxygen on or with, you will need to have include files + binaries in >>> an archive. Those will sit in a repository and the CMake scripts will >>> download them, extract them, and automatically setup include >>> directories and dependencies for you. >>> >>> There are a couple of benefits to having this approach: >>> 1. No need to search for these libraries on the system. The CMake >>> scripts will always be able to guarantee that they are on the system >>> since it will be downloading them from a repository you maintain. >>> 2. Easier for new developers to just pick up the code and start >>> building, since they do not have to spend time building libraries. >>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>> just another dependency to worry about) mechanism, so this makes up >>> for that and makes it super easy to get a build started on Windows. >>> >>> The downside, of course, is that this can become a maintenance > > problem >>> >>> if you have a ton of libraries and/or platforms or toolchains to >>> support. >>> >>> Let me know how you want to approach this, as it will deeply impact >>> the work. Personally I suggest we take this approach, assuming you > > can >>> >>> setup an FTP/HTTP server somewhere to pull down the archives. I will >>> also post this in the dev mailing list, as I have created a dedicated >>> thread there for CMake discussion. Join me there! >> >> >> It is not problem for me to host the packages, and I do need them > > myself when I >> >> build a doxygen release. So for Windows (32bit/64bit + debug/release > > flavors) and >> >> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like > > a good approach. >> >> For Linux, however, it would be better to depend on the packages that > > come with >> >> a distribution (there are too many distros to support). >> >> Can CMake be configured like that? or is it one approach or the other > > for all platforms. >> >> >> Regards, >> Dimitri >> > > ------------------------------------------------------------------------- > ----- > How ServiceNow helps IT people transform IT departments: > 1. A cloud service to automate IT design, transition and operations > 2. Dashboards that offer high-level views of enterprise services > 3. A single system of record for all IT processes > http://p.sf.net/sfu/servicenow-d2d-j > _______________________________________________ > Doxygen-develop mailing list > Dox...@li... > https://lists.sourceforge.net/lists/listinfo/doxygen-develop > > > |
From: Kevin M. <dol...@ai...> - 2013-06-08 23:55:04
|
Another area to consider is the standard C/C++ libraries. I have read the info behind the libraries, and found that this is where you could run into problems, as every std library is configured differently to make system calls to the Linux kernel. It is much better to compile dependencies during the actual building process (like the QTools package in doxygen's source code). When I used to compile RPMs (I was the one who came up with the `make rpm' command to the makefiles of Doxygen) I compiled under the Fedora distro, which had a dynamic libpng installed. I always had the linking process dynamically link to the libpng in the Fedora distro. The libpng dynamic linking enhancement was not included in the master repository because Dimitri and I found only a small speed difference when using libpng that some distros provided. Can cmake do a configure process just like what is currently done to compile the doxygen source code? I really do think it is best to link dynamically to the common libraries that almost every distro has. You run into less problems this way, especially considering the example that the binary form of the standard C/C++ libraries do differ from distro to distro. Kevin McBride dol...@ai... -----Original Message----- From: Robert Dailey <rcd...@gm...> To: Kevin McBride <dol...@ai...> Cc: Doxygen <do...@gm...>; Doxygen Developers <dox...@li...> Sent: Sat, Jun 8, 2013 4:43 pm Subject: Re: [Doxygen-develop] CMake Suppose you have a library on linux called D. The dependency tree is as follows: D -> B, C B -> A So basically: D depends on libraries B & C, while library B depends on library A In this case, you'd compile all 4 libraries on your target toolchain and architecture (GCC + Intel) and put that in your CMake repository. Would these 4 binaries not be usable on multiple distros? As long as the kernel is the same (or at least compatible), it should work fine. The only edge case I can think of is if the kernel is vastly different on each distro, meaning things like the memory manager changes and thus libraries would need to be recompiled for each kernel. On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> wrote: > Robert, > > Unlike Windows, Linux is written in a way that allows many different > "distros" to be written. Some people prefer not to compile packages from > sources (Fedora is good for these people). Others that prefer to compile > from sources would typically use a distro that is more friendly to > developers. > > With such differences, it is not wise to use one binary for all Linux > distros. In fact, new stuff that gets compiled would be replaced with old > dependencies, resulting in chaos for developers as they try to stick with > their preferred versions of libraries and programs they have compiled. > > I once thought the autotools were good for doxygen, but the autotools are > good only on *nix platforms. They do not perform well on Windows. > > Hope this brief explanation about Linux helps. > > Kevin McBride > dol...@ai... > > > > -----Original Message----- > From: Robert Dailey <rcd...@gm...> > To: Dimitri van Heesch <do...@gm...> > Cc: Doxygen Developers <dox...@li...> > Sent: Sat, Jun 8, 2013 3:46 pm > Subject: Re: [Doxygen-develop] CMake > > I'd have to rewrite the framework to handle the special case package > handling, which would be significant work for what might be little > gain. The system would have to be used for all platforms as it > currently is. I'm not a Unix developer, so I'm not sure why Unix would > be more difficult than windows. For example, pretty much every linux > distro I know supports GCC with Intel architecture. Even if you need > to support GCC + ARM, couldn't you easily maintain 2 sets of packages > on Unix (That would be: GCC+x86 and GCC+ARM)? > > Another reason why I prefer this approach on linux is because when you > download libs through package managers on Linux, your build system > can't really control what version you have. With this approach, we are > in control of the packages, so we can guarantee the versions of our > libs we use are those we have tested with. When we upgrade a library, > we can perform regression testing and update the package system on the > CMake side to use the new version. > > At this point I'd just like a bit of education on where the complexity > lies on the unix side. Teach me a little and I might be able to come > up with some ideas for you :) > > Thanks. > > On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch <do...@gm...> > wrote: >> >> Hi Robert, >> >> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> > > wrote: >> >> >>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey > > <rcd...@gm...> wrote: >>>> >>>> Starting a new discussion thread here in the dev mailing list for >>>> CMake support. I'll be working on this over on my github fork: >>>> https://github.com/rcdailey/doxygen >>>> >>>> I'll be spending my spare time on this so please forgive any slow > > progress :) >>> >>> >>> Concerning third party dependencies that you do not build yourself as >>> part of your make command, would you be able to maintain your own >>> binaries for these in a repository? >>> >>> I already have a CMake framework that I can drop into doxygen and > > use. >>> >>> To be the most platform agnostic, I have set it up to download >>> archives of precompiled binaries for third party libraries from an >>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>> cache). Basically for every platform or toolchain you plan to build >>> doxygen on or with, you will need to have include files + binaries in >>> an archive. Those will sit in a repository and the CMake scripts will >>> download them, extract them, and automatically setup include >>> directories and dependencies for you. >>> >>> There are a couple of benefits to having this approach: >>> 1. No need to search for these libraries on the system. The CMake >>> scripts will always be able to guarantee that they are on the system >>> since it will be downloading them from a repository you maintain. >>> 2. Easier for new developers to just pick up the code and start >>> building, since they do not have to spend time building libraries. >>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>> just another dependency to worry about) mechanism, so this makes up >>> for that and makes it super easy to get a build started on Windows. >>> >>> The downside, of course, is that this can become a maintenance > > problem >>> >>> if you have a ton of libraries and/or platforms or toolchains to >>> support. >>> >>> Let me know how you want to approach this, as it will deeply impact >>> the work. Personally I suggest we take this approach, assuming you > > can >>> >>> setup an FTP/HTTP server somewhere to pull down the archives. I will >>> also post this in the dev mailing list, as I have created a dedicated >>> thread there for CMake discussion. Join me there! >> >> >> It is not problem for me to host the packages, and I do need them > > myself when I >> >> build a doxygen release. So for Windows (32bit/64bit + debug/release > > flavors) and >> >> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like > > a good approach. >> >> For Linux, however, it would be better to depend on the packages that > > come with >> >> a distribution (there are too many distros to support). >> >> Can CMake be configured like that? or is it one approach or the other > > for all platforms. >> >> >> Regards, >> Dimitri >> > > ------------------------------------------------------------------------- > ----- > How ServiceNow helps IT people transform IT departments: > 1. A cloud service to automate IT design, transition and operations > 2. Dashboards that offer high-level views of enterprise services > 3. A single system of record for all IT processes > http://p.sf.net/sfu/servicenow-d2d-j > _______________________________________________ > Doxygen-develop mailing list > Dox...@li... > https://lists.sourceforge.net/lists/listinfo/doxygen-develop > > > |
From: Robert D. <rcd...@gm...> - 2013-06-09 17:45:24
|
I think what I will do is redesign my framework to take a hybrid approach. I have a manifest file where you define the third party libraries plus their versions. I will change this so you define whether or not to download from a custom repository or download from package manager. In the latter case, you will need to define a script that takes a library name and version. This script will execute the platform's package manager to make sure that the appropriate includes and binaries are downloaded (or compiled) as part of the process CMake goes through to prepare third party libs for usage. I know that most linux distros have very different package managers. Syntax is different, and some download binaries while others download source. The source case will be difficult, as this custom script will not only need to invoke the platform's package manager to download the source but also build each package the moment it is downloaded in some uniform way. Does this seem feasible? On Sat, Jun 8, 2013 at 6:42 PM, Kevin McBride <dol...@ai...> wrote: > Another area to consider is the standard C/C++ libraries. I have read the > info behind the libraries, and found that this is where you could run into > problems, as every std library is configured differently to make system > calls to the Linux kernel. It is much better to compile dependencies during > the actual building process (like the QTools package in doxygen's source > code). > > When I used to compile RPMs (I was the one who came up with the `make rpm' > command to the makefiles of Doxygen) I compiled under the Fedora distro, > which had a dynamic libpng installed. I always had the linking process > dynamically link to the libpng in the Fedora distro. The libpng dynamic > linking enhancement was not included in the master repository because > Dimitri and I found only a small speed difference when using libpng that > some distros provided. > > Can cmake do a configure process just like what is currently done to compile > the doxygen source code? I really do think it is best to link dynamically > to the common libraries that almost every distro has. You run into less > problems this way, especially considering the example that the binary form > of the standard C/C++ libraries do differ from distro to distro. > > > Kevin McBride > dol...@ai... > > > -----Original Message----- > From: Robert Dailey <rcd...@gm...> > To: Kevin McBride <dol...@ai...> > Cc: Doxygen <do...@gm...>; Doxygen Developers > <dox...@li...> > Sent: Sat, Jun 8, 2013 4:43 pm > Subject: Re: [Doxygen-develop] CMake > > Suppose you have a library on linux called D. The dependency tree is as > follows: > > D -> B, C > B -> A > > So basically: D depends on libraries B & C, while library B depends on > library A > > In this case, you'd compile all 4 libraries on your target toolchain > and architecture (GCC + Intel) and put that in your CMake repository. > Would these 4 binaries not be usable on multiple distros? As long as > the kernel is the same (or at least compatible), it should work fine. > The only edge case I can think of is if the kernel is vastly different > on each distro, meaning things like the memory manager changes and > thus libraries would need to be recompiled for each kernel. > > On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> wrote: >> >> Robert, >> >> Unlike Windows, Linux is written in a way that allows many different >> "distros" to be written. Some people prefer not to compile packages > > from >> >> sources (Fedora is good for these people). Others that prefer to > > compile >> >> from sources would typically use a distro that is more friendly to >> developers. >> >> With such differences, it is not wise to use one binary for all Linux >> distros. In fact, new stuff that gets compiled would be replaced > > with old >> >> dependencies, resulting in chaos for developers as they try to stick > > with >> >> their preferred versions of libraries and programs they have compiled. >> >> I once thought the autotools were good for doxygen, but the autotools > > are >> >> good only on *nix platforms. They do not perform well on Windows. >> >> Hope this brief explanation about Linux helps. >> >> Kevin McBride >> dol...@ai... >> >> >> >> -----Original Message----- >> From: Robert Dailey <rcd...@gm...> >> To: Dimitri van Heesch <do...@gm...> >> Cc: Doxygen Developers <dox...@li...> >> Sent: Sat, Jun 8, 2013 3:46 pm >> Subject: Re: [Doxygen-develop] CMake >> >> I'd have to rewrite the framework to handle the special case package >> handling, which would be significant work for what might be little >> gain. The system would have to be used for all platforms as it >> currently is. I'm not a Unix developer, so I'm not sure why Unix would >> be more difficult than windows. For example, pretty much every linux >> distro I know supports GCC with Intel architecture. Even if you need >> to support GCC + ARM, couldn't you easily maintain 2 sets of packages >> on Unix (That would be: GCC+x86 and GCC+ARM)? >> >> Another reason why I prefer this approach on linux is because when you >> download libs through package managers on Linux, your build system >> can't really control what version you have. With this approach, we are >> in control of the packages, so we can guarantee the versions of our >> libs we use are those we have tested with. When we upgrade a library, >> we can perform regression testing and update the package system on the >> CMake side to use the new version. >> >> At this point I'd just like a bit of education on where the complexity >> lies on the unix side. Teach me a little and I might be able to come >> up with some ideas for you :) >> >> Thanks. >> >> On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch <do...@gm...> >> wrote: >>> >>> >>> Hi Robert, >>> >>> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> >> >> >> wrote: >>> >>> >>> >>>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey >> >> >> <rcd...@gm...> wrote: >>>>> >>>>> >>>>> Starting a new discussion thread here in the dev mailing list for >>>>> CMake support. I'll be working on this over on my github fork: >>>>> https://github.com/rcdailey/doxygen >>>>> >>>>> I'll be spending my spare time on this so please forgive any slow >> >> >> progress :) >>>> >>>> >>>> >>>> Concerning third party dependencies that you do not build yourself > > as >>>> >>>> part of your make command, would you be able to maintain your own >>>> binaries for these in a repository? >>>> >>>> I already have a CMake framework that I can drop into doxygen and >> >> >> use. >>>> >>>> >>>> To be the most platform agnostic, I have set it up to download >>>> archives of precompiled binaries for third party libraries from an >>>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>>> cache). Basically for every platform or toolchain you plan to build >>>> doxygen on or with, you will need to have include files + binaries > > in >>>> >>>> an archive. Those will sit in a repository and the CMake scripts > > will >>>> >>>> download them, extract them, and automatically setup include >>>> directories and dependencies for you. >>>> >>>> There are a couple of benefits to having this approach: >>>> 1. No need to search for these libraries on the system. The CMake >>>> scripts will always be able to guarantee that they are on the > > system >>>> >>>> since it will be downloading them from a repository you maintain. >>>> 2. Easier for new developers to just pick up the code and start >>>> building, since they do not have to spend time building libraries. >>>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>>> just another dependency to worry about) mechanism, so this makes up >>>> for that and makes it super easy to get a build started on Windows. >>>> >>>> The downside, of course, is that this can become a maintenance >> >> >> problem >>>> >>>> >>>> if you have a ton of libraries and/or platforms or toolchains to >>>> support. >>>> >>>> Let me know how you want to approach this, as it will deeply impact >>>> the work. Personally I suggest we take this approach, assuming you >> >> >> can >>>> >>>> >>>> setup an FTP/HTTP server somewhere to pull down the archives. I will >>>> also post this in the dev mailing list, as I have created a > > dedicated >>>> >>>> thread there for CMake discussion. Join me there! >>> >>> >>> >>> It is not problem for me to host the packages, and I do need them >> >> >> myself when I >>> >>> >>> build a doxygen release. So for Windows (32bit/64bit + debug/release >> >> >> flavors) and >>> >>> >>> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like >> >> >> a good approach. >>> >>> >>> For Linux, however, it would be better to depend on the packages that >> >> >> come with >>> >>> >>> a distribution (there are too many distros to support). >>> >>> Can CMake be configured like that? or is it one approach or the other >> >> >> for all platforms. >>> >>> >>> >>> Regards, >>> Dimitri >>> >> >> > ------------------------------------------------------------------------- > >> ----- >> How ServiceNow helps IT people transform IT departments: >> 1. A cloud service to automate IT design, transition and operations >> 2. Dashboards that offer high-level views of enterprise services >> 3. A single system of record for all IT processes >> http://p.sf.net/sfu/servicenow-d2d-j >> _______________________________________________ >> Doxygen-develop mailing list >> Dox...@li... >> https://lists.sourceforge.net/lists/listinfo/doxygen-develop >> >> >> > > > |
From: Kevin M. <dol...@ai...> - 2013-06-09 19:18:50
|
Robert, That sounds like a good idea. I will leave the final decision to Dimitri, as he is the one who will have to commit the changes to GIT. Kevin McBride dol...@ai... -----Original Message----- From: Robert Dailey <rcd...@gm...> To: Kevin McBride <dol...@ai...> Cc: Doxygen <do...@gm...>; Doxygen Developers <dox...@li...> Sent: Sun, Jun 9, 2013 1:45 pm Subject: Re: [Doxygen-develop] CMake I think what I will do is redesign my framework to take a hybrid approach. I have a manifest file where you define the third party libraries plus their versions. I will change this so you define whether or not to download from a custom repository or download from package manager. In the latter case, you will need to define a script that takes a library name and version. This script will execute the platform's package manager to make sure that the appropriate includes and binaries are downloaded (or compiled) as part of the process CMake goes through to prepare third party libs for usage. I know that most linux distros have very different package managers. Syntax is different, and some download binaries while others download source. The source case will be difficult, as this custom script will not only need to invoke the platform's package manager to download the source but also build each package the moment it is downloaded in some uniform way. Does this seem feasible? On Sat, Jun 8, 2013 at 6:42 PM, Kevin McBride <dol...@ai...> wrote: > Another area to consider is the standard C/C++ libraries. I have read the > info behind the libraries, and found that this is where you could run into > problems, as every std library is configured differently to make system > calls to the Linux kernel. It is much better to compile dependencies during > the actual building process (like the QTools package in doxygen's source > code). > > When I used to compile RPMs (I was the one who came up with the `make rpm' > command to the makefiles of Doxygen) I compiled under the Fedora distro, > which had a dynamic libpng installed. I always had the linking process > dynamically link to the libpng in the Fedora distro. The libpng dynamic > linking enhancement was not included in the master repository because > Dimitri and I found only a small speed difference when using libpng that > some distros provided. > > Can cmake do a configure process just like what is currently done to compile > the doxygen source code? I really do think it is best to link dynamically > to the common libraries that almost every distro has. You run into less > problems this way, especially considering the example that the binary form > of the standard C/C++ libraries do differ from distro to distro. > > > Kevin McBride > dol...@ai... > > > -----Original Message----- > From: Robert Dailey <rcd...@gm...> > To: Kevin McBride <dol...@ai...> > Cc: Doxygen <do...@gm...>; Doxygen Developers > <dox...@li...> > Sent: Sat, Jun 8, 2013 4:43 pm > Subject: Re: [Doxygen-develop] CMake > > Suppose you have a library on linux called D. The dependency tree is as > follows: > > D -> B, C > B -> A > > So basically: D depends on libraries B & C, while library B depends on > library A > > In this case, you'd compile all 4 libraries on your target toolchain > and architecture (GCC + Intel) and put that in your CMake repository. > Would these 4 binaries not be usable on multiple distros? As long as > the kernel is the same (or at least compatible), it should work fine. > The only edge case I can think of is if the kernel is vastly different > on each distro, meaning things like the memory manager changes and > thus libraries would need to be recompiled for each kernel. > > On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> wrote: >> >> Robert, >> >> Unlike Windows, Linux is written in a way that allows many different >> "distros" to be written. Some people prefer not to compile packages > > from >> >> sources (Fedora is good for these people). Others that prefer to > > compile >> >> from sources would typically use a distro that is more friendly to >> developers. >> >> With such differences, it is not wise to use one binary for all Linux >> distros. In fact, new stuff that gets compiled would be replaced > > with old >> >> dependencies, resulting in chaos for developers as they try to stick > > with >> >> their preferred versions of libraries and programs they have compiled. >> >> I once thought the autotools were good for doxygen, but the autotools > > are >> >> good only on *nix platforms. They do not perform well on Windows. >> >> Hope this brief explanation about Linux helps. >> >> Kevin McBride >> dol...@ai... >> >> >> >> -----Original Message----- >> From: Robert Dailey <rcd...@gm...> >> To: Dimitri van Heesch <do...@gm...> >> Cc: Doxygen Developers <dox...@li...> >> Sent: Sat, Jun 8, 2013 3:46 pm >> Subject: Re: [Doxygen-develop] CMake >> >> I'd have to rewrite the framework to handle the special case package >> handling, which would be significant work for what might be little >> gain. The system would have to be used for all platforms as it >> currently is. I'm not a Unix developer, so I'm not sure why Unix would >> be more difficult than windows. For example, pretty much every linux >> distro I know supports GCC with Intel architecture. Even if you need >> to support GCC + ARM, couldn't you easily maintain 2 sets of packages >> on Unix (That would be: GCC+x86 and GCC+ARM)? >> >> Another reason why I prefer this approach on linux is because when you >> download libs through package managers on Linux, your build system >> can't really control what version you have. With this approach, we are >> in control of the packages, so we can guarantee the versions of our >> libs we use are those we have tested with. When we upgrade a library, >> we can perform regression testing and update the package system on the >> CMake side to use the new version. >> >> At this point I'd just like a bit of education on where the complexity >> lies on the unix side. Teach me a little and I might be able to come >> up with some ideas for you :) >> >> Thanks. >> >> On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch <do...@gm...> >> wrote: >>> >>> >>> Hi Robert, >>> >>> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> >> >> >> wrote: >>> >>> >>> >>>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey >> >> >> <rcd...@gm...> wrote: >>>>> >>>>> >>>>> Starting a new discussion thread here in the dev mailing list for >>>>> CMake support. I'll be working on this over on my github fork: >>>>> https://github.com/rcdailey/doxygen >>>>> >>>>> I'll be spending my spare time on this so please forgive any slow >> >> >> progress :) >>>> >>>> >>>> >>>> Concerning third party dependencies that you do not build yourself > > as >>>> >>>> part of your make command, would you be able to maintain your own >>>> binaries for these in a repository? >>>> >>>> I already have a CMake framework that I can drop into doxygen and >> >> >> use. >>>> >>>> >>>> To be the most platform agnostic, I have set it up to download >>>> archives of precompiled binaries for third party libraries from an >>>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>>> cache). Basically for every platform or toolchain you plan to build >>>> doxygen on or with, you will need to have include files + binaries > > in >>>> >>>> an archive. Those will sit in a repository and the CMake scripts > > will >>>> >>>> download them, extract them, and automatically setup include >>>> directories and dependencies for you. >>>> >>>> There are a couple of benefits to having this approach: >>>> 1. No need to search for these libraries on the system. The CMake >>>> scripts will always be able to guarantee that they are on the > > system >>>> >>>> since it will be downloading them from a repository you maintain. >>>> 2. Easier for new developers to just pick up the code and start >>>> building, since they do not have to spend time building libraries. >>>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>>> just another dependency to worry about) mechanism, so this makes up >>>> for that and makes it super easy to get a build started on Windows. >>>> >>>> The downside, of course, is that this can become a maintenance >> >> >> problem >>>> >>>> >>>> if you have a ton of libraries and/or platforms or toolchains to >>>> support. >>>> >>>> Let me know how you want to approach this, as it will deeply impact >>>> the work. Personally I suggest we take this approach, assuming you >> >> >> can >>>> >>>> >>>> setup an FTP/HTTP server somewhere to pull down the archives. I will >>>> also post this in the dev mailing list, as I have created a > > dedicated >>>> >>>> thread there for CMake discussion. Join me there! >>> >>> >>> >>> It is not problem for me to host the packages, and I do need them >> >> >> myself when I >>> >>> >>> build a doxygen release. So for Windows (32bit/64bit + debug/release >> >> >> flavors) and >>> >>> >>> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems like >> >> >> a good approach. >>> >>> >>> For Linux, however, it would be better to depend on the packages that >> >> >> come with >>> >>> >>> a distribution (there are too many distros to support). >>> >>> Can CMake be configured like that? or is it one approach or the other >> >> >> for all platforms. >>> >>> >>> >>> Regards, >>> Dimitri >>> >> >> > ------------------------------------------------------------------------- > >> ----- >> How ServiceNow helps IT people transform IT departments: >> 1. A cloud service to automate IT design, transition and operations >> 2. Dashboards that offer high-level views of enterprise services >> 3. A single system of record for all IT processes >> http://p.sf.net/sfu/servicenow-d2d-j >> _______________________________________________ >> Doxygen-develop mailing list >> Dox...@li... >> https://lists.sourceforge.net/lists/listinfo/doxygen-develop >> >> >> > > > |
From: Robert D. <rcd...@gm...> - 2013-06-09 21:13:09
|
The problem with depending on linux package managers is that you won't be able to use those libraries on Windows or MacOS. So, as annoying as it is, maintaining the packages yourself is the most portable solution. What linux platforms/distros does Doxygen need to support? Which third party libraries are we referring to? On Sun, Jun 9, 2013 at 2:18 PM, Kevin McBride <dol...@ai...> wrote: > Robert, > > That sounds like a good idea. I will leave the final decision to Dimitri, > as he is the one who will have to commit the changes to GIT. > > > Kevin McBride > dol...@ai... > > > -----Original Message----- > From: Robert Dailey <rcd...@gm...> > To: Kevin McBride <dol...@ai...> > Cc: Doxygen <do...@gm...>; Doxygen Developers > <dox...@li...> > Sent: Sun, Jun 9, 2013 1:45 pm > Subject: Re: [Doxygen-develop] CMake > > I think what I will do is redesign my framework to take a hybrid > approach. I have a manifest file where you define the third party > libraries plus their versions. I will change this so you define > whether or not to download from a custom repository or download from > package manager. In the latter case, you will need to define a script > that takes a library name and version. This script will execute the > platform's package manager to make sure that the appropriate includes > and binaries are downloaded (or compiled) as part of the process CMake > goes through to prepare third party libs for usage. > > I know that most linux distros have very different package managers. > Syntax is different, and some download binaries while others download > source. The source case will be difficult, as this custom script will > not only need to invoke the platform's package manager to download the > source but also build each package the moment it is downloaded in some > uniform way. Does this seem feasible? > > On Sat, Jun 8, 2013 at 6:42 PM, Kevin McBride <dol...@ai...> wrote: >> >> Another area to consider is the standard C/C++ libraries. I have > > read the >> >> info behind the libraries, and found that this is where you could run > > into >> >> problems, as every std library is configured differently to make > > system >> >> calls to the Linux kernel. It is much better to compile dependencies > > during >> >> the actual building process (like the QTools package in doxygen's > > source >> >> code). >> >> When I used to compile RPMs (I was the one who came up with the `make > > rpm' >> >> command to the makefiles of Doxygen) I compiled under the Fedora > > distro, >> >> which had a dynamic libpng installed. I always had the linking > > process >> >> dynamically link to the libpng in the Fedora distro. The libpng > > dynamic >> >> linking enhancement was not included in the master repository because >> Dimitri and I found only a small speed difference when using libpng > > that >> >> some distros provided. >> >> Can cmake do a configure process just like what is currently done to > > compile >> >> the doxygen source code? I really do think it is best to link > > dynamically >> >> to the common libraries that almost every distro has. You run into > > less >> >> problems this way, especially considering the example that the binary > > form >> >> of the standard C/C++ libraries do differ from distro to distro. >> >> >> Kevin McBride >> dol...@ai... >> >> >> -----Original Message----- >> From: Robert Dailey <rcd...@gm...> >> To: Kevin McBride <dol...@ai...> >> Cc: Doxygen <do...@gm...>; Doxygen Developers >> <dox...@li...> >> Sent: Sat, Jun 8, 2013 4:43 pm >> Subject: Re: [Doxygen-develop] CMake >> >> Suppose you have a library on linux called D. The dependency tree is > > as >> >> follows: >> >> D -> B, C >> B -> A >> >> So basically: D depends on libraries B & C, while library B depends on >> library A >> >> In this case, you'd compile all 4 libraries on your target toolchain >> and architecture (GCC + Intel) and put that in your CMake repository. >> Would these 4 binaries not be usable on multiple distros? As long as >> the kernel is the same (or at least compatible), it should work fine. >> The only edge case I can think of is if the kernel is vastly different >> on each distro, meaning things like the memory manager changes and >> thus libraries would need to be recompiled for each kernel. >> >> On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> > > wrote: >>> >>> >>> Robert, >>> >>> Unlike Windows, Linux is written in a way that allows many different >>> "distros" to be written. Some people prefer not to compile packages >> >> >> from >>> >>> >>> sources (Fedora is good for these people). Others that prefer to >> >> >> compile >>> >>> >>> from sources would typically use a distro that is more friendly to >>> developers. >>> >>> With such differences, it is not wise to use one binary for all Linux >>> distros. In fact, new stuff that gets compiled would be replaced >> >> >> with old >>> >>> >>> dependencies, resulting in chaos for developers as they try to stick >> >> >> with >>> >>> >>> their preferred versions of libraries and programs they have > > compiled. >>> >>> >>> I once thought the autotools were good for doxygen, but the autotools >> >> >> are >>> >>> >>> good only on *nix platforms. They do not perform well on Windows. >>> >>> Hope this brief explanation about Linux helps. >>> >>> Kevin McBride >>> dol...@ai... >>> >>> >>> >>> -----Original Message----- >>> From: Robert Dailey <rcd...@gm...> >>> To: Dimitri van Heesch <do...@gm...> >>> Cc: Doxygen Developers <dox...@li...> >>> Sent: Sat, Jun 8, 2013 3:46 pm >>> Subject: Re: [Doxygen-develop] CMake >>> >>> I'd have to rewrite the framework to handle the special case package >>> handling, which would be significant work for what might be little >>> gain. The system would have to be used for all platforms as it >>> currently is. I'm not a Unix developer, so I'm not sure why Unix > > would >>> >>> be more difficult than windows. For example, pretty much every linux >>> distro I know supports GCC with Intel architecture. Even if you need >>> to support GCC + ARM, couldn't you easily maintain 2 sets of packages >>> on Unix (That would be: GCC+x86 and GCC+ARM)? >>> >>> Another reason why I prefer this approach on linux is because when > > you >>> >>> download libs through package managers on Linux, your build system >>> can't really control what version you have. With this approach, we > > are >>> >>> in control of the packages, so we can guarantee the versions of our >>> libs we use are those we have tested with. When we upgrade a library, >>> we can perform regression testing and update the package system on > > the >>> >>> CMake side to use the new version. >>> >>> At this point I'd just like a bit of education on where the > > complexity >>> >>> lies on the unix side. Teach me a little and I might be able to come >>> up with some ideas for you :) >>> >>> Thanks. >>> >>> On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch > > <do...@gm...> >>> >>> wrote: >>>> >>>> >>>> >>>> Hi Robert, >>>> >>>> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> >>> >>> >>> >>> wrote: >>>> >>>> >>>> >>>> >>>>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey >>> >>> >>> >>> <rcd...@gm...> wrote: >>>>>> >>>>>> >>>>>> >>>>>> Starting a new discussion thread here in the dev mailing list for >>>>>> CMake support. I'll be working on this over on my github fork: >>>>>> https://github.com/rcdailey/doxygen >>>>>> >>>>>> I'll be spending my spare time on this so please forgive any slow >>> >>> >>> >>> progress :) >>>>> >>>>> >>>>> >>>>> >>>>> Concerning third party dependencies that you do not build yourself >> >> >> as >>>>> >>>>> >>>>> part of your make command, would you be able to maintain your own >>>>> binaries for these in a repository? >>>>> >>>>> I already have a CMake framework that I can drop into doxygen and >>> >>> >>> >>> use. >>>>> >>>>> >>>>> >>>>> To be the most platform agnostic, I have set it up to download >>>>> archives of precompiled binaries for third party libraries from an >>>>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>>>> cache). Basically for every platform or toolchain you plan to build >>>>> doxygen on or with, you will need to have include files + binaries >> >> >> in >>>>> >>>>> >>>>> an archive. Those will sit in a repository and the CMake scripts >> >> >> will >>>>> >>>>> >>>>> download them, extract them, and automatically setup include >>>>> directories and dependencies for you. >>>>> >>>>> There are a couple of benefits to having this approach: >>>>> 1. No need to search for these libraries on the system. The CMake >>>>> scripts will always be able to guarantee that they are on the >> >> >> system >>>>> >>>>> >>>>> since it will be downloading them from a repository you maintain. >>>>> 2. Easier for new developers to just pick up the code and start >>>>> building, since they do not have to spend time building libraries. >>>>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>>>> just another dependency to worry about) mechanism, so this makes up >>>>> for that and makes it super easy to get a build started on Windows. >>>>> >>>>> The downside, of course, is that this can become a maintenance >>> >>> >>> >>> problem >>>>> >>>>> >>>>> >>>>> if you have a ton of libraries and/or platforms or toolchains to >>>>> support. >>>>> >>>>> Let me know how you want to approach this, as it will deeply impact >>>>> the work. Personally I suggest we take this approach, assuming you >>> >>> >>> >>> can >>>>> >>>>> >>>>> >>>>> setup an FTP/HTTP server somewhere to pull down the archives. I > > will >>>>> >>>>> also post this in the dev mailing list, as I have created a >> >> >> dedicated >>>>> >>>>> >>>>> thread there for CMake discussion. Join me there! >>>> >>>> >>>> >>>> >>>> It is not problem for me to host the packages, and I do need them >>> >>> >>> >>> myself when I >>>> >>>> >>>> >>>> build a doxygen release. So for Windows (32bit/64bit + debug/release >>> >>> >>> >>> flavors) and >>>> >>>> >>>> >>>> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems > > like >>> >>> >>> >>> a good approach. >>>> >>>> >>>> >>>> For Linux, however, it would be better to depend on the packages > > that >>> >>> >>> >>> come with >>>> >>>> >>>> >>>> a distribution (there are too many distros to support). >>>> >>>> Can CMake be configured like that? or is it one approach or the > > other >>> >>> >>> >>> for all platforms. >>>> >>>> >>>> >>>> >>>> Regards, >>>> Dimitri >>>> >>> >>> >> > ------------------------------------------------------------------------- > >> >>> ----- >>> How ServiceNow helps IT people transform IT departments: >>> 1. A cloud service to automate IT design, transition and operations >>> 2. Dashboards that offer high-level views of enterprise services >>> 3. A single system of record for all IT processes >>> http://p.sf.net/sfu/servicenow-d2d-j >>> _______________________________________________ >>> Doxygen-develop mailing list >>> Dox...@li... >>> https://lists.sourceforge.net/lists/listinfo/doxygen-develop >>> >>> >>> >> >> >> > > > |
From: Dimitri v. H. <do...@gm...> - 2013-06-09 21:53:00
|
Hi Robert, On Jun 9, 2013, at 23:13 , Robert Dailey <rcd...@gm...> wrote: > The problem with depending on linux package managers is that you won't > be able to use those libraries on Windows or MacOS. So, as annoying as > it is, maintaining the packages yourself is the most portable > solution. Then CMake doesn't offer a solution for those. So it seems the solution will mostly be for Windows (MacOS can be treated as a Unix flavor as well). So maybe we should rethink what the problem is that we are trying to solve and if CMake is indeed the solution, or if adding some extra rules to the Doxygen.vcproj file will do just as well. > > What linux platforms/distros does Doxygen need to support? Well there are many Linux distro's that now bundle doxygen (Ubuntu, Fedora, RedHat, Mint, Debian, Arch, Gentoo, etc.) and then there is BSD (Free, Net, Open, DragonFly flavors) and Solaris (Sparc/x86) and maybe some more exotic Unix flavors. I do not provide packages for any of those, but package maintainers should be able to build doxygen using the other packages. > Which third party libraries are we referring to? For building: perl, python, flex, bison, sed For compiling: Qt4 (for Doxywizard), Xapian (for doxysearch), libclang (for clang support), libmd5 (now bundled with doxygen). Install time: LaTeX for generating the manual. Runtime (optional): Graphviz, mscgen, bibtex, epstopdf, dvips Note that a number of these package depend on other packages themselves. Regards, Dimitri > > On Sun, Jun 9, 2013 at 2:18 PM, Kevin McBride <dol...@ai...> wrote: >> Robert, >> >> That sounds like a good idea. I will leave the final decision to Dimitri, >> as he is the one who will have to commit the changes to GIT. >> >> >> Kevin McBride >> dol...@ai... >> >> >> -----Original Message----- >> From: Robert Dailey <rcd...@gm...> >> To: Kevin McBride <dol...@ai...> >> Cc: Doxygen <do...@gm...>; Doxygen Developers >> <dox...@li...> >> Sent: Sun, Jun 9, 2013 1:45 pm >> Subject: Re: [Doxygen-develop] CMake >> >> I think what I will do is redesign my framework to take a hybrid >> approach. I have a manifest file where you define the third party >> libraries plus their versions. I will change this so you define >> whether or not to download from a custom repository or download from >> package manager. In the latter case, you will need to define a script >> that takes a library name and version. This script will execute the >> platform's package manager to make sure that the appropriate includes >> and binaries are downloaded (or compiled) as part of the process CMake >> goes through to prepare third party libs for usage. >> >> I know that most linux distros have very different package managers. >> Syntax is different, and some download binaries while others download >> source. The source case will be difficult, as this custom script will >> not only need to invoke the platform's package manager to download the >> source but also build each package the moment it is downloaded in some >> uniform way. Does this seem feasible? >> >> On Sat, Jun 8, 2013 at 6:42 PM, Kevin McBride <dol...@ai...> wrote: >>> >>> Another area to consider is the standard C/C++ libraries. I have >> >> read the >>> >>> info behind the libraries, and found that this is where you could run >> >> into >>> >>> problems, as every std library is configured differently to make >> >> system >>> >>> calls to the Linux kernel. It is much better to compile dependencies >> >> during >>> >>> the actual building process (like the QTools package in doxygen's >> >> source >>> >>> code). >>> >>> When I used to compile RPMs (I was the one who came up with the `make >> >> rpm' >>> >>> command to the makefiles of Doxygen) I compiled under the Fedora >> >> distro, >>> >>> which had a dynamic libpng installed. I always had the linking >> >> process >>> >>> dynamically link to the libpng in the Fedora distro. The libpng >> >> dynamic >>> >>> linking enhancement was not included in the master repository because >>> Dimitri and I found only a small speed difference when using libpng >> >> that >>> >>> some distros provided. >>> >>> Can cmake do a configure process just like what is currently done to >> >> compile >>> >>> the doxygen source code? I really do think it is best to link >> >> dynamically >>> >>> to the common libraries that almost every distro has. You run into >> >> less >>> >>> problems this way, especially considering the example that the binary >> >> form >>> >>> of the standard C/C++ libraries do differ from distro to distro. >>> >>> >>> Kevin McBride >>> dol...@ai... >>> >>> >>> -----Original Message----- >>> From: Robert Dailey <rcd...@gm...> >>> To: Kevin McBride <dol...@ai...> >>> Cc: Doxygen <do...@gm...>; Doxygen Developers >>> <dox...@li...> >>> Sent: Sat, Jun 8, 2013 4:43 pm >>> Subject: Re: [Doxygen-develop] CMake >>> >>> Suppose you have a library on linux called D. The dependency tree is >> >> as >>> >>> follows: >>> >>> D -> B, C >>> B -> A >>> >>> So basically: D depends on libraries B & C, while library B depends on >>> library A >>> >>> In this case, you'd compile all 4 libraries on your target toolchain >>> and architecture (GCC + Intel) and put that in your CMake repository. >>> Would these 4 binaries not be usable on multiple distros? As long as >>> the kernel is the same (or at least compatible), it should work fine. >>> The only edge case I can think of is if the kernel is vastly different >>> on each distro, meaning things like the memory manager changes and >>> thus libraries would need to be recompiled for each kernel. >>> >>> On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> >> >> wrote: >>>> >>>> >>>> Robert, >>>> >>>> Unlike Windows, Linux is written in a way that allows many different >>>> "distros" to be written. Some people prefer not to compile packages >>> >>> >>> from >>>> >>>> >>>> sources (Fedora is good for these people). Others that prefer to >>> >>> >>> compile >>>> >>>> >>>> from sources would typically use a distro that is more friendly to >>>> developers. >>>> >>>> With such differences, it is not wise to use one binary for all Linux >>>> distros. In fact, new stuff that gets compiled would be replaced >>> >>> >>> with old >>>> >>>> >>>> dependencies, resulting in chaos for developers as they try to stick >>> >>> >>> with >>>> >>>> >>>> their preferred versions of libraries and programs they have >> >> compiled. >>>> >>>> >>>> I once thought the autotools were good for doxygen, but the autotools >>> >>> >>> are >>>> >>>> >>>> good only on *nix platforms. They do not perform well on Windows. >>>> >>>> Hope this brief explanation about Linux helps. >>>> >>>> Kevin McBride >>>> dol...@ai... >>>> >>>> >>>> >>>> -----Original Message----- >>>> From: Robert Dailey <rcd...@gm...> >>>> To: Dimitri van Heesch <do...@gm...> >>>> Cc: Doxygen Developers <dox...@li...> >>>> Sent: Sat, Jun 8, 2013 3:46 pm >>>> Subject: Re: [Doxygen-develop] CMake >>>> >>>> I'd have to rewrite the framework to handle the special case package >>>> handling, which would be significant work for what might be little >>>> gain. The system would have to be used for all platforms as it >>>> currently is. I'm not a Unix developer, so I'm not sure why Unix >> >> would >>>> >>>> be more difficult than windows. For example, pretty much every linux >>>> distro I know supports GCC with Intel architecture. Even if you need >>>> to support GCC + ARM, couldn't you easily maintain 2 sets of packages >>>> on Unix (That would be: GCC+x86 and GCC+ARM)? >>>> >>>> Another reason why I prefer this approach on linux is because when >> >> you >>>> >>>> download libs through package managers on Linux, your build system >>>> can't really control what version you have. With this approach, we >> >> are >>>> >>>> in control of the packages, so we can guarantee the versions of our >>>> libs we use are those we have tested with. When we upgrade a library, >>>> we can perform regression testing and update the package system on >> >> the >>>> >>>> CMake side to use the new version. >>>> >>>> At this point I'd just like a bit of education on where the >> >> complexity >>>> >>>> lies on the unix side. Teach me a little and I might be able to come >>>> up with some ideas for you :) >>>> >>>> Thanks. >>>> >>>> On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch >> >> <do...@gm...> >>>> >>>> wrote: >>>>> >>>>> >>>>> >>>>> Hi Robert, >>>>> >>>>> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> >>>> >>>> >>>> >>>> wrote: >>>>> >>>>> >>>>> >>>>> >>>>>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey >>>> >>>> >>>> >>>> <rcd...@gm...> wrote: >>>>>>> >>>>>>> >>>>>>> >>>>>>> Starting a new discussion thread here in the dev mailing list for >>>>>>> CMake support. I'll be working on this over on my github fork: >>>>>>> https://github.com/rcdailey/doxygen >>>>>>> >>>>>>> I'll be spending my spare time on this so please forgive any slow >>>> >>>> >>>> >>>> progress :) >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> Concerning third party dependencies that you do not build yourself >>> >>> >>> as >>>>>> >>>>>> >>>>>> part of your make command, would you be able to maintain your own >>>>>> binaries for these in a repository? >>>>>> >>>>>> I already have a CMake framework that I can drop into doxygen and >>>> >>>> >>>> >>>> use. >>>>>> >>>>>> >>>>>> >>>>>> To be the most platform agnostic, I have set it up to download >>>>>> archives of precompiled binaries for third party libraries from an >>>>>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>>>>> cache). Basically for every platform or toolchain you plan to build >>>>>> doxygen on or with, you will need to have include files + binaries >>> >>> >>> in >>>>>> >>>>>> >>>>>> an archive. Those will sit in a repository and the CMake scripts >>> >>> >>> will >>>>>> >>>>>> >>>>>> download them, extract them, and automatically setup include >>>>>> directories and dependencies for you. >>>>>> >>>>>> There are a couple of benefits to having this approach: >>>>>> 1. No need to search for these libraries on the system. The CMake >>>>>> scripts will always be able to guarantee that they are on the >>> >>> >>> system >>>>>> >>>>>> >>>>>> since it will be downloading them from a repository you maintain. >>>>>> 2. Easier for new developers to just pick up the code and start >>>>>> building, since they do not have to spend time building libraries. >>>>>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>>>>> just another dependency to worry about) mechanism, so this makes up >>>>>> for that and makes it super easy to get a build started on Windows. >>>>>> >>>>>> The downside, of course, is that this can become a maintenance >>>> >>>> >>>> >>>> problem >>>>>> >>>>>> >>>>>> >>>>>> if you have a ton of libraries and/or platforms or toolchains to >>>>>> support. >>>>>> >>>>>> Let me know how you want to approach this, as it will deeply impact >>>>>> the work. Personally I suggest we take this approach, assuming you >>>> >>>> >>>> >>>> can >>>>>> >>>>>> >>>>>> >>>>>> setup an FTP/HTTP server somewhere to pull down the archives. I >> >> will >>>>>> >>>>>> also post this in the dev mailing list, as I have created a >>> >>> >>> dedicated >>>>>> >>>>>> >>>>>> thread there for CMake discussion. Join me there! >>>>> >>>>> >>>>> >>>>> >>>>> It is not problem for me to host the packages, and I do need them >>>> >>>> >>>> >>>> myself when I >>>>> >>>>> >>>>> >>>>> build a doxygen release. So for Windows (32bit/64bit + debug/release >>>> >>>> >>>> >>>> flavors) and >>>>> >>>>> >>>>> >>>>> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems >> >> like >>>> >>>> >>>> >>>> a good approach. >>>>> >>>>> >>>>> >>>>> For Linux, however, it would be better to depend on the packages >> >> that >>>> >>>> >>>> >>>> come with >>>>> >>>>> >>>>> >>>>> a distribution (there are too many distros to support). >>>>> >>>>> Can CMake be configured like that? or is it one approach or the >> >> other >>>> >>>> >>>> >>>> for all platforms. >>>>> >>>>> >>>>> >>>>> >>>>> Regards, >>>>> Dimitri >>>>> >>>> >>>> >>> >> ------------------------------------------------------------------------- >> >>> >>>> ----- >>>> How ServiceNow helps IT people transform IT departments: >>>> 1. A cloud service to automate IT design, transition and operations >>>> 2. Dashboards that offer high-level views of enterprise services >>>> 3. A single system of record for all IT processes >>>> http://p.sf.net/sfu/servicenow-d2d-j >>>> _______________________________________________ >>>> Doxygen-develop mailing list >>>> Dox...@li... >>>> https://lists.sourceforge.net/lists/listinfo/doxygen-develop >>>> >>>> >>>> >>> >>> >>> >> >> >> |
From: Robert D. <rcd...@gm...> - 2013-06-10 17:13:30
|
The only other thing I can suggest to remedy the dependencies problem is to bundle all dependency source code with Doxygen and build it along with everything else, much like you already do for libmd5. This also would include any dependencies of dependencies, transitively. This doesn't seem like it would be too big of a deal except for QT, which is pretty huge. Thoughts? On Sun, Jun 9, 2013 at 4:52 PM, Dimitri van Heesch <do...@gm...> wrote: > Hi Robert, > > On Jun 9, 2013, at 23:13 , Robert Dailey <rcd...@gm...> wrote: > >> The problem with depending on linux package managers is that you won't >> be able to use those libraries on Windows or MacOS. So, as annoying as >> it is, maintaining the packages yourself is the most portable >> solution. > > Then CMake doesn't offer a solution for those. So it seems the solution > will mostly be for Windows (MacOS can be treated as a Unix flavor as well). > > So maybe we should rethink what the problem is that we are trying to > solve and if CMake is indeed the solution, or if adding some extra rules > to the Doxygen.vcproj file will do just as well. > >> >> What linux platforms/distros does Doxygen need to support? > > Well there are many Linux distro's that now bundle doxygen > (Ubuntu, Fedora, RedHat, Mint, Debian, Arch, Gentoo, etc.) and > then there is BSD (Free, Net, Open, DragonFly flavors) and Solaris (Sparc/x86) > and maybe some more exotic Unix flavors. > > I do not provide packages for any of those, but package maintainers > should be able to build doxygen using the other packages. > >> Which third party libraries are we referring to? > > For building: perl, python, flex, bison, sed > For compiling: Qt4 (for Doxywizard), Xapian (for doxysearch), libclang (for clang support), libmd5 (now bundled with doxygen). > Install time: LaTeX for generating the manual. > Runtime (optional): Graphviz, mscgen, bibtex, epstopdf, dvips > > Note that a number of these package depend on other packages themselves. > > Regards, > Dimitri > >> >> On Sun, Jun 9, 2013 at 2:18 PM, Kevin McBride <dol...@ai...> wrote: >>> Robert, >>> >>> That sounds like a good idea. I will leave the final decision to Dimitri, >>> as he is the one who will have to commit the changes to GIT. >>> >>> >>> Kevin McBride >>> dol...@ai... >>> >>> >>> -----Original Message----- >>> From: Robert Dailey <rcd...@gm...> >>> To: Kevin McBride <dol...@ai...> >>> Cc: Doxygen <do...@gm...>; Doxygen Developers >>> <dox...@li...> >>> Sent: Sun, Jun 9, 2013 1:45 pm >>> Subject: Re: [Doxygen-develop] CMake >>> >>> I think what I will do is redesign my framework to take a hybrid >>> approach. I have a manifest file where you define the third party >>> libraries plus their versions. I will change this so you define >>> whether or not to download from a custom repository or download from >>> package manager. In the latter case, you will need to define a script >>> that takes a library name and version. This script will execute the >>> platform's package manager to make sure that the appropriate includes >>> and binaries are downloaded (or compiled) as part of the process CMake >>> goes through to prepare third party libs for usage. >>> >>> I know that most linux distros have very different package managers. >>> Syntax is different, and some download binaries while others download >>> source. The source case will be difficult, as this custom script will >>> not only need to invoke the platform's package manager to download the >>> source but also build each package the moment it is downloaded in some >>> uniform way. Does this seem feasible? >>> >>> On Sat, Jun 8, 2013 at 6:42 PM, Kevin McBride <dol...@ai...> wrote: >>>> >>>> Another area to consider is the standard C/C++ libraries. I have >>> >>> read the >>>> >>>> info behind the libraries, and found that this is where you could run >>> >>> into >>>> >>>> problems, as every std library is configured differently to make >>> >>> system >>>> >>>> calls to the Linux kernel. It is much better to compile dependencies >>> >>> during >>>> >>>> the actual building process (like the QTools package in doxygen's >>> >>> source >>>> >>>> code). >>>> >>>> When I used to compile RPMs (I was the one who came up with the `make >>> >>> rpm' >>>> >>>> command to the makefiles of Doxygen) I compiled under the Fedora >>> >>> distro, >>>> >>>> which had a dynamic libpng installed. I always had the linking >>> >>> process >>>> >>>> dynamically link to the libpng in the Fedora distro. The libpng >>> >>> dynamic >>>> >>>> linking enhancement was not included in the master repository because >>>> Dimitri and I found only a small speed difference when using libpng >>> >>> that >>>> >>>> some distros provided. >>>> >>>> Can cmake do a configure process just like what is currently done to >>> >>> compile >>>> >>>> the doxygen source code? I really do think it is best to link >>> >>> dynamically >>>> >>>> to the common libraries that almost every distro has. You run into >>> >>> less >>>> >>>> problems this way, especially considering the example that the binary >>> >>> form >>>> >>>> of the standard C/C++ libraries do differ from distro to distro. >>>> >>>> >>>> Kevin McBride >>>> dol...@ai... >>>> >>>> >>>> -----Original Message----- >>>> From: Robert Dailey <rcd...@gm...> >>>> To: Kevin McBride <dol...@ai...> >>>> Cc: Doxygen <do...@gm...>; Doxygen Developers >>>> <dox...@li...> >>>> Sent: Sat, Jun 8, 2013 4:43 pm >>>> Subject: Re: [Doxygen-develop] CMake >>>> >>>> Suppose you have a library on linux called D. The dependency tree is >>> >>> as >>>> >>>> follows: >>>> >>>> D -> B, C >>>> B -> A >>>> >>>> So basically: D depends on libraries B & C, while library B depends on >>>> library A >>>> >>>> In this case, you'd compile all 4 libraries on your target toolchain >>>> and architecture (GCC + Intel) and put that in your CMake repository. >>>> Would these 4 binaries not be usable on multiple distros? As long as >>>> the kernel is the same (or at least compatible), it should work fine. >>>> The only edge case I can think of is if the kernel is vastly different >>>> on each distro, meaning things like the memory manager changes and >>>> thus libraries would need to be recompiled for each kernel. >>>> >>>> On Sat, Jun 8, 2013 at 3:10 PM, Kevin McBride <dol...@ai...> >>> >>> wrote: >>>>> >>>>> >>>>> Robert, >>>>> >>>>> Unlike Windows, Linux is written in a way that allows many different >>>>> "distros" to be written. Some people prefer not to compile packages >>>> >>>> >>>> from >>>>> >>>>> >>>>> sources (Fedora is good for these people). Others that prefer to >>>> >>>> >>>> compile >>>>> >>>>> >>>>> from sources would typically use a distro that is more friendly to >>>>> developers. >>>>> >>>>> With such differences, it is not wise to use one binary for all Linux >>>>> distros. In fact, new stuff that gets compiled would be replaced >>>> >>>> >>>> with old >>>>> >>>>> >>>>> dependencies, resulting in chaos for developers as they try to stick >>>> >>>> >>>> with >>>>> >>>>> >>>>> their preferred versions of libraries and programs they have >>> >>> compiled. >>>>> >>>>> >>>>> I once thought the autotools were good for doxygen, but the autotools >>>> >>>> >>>> are >>>>> >>>>> >>>>> good only on *nix platforms. They do not perform well on Windows. >>>>> >>>>> Hope this brief explanation about Linux helps. >>>>> >>>>> Kevin McBride >>>>> dol...@ai... >>>>> >>>>> >>>>> >>>>> -----Original Message----- >>>>> From: Robert Dailey <rcd...@gm...> >>>>> To: Dimitri van Heesch <do...@gm...> >>>>> Cc: Doxygen Developers <dox...@li...> >>>>> Sent: Sat, Jun 8, 2013 3:46 pm >>>>> Subject: Re: [Doxygen-develop] CMake >>>>> >>>>> I'd have to rewrite the framework to handle the special case package >>>>> handling, which would be significant work for what might be little >>>>> gain. The system would have to be used for all platforms as it >>>>> currently is. I'm not a Unix developer, so I'm not sure why Unix >>> >>> would >>>>> >>>>> be more difficult than windows. For example, pretty much every linux >>>>> distro I know supports GCC with Intel architecture. Even if you need >>>>> to support GCC + ARM, couldn't you easily maintain 2 sets of packages >>>>> on Unix (That would be: GCC+x86 and GCC+ARM)? >>>>> >>>>> Another reason why I prefer this approach on linux is because when >>> >>> you >>>>> >>>>> download libs through package managers on Linux, your build system >>>>> can't really control what version you have. With this approach, we >>> >>> are >>>>> >>>>> in control of the packages, so we can guarantee the versions of our >>>>> libs we use are those we have tested with. When we upgrade a library, >>>>> we can perform regression testing and update the package system on >>> >>> the >>>>> >>>>> CMake side to use the new version. >>>>> >>>>> At this point I'd just like a bit of education on where the >>> >>> complexity >>>>> >>>>> lies on the unix side. Teach me a little and I might be able to come >>>>> up with some ideas for you :) >>>>> >>>>> Thanks. >>>>> >>>>> On Sat, Jun 8, 2013 at 4:59 AM, Dimitri van Heesch >>> >>> <do...@gm...> >>>>> >>>>> wrote: >>>>>> >>>>>> >>>>>> >>>>>> Hi Robert, >>>>>> >>>>>> On Jun 7, 2013, at 2:21 , Robert Dailey <rcd...@gm...> >>>>> >>>>> >>>>> >>>>> wrote: >>>>>> >>>>>> >>>>>> >>>>>> >>>>>>> On Thu, Jun 6, 2013 at 4:03 PM, Robert Dailey >>>>> >>>>> >>>>> >>>>> <rcd...@gm...> wrote: >>>>>>>> >>>>>>>> >>>>>>>> >>>>>>>> Starting a new discussion thread here in the dev mailing list for >>>>>>>> CMake support. I'll be working on this over on my github fork: >>>>>>>> https://github.com/rcdailey/doxygen >>>>>>>> >>>>>>>> I'll be spending my spare time on this so please forgive any slow >>>>> >>>>> >>>>> >>>>> progress :) >>>>>>> >>>>>>> >>>>>>> >>>>>>> >>>>>>> Concerning third party dependencies that you do not build yourself >>>> >>>> >>>> as >>>>>>> >>>>>>> >>>>>>> part of your make command, would you be able to maintain your own >>>>>>> binaries for these in a repository? >>>>>>> >>>>>>> I already have a CMake framework that I can drop into doxygen and >>>>> >>>>> >>>>> >>>>> use. >>>>>>> >>>>>>> >>>>>>> >>>>>>> To be the most platform agnostic, I have set it up to download >>>>>>> archives of precompiled binaries for third party libraries from an >>>>>>> FTP/HTTP/Windows file share of your choosing (configurable in CMake >>>>>>> cache). Basically for every platform or toolchain you plan to build >>>>>>> doxygen on or with, you will need to have include files + binaries >>>> >>>> >>>> in >>>>>>> >>>>>>> >>>>>>> an archive. Those will sit in a repository and the CMake scripts >>>> >>>> >>>> will >>>>>>> >>>>>>> >>>>>>> download them, extract them, and automatically setup include >>>>>>> directories and dependencies for you. >>>>>>> >>>>>>> There are a couple of benefits to having this approach: >>>>>>> 1. No need to search for these libraries on the system. The CMake >>>>>>> scripts will always be able to guarantee that they are on the >>>> >>>> >>>> system >>>>>>> >>>>>>> >>>>>>> since it will be downloading them from a repository you maintain. >>>>>>> 2. Easier for new developers to just pick up the code and start >>>>>>> building, since they do not have to spend time building libraries. >>>>>>> 3. Windows doesn't have an apt-get (unless you use cygwin, which is >>>>>>> just another dependency to worry about) mechanism, so this makes up >>>>>>> for that and makes it super easy to get a build started on Windows. >>>>>>> >>>>>>> The downside, of course, is that this can become a maintenance >>>>> >>>>> >>>>> >>>>> problem >>>>>>> >>>>>>> >>>>>>> >>>>>>> if you have a ton of libraries and/or platforms or toolchains to >>>>>>> support. >>>>>>> >>>>>>> Let me know how you want to approach this, as it will deeply impact >>>>>>> the work. Personally I suggest we take this approach, assuming you >>>>> >>>>> >>>>> >>>>> can >>>>>>> >>>>>>> >>>>>>> >>>>>>> setup an FTP/HTTP server somewhere to pull down the archives. I >>> >>> will >>>>>>> >>>>>>> also post this in the dev mailing list, as I have created a >>>> >>>> >>>> dedicated >>>>>>> >>>>>>> >>>>>>> thread there for CMake discussion. Join me there! >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> It is not problem for me to host the packages, and I do need them >>>>> >>>>> >>>>> >>>>> myself when I >>>>>> >>>>>> >>>>>> >>>>>> build a doxygen release. So for Windows (32bit/64bit + debug/release >>>>> >>>>> >>>>> >>>>> flavors) and >>>>>> >>>>>> >>>>>> >>>>>> MacOSX (32bit+64bit intel fat binaries for OSX 10.5+) this seems >>> >>> like >>>>> >>>>> >>>>> >>>>> a good approach. >>>>>> >>>>>> >>>>>> >>>>>> For Linux, however, it would be better to depend on the packages >>> >>> that >>>>> >>>>> >>>>> >>>>> come with >>>>>> >>>>>> >>>>>> >>>>>> a distribution (there are too many distros to support). >>>>>> >>>>>> Can CMake be configured like that? or is it one approach or the >>> >>> other >>>>> >>>>> >>>>> >>>>> for all platforms. >>>>>> >>>>>> >>>>>> >>>>>> >>>>>> Regards, >>>>>> Dimitri >>>>>> >>>>> >>>>> >>>> >>> ------------------------------------------------------------------------- >>> >>>> >>>>> ----- >>>>> How ServiceNow helps IT people transform IT departments: >>>>> 1. A cloud service to automate IT design, transition and operations >>>>> 2. Dashboards that offer high-level views of enterprise services >>>>> 3. A single system of record for all IT processes >>>>> http://p.sf.net/sfu/servicenow-d2d-j >>>>> _______________________________________________ >>>>> Doxygen-develop mailing list >>>>> Dox...@li... >>>>> https://lists.sourceforge.net/lists/listinfo/doxygen-develop >>>>> >>>>> >>>>> >>>> >>>> >>>> >>> >>> >>> > |
From: Markus G. <mg...@we...> - 2013-06-11 18:06:31
|
Robert, > The only other thing I can suggest to remedy the dependencies problem > is to bundle all dependency source code with Doxygen and build it > along with everything else, much like you already do for libmd5. This > also would include any dependencies of dependencies, transitively. > This doesn't seem like it would be too big of a deal except for QT, > which is pretty huge. Thoughts? I strongly recommend to *not* create such a monster tarball because of various reasons: - Most importantly, it will become a maintenance nightmare. Someone needs to keep track of new versions of all the dependency packages, and based on the changelog decide whether an update is required for doxygen (e.g., when a security issue has been fixed) or not. - Additional burden is put onto the distro package maintainers, as they will most likely patch the sources to use the system-provided libraries instead of the included ones (this is, e.g., already the case for libpng on Debian and maybe other distros). - It will unnecessarily increase build times for 99% of the users which have the common libraries already installed (not such a big deal if parallel builds are working, but Qt will be a killer). >>>> I know that most linux distros have very different package managers. >>>> Syntax is different, and some download binaries while others download >>>> source. The source case will be difficult, as this custom script will >>>> not only need to invoke the platform's package manager to download the >>>> source but also build each package the moment it is downloaded in some >>>> uniform way. Does this seem feasible? Builds are normally done as ordinary user, while installation of packages is done as root. So, trying to install packages using a package manager at build time of doxygen sounds crazy to me. >>>>> Another area to consider is the standard C/C++ libraries. I have >>>>> read the info behind the libraries, and found that this is where you >>>>> could run into problems, as every std library is configured differently >>>>> to make system calls to the Linux kernel. Yep, libstdc++ would really be an issue. Typically, you can build a binary against a version of libstdc++ and then run it using a newer one, but not vice versa. Therefore, the best thing to do is to always use the system-installed C++ library. >>>>> It is much better to compile dependencies during the actual building >>>>> process (like the QTools package in doxygen's source code). Hmm... I disagree. Often, the dependencies are used by multiple packages, i.e., you don't want to compile/install them multiple times. And as soon as they are shared libraries, you might end up in a big mess... My recommendation would be to stick to proven best practices: Let the configure step try to determine whether an appropriate version of a dependency library is installed on the system, use it if available, and complain (or disable optional functionality) otherwise. Of course, there should also be a way for the user to specify an alternate installation path to override the auto-detection and a list of all dependencies including URLs in the INSTALL file (which I think is already the case). Best, Markus |
From: Michael S. <ms...@re...> - 2013-06-19 10:32:59
|
On 11/06/13 20:05, Markus Geimer wrote: >> The only other thing I can suggest to remedy the dependencies problem >> is to bundle all dependency source code with Doxygen and build it >> along with everything else, much like you already do for libmd5. This >> also would include any dependencies of dependencies, transitively. >> This doesn't seem like it would be too big of a deal except for QT, >> which is pretty huge. Thoughts? > > I strongly recommend to *not* create such a monster tarball because of > various reasons: > > - Most importantly, it will become a maintenance nightmare. Someone > needs to keep track of new versions of all the dependency packages, > and based on the changelog decide whether an update is required for > doxygen (e.g., when a security issue has been fixed) or not. > > - Additional burden is put onto the distro package maintainers, as > they will most likely patch the sources to use the system-provided > libraries instead of the included ones (this is, e.g., already the > case for libpng on Debian and maybe other distros). > > - It will unnecessarily increase build times for 99% of the users > which have the common libraries already installed (not such a big > deal if parallel builds are working, but Qt will be a killer). that's all very true. there is a sort of "compromise" between bundling stuff and not bundling stuff that is used by LibreOffice: we have dozens of external libraries that are bundled, and we can build all of these during the LibreOffice build, but they are not actually included in the git repo or in source tarballs. it works like this: 1. configure has for most externals a --with[out]-system-foo switch to determine if it should be bundled (defaults to bundled on Windows) 2. if configure says it's bundled, then a source tarball is automatically downloaded from somewhere.libreoffice.org 3. the tarball is unpacked, some patches are applied (most things don't build out of the box on all platforms), the external thing is built and its libraries copied to well known library dir of course Doxygen has a lot fewer dependencies and so a much smaller problem so i wouldn't recommend to copy the approach exactly but perhaps it can provide some inspiration. most likely it's not necessary to build bundled libraries on Linux, because package managers allow to easily install stuff there, it's more of a problem for Mac and Windows. >>>>>> Another area to consider is the standard C/C++ libraries. I have >>>>>> read the info behind the libraries, and found that this is where you >>>>>> could run into problems, as every std library is configured differently >>>>>> to make system calls to the Linux kernel. > > Yep, libstdc++ would really be an issue. Typically, you can build > a binary against a version of libstdc++ and then run it using a > newer one, but not vice versa. Therefore, the best thing to do is > to always use the system-installed C++ library. yes, if you want to distribute binaries of Doxygen that are supposed to run on any old Linux system you need to think about this; it's not just libstdc++ but also glibc, or well any system library really. for LibreOffice we build the upstream release Linux binaries on a RHEL5 system (very old versions of everything) for that reason; for OpenOffice.org a similar approach was used with an NFS-mounted baseline of libraries and the GCC --sysroot (which prevents it from looking in standard system dirs). but given that there's a Doxygen package for every distro anyway and it's a developer focused tool (so users can be expected to build it from source if needed) i'm not sure why you'd bother to do this :) |
From: Anthony F. <ant...@gm...> - 2013-06-12 02:48:04
|
Robert, Markus, all -- On Tue, Jun 11, 2013 at 12:05 PM, Markus Geimer <mg...@we...> wrote: > Robert, > >> The only other thing I can suggest to remedy the dependencies problem >> is to bundle all dependency source code with Doxygen and build it >> along with everything else, much like you already do for libmd5. This >> also would include any dependencies of dependencies, transitively. >> This doesn't seem like it would be too big of a deal except for QT, >> which is pretty huge. Thoughts? > > I strongly recommend to *not* create such a monster tarball because of > various reasons: I very strongly concur. Robert's proposal (create private copies of everything) was the route taken by Chromium. Getting it accepted into Fedora has been delayed for years because the original packagers did not use the libraries already available on the platform. (Not to mention the fact that, with the "monster tarball" approach, you also need to keep up with security patches for all of those programs you copied...) More info on this particular case here: http://ostatic.com/blog/making-projects-easier-to-package-why-chromium-isnt-in-fedora > My recommendation would be to stick to proven best practices: Let the > configure step try to determine whether an appropriate version of a > dependency library is installed on the system, use it if available, > and complain (or disable optional functionality) otherwise. +1. I can't believe that CMake doesn't have this capability already, even if it's in the guise of "is this symbol available from any known library on the system?" Either way, I do wish Robert luck; CMake has the potential to improve cross-platform projects quite a bit, but I agree with Markus that bringing in tarballs of dependencies is not the right way. Best regards, Anthony Foiani |
From: Torbjörn K. <ope...@to...> - 2013-06-12 06:29:40
|
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Dear all On 12.06.2013 04:47, Anthony Foiani wrote: > Robert, Markus, all -- > > On Tue, Jun 11, 2013 at 12:05 PM, Markus Geimer <mg...@we...> > wrote: >> Robert, >> >>> The only other thing I can suggest to remedy the dependencies >>> problem is to bundle all dependency source code with Doxygen >>> and build it along with everything else, much like you already >>> do for libmd5. This also would include any dependencies of >>> dependencies, transitively. This doesn't seem like it would be >>> too big of a deal except for QT, which is pretty huge. >>> Thoughts? Only ever do this for libraries which do not provide pre-built Windows binaries or where most linux distributions do not provide packages (which is very very rare). And even then you really need to know what you are doing and should prefer to let the use resolve these dependencies on his own. He/she knows the system well better than you ever do via CMake. CMake's purpose is exactly to overcome the need to create such monstrosities of bundles and maintenance nightmares. >> >> I strongly recommend to *not* create such a monster tarball >> because of various reasons: > > I very strongly concur. > > Robert's proposal (create private copies of everything) was the > route taken by Chromium. Getting it accepted into Fedora has been > delayed for years because the original packagers did not use the > libraries already available on the platform. > > (Not to mention the fact that, with the "monster tarball" > approach, you also need to keep up with security patches for all of > those programs you copied...) > > More info on this particular case here: > http://ostatic.com/blog/making-projects-easier-to-package-why-chromium-isnt-in-fedora > > >> My recommendation would be to stick to proven best practices: Let >> the configure step try to determine whether an appropriate >> version of a dependency library is installed on the system, use >> it if available, and complain (or disable optional functionality) >> otherwise. > > +1. > > I can't believe that CMake doesn't have this capability already, > even if it's in the guise of "is this symbol available from any > known library on the system?" CMake does have such possibilities. It's called FIND_PACKAGE(mypackage REQUIRED) This will abort the whole configuration step if 'mypackage' is not available on the system (read: not found in system's paths). E.g. to require the Boost system and regex libraries and headers of version 1.50 and later one writes FIND_PACKAGE(Boost 1.50 REQUIRED COMPONENTS system regex) For a list of somehow inbuilt supported packages to search for see [1] and all items starting with 'Find'. In case the desired package is not in this list, one usually finds such via Google (or GitHub) easily. If you ever encounter a library where no FindX module is available, you should try detecting existence on the system via the low-level CMake commands FIND_LIBRARY and FIND_PROGRAM which are used by the FindX modules. [1] http://www.cmake.org/cmake/help/v2.8.11/cmake.html#section_StandardCMakeModules > > Either way, I do wish Robert luck; CMake has the potential to > improve cross-platform projects quite a bit, but I agree with > Markus that bringing in tarballs of dependencies is not the right > way. I sign this. > > Best regards, Anthony Foiani Cheers, Torbjörn Klatt -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.18 (GNU/Linux) Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ iQEcBAEBAgAGBQJRuBDpAAoJENyw9v81DsTG5PwIAKSjp8gClLd5gOGF8OIeo0QO lzc8G7Kvn6wwSpklJqiVKFE3FA3y3y1bnnOOj4nSCycLgJFUFUhaWekCYHYM0yWJ CCjj2EExFGs6J1gKkopInEmEhj5REhKJwTZzZjs88kGqVCbiZgE6LWkDTJd4PjE7 dSl+sLdJZRWyadzgyDpG6kWBs0eE0EKtT+Nql94nXrWTP1nDoSp/fQZevpyBP5++ fq2D9UufgpBkossXfEQN2MxwzEZduX4MmvUBd5PBKwleI+PXQAM86l+t09A5Hpiu I28+YUQQYStFD9ztgu91ps/Vs7dJxzlAvqs56PcdgNCnZTdjwt9JML8IjNzUiW8= =UNdQ -----END PGP SIGNATURE----- |
From: Robert D. <rcd...@gm...> - 2013-06-18 17:17:20
|
Sorry for the delay in response. So basically the problem is between Windows and Linux. On Windows, I do not have a package manager. It's a huge pain in the ass to download each dependency, as well as its dependencies (transitively) and build each individually. This takes sometimes days, and is impossible to do depending on what libs you use. If anyon tried to build SVN on Windows 5 years ago, they would know what I'm talking about. However on Linux, things are much easier in this respect. It's OK to let the user install the dependencies because they just type in apt-get and they are done. find_package() is ideally the cross-platform way of handling dependencies in CMake, but the problem is just that: Fulfilling the dependencies, as I noted above, is rather painful on Windows. Not only that, but CMake doesn't come pre-packaged with find modules for each and every library known to man. For any libraries that CMake does not know how to search for, you must write the corresponding find package script. Granted it's a one time thing, but adds the occasional extra maintenance. If we go with find_package in Cmake, which is the most-likely approach we will have to take, then we will be required to have *compiled* versions of dependencies on the system. Whether that means the package manager on that distro already downloads the binaries *or* the additional step of building them must be taken, is outside of the control of CMake. It'll have to be something that is taken care of externally (I think this is the case already, based on the feedback so far). Can anyone think of an easier way to setup dependencies on Windows? QT has an installer which is fine, and CMake also already has a find module for it that ships with CMake. What about the others? For example, if we depended on OpenSSL, it's rather painful to build on Windows outside of Cygwin (even with Cygwin, you still have to setup the environment to point to the correct compiler & stuff, much easier to run the Visual Studio bundled command prompt scripts). In those cases, getting up and running on Windows can still be a pain :( On Wed, Jun 12, 2013 at 1:10 AM, Torbjörn Klatt <ope...@to...> wrote: > -----BEGIN PGP SIGNED MESSAGE----- > Hash: SHA1 > > Dear all > > On 12.06.2013 04:47, Anthony Foiani wrote: >> Robert, Markus, all -- >> >> On Tue, Jun 11, 2013 at 12:05 PM, Markus Geimer <mg...@we...> >> wrote: >>> Robert, >>> >>>> The only other thing I can suggest to remedy the dependencies >>>> problem is to bundle all dependency source code with Doxygen >>>> and build it along with everything else, much like you already >>>> do for libmd5. This also would include any dependencies of >>>> dependencies, transitively. This doesn't seem like it would be >>>> too big of a deal except for QT, which is pretty huge. >>>> Thoughts? > > Only ever do this for libraries which do not provide pre-built Windows > binaries or where most linux distributions do not provide packages > (which is very very rare). > And even then you really need to know what you are doing and should > prefer to let the use resolve these dependencies on his own. He/she > knows the system well better than you ever do via CMake. > > CMake's purpose is exactly to overcome the need to create such > monstrosities of bundles and maintenance nightmares. > >>> >>> I strongly recommend to *not* create such a monster tarball >>> because of various reasons: >> >> I very strongly concur. >> >> Robert's proposal (create private copies of everything) was the >> route taken by Chromium. Getting it accepted into Fedora has been >> delayed for years because the original packagers did not use the >> libraries already available on the platform. >> >> (Not to mention the fact that, with the "monster tarball" >> approach, you also need to keep up with security patches for all of >> those programs you copied...) >> >> More info on this particular case here: >> http://ostatic.com/blog/making-projects-easier-to-package-why-chromium-isnt-in-fedora >> >> >>> My recommendation would be to stick to proven best practices: Let >>> the configure step try to determine whether an appropriate >>> version of a dependency library is installed on the system, use >>> it if available, and complain (or disable optional functionality) >>> otherwise. >> >> +1. >> >> I can't believe that CMake doesn't have this capability already, >> even if it's in the guise of "is this symbol available from any >> known library on the system?" > > CMake does have such possibilities. It's called > > FIND_PACKAGE(mypackage REQUIRED) > > This will abort the whole configuration step if 'mypackage' is not > available on the system (read: not found in system's paths). > > E.g. to require the Boost system and regex libraries and headers of > version 1.50 and later one writes > > FIND_PACKAGE(Boost 1.50 REQUIRED COMPONENTS system regex) > > For a list of somehow inbuilt supported packages to search for see [1] > and all items starting with 'Find'. > In case the desired package is not in this list, one usually finds > such via Google (or GitHub) easily. > > If you ever encounter a library where no FindX module is available, > you should try detecting existence on the system via the low-level > CMake commands FIND_LIBRARY and FIND_PROGRAM which are used by the > FindX modules. > > [1] > http://www.cmake.org/cmake/help/v2.8.11/cmake.html#section_StandardCMakeModules > >> >> Either way, I do wish Robert luck; CMake has the potential to >> improve cross-platform projects quite a bit, but I agree with >> Markus that bringing in tarballs of dependencies is not the right >> way. > > I sign this. > >> >> Best regards, Anthony Foiani > > Cheers, > Torbjörn Klatt > > -----BEGIN PGP SIGNATURE----- > Version: GnuPG v2.0.18 (GNU/Linux) > Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ > > iQEcBAEBAgAGBQJRuBDpAAoJENyw9v81DsTG5PwIAKSjp8gClLd5gOGF8OIeo0QO > lzc8G7Kvn6wwSpklJqiVKFE3FA3y3y1bnnOOj4nSCycLgJFUFUhaWekCYHYM0yWJ > CCjj2EExFGs6J1gKkopInEmEhj5REhKJwTZzZjs88kGqVCbiZgE6LWkDTJd4PjE7 > dSl+sLdJZRWyadzgyDpG6kWBs0eE0EKtT+Nql94nXrWTP1nDoSp/fQZevpyBP5++ > fq2D9UufgpBkossXfEQN2MxwzEZduX4MmvUBd5PBKwleI+PXQAM86l+t09A5Hpiu > I28+YUQQYStFD9ztgu91ps/Vs7dJxzlAvqs56PcdgNCnZTdjwt9JML8IjNzUiW8= > =UNdQ > -----END PGP SIGNATURE----- > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > _______________________________________________ > Doxygen-develop mailing list > Dox...@li... > https://lists.sourceforge.net/lists/listinfo/doxygen-develop |
From: Bastiaan V. <Bas...@SA...> - 2013-06-18 20:25:47
|
Hi, I must admit that I haven´t studied this in great detail, but if most binary dependencies are available for windows from somewhere, so that cmake would work well for 90% of the dependencies, one possibility for the remaining 10% would be to distribute binaries on the doxygen site for the sole purpose that cmake can download them from there. What are the percentages actually? It may be an obvious thing to say, but I thought I´d bring it up nonetheless. Bastiaan. On 18-6-2013 19:17, Robert Dailey wrote: > Sorry for the delay in response. > > So basically the problem is between Windows and Linux. On Windows, I > do not have a package manager. It's a huge pain in the ass to download > each dependency, as well as its dependencies (transitively) and build > each individually. This takes sometimes days, and is impossible to do > depending on what libs you use. If anyon tried to build SVN on Windows > 5 years ago, they would know what I'm talking about. > > However on Linux, things are much easier in this respect. It's OK to > let the user install the dependencies because they just type in > apt-get and they are done. > > find_package() is ideally the cross-platform way of handling > dependencies in CMake, but the problem is just that: Fulfilling the > dependencies, as I noted above, is rather painful on Windows. Not only > that, but CMake doesn't come pre-packaged with find modules for each > and every library known to man. For any libraries that CMake does not > know how to search for, you must write the corresponding find package > script. Granted it's a one time thing, but adds the occasional extra > maintenance. > > If we go with find_package in Cmake, which is the most-likely approach > we will have to take, then we will be required to have *compiled* > versions of dependencies on the system. Whether that means the package > manager on that distro already downloads the binaries *or* the > additional step of building them must be taken, is outside of the > control of CMake. It'll have to be something that is taken care of > externally (I think this is the case already, based on the feedback so > far). > > Can anyone think of an easier way to setup dependencies on Windows? QT > has an installer which is fine, and CMake also already has a find > module for it that ships with CMake. What about the others? For > example, if we depended on OpenSSL, it's rather painful to build on > Windows outside of Cygwin (even with Cygwin, you still have to setup > the environment to point to the correct compiler & stuff, much easier > to run the Visual Studio bundled command prompt scripts). In those > cases, getting up and running on Windows can still be a pain :( > > On Wed, Jun 12, 2013 at 1:10 AM, Torbjörn Klatt > <ope...@to...> wrote: >> -----BEGIN PGP SIGNED MESSAGE----- >> Hash: SHA1 >> >> Dear all >> >> On 12.06.2013 04:47, Anthony Foiani wrote: >>> Robert, Markus, all -- >>> >>> On Tue, Jun 11, 2013 at 12:05 PM, Markus Geimer <mg...@we...> >>> wrote: >>>> Robert, >>>> >>>>> The only other thing I can suggest to remedy the dependencies >>>>> problem is to bundle all dependency source code with Doxygen >>>>> and build it along with everything else, much like you already >>>>> do for libmd5. This also would include any dependencies of >>>>> dependencies, transitively. This doesn't seem like it would be >>>>> too big of a deal except for QT, which is pretty huge. >>>>> Thoughts? >> Only ever do this for libraries which do not provide pre-built Windows >> binaries or where most linux distributions do not provide packages >> (which is very very rare). >> And even then you really need to know what you are doing and should >> prefer to let the use resolve these dependencies on his own. He/she >> knows the system well better than you ever do via CMake. >> >> CMake's purpose is exactly to overcome the need to create such >> monstrosities of bundles and maintenance nightmares. >> >>>> I strongly recommend to *not* create such a monster tarball >>>> because of various reasons: >>> I very strongly concur. >>> >>> Robert's proposal (create private copies of everything) was the >>> route taken by Chromium. Getting it accepted into Fedora has been >>> delayed for years because the original packagers did not use the >>> libraries already available on the platform. >>> >>> (Not to mention the fact that, with the "monster tarball" >>> approach, you also need to keep up with security patches for all of >>> those programs you copied...) >>> >>> More info on this particular case here: >>> http://ostatic.com/blog/making-projects-easier-to-package-why-chromium-isnt-in-fedora >>> >>> >>>> My recommendation would be to stick to proven best practices: Let >>>> the configure step try to determine whether an appropriate >>>> version of a dependency library is installed on the system, use >>>> it if available, and complain (or disable optional functionality) >>>> otherwise. >>> +1. >>> >>> I can't believe that CMake doesn't have this capability already, >>> even if it's in the guise of "is this symbol available from any >>> known library on the system?" >> CMake does have such possibilities. It's called >> >> FIND_PACKAGE(mypackage REQUIRED) >> >> This will abort the whole configuration step if 'mypackage' is not >> available on the system (read: not found in system's paths). >> >> E.g. to require the Boost system and regex libraries and headers of >> version 1.50 and later one writes >> >> FIND_PACKAGE(Boost 1.50 REQUIRED COMPONENTS system regex) >> >> For a list of somehow inbuilt supported packages to search for see [1] >> and all items starting with 'Find'. >> In case the desired package is not in this list, one usually finds >> such via Google (or GitHub) easily. >> >> If you ever encounter a library where no FindX module is available, >> you should try detecting existence on the system via the low-level >> CMake commands FIND_LIBRARY and FIND_PROGRAM which are used by the >> FindX modules. >> >> [1] >> http://www.cmake.org/cmake/help/v2.8.11/cmake.html#section_StandardCMakeModules >> >>> Either way, I do wish Robert luck; CMake has the potential to >>> improve cross-platform projects quite a bit, but I agree with >>> Markus that bringing in tarballs of dependencies is not the right >>> way. >> I sign this. >> >>> Best regards, Anthony Foiani >> Cheers, >> Torbjörn Klatt >> >> -----BEGIN PGP SIGNATURE----- >> Version: GnuPG v2.0.18 (GNU/Linux) >> Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ >> >> iQEcBAEBAgAGBQJRuBDpAAoJENyw9v81DsTG5PwIAKSjp8gClLd5gOGF8OIeo0QO >> lzc8G7Kvn6wwSpklJqiVKFE3FA3y3y1bnnOOj4nSCycLgJFUFUhaWekCYHYM0yWJ >> CCjj2EExFGs6J1gKkopInEmEhj5REhKJwTZzZjs88kGqVCbiZgE6LWkDTJd4PjE7 >> dSl+sLdJZRWyadzgyDpG6kWBs0eE0EKtT+Nql94nXrWTP1nDoSp/fQZevpyBP5++ >> fq2D9UufgpBkossXfEQN2MxwzEZduX4MmvUBd5PBKwleI+PXQAM86l+t09A5Hpiu >> I28+YUQQYStFD9ztgu91ps/Vs7dJxzlAvqs56PcdgNCnZTdjwt9JML8IjNzUiW8= >> =UNdQ >> -----END PGP SIGNATURE----- >> >> ------------------------------------------------------------------------------ >> This SF.net email is sponsored by Windows: >> >> Build for Windows Store. >> >> http://p.sf.net/sfu/windows-dev2dev >> _______________________________________________ >> Doxygen-develop mailing list >> Dox...@li... >> https://lists.sourceforge.net/lists/listinfo/doxygen-develop > ------------------------------------------------------------------------------ > This SF.net email is sponsored by Windows: > > Build for Windows Store. > > http://p.sf.net/sfu/windows-dev2dev > _______________________________________________ > Doxygen-develop mailing list > Dox...@li... > https://lists.sourceforge.net/lists/listinfo/doxygen-develop |