I can see that that would make things easier for packagers, and with most "normal" users of PyPI using the .whl files, in which I don't want to include the tests, I consider it less problematic to include tests in the .tar.gz version. If I can easily (i.e. automatically) differentiate these two given my development tools, I might consider doing that, but I am also considering moving all tests to sub-packages and I am not sure I want to support both.
Please be aware that my development tool, when pushing a new version, runs several checks andtox, and will barf when something doesn't run as expected. That does of course not find any issues in setup.py caused by backward incompatible changes in libraries used by that package, and unfortunately I have to work around limitations/bugs in the setup tools of Python using some, obviously non-stable, internals. (Since the setup.py is shared between packages, driven by the data in the individual __init__.py files, I knew about this issue, but have been too lazy to pro-actively update all packages on PyPI, sorry for that).
(originally posted on 2019-07-12 at 07:40:05)
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Apart from using setup.py as a template, my philosophy is to minimize the “crap” in my Python projects, i.e. reducing the number of files needed in a module by generating them on the fly from data loaded (not eval-ed) from the one file you need to have anyway (``__init__.py``) and not having subdirectories called ``src`` or ``lib`` that prevent you from having a normal source tree on which you can use standard python imports.
My initial reaction is to I think that adding a `mobanfile` to the projects would clash with that.
Apart from that please realise that less than 1/3 of the packages that I built and maintain are open source, any changes will have to work for the non-open-source projects I work on for clients as well. That doesn’t mean I don’t want to look into things, but e.g. if something is going to be changed on all my projects, and if that cannot be automated, you only see the iceberg of what needs to be done.
(originally posted on 2019-07-17 at 09:25:39)
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
As for the download problems, it is probably easier if I make a “full” tar file, including tests, automatically and put that on my server for you to download. I would give you a secret, non-guessable, base url and when you see a new version on PyPI (or github or whatever way you now know to rebuild your RPM), get ruamel.yaml.cmd-X.Y.Z.tar.xz (et other relevant packages) from there. If I put that in my develop command depending on __init__.py, that will be automatically available in time.
(originally posted on 2019-07-17 at 09:38:19)
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Why not use BitBucket "downloads"? IIRC you can upload custom tarballs there.
That avoids the ruamel-yaml-25737c624a8c directory names inside the tarball that BitBucket creates, as it is customary to have the directory name inside the tarball be ruamel-yaml-%version
re moban, I usually have a .moban.yaml inside the repo, but then that is all that is needed, and then I can regenerate all files to ensure the repo is in sync, and also generate other per-repo files which other tools might need that can be added to .*ignore because I dont want to version their per-repo files in every repo as I am tracking the file(or a template for the file) in a central repo. I also use moban with private repos for work. There is one 'scale' type problem I have been meaning to fix, that I think you have touched on, which is how to propagate changes to the .moban.yaml in each repo from the central repo. I currently solve this (for work only where scale is important; not yet used in OSS) by having a template for .moban.yaml , but then moban needs to be run twice if that template changes - once to update the control file and then again to use the control file - and sometimes that "run twice" can make the template logic messy. That is https://github.com/moremoban/moban/issues/127 If you are willing to accept a .moban.yaml in each repo, I'll get that issue solved. If you cant handle a .moban.yaml in each repo, I'll create some issues to allow sync-ing without any control file, as I am fairly sure the tool could work out 99% of the syncing required automatically, especially as you dont have many files which need sync'ing because you've essentially made */__init__.py your control file - PON?. We could even add support for the control file to be PON or .py , and it guess which */__init__.py is the master control file.
(originally posted on 2019-07-17 at 10:05:05 by John Vandenberg <John Vandenberg@bitbucket>)
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I know how to work around the problem, but the problem remains. The .spec should use standard rpm macros for %setup , otherwise it increases the maintenance costs for each distro.
If you dont want to upload tarballs that match %name-%version.tar.gz since BitBucket doesnt do this , the approach I will take is
Also note that the URL of the download needs to be in the .spec , and there are tools which check that the stored tarball is the same as the online version, so it cant be a private online resource, and it cant be fiddled with before uploading to the rpm build workers.
How do I name the tar file? ruamel-yaml-convert-0.3.1.tar (convert all dots to dashes) or ruamel-yaml.convert-0.3.1.tar(i.e. user - repo name) or ruamel.yaml.convert-0.3.1.tar(tarfile named as on PyPI, this has my preference because of consistency, but I am not sure that works for you). I can adapt the top-level directory to match any of those.
I have no idea how how many versions I can (and do want to) keep around, so I want to to do some pruning of old uploads. I am thinking of rules like
- keeping file that is newer than N months
- keep latest micro version (X.Y.latest) for every X.Y for M month
- allow entries to be explicitly pinned (via ``__init__.py>_package_data``? That would allow pinning via a PR)
3. Do you expect tar..bz2? Would .tar.xz be ok? I remember SuSE as being slow to adapt better compression schemes, but that was for data in the RPMs, and maybe does not apply for source material.
(originally posted on 2019-07-19 at 08:54:53)
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
The %name part is quite flexible, and probably you can not make everyone happy there.
openSUSE rpm names are strictly PyPI names ; i.e. ruamel.yaml.convert-0.3.1.tar is their preference.
But other distros have different naming conventions, and it doesnt matter much due to %setup -n
It is common to use %setup -n tar-ball-name-%version with a literal in the name part because it cant be derived from the rpm name. GitHub tarballs often require this as the repo name is used in the tarball name, and the repo name often isnt the PyPI package name.
rpmbuild and .deb equivalent tools on most distros will now support any of .gz/.bz2/.xz. openSUSE definitely does on all versions of the OS that they still do builds for now, and .xz is their preferred format for custom built tarballs.
Tarballs are uploaded into the openSUSE build system, and into the Fedora equivalent, so I think you are safe to prune the old ones whenever you feel they are not relevant any more, especially earlier point releases. Ultimately anyone wanting to get any old release can still use the tag system provided by BitBucket. The tarball checks done by build bots may start screaming at maintainers if your old tarballs disappear, but the tarball will have been verified when it was first pushed into the build systems, so they know that the tarball is trusted even if it stops being validated. I'm sure they have some way to turn off the tarball hash verification - you wont be the first package tarball host that has tarballs disappearing. As long as the latest point releases are still available, the maintainers have an obvious solution - update the package.
(originally posted on 2019-07-19 at 12:00:54 by John Vandenberg <John Vandenberg@bitbucket>)
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
On CentOS 7.0 the same directory structure seems to be used and there it work if the .so is in /usr/lib64/python2.7/site-packages and the rest of ruamel.yaml under /usr/lib/python2.7/site-packages. Splitting of the binary in a subpackages is going to make my building for PyPI much simpler as the C source doesn’t change that often, and I don’t have to regenerate all the wheels for the different platforms.
(originally posted on 2019-07-22 at 11:12:53)
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I split of the .so generation from `ruamel.yaml` into `ruamel.yaml.clib`, which makes a .whl which has no .pth or .py files. At some point I have to look at how to install from the .tar.gz and not get those files.
ruamel.yaml is dependent on ruamel.yaml.clib for CPython and 2.7/3.5/3.6/3.7. Currently 3.8 cannot be made yet by manylinux and because installing ruamel.yaml.clib from the tar.gz gets you the .pth (and __init__.py) file, that doesn’t work (unless you remove the .pth by hand).
I have a mechanised routine to check out a tagged version from bitbucket and upload the .tar.xz to the download area, it is not part of the normal “push” process I invoke on release, so there will be some delay on getting that file up (more if I forget to start that routine). I consider changing this to do this based on my local copy (checking out cleanly based on the tag).
Can you let me know if that works for you? If not what needs to change?
My current push is: test for sanity; run tox; commit; tag; push to my backup server; push to bitbucket; make distribution files; push distribution files to PyPI.
Do you automatically trigger your RPM builds? If so what triggers them. I can obviously not generate a tar.xz before I tag, but I could do so before pushing to Bitbucket (from my local repo).
(originally posted on 2019-07-26 at 07:08:31)
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
There is no automation between pypi release and opensuse package being updated.
I have a script which searches for new pypi releases, but I run that manually, and only when I have time or reason to do some openSUSE packaging.
I'm taking a look at the new releases now.
(originally posted on 2019-07-28 at 11:14:51 by John Vandenberg <John Vandenberg@bitbucket>)
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Those new tarballs worked wonderfully. Thanks so much. I'll let you know when the openSUSE packages have been updated in the main repos.
Note .convert tarball with tests hasnt been created yet, but not so important as .cmd has tests which mostly covers .convert. Feel free to close this if you dont intend to provide separate tarballs for .convert.
(originally posted on 2019-07-28 at 15:43:37 by John Vandenberg <John Vandenberg@bitbucket>)
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I generated the tar.xz for ruamel.yaml.convert and ruamel.yaml.jinja2.
The tar.xz generations is now also part of the add-tag to local repo step, before pushing to bitbucket of the repo and certainly before uploading to pypi. So if things change on PyPI, there should be a new .tar.xz
(originally posted on 2019-07-28 at 16:47:25)
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I can see that that would make things easier for packagers, and with most "normal" users of PyPI using the .whl files, in which I don't want to include the tests, I consider it less problematic to include tests in the
.tar.gzversion. If I can easily (i.e. automatically) differentiate these two given my development tools, I might consider doing that, but I am also considering moving all tests to sub-packages and I am not sure I want to support both.Please be aware that my development tool, when pushing a new version, runs several checks and
tox, and will barf when something doesn't run as expected. That does of course not find any issues insetup.pycaused by backward incompatible changes in libraries used by that package, and unfortunately I have to work around limitations/bugs in the setup tools of Python using some, obviously non-stable, internals. (Since thesetup.pyis shared between packages, driven by the data in the individual__init__.pyfiles, I knew about this issue, but have been too lazy to pro-actively update all packages on PyPI, sorry for that).(originally posted on 2019-07-12 at 07:40:05)
No rush. I can grab tarballs from bitbucket with the tests, but a bitbucket URL in a .spec doesnt look as good as it used to.
And https://bitbucket.org/ruamel/yaml.convert/get/0.3.1.tar.gz has a tarball which expands to
ruamel-yaml.convert-9400b6b23cda/*which isnt as nice as githubs archives, and makes the .spec a little less standardised.Maintainers are less likely to want to have to find the new tarball here.
On all your templates and setuptools hackery, I would be keen to import all your setup.py voodoo into the templates at https://github.com/moremoban/pypi-mobans ( i.e. https://github.com/moremoban/pypi-mobans/blob/dev/templates/setup.py.jj2 ) which means you can use
mobanto assist in the repo management, and you would get a few features in return. We use it a lot at coala and pyexcel due to lots of interacting packages . Your approach to setup.py is quite similar. coala and pyexcel each have their own templates as well, e.g. https://gitlab.com/coala/mobans/ and https://github.com/pyexcel/pyexcel-mobans . If this interests you at all, I can do a sample of what it might look like for you, so you can better evaluate it.(originally posted on 2019-07-12 at 08:23:26 by John Vandenberg <John Vandenberg@bitbucket>)
Apart from using
setup.pyas a template, my philosophy is to minimize the “crap” in my Python projects, i.e. reducing the number of files needed in a module by generating them on the fly from data loaded (not eval-ed) from the one file you need to have anyway (``__init__.py``) and not having subdirectories called ``src`` or ``lib`` that prevent you from having a normal source tree on which you can use standard python imports.My initial reaction is to I think that adding a `mobanfile` to the projects would clash with that.
Apart from that please realise that less than 1/3 of the packages that I built and maintain are open source, any changes will have to work for the non-open-source projects I work on for clients as well. That doesn’t mean I don’t want to look into things, but e.g. if something is going to be changed on all my projects, and if that cannot be automated, you only see the iceberg of what needs to be done.
(originally posted on 2019-07-17 at 09:25:39)
As for the download problems, it is probably easier if I make a “full” tar file, including tests, automatically and put that on my server for you to download. I would give you a secret, non-guessable, base url and when you see a new version on PyPI (or github or whatever way you now know to rebuild your RPM), get ruamel.yaml.cmd-X.Y.Z.tar.xz (et other relevant packages) from there. If I put that in my develop command depending on
__init__.py,that will be automatically available in time.(originally posted on 2019-07-17 at 09:38:19)
Why not use BitBucket "downloads"? IIRC you can upload custom tarballs there.
That avoids the
ruamel-yaml-25737c624a8cdirectory names inside the tarball that BitBucket creates, as it is customary to have the directory name inside the tarball beruamel-yaml-%versionre moban, I usually have a
.moban.yamlinside the repo, but then that is all that is needed, and then I can regenerate all files to ensure the repo is in sync, and also generate other per-repo files which other tools might need that can be added to .*ignore because I dont want to version their per-repo files in every repo as I am tracking the file(or a template for the file) in a central repo. I also use moban with private repos for work. There is one 'scale' type problem I have been meaning to fix, that I think you have touched on, which is how to propagate changes to the.moban.yamlin each repo from the central repo. I currently solve this (for work only where scale is important; not yet used in OSS) by having a template for.moban.yaml, but then moban needs to be run twice if that template changes - once to update the control file and then again to use the control file - and sometimes that "run twice" can make the template logic messy. That is https://github.com/moremoban/moban/issues/127 If you are willing to accept a.moban.yamlin each repo, I'll get that issue solved. If you cant handle a.moban.yamlin each repo, I'll create some issues to allow sync-ing without any control file, as I am fairly sure the tool could work out 99% of the syncing required automatically, especially as you dont have many files which need sync'ing because you've essentially made*/__init__.pyyour control file - PON?. We could even add support for the control file to be PON or .py , and it guess which*/__init__.pyis the master control file.(originally posted on 2019-07-17 at 10:05:05 by John Vandenberg <John Vandenberg@bitbucket>)
Bitbucket downloads, now that is a novel idea ( https://bitbucket.org/ruamel/winpysetup/downloads/ ).
The docter said something about …heimer, I have forgotten exactly what it was. :-)
(originally posted on 2019-07-17 at 10:21:41)
If you’re extracting the downloaded tar file anyway, you can do something like:
to get things extracted directly into a directory ruamel-yaml-0.15.100, with `_test` subdir etc
You can probably achieve the same with Gnu-tar's
--transformbut then you would need to know the name of the initial path segment.(originally posted on 2019-07-17 at 14:07:50)
I know how to work around the problem, but the problem remains. The .spec should use standard rpm macros for %setup , otherwise it increases the maintenance costs for each distro.
If you dont want to upload tarballs that match
%name-%version.tar.gzsince BitBucket doesnt do this , the approach I will take isAlso note that the URL of the download needs to be in the .spec , and there are tools which check that the stored tarball is the same as the online version, so it cant be a private online resource, and it cant be fiddled with before uploading to the rpm build workers.
BitBucket bug closed as inactive https://bitbucket.org/site/master/issues/6007/download-filenames-should-include-the
(originally posted on 2019-07-19 at 03:29:44 by John Vandenberg <John Vandenberg@bitbucket>)
Ok, questions:
ruamel-yaml-convert-0.3.1.tar(convert all dots to dashes) orruamel-yaml.convert-0.3.1.tar(i.e. user - repo name) orruamel.yaml.convert-0.3.1.tar(tarfile named as on PyPI, this has my preference because of consistency, but I am not sure that works for you). I can adapt the top-level directory to match any of those. - keeping file that is newer than N months
- keep latest micro version (X.Y.latest) for every X.Y for M month
- allow entries to be explicitly pinned (via ``__init__.py>_package_data``? That would allow pinning via a PR)
3. Do you expect
tar..bz2? Would.tar.xzbe ok? I remember SuSE as being slow to adapt better compression schemes, but that was for data in the RPMs, and maybe does not apply for source material.(originally posted on 2019-07-19 at 08:54:53)
The %name part is quite flexible, and probably you can not make everyone happy there.
openSUSE rpm names are strictly PyPI names ; i.e.
ruamel.yaml.convert-0.3.1.taris their preference.But other distros have different naming conventions, and it doesnt matter much due to
%setup -nIt is common to use
%setup -n tar-ball-name-%versionwith a literal in the name part because it cant be derived from the rpm name. GitHub tarballs often require this as the repo name is used in the tarball name, and the repo name often isnt the PyPI package name.rpmbuildand .deb equivalent tools on most distros will now support any of .gz/.bz2/.xz. openSUSE definitely does on all versions of the OS that they still do builds for now, and .xz is their preferred format for custom built tarballs.Tarballs are uploaded into the openSUSE build system, and into the Fedora equivalent, so I think you are safe to prune the old ones whenever you feel they are not relevant any more, especially earlier point releases. Ultimately anyone wanting to get any old release can still use the tag system provided by BitBucket. The tarball checks done by build bots may start screaming at maintainers if your old tarballs disappear, but the tarball will have been verified when it was first pushed into the build systems, so they know that the tarball is trusted even if it stops being validated. I'm sure they have some way to turn off the tarball hash verification - you wont be the first package tarball host that has tarballs disappearing. As long as the latest point releases are still available, the maintainers have an obvious solution - update the package.
(originally posted on 2019-07-19 at 12:00:54 by John Vandenberg <John Vandenberg@bitbucket>)
Can you check if https://bitbucket.org/ruamel/yaml/downloads/?tab=downloads contains what you need (for ruamel.yaml, version 0.15.100) and https://bitbucket.org/ruamel/yaml.cmd/downloads/?tab=downloads for ruamel.yaml.cmd version 0.5.4 ?
(originally posted on 2019-07-19 at 15:23:39)
On CentOS 7.0 the same directory structure seems to be used and there it work if the .so is in /usr/lib64/python2.7/site-packages and the rest of ruamel.yaml under /usr/lib/python2.7/site-packages. Splitting of the binary in a subpackages is going to make my building for PyPI much simpler as the C source doesn’t change that often, and I don’t have to regenerate all the wheels for the different platforms.
(originally posted on 2019-07-22 at 11:12:53)
I split of the .so generation from `ruamel.yaml` into `ruamel.yaml.clib`, which makes a .whl which has no .pth or .py files. At some point I have to look at how to install from the .tar.gz and not get those files.
ruamel.yamlis dependent onruamel.yaml.clibfor CPython and 2.7/3.5/3.6/3.7. Currently 3.8 cannot be made yet by manylinux and because installing ruamel.yaml.clib from the tar.gz gets you the .pth (and__init__.py) file, that doesn’t work (unless you remove the .pth by hand).I have a mechanised routine to check out a tagged version from bitbucket and upload the .tar.xz to the download area, it is not part of the normal “push” process I invoke on release, so there will be some delay on getting that file up (more if I forget to start that routine). I consider changing this to do this based on my local copy (checking out cleanly based on the tag).
Can you let me know if that works for you? If not what needs to change?
My current push is: test for sanity; run tox; commit; tag; push to my backup server; push to bitbucket; make distribution files; push distribution files to PyPI.
Do you automatically trigger your RPM builds? If so what triggers them. I can obviously not generate a tar.xz before I tag, but I could do so before pushing to Bitbucket (from my local repo).
(originally posted on 2019-07-26 at 07:08:31)
There is no automation between pypi release and opensuse package being updated.
I have a script which searches for new pypi releases, but I run that manually, and only when I have time or reason to do some openSUSE packaging.
I'm taking a look at the new releases now.
(originally posted on 2019-07-28 at 11:14:51 by John Vandenberg <John Vandenberg@bitbucket>)
Those new tarballs worked wonderfully. Thanks so much. I'll let you know when the openSUSE packages have been updated in the main repos.
Note .convert tarball with tests hasnt been created yet, but not so important as .cmd has tests which mostly covers .convert. Feel free to close this if you dont intend to provide separate tarballs for .convert.
(originally posted on 2019-07-28 at 15:43:37 by John Vandenberg <John Vandenberg@bitbucket>)
I generated the tar.xz for ruamel.yaml.convert and ruamel.yaml.jinja2.
The tar.xz generations is now also part of the add-tag to local repo step, before pushing to bitbucket of the repo and certainly before uploading to pypi. So if things change on PyPI, there should be a new .tar.xz
(originally posted on 2019-07-28 at 16:47:25)