On Wed, Dec 5, 2012 at 10:23 PM, Paul Ivanov <pivanov314@...> wrote:
> On Wed, Dec 5, 2012 at 1:52 PM, Nathaniel Smith <njs@...> wrote:
>> If you're defining your own warning class, you might consider using
>> FutureWarning instead of UserWarning.
>> We had a discussion about this issue for numpy recently:
>> What we eventually ended up with:
> Thanks for the pointers, Nathaniel. Though I think I disagree with
> continuing to use DeprecationWarnings for features that will go away
> and just break code - shouldn't users be given ample opportunity of
> coming changes without having to find out by having their code break
> at a future release?
Yeah, there aren't any perfect solutions here. That's why I didn't
express an opinion on what you ought to do :-).
Basically what the debate comes down to is, deprecation warnings are
useful to developers, and annoying and scary to users. (And users can
easily end up seeing them, e.g. if they use a package which depends on
matplotlib, and then upgrade matplotlib, their existing package may
suddenly start spewing scary warnings, and that package's developers
can't do anything about this because this version of matplotlib is
newer than anything that existed when they released their package.)
This problem becomes worse the lower your package is in the stack, and
the more widely used it is by third-party packages.
It's easier to tell developers how to turn on deprecation warnings
than it is to tell users how to turn them off, so that's why the
Python stdlib turned them off by default, and similarly numpy.
The main thing I took from this personally is that I went and added
'export PYTHONWARNINGS=default' to all my package's test scripts, to
ensure deprecation warnings would be enabled...