You can subscribe to this list here.
2000 |
Jan
(8) |
Feb
(49) |
Mar
(48) |
Apr
(28) |
May
(37) |
Jun
(28) |
Jul
(16) |
Aug
(16) |
Sep
(44) |
Oct
(61) |
Nov
(31) |
Dec
(24) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2001 |
Jan
(56) |
Feb
(54) |
Mar
(41) |
Apr
(71) |
May
(48) |
Jun
(32) |
Jul
(53) |
Aug
(91) |
Sep
(56) |
Oct
(33) |
Nov
(81) |
Dec
(54) |
2002 |
Jan
(72) |
Feb
(37) |
Mar
(126) |
Apr
(62) |
May
(34) |
Jun
(124) |
Jul
(36) |
Aug
(34) |
Sep
(60) |
Oct
(37) |
Nov
(23) |
Dec
(104) |
2003 |
Jan
(110) |
Feb
(73) |
Mar
(42) |
Apr
(8) |
May
(76) |
Jun
(14) |
Jul
(52) |
Aug
(26) |
Sep
(108) |
Oct
(82) |
Nov
(89) |
Dec
(94) |
2004 |
Jan
(117) |
Feb
(86) |
Mar
(75) |
Apr
(55) |
May
(75) |
Jun
(160) |
Jul
(152) |
Aug
(86) |
Sep
(75) |
Oct
(134) |
Nov
(62) |
Dec
(60) |
2005 |
Jan
(187) |
Feb
(318) |
Mar
(296) |
Apr
(205) |
May
(84) |
Jun
(63) |
Jul
(122) |
Aug
(59) |
Sep
(66) |
Oct
(148) |
Nov
(120) |
Dec
(70) |
2006 |
Jan
(460) |
Feb
(683) |
Mar
(589) |
Apr
(559) |
May
(445) |
Jun
(712) |
Jul
(815) |
Aug
(663) |
Sep
(559) |
Oct
(930) |
Nov
(373) |
Dec
|
From: Travis O. <oli...@ee...> - 2006-10-05 22:19:29
|
A. M. Archibald wrote: >On 05/10/06, Greg Willden <gre...@gm...> wrote: > > >>On 10/5/06, Travis Oliphant <oli...@ee...> wrote: >> >> >>>Perhaps that is the best way to move forward along with the work on a >>>"pylab" super-package. >>> >>> >>That is exactly what I want. >> >> > >What is unsatisfactory about installing numpy+scipy+matplotlib? I've >found they're generally pretty complete (except where no decent python >alternative exists). > > > >>In the end I want a nice collection of functions, logically organized, that >>let me analyze/filter/plot etc. etc. etc. >> >>The key for me is "logically organized". >> >> > > > There is a structure to it, but it's more organic because of the multiple contributors. weave should be in NumPy but nobody was willing to step up to maintain it a year ago. I may be willing to step up at this point. I would like to see weave in NumPy (maybe not the blitz libraries though...) I think a hybrid for weave / f2py / ctypes that allows "inlining in multiple languages" as well as automatic extension module generation for "already-written" code is in order. -Travis |
From: Travis O. <oli...@ee...> - 2006-10-05 22:11:35
|
Martin Wiechert wrote: >Hi list, > >when I try to assign a sequence as an element of an object array via flat >indexing only the first element of the sequence is assigned: > > > >>>>import numpy >>>>numpy.version.version >>>> >>>> >'1.0rc1.dev3171' > > >>>>from numpy import * >>>>a = ndarray ((2,2), object) >>>>a.flat [2] = (1, 2, 3) >>>>a.flat [2] >>>> >>>> >1 > > >>>>a >>>> >>>> >array([[None, None], > [1, None]], dtype=object) > >Is this a feature? Wouldn't a naive user like me expect >a.flat [2] == (1, 2, 3)? > > > You are probably right. This should be changed. -Travis |
From: Matthew B. <mat...@gm...> - 2006-10-05 21:30:39
|
Hi, On 10/5/06, Martin Wiechert <mar...@gm...> wrote: > Hi list, > > when I try to assign a sequence as an element of an object array via flat > indexing only the first element of the sequence is assigned: I've also been having trouble with flat on object arrays. Is this intended? In [1]: from numpy import * In [2]: a = arange(2) In [3]: a[1] Out[3]: 1 In [4]: a.flat[1] Out[4]: 1 In [5]: b = array([a], dtype=object) In [6]: b[1] --------------------------------------------------------------------------- exceptions.IndexError Traceback (most recent call last) /home/mb312/devel_trees/scipy/Lib/io/<ipython console> IndexError: index out of bounds In [7]: b.flat[1] Out[7]: 1 Best, Matthew |
From: Christopher B. <Chr...@no...> - 2006-10-05 20:23:00
|
The situation is confusing, we all know that, and we all want to move toward a better way. Key to that is that SciPy needs to be easier to build and install -- that's happening, but I don't know that it's there yet. Maybe it can be built on Fedora Core 4 now, but last I tried it couldn't be. Anyway, few thoughts on comments made here: > Matplotlib plots the spectrogram Here's a key problem -- matplotlib includes WAY too much. There are reasons, and there is history, but as a goal, I think matplotlib should be just what it says it is -- a plotting library. I don't know that MPL has been declared the "official" plotting package for SciPy, but it would be nice if it were. SciPy has suffered for a very long time without a full-featured, cross-platform, "official" plotting package. AS far as I know, MPL comes the closest (except it doesn't do 3-d -- darn!) A. M. Archibald wrote: > Frankly, I tend to prefer the other approach to solving all these > issues: distribute numpy, scipy, and matplotlib as one bundle. This is really the only way to set things up for someone that want what could be used as a "matlab replacement". If we ever get settools working just right, we should be able to do: easy_install scipy and have it all! woo hoo! However, as we work to that goal, i do think it makes sense that numpy, and matplotlib be packages unto themselves -- developed separately, etc. In fact, SciPy should be a collection of distinct packages as well. I think there is a real benefit to be able to install just what you need. Not very user of numpy and/or MPL is looking for a full scientific software package. I'm a big advocate of people using numpy arrays for all sorts of thinks that fit well into an n-d array, that have nothing to do with Scientific programming. Matplotlib is also very useful for people who need a quick plot for a web site or something. These people don't want to install the entirety of SciPy. > * routines and objects can be in the package in which they make the > most semantic sense, exactly. If it's plotting (but not computing stuff to plot) put it in MPL, if it's generic computation (if you can't understand it with high school math, it's not generic), put it in numpy. Of course, these aren't clear definitions, but can still be useful as guidelines. > * documentation can be cross-referenced between packages (so the > Matrix class can tell people to look in scipy.linalg for inverses, for > example) If it were me, I'd probably have the Matrix package depend on a linear algebra package, and have only one of those. Travis Oliphant wrote: > 3) some kind of problem-domain hierarchy +1 > Do we want to pull scipy apart into two components: one that needs > Fortran to build and another that doesn't? yup -- I don't like it, but the state of Fortran compilers really gives little choice. -Chris -- Christopher Barker, Ph.D. Oceanographer NOAA/OR&R/HAZMAT (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chr...@no... |
From: Tim H. <tim...@ie...> - 2006-10-05 20:21:10
|
Travis Oliphant wrote: > Tim Hochberg wrote: > > >>>> >>>> >>>> >>>> >>> That would be easy to do. Right now the opcodes should work correctly >>> on data that is spaced in multiples of the itemsize on the last axis. >>> Other arrays are copied (no opcode required, it's embedded at the top >>> of interp_body lines 64-80). The record array case apparently slips >>> through the cracks when we're checking whether an array is suitable to >>> be used correctly (interpreter.c 1086-1103). It would certainly not be >>> any harder to only allow contiguous arrays than to correctly deal with >>> record arrays. Only question I have is whether the extra copy will >>> overwhelm the savings of that operating on contiguous data gives. The >>> thing to do is probably try it and see what happens. >>> >>> >>> >> OK, I've checked in a fix for this that makes a copy when the array is >> not strided in an even multiple of the itemsize. I first tried copying >> for all discontiguous array, but this resulted in a large speed hit for >> vanilla strided arrays (a=arange(10)[::2], etc.), so I was more frugal >> with my copying. I'm not entirely certain that I caught all of the >> problematic cases, so let me know if you run into any more issues like this. >> >> >> >> > There is an ElementStrides check and similar requirement flag you can > use to make sure that you have an array whose strides are multiples of > it's itemsize. > > Thanks Travis, I'll make a note; next time I look at this code I'll see if that can be used to simplify the code in question. -tim |
From: Albert S. <fu...@gm...> - 2006-10-05 20:19:44
|
Hello all Some comments from a Windows user's perspective. On Thu, 05 Oct 2006, Travis Oliphant wrote: > John Hunter wrote: > > >>>>>>"Robert" == Robert Kern <rob...@gm...> writes: > > > > Robert> IMO, I'd rather see this and similar functions go into > > Robert> scipy. New functions that apply semantics to arrays (in > > Robert> this case, treating them as time series), I think should > > Robert> go into scipy. New functions that treat arrays simply as > > Robert> arrays and are generally useful can probably go into > > Robert> numpy. > > > >I prefer Perry's longstanding suggestion: things that do not add to > >distribution complexity should go into numpy. If it compiles as > >easily as numpy itself, it should go into numpy where sensible. > > I don't think this is as feasible as it sounds at first. Some people > complain that NumPy is too big already. I agree here. Focus NumPy on doing array library things. > SciPy is very easy to install on Windows (there is a binary available). > The only major platform that still gives some trouble is Intel Mac (due > to the fortran compiler situation). But, all you need is one person > who can build it and distribute a binary. So far, I've been shying away from SciPy, because, if I encounter a problem, I have no easy way of building from SVN on Windows. I don't think I'm the only one: few Windows users have a proper Fortran compiler. Sure, there's MinGW, but that breaks all my other tools, most notably the Visual Studio debugger and other useful things like profilers (e.g. IBM Rational Quantify). That being said, Enthought's nightly builds obviate the need of most Windows users to build from source. (Enthought rocks.) Two feature requests at this point, which would make NumPy/SciPy/Matplotlib dead easy to use on Windows, even if you're trying to stay close to trunk: 1. Please consider setting up a buildbot(*) that builds and runs the tests on every checkin. I've set up a buildbot for NumPy on my own machine; it takes a matter of minutes. Probably they already have something like this in place. (*) http://buildbot.sourceforge.net/ 2. Please consider doing separate builds per CPU with ATLAS 3.7.11, Intel MKL and ACML. By all means, make a generic build available that runs everywhere. This will require some reading of the MKL license agreement, but I think Enthought should be able to distribute Windows builds based on MKL without problems. Why go to this trouble? MATLAB R2006b uses Intel MKL on my CPU, and it is much faster than NumPy with ATLAS 3.6.0. Core Duo users also have the option of enabling OpenMP, to spread calculations to multiple cores. > I think a better long-term solution is to understand how to package > things better by working with people at Enthought so that when you > advertise to the ex-Matlab user you point him to a "super-package" that > installs a bunch of other small packages. This is a more maintainable > solution as long as we set standards for > > 1) documentation > 2) tests > 3) some kind of problem-domain hierarchy Agreed. If Enthought is willing to provide the resources, Enthon could be the perfect solution to many of the issues that we currently encounter to get decent builds on Windows. > The idea of just lumping more an more things into NumPy itself is not a > good idea. What most users want is something that installs easily (like > Enthon). How it is packaged is not as important. What developers need > is a sane multi-namespace system that can be maintained separately if > needed. > > I think we decided a while ago, that the package approach should contain > indicators as to whether or not a fortran compiler was needed to build > the system so that dependency on those things could be eliminated if > needed. > > Do we want to pull scipy apart into two components: one that needs > Fortran to build and another that doesn't? Maybe. Maybe not. On Linux this doesn't make much difference to me if I check out 3 projects or 10 -- builds are easy. On Windows, getting the support libraries, build tools and configuration right is much harder. Hard enough, that I don't think anybody wants to do it regularly. > Perhaps that is the best way to move forward along with the work on a > "pylab" super-package. Yes, please. Regards, Albert |
From: Martin W. <mar...@gm...> - 2006-10-05 20:18:43
|
Hi list, when I try to assign a sequence as an element of an object array via flat indexing only the first element of the sequence is assigned: >>> import numpy >>> numpy.version.version '1.0rc1.dev3171' >>> from numpy import * >>> a = ndarray ((2,2), object) >>> a.flat [2] = (1, 2, 3) >>> a.flat [2] 1 >>> a array([[None, None], [1, None]], dtype=object) Is this a feature? Wouldn't a naive user like me expect a.flat [2] == (1, 2, 3)? Thanks in advance, Martin Wiechert |
From: A. M. A. <per...@gm...> - 2006-10-05 20:13:14
|
On 05/10/06, Greg Willden <gre...@gm...> wrote: > On 10/5/06, Travis Oliphant <oli...@ee...> wrote: > > Perhaps that is the best way to move forward along with the work on a > > "pylab" super-package. > > That is exactly what I want. What is unsatisfactory about installing numpy+scipy+matplotlib? I've found they're generally pretty complete (except where no decent python alternative exists). > In the end I want a nice collection of functions, logically organized, that > let me analyze/filter/plot etc. etc. etc. > > The key for me is "logically organized". For the most part, the organization is pretty logical: * Basic array and matrix operations in numpy * linear algebra, differential equation, interpolation, etc. tools are in scipy, each in their own subpackage * weave is mysteriously in scipy * plotting tools are in matplotlib There are a few historical quirks, like window functions in numpy (they really belong in scipy), and some of the less-used scipy subpackages are a bit of a mess internally (scipy.interpolate for example), but otherwise I'm not sure what you want to be different. > And right now that means "So a newbie can find the function I need and the > function I need is there" > > I'm not criticising. I'd like to help get there. Install all three major packages. Use the window functions from scipy in matplotlib. Task-oriented documentation is so far a bit scant, although the scipy cookbook (http://www.scipy.org/Cookbook ) and the numpy examples list (http://www.scipy.org/Numpy_Example_List ) are a good start. A. M. Archibald |
From: Alan G I. <ai...@am...> - 2006-10-05 20:11:45
|
> On 10/5/06, John Hunter <jdh...@ac...> wrote:=20 >> I would be nice to have as much as possible in the most=20 >> widely distributed package IMO.=20 On Thu, 5 Oct 2006, Greg Willden apparently wrote:=20 > That is a much better policy in my view.=20 A user's perspective: Well yes, all else equal, I'd like to have as much as=20 possible in the easiest to install package, BUT my top priority is that numpy be completely bulletproof. Do these goals conflict? Cheers, Alan Isaac |
From: Robert K. <rob...@gm...> - 2006-10-05 20:06:14
|
Greg Willden wrote: > On 10/5/06, *Robert Kern* <rob...@gm... > <mailto:rob...@gm...>> wrote: > > Greg Willden wrote: > > > From my view as a newbie to numpy/scipy/matplotlib it isn't > clear where > > I should look for what functionality. Matplotlib plots the > spectrogram > > but it only supports two or three window functions. Numpy > supports 4 or > > 5 window functions and Scipy apparently supports more but Matplotlib > > doesn't support Scipy. > > That's not true. specgram() takes a windowing function. It doesn't > matter where > that function comes from. > > The next sentence (that you snipped) afirms what you said. > <quote> > Of course this is a minor example and I could just write the window > function myself and then use it in Matplotlib > </quote> > > The details of my off-the-cuff example would probably be better > addressed by the Matplotlib folks. (i.e. why they don't have builtin > functions for more windows). > > You can see how it's all very confusing to someone new. > "Matplotlib can plot a spectrogram but I need to use a window function > from SciPy because Matplotlib only supports NumPy and NumPy doesn't have > the one I want?" > > You get the idea. No, I'm afraid I don't. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco |
From: Greg W. <gre...@gm...> - 2006-10-05 20:04:58
|
On 10/5/06, Travis Oliphant <oli...@ee...> wrote: > > Perhaps that is the best way to move forward along with the work on a > "pylab" super-package. That is exactly what I want. In the end I want a nice collection of functions, logically organized, that let me analyze/filter/plot etc. etc. etc. The key for me is "logically organized". And right now that means "So a newbie can find the function I need and the function I need is there" I'm not criticising. I'd like to help get there. Greg -- Linux. Because rebooting is for adding hardware. |
From: Greg W. <gre...@gm...> - 2006-10-05 19:56:41
|
On 10/5/06, Robert Kern <rob...@gm...> wrote: > > Greg Willden wrote: > > > From my view as a newbie to numpy/scipy/matplotlib it isn't clear where > > I should look for what functionality. Matplotlib plots the spectrogram > > but it only supports two or three window functions. Numpy supports 4 or > > 5 window functions and Scipy apparently supports more but Matplotlib > > doesn't support Scipy. > > That's not true. specgram() takes a windowing function. It doesn't matter > where > that function comes from. The next sentence (that you snipped) afirms what you said. <quote> Of course this is a minor example and I could just write the window function myself and then use it in Matplotlib </quote> The details of my off-the-cuff example would probably be better addressed by the Matplotlib folks. (i.e. why they don't have builtin functions for more windows). You can see how it's all very confusing to someone new. "Matplotlib can plot a spectrogram but I need to use a window function from SciPy because Matplotlib only supports NumPy and NumPy doesn't have the one I want?" You get the idea. Greg -- Linux. Because rebooting is for adding hardware. |
From: Travis O. <oli...@ee...> - 2006-10-05 19:54:08
|
John Hunter wrote: >>>>>>"Robert" == Robert Kern <rob...@gm...> writes: >>>>>> >>>>>> > > Robert> IMO, I'd rather see this and similar functions go into > Robert> scipy. New functions that apply semantics to arrays (in > Robert> this case, treating them as time series), I think should > Robert> go into scipy. New functions that treat arrays simply as > Robert> arrays and are generally useful can probably go into > Robert> numpy. > >I prefer Perry's longstanding suggestion: things that do not add to >distribution complexity should go into numpy. If it compiles as >easily as numpy itself, it should go into numpy where sensible. > I don't think this is as feasible as it sounds at first. Some people complain that NumPy is too big already. SciPy is very easy to install on Windows (there is a binary available). The only major platform that still gives some trouble is Intel Mac (due to the fortran compiler situation). But, all you need is one person who can build it and distribute a binary. I think a better long-term solution is to understand how to package things better by working with people at Enthought so that when you advertise to the ex-Matlab user you point him to a "super-package" that installs a bunch of other small packages. This is a more maintainable solution as long as we set standards for 1) documentation 2) tests 3) some kind of problem-domain hierarchy The idea of just lumping more an more things into NumPy itself is not a good idea. What most users want is something that installs easily (like Enthon). How it is packaged is not as important. What developers need is a sane multi-namespace system that can be maintained separately if needed. I think we decided a while ago, that the package approach should contain indicators as to whether or not a fortran compiler was needed to build the system so that dependency on those things could be eliminated if needed. Do we want to pull scipy apart into two components: one that needs Fortran to build and another that doesn't? Perhaps that is the best way to move forward along with the work on a "pylab" super-package. -Travis |
From: A. M. A. <per...@gm...> - 2006-10-05 19:46:51
|
On 05/10/06, Greg Willden <gre...@gm...> wrote: > That is a much better policy in my view. > > I (gently) encourage this group (Travis?) to make this the policy for > Numpy/Scipy. > > From my view as a newbie to numpy/scipy/matplotlib it isn't clear where I > should look for what functionality. Matplotlib plots the spectrogram but it > only supports two or three window functions. Numpy supports 4 or 5 window > functions and Scipy apparently supports more but Matplotlib doesn't support > Scipy. Of course this is a minor example and I could just write the window > function myself and then use it in Matplotlib but I want to give back so > that the project can grow. I'd really like to be able to leave Matlab > behind and encourage everyone else to do the same but there are still these > annoyances that need to be worked out. Unfortunately, that policy (put it in numpy if it doesn't make the build dependencies any worse) makes it even harder for the user to figure out what is where. Say I want a fast matrix product. Do I look in numpy or scipy? It'll run faster if it uses a tuned BLAS, so it ought to have external requirements, so I'd look in scipy, but maybe there's a simple non-tuned implementation in numpy instead... Frankly, I tend to prefer the other approach to solving all these issues: distribute numpy, scipy, and matplotlib as one bundle. The requirements for scipy are not particularly onerous, particularly if it comes as part of your distribution. There are currently some problems successfully finding optimized versions of LAPACK and BLAS, but to me the benefits of bundling the packages together outweigh the difficulties: * routines and objects can be in the package in which they make the most semantic sense, rather than the one with the correct external dependencies (how is a user supposed to know whether convolution uses an external library or not?) * documentation can be cross-referenced between packages (so the Matrix class can tell people to look in scipy.linalg for inverses, for example) * users can more easily find what's available rather than rewriting from scratch * derived packages (ipython, IDEs, MATLAB-alikes) have many fewer possibilities to support I'm not arguing that the development processes need to be connected, just that making the primary distributed object a package containing all the components will make life easier for everyone involved. A. M. Archibald |
From: Robert K. <rob...@gm...> - 2006-10-05 19:40:25
|
Greg Willden wrote: > From my view as a newbie to numpy/scipy/matplotlib it isn't clear where > I should look for what functionality. Matplotlib plots the spectrogram > but it only supports two or three window functions. Numpy supports 4 or > 5 window functions and Scipy apparently supports more but Matplotlib > doesn't support Scipy. That's not true. specgram() takes a windowing function. It doesn't matter where that function comes from. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco |
From: Greg W. <gre...@gm...> - 2006-10-05 19:25:19
|
On 10/5/06, John Hunter <jdh...@ac...> wrote: > > I prefer Perry's longstanding suggestion: things that do not add to > distribution complexity should go into numpy. If it compiles as > easily as numpy itself, it should go into numpy where sensible. It > remains a fact of life that numpy gets a wider distribution than > scipy, and some packages are hesitant to require scipy as a prereq > because of the additional complexity or building fortran, etc. I > would be nice to have as much as possible in the most widely > distributed package IMO. That is a much better policy in my view. I (gently) encourage this group (Travis?) to make this the policy for Numpy/Scipy. >From my view as a newbie to numpy/scipy/matplotlib it isn't clear where I should look for what functionality. Matplotlib plots the spectrogram but it only supports two or three window functions. Numpy supports 4 or 5 window functions and Scipy apparently supports more but Matplotlib doesn't support Scipy. Of course this is a minor example and I could just write the window function myself and then use it in Matplotlib but I want to give back so that the project can grow. I'd really like to be able to leave Matlab behind and encourage everyone else to do the same but there are still these annoyances that need to be worked out. Thanks Greg -- Linux. Because rebooting is for adding hardware. |
From: Bryce H. <bhe...@en...> - 2006-10-05 18:47:56
|
Robert Kern wrote: >> Perhaps the best solution is to complain to the setuptools list, I'm >> just looking for a quick fix for now. >> > > Patch setup.py in our build system, I would think. > > Thats what I did, but patching working copies of files has been troublesome in the past when svn conflicts have wiggled their way into shipped builds. > What revision of numpy and what version of setuptools are you using? setuptools > 0.7a1 and numpy r3261 correctly recognizes numpy as not zip-safe. > > 0.6c3. I think 0.6c1 did _not_ tag it as zip safe either. Guess I'll just update setuptools on our build system. Bryce |
From: Robert K. <rob...@gm...> - 2006-10-05 18:28:42
|
Bryce Hendrix wrote: > Robert Kern wrote: >> It is not zip-safe if you want to compile against the headers. That keyword >> can't be added to the setup() call in the trunk's setup.py because numpy cannot >> depend on setuptools, at the moment. >> > Adding the keyword does not break builds not using setuptools, the build > just prints a warning that its not a valid keyword. I just discovered > this is also an issue for scipy eggs when building with weave. A warning is not really acceptable in the trunk, either. We've found that warnings during the build process tend to make people think that something went wrong. > Perhaps the best solution is to complain to the setuptools list, I'm > just looking for a quick fix for now. Patch setup.py in our build system, I would think. What revision of numpy and what version of setuptools are you using? setuptools 0.7a1 and numpy r3261 correctly recognizes numpy as not zip-safe. zip_safe flag not set; analyzing archive contents... numpy._import_tools: module references __file__ numpy._import_tools: module references __path__ numpy.version: module references __file__ numpy.core.generate_array_api: module references __file__ numpy.core.setup: module references __file__ numpy.distutils.exec_command: module references __file__ numpy.distutils.misc_util: module references __file__ numpy.distutils.system_info: module references __file__ numpy.distutils.command.build_src: module references __file__ numpy.f2py.diagnose: module references __file__ numpy.f2py.f2py2e: module references __file__ numpy.f2py.setup: module references __file__ numpy.f2py.lib.wrapper_base: module references __file__ numpy.lib.utils: module MAY be using inspect.getsource numpy.lib.utils: module MAY be using inspect.getsourcefile numpy.numarray.util: module references __file__ numpy.testing.numpytest: module references __file__ -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco |
From: Bryce H. <bhe...@en...> - 2006-10-05 17:48:29
|
Robert Kern wrote: > It is not zip-safe if you want to compile against the headers. That keyword > can't be added to the setup() call in the trunk's setup.py because numpy cannot > depend on setuptools, at the moment. > > Adding the keyword does not break builds not using setuptools, the build just prints a warning that its not a valid keyword. I just discovered this is also an issue for scipy eggs when building with weave. Perhaps the best solution is to complain to the setuptools list, I'm just looking for a quick fix for now. Bryce |
From: Robert K. <rob...@gm...> - 2006-10-05 17:44:25
|
Bryce Hendrix wrote: > Is anyone using the numpy egg compressed? Recent versions of setuptools > seem to think it is zip safe, but this causes builds to fail which > compile with the numpy headers. Can we change the setup.py to default to > not zip safe (by adding zip_safe=False as a keyword arg to the setup > function call)? It is not zip-safe if you want to compile against the headers. That keyword can't be added to the setup() call in the trunk's setup.py because numpy cannot depend on setuptools, at the moment. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco |
From: Bryce H. <bhe...@en...> - 2006-10-05 17:30:31
|
Is anyone using the numpy egg compressed? Recent versions of setuptools seem to think it is zip safe, but this causes builds to fail which compile with the numpy headers. Can we change the setup.py to default to not zip safe (by adding zip_safe=False as a keyword arg to the setup function call)? Bryce |
From: Tim H. <tim...@ie...> - 2006-10-05 16:52:52
|
Ivan Vilata i Balaguer wrote: > En/na Tim Hochberg ha escrit:: > > >> Ivan Vilata i Balaguer wrote: >> >>> It seemed that discontiguous arrays worked OK in Numexpr since r1977 or >>> so, but I have come across some alignment or striding problems which can >>> be seen with the following code:: >>> >> I looked at this just a little bit and clearly this bit from interp_body >> cannot work in the presence of recor arrays: >> >> //.... >> intp sf1 = sb1 / sizeof(double); \ >> //... >> #define f1 ((double *)x1)[j*sf1] >> >> There are clearly some assumptions that sb1 is evenly divisible by >> sizeof(double). [...] >> > > I noticed something strange in those statements when implementing > support for strings, and I must confess that I didn't grasp their > meaning, so I implemented it a little differently for strings:: > > #define s1 ((char *)x1 + j*params.memsteps[arg1]) > I believe that these approaches are the same as long as memstep is a multiple of itemsize. I chose the indexing rather than the pointer foo version[1] because there's a rumor that compilers will sometimes generate faster code for it. One additional potential slowdown in the above is the compiler may not be able to tell that memsteps[arg1]) is constant and may do that lookup repeatedly. Or maybe not, I try not to second guess compilers too much. -tim [1] I'm pretty sure I used the pointer foo version at least for a while. and may have gone back and forth several times. > That seemed to work, but it might not be right (though I tested a bit), > and certainly it may not be efficient enough. Here you have my previous > patches if you want to have a look at how I (try to) do it: > > 1.http://www.mail-archive.com/numpy-discussion%40lists.sourceforge.net/msg01551.html > 2.http://www.mail-archive.com/numpy-discussion%40lists.sourceforge.net/msg02261.html > 3.http://www.mail-archive.com/numpy-discussion%40lists.sourceforge.net/msg02644.html > > :: > > Ivan Vilata i Balaguer >qo< http://www.carabos.com/ > Cárabos Coop. V. V V Enjoy Data > "" > > > ------------------------------------------------------------------------ > > ------------------------------------------------------------------------- > Take Surveys. Earn Cash. Influence the Future of IT > Join SourceForge.net's Techsay panel and you'll get the chance to share your > opinions on IT & business topics through brief surveys -- and earn cash > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > ------------------------------------------------------------------------ > > _______________________________________________ > Numpy-discussion mailing list > Num...@li... > https://lists.sourceforge.net/lists/listinfo/numpy-discussion > |
From: John H. <jdh...@ac...> - 2006-10-05 16:33:33
|
>>>>> "Robert" == Robert Kern <rob...@gm...> writes: Robert> IMO, I'd rather see this and similar functions go into Robert> scipy. New functions that apply semantics to arrays (in Robert> this case, treating them as time series), I think should Robert> go into scipy. New functions that treat arrays simply as Robert> arrays and are generally useful can probably go into Robert> numpy. I prefer Perry's longstanding suggestion: things that do not add to distribution complexity should go into numpy. If it compiles as easily as numpy itself, it should go into numpy where sensible. It remains a fact of life that numpy gets a wider distribution than scipy, and some packages are hesitant to require scipy as a prereq because of the additional complexity or building fortran, etc. I would be nice to have as much as possible in the most widely distributed package IMO. JDH |
From: Ivan V. i B. <iv...@ca...> - 2006-10-05 16:00:37
|
En/na Tim Hochberg ha escrit:: > Tim Hochberg wrote: >>>> Ivan Vilata i Balaguer wrote: >>>>> It seemed that discontiguous arrays worked OK in Numexpr since=20 >>>>> r1977 or >>>>> so, but I have come across some alignment or striding problems=20 >>>>> which can >>>>> be seen with the following code:: > OK, I've checked in a fix for this that makes a copy when the array is = > not strided in an even multiple of the itemsize. I first tried copying = > for all discontiguous array, but this resulted in a large speed hit for= =20 > vanilla strided arrays (a=3Darange(10)[::2], etc.), so I was more fruga= l=20 > with my copying. I'm not entirely certain that I caught all of the=20 > problematic cases, so let me know if you run into any more issues like = this. Great! For the moment I just can say that the problems I had with that have disappeared, but I will keep you up to date if I find something else. Thank you very much! :: Ivan Vilata i Balaguer >qo< http://www.carabos.com/ C=C3=A1rabos Coop. V. V V Enjoy Data "" |
From: Greg W. <gre...@gm...> - 2006-10-05 15:58:19
|
On 10/5/06, Robert Kern <rob...@gm...> wrote: > > IMO, I'd rather see this and similar functions go into scipy. New > functions that > apply semantics to arrays (in this case, treating them as time series), I > think > should go into scipy. New functions that treat arrays simply as arrays and > are > generally useful can probably go into numpy. > Okay I'll take a look at the Scipy parts tonight. So do you cancel that ticket I created or do I attach a new patch against scipy or what? Thanks Greg -- Linux. Because rebooting is for adding hardware. |