You can subscribe to this list here.
2000 |
Jan
(8) |
Feb
(49) |
Mar
(48) |
Apr
(28) |
May
(37) |
Jun
(28) |
Jul
(16) |
Aug
(16) |
Sep
(44) |
Oct
(61) |
Nov
(31) |
Dec
(24) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2001 |
Jan
(56) |
Feb
(54) |
Mar
(41) |
Apr
(71) |
May
(48) |
Jun
(32) |
Jul
(53) |
Aug
(91) |
Sep
(56) |
Oct
(33) |
Nov
(81) |
Dec
(54) |
2002 |
Jan
(72) |
Feb
(37) |
Mar
(126) |
Apr
(62) |
May
(34) |
Jun
(124) |
Jul
(36) |
Aug
(34) |
Sep
(60) |
Oct
(37) |
Nov
(23) |
Dec
(104) |
2003 |
Jan
(110) |
Feb
(73) |
Mar
(42) |
Apr
(8) |
May
(76) |
Jun
(14) |
Jul
(52) |
Aug
(26) |
Sep
(108) |
Oct
(82) |
Nov
(89) |
Dec
(94) |
2004 |
Jan
(117) |
Feb
(86) |
Mar
(75) |
Apr
(55) |
May
(75) |
Jun
(160) |
Jul
(152) |
Aug
(86) |
Sep
(75) |
Oct
(134) |
Nov
(62) |
Dec
(60) |
2005 |
Jan
(187) |
Feb
(318) |
Mar
(296) |
Apr
(205) |
May
(84) |
Jun
(63) |
Jul
(122) |
Aug
(59) |
Sep
(66) |
Oct
(148) |
Nov
(120) |
Dec
(70) |
2006 |
Jan
(460) |
Feb
(683) |
Mar
(589) |
Apr
(559) |
May
(445) |
Jun
(712) |
Jul
(815) |
Aug
(663) |
Sep
(559) |
Oct
(930) |
Nov
(373) |
Dec
|
From: Charles R H. <cha...@gm...> - 2006-08-01 19:49:04
|
Hi David, I often have several thousand nodes in a graph, sometimes clustered into connected components. I suspect that using an adjacency matrix is an inefficient representation for graphs of that size while for smaller graphs the overhead of more complicated structures wouldn't be noticeable. Have you looked at the boost graph library? I don't like all their stuff but it is a good start with lots of code and a suitable license. Chuck On 8/1/06, David Grant <dav...@gm...> wrote: > > I have written my own graph class, it doesn't really do much, just has a > few methods, it might do more later. Up until now it has just had one piece > of data, an adjacency matrix, so it looks something like this: > > class Graph: > def __init__(self, Adj): > self.Adj = Adj > > I had the idea of changing Graph to inherit numpy.ndarray instead, so then > I can just access itself directly rather than having to type self.Adj. Is > this the right way to go about it? To inherit from numpy.ndarray? > > The reason I'm using a numpy array to store the graph by the way is the > following: > -Memory is not a concern (yet) so I don't need to use a sparse structure > like a sparse array or a dictionary > -I run a lot of sums on it, argmin, blanking out of certain rows and > columns using fancy indexing, grabbing subgraphs using vector indexing > > -- > David Grant > http://www.davidgrant.ca > > ------------------------------------------------------------------------- > Take Surveys. Earn Cash. Influence the Future of IT > Join SourceForge.net's Techsay panel and you'll get the chance to share > your > opinions on IT & business topics through brief surveys -- and earn cash > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > > _______________________________________________ > Numpy-discussion mailing list > Num...@li... > https://lists.sourceforge.net/lists/listinfo/numpy-discussion > > > |
From: Bill B. <wb...@gm...> - 2006-08-01 18:41:11
|
Hi David, For a graph, the fact that it's stored as a matrix, or stored as linked nodes, or dicts, etc, is an implementation detail. So from a classical OO point of view, inheritance is not what you want. Inheritance says "this is a kind of that". But a graph is not a kind of matrix. A matrix is merely one possible way to represent a graph. Many matrix operations don't even make sense on a graph (although a lot of them do...). Also you say "memory is not a concern (yet)", but maybe it will be later, and then you'll want to change the underlying representation. Ideally you will be able to do this in such a way that all your graph-using code works completely without modification. This will be harder to do if you derive from ndarray. Because to prevent existing code from breaking you have to duplicate ndarray's interface exactly, because you don't know which ndarray methods are being used by all existing Graph-using code. On the other hand, in the short term it's probably easier to derive from ndarray directly if all you need is something quick and dirty. But maybe then you don't even need to make a graph class. All you need is Graph = ndarray I've seen plenty of Matlab code that just uses raw matrices to represent graphs without introducing any new type or class. It may be that's good enough for what you want to do. Python is not really a "Classical OO" language, in the sense that there's.no real data hiding, etc. Python's philosophy seems to be more like "whatever makes your life the easiest". So do what you think will make your life easiest based on the totality of your circumstances (including need for future maintenance). If memory is your only concern, then if/when it becomes and issue, a switch to scipy.sparse matrix shouldn't be too bad if you want to just use the ndarray interface. --bill On 8/2/06, David Grant <dav...@gm...> wrote: > I have written my own graph class, it doesn't really do much, just has a few > methods, it might do more later. Up until now it has just had one piece of > data, an adjacency matrix, so it looks something like this: > > class Graph: > def __init__(self, Adj): > self.Adj = Adj > > I had the idea of changing Graph to inherit numpy.ndarray instead, so then I > can just access itself directly rather than having to type self.Adj. Is this > the right way to go about it? To inherit from numpy.ndarray? > > The reason I'm using a numpy array to store the graph by the way is the > following: > -Memory is not a concern (yet) so I don't need to use a sparse structure > like a sparse array or a dictionary > -I run a lot of sums on it, argmin, blanking out of certain rows and columns > using fancy indexing, grabbing subgraphs using vector indexing > > -- > David Grant > http://www.davidgrant.ca > ------------------------------------------------------------------------- > Take Surveys. Earn Cash. Influence the Future of IT > Join SourceForge.net's Techsay panel and you'll get the chance to share your > opinions on IT & business topics through brief surveys -- and earn cash > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > > _______________________________________________ > Numpy-discussion mailing list > Num...@li... > https://lists.sourceforge.net/lists/listinfo/numpy-discussion > > > |
From: David G. <dav...@gm...> - 2006-08-01 17:40:41
|
I have written my own graph class, it doesn't really do much, just has a few methods, it might do more later. Up until now it has just had one piece of data, an adjacency matrix, so it looks something like this: class Graph: def __init__(self, Adj): self.Adj = Adj I had the idea of changing Graph to inherit numpy.ndarray instead, so then I can just access itself directly rather than having to type self.Adj. Is this the right way to go about it? To inherit from numpy.ndarray? The reason I'm using a numpy array to store the graph by the way is the following: -Memory is not a concern (yet) so I don't need to use a sparse structure like a sparse array or a dictionary -I run a lot of sums on it, argmin, blanking out of certain rows and columns using fancy indexing, grabbing subgraphs using vector indexing -- David Grant http://www.davidgrant.ca |
From: David G. <dav...@gm...> - 2006-08-01 16:56:19
|
I also couldn't reproduce it on my 0.9.8 on Linux. DG On 8/1/06, David L Goldsmith <Dav...@no...> wrote: > > Hi, Hanno. I ran your sample session in numpy 0.9.8 (on a Mac, just so > you know; I don't yet have numpy installed on my Windows platform, and I > don't have immediate access to a *nix box) and could not reproduce the > problem, i.e., it does appear to have been fixed in 0.9.8. > > DG > > Hanno Klemm wrote: > > Hello, > > > > numpy.var exhibits a rather dangereous behviour, as I have just > > noticed. In some cases, numpy.var calculates the variance, and in some > > cases the standard deviation (=square root of variance). Is this > > intended? I have to admit that I use numpy 0.9.6 at the moment. Has > > this been changed in more recent versions? > > > > Below a sample session > > > > > > Python 2.4.3 (#1, May 8 2006, 18:35:42) > > [GCC 3.2.3 20030502 (Red Hat Linux 3.2.3-52)] on linux2 > > Type "help", "copyright", "credits" or "license" for more information. > > > >>>> import numpy > >>>> a = [1,2,3,4,5] > >>>> numpy.var(a) > >>>> > > 2.5 > > > >>>> numpy.std(a) > >>>> > > 1.5811388300841898 > > > >>>> numpy.sqrt(2.5) > >>>> > > 1.5811388300841898 > > > >>>> a1 = numpy.array([[1],[2],[3],[4],[5]]) > >>>> a1 > >>>> > > array([[1], > > [2], > > [3], > > [4], > > [5]]) > > > >>>> numpy.var(a1) > >>>> > > array([ 1.58113883]) > > > >>>> numpy.std(a1) > >>>> > > array([ 1.58113883]) > > > >>>> a =numpy.array([1,2,3,4,5]) > >>>> numpy.std(a) > >>>> > > 1.5811388300841898 > > > >>>> numpy.var(a) > >>>> > > 1.5811388300841898 > > > >>>> numpy.__version__ > >>>> > > '0.9.6' > > > > > > > > Hanno > > > > > > > -- > HMRD/ORR/NOS/NOAA <http://response.restoration.noaa.gov/emergencyresponse/ > > > > ------------------------------------------------------------------------- > Take Surveys. Earn Cash. Influence the Future of IT > Join SourceForge.net's Techsay panel and you'll get the chance to share > your > opinions on IT & business topics through brief surveys -- and earn cash > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > _______________________________________________ > Numpy-discussion mailing list > Num...@li... > https://lists.sourceforge.net/lists/listinfo/numpy-discussion > -- David Grant http://www.davidgrant.ca |
From: Sasha <nd...@ma...> - 2006-08-01 16:07:35
|
I cannot reproduce your results, but I wonder if the following is right: >>> a = array([1,2,3,4,5]) >>> var(a[newaxis,:]) array([ 0., 0., 0., 0., 0.]) >>> a[newaxis,:].var() 2.0 >>> a[newaxis,:].var(axis=0) array([ 0., 0., 0., 0., 0.]) Are method and function supposed to have different defaults? It looks like the method defaults to variance over all axes while the function defaults to axis=0. >>> __version__ '1.0b2.dev2192' On 8/1/06, Hanno Klemm <kl...@ph...> wrote: > > Hello, > > numpy.var exhibits a rather dangereous behviour, as I have just > noticed. In some cases, numpy.var calculates the variance, and in some > cases the standard deviation (=square root of variance). Is this > intended? I have to admit that I use numpy 0.9.6 at the moment. Has > this been changed in more recent versions? > > Below a sample session > > > Python 2.4.3 (#1, May 8 2006, 18:35:42) > [GCC 3.2.3 20030502 (Red Hat Linux 3.2.3-52)] on linux2 > Type "help", "copyright", "credits" or "license" for more information. > >>> import numpy > >>> a = [1,2,3,4,5] > >>> numpy.var(a) > 2.5 > >>> numpy.std(a) > 1.5811388300841898 > >>> numpy.sqrt(2.5) > 1.5811388300841898 > >>> a1 = numpy.array([[1],[2],[3],[4],[5]]) > >>> a1 > array([[1], > [2], > [3], > [4], > [5]]) > >>> numpy.var(a1) > array([ 1.58113883]) > >>> numpy.std(a1) > array([ 1.58113883]) > >>> a =numpy.array([1,2,3,4,5]) > >>> numpy.std(a) > 1.5811388300841898 > >>> numpy.var(a) > 1.5811388300841898 > >>> numpy.__version__ > '0.9.6' > > > > Hanno > > -- > Hanno Klemm > kl...@ph... > > > > ------------------------------------------------------------------------- > Take Surveys. Earn Cash. Influence the Future of IT > Join SourceForge.net's Techsay panel and you'll get the chance to share your > opinions on IT & business topics through brief surveys -- and earn cash > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > _______________________________________________ > Numpy-discussion mailing list > Num...@li... > https://lists.sourceforge.net/lists/listinfo/numpy-discussion > |
From: Ivan V. i B. <iv...@ca...> - 2006-08-01 16:02:34
|
Hi all, I'm attaching some patches that enable the current version of numexpr (r2142) to: 1. Handle int64 integers in addition to int32 (constants, variables and arrays). Python int objects are considered int32 if they fit in 32 bits. Python long objects and int objects that don't fit in 32 bits (for 64-bit platforms) are considered int64. 2. Handle string constants, variables and arrays (not Unicode), with support for comparison operators (==, !=, <, <=, >=, >). (This brings the old ``memsizes`` variable back.) String temporaries (necessary for other kinds of operations) are not supported. The patches also include test cases and some minor corrections (e.g. removing odd carriage returns in some lines in compile.py). There are three patches to ease their individual review: * numexpr-int64.diff only contains the changes for int64 support. * numexpr-str.diff only contains the changes for string support. * numexpr-int64str.diff contains all changes. The task has been somehow difficult, but I think the result fits quite well in numexpr. So, what's your opinion about the patches? Are they worth integrating into the main branch? Thanks! :: Ivan Vilata i Balaguer >qo< http://www.carabos.com/ Cárabos Coop. V. V V Enjoy Data "" |
From: David L G. <Dav...@no...> - 2006-08-01 15:59:25
|
Hi, Hanno. I ran your sample session in numpy 0.9.8 (on a Mac, just so you know; I don't yet have numpy installed on my Windows platform, and I don't have immediate access to a *nix box) and could not reproduce the problem, i.e., it does appear to have been fixed in 0.9.8. DG Hanno Klemm wrote: > Hello, > > numpy.var exhibits a rather dangereous behviour, as I have just > noticed. In some cases, numpy.var calculates the variance, and in some > cases the standard deviation (=square root of variance). Is this > intended? I have to admit that I use numpy 0.9.6 at the moment. Has > this been changed in more recent versions? > > Below a sample session > > > Python 2.4.3 (#1, May 8 2006, 18:35:42) > [GCC 3.2.3 20030502 (Red Hat Linux 3.2.3-52)] on linux2 > Type "help", "copyright", "credits" or "license" for more information. > >>>> import numpy >>>> a = [1,2,3,4,5] >>>> numpy.var(a) >>>> > 2.5 > >>>> numpy.std(a) >>>> > 1.5811388300841898 > >>>> numpy.sqrt(2.5) >>>> > 1.5811388300841898 > >>>> a1 = numpy.array([[1],[2],[3],[4],[5]]) >>>> a1 >>>> > array([[1], > [2], > [3], > [4], > [5]]) > >>>> numpy.var(a1) >>>> > array([ 1.58113883]) > >>>> numpy.std(a1) >>>> > array([ 1.58113883]) > >>>> a =numpy.array([1,2,3,4,5]) >>>> numpy.std(a) >>>> > 1.5811388300841898 > >>>> numpy.var(a) >>>> > 1.5811388300841898 > >>>> numpy.__version__ >>>> > '0.9.6' > > > > Hanno > > -- HMRD/ORR/NOS/NOAA <http://response.restoration.noaa.gov/emergencyresponse/> |
From: Hanno K. <kl...@ph...> - 2006-08-01 11:53:29
|
Hello, numpy.var exhibits a rather dangereous behviour, as I have just noticed. In some cases, numpy.var calculates the variance, and in some cases the standard deviation (=square root of variance). Is this intended? I have to admit that I use numpy 0.9.6 at the moment. Has this been changed in more recent versions? Below a sample session Python 2.4.3 (#1, May 8 2006, 18:35:42) [GCC 3.2.3 20030502 (Red Hat Linux 3.2.3-52)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import numpy >>> a = [1,2,3,4,5] >>> numpy.var(a) 2.5 >>> numpy.std(a) 1.5811388300841898 >>> numpy.sqrt(2.5) 1.5811388300841898 >>> a1 = numpy.array([[1],[2],[3],[4],[5]]) >>> a1 array([[1], [2], [3], [4], [5]]) >>> numpy.var(a1) array([ 1.58113883]) >>> numpy.std(a1) array([ 1.58113883]) >>> a =numpy.array([1,2,3,4,5]) >>> numpy.std(a) 1.5811388300841898 >>> numpy.var(a) 1.5811388300841898 >>> numpy.__version__ '0.9.6' Hanno -- Hanno Klemm kl...@ph... |
From: Louis C. <lco...@po...> - 2006-08-01 08:12:08
|
> I listened to this and it looks like Sergio Ray is giving an intro class > on scientific computing with Python and has some concepts confused. We > should take this as a sign that we need to keep doing a good job of > educating people. I'm on UTC+02:00 so only just saw there have been a few posts. Basically my issue was with numarray going to replace NumPy, and that the recording was only a few months old, sitting on the web where newcomers to Python will undoubtedly find it. I thought the proper thing to do would be to ask the 411 site to just append a footnote explaining that some of the info is out-dated. I just didn't want to do it without getting the groups opinion first. Regards, Louis. -- Louis Cordier <lco...@po...> cell: +27721472305 Point45 Entertainment (Pty) Ltd. http://www.point45.org |
From: Travis O. <oli...@ie...> - 2006-08-01 06:55:01
|
Angus McMorland wrote: > Hi people who know what's going on, > > I'm getting an install failure with the latest numpy from svn (revision > 2940). This is on an amd64 machine running python 2.4.4c0. > This was my fault. Revision 2931 contained a mistaken deletion of a line from arrayobject.h that should not have happened which affected only 64-bit builds. This problem is corrected in revision 2941. -Travis |
From: Nils W. <nw...@ia...> - 2006-08-01 06:24:44
|
Angus McMorland wrote: > Hi people who know what's going on, > > I'm getting an install failure with the latest numpy from svn (revision > 2940). This is on an amd64 machine running python 2.4.4c0. > > The build halts at:=20 > > compile options: '-Ibuild/src.linux-x86_64-2.4/numpy/core/src > -Inumpy/core/include -Ibuild/src.linux-x86_64-2.4/numpy/core > -Inumpy/core/src -Inumpy/core/include -I/usr/include/python2.4 -c' > gcc: numpy/core/src/multiarraymodule.c > In file included from numpy/core/src/arrayobject.c:508, > from numpy/core/src/multiarraymodule.c:64: > numpy/core/src/arraytypes.inc.src: In function 'set_typeinfo': > numpy/core/src/arraytypes.inc.src:2139: error: 'PyIntpArrType_Type' > undeclared (first use in this function) > numpy/core/src/arraytypes.inc.src:2139: error: (Each undeclared > identifier is reported only once > numpy/core/src/arraytypes.inc.src:2139: error: for each function it > appears in.) > In file included from numpy/core/src/arrayobject.c:508, > from numpy/core/src/multiarraymodule.c:64: > numpy/core/src/arraytypes.inc.src: In function 'set_typeinfo': > numpy/core/src/arraytypes.inc.src:2139: error: 'PyIntpArrType_Type' > undeclared (first use in this function) > numpy/core/src/arraytypes.inc.src:2139: error: (Each undeclared > identifier is reported only once > numpy/core/src/arraytypes.inc.src:2139: error: for each function it > appears in.) > error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O2 -Wall > -Wstrict-prototypes -fPIC -Ibuild/src.linux-x86_64-2.4/numpy/core/src > -Inumpy/core/include -Ibuild/src.linux-x86_64-2.4/numpy/core > -Inumpy/core/src -Inumpy/core/include -I/usr/include/python2.4 -c > numpy/core/src/multiarraymodule.c -o > build/temp.linux-x86_64-2.4/numpy/core/src/multiarraymodule.o" failed > with exit status 1 > > Am I missing something or might this be a bug? > > Cheers, > > Angus. > =20 I can build numpy on a 32-bit machine but it fails on a 64-bit machine. Travis, please can you have a look at this issue. In file included from numpy/core/src/arrayobject.c:508, from numpy/core/src/multiarraymodule.c:64: numpy/core/src/arraytypes.inc.src: In function =91set_typeinfo=92: numpy/core/src/arraytypes.inc.src:2139: error: =91PyIntpArrType_Type=92 undeclared (first use in this function) numpy/core/src/arraytypes.inc.src:2139: error: (Each undeclared identifier is reported only once numpy/core/src/arraytypes.inc.src:2139: error: for each function it appears in.) In file included from numpy/core/src/arrayobject.c:508, from numpy/core/src/multiarraymodule.c:64: numpy/core/src/arraytypes.inc.src: In function =91set_typeinfo=92: numpy/core/src/arraytypes.inc.src:2139: error: =91PyIntpArrType_Type=92 undeclared (first use in this function) numpy/core/src/arraytypes.inc.src:2139: error: (Each undeclared identifier is reported only once numpy/core/src/arraytypes.inc.src:2139: error: for each function it appears in.) error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -O2 -fmessage-length=3D0 -Wall -D_FORTIFY_SOURCE=3D2 -g -fPIC -Ibuild/src.linux-x86_64-2.4/numpy/core/src -Inumpy/core/include -Ibuild/src.linux-x86_64-2.4/numpy/core -Inumpy/core/src -Inumpy/core/include -I/usr/include/python2.4 -c numpy/core/src/multiarraymodule.c -o build/temp.linux-x86_64-2.4/numpy/core/src/multiarraymodule.o" failed with exit status 1 Nils |
From: Tom L. <lo...@as...> - 2006-08-01 03:15:15
|
Hi Travis et al.- > Release Notes are forthcoming. But, probably won't be available for > awhile. Thanks for the alert, and for the subsequent clarifications. > The "problem" is that backward compatibility is not accomplished simply > by using import numpy instead of import Numeric. You have to use import > numpy.oldnumeric to get it. My problem is not with backward incompatibility. I am quite satisfied that all the changes are sensible, and I've tried to keep pace with them. I do skim the various digests fairly regularly. When I see an issue discussed and resolved, it is clear to me that the decisions are being made by better heads than mine. I'm +1 on all the changes, and grateful to all the developers who are thinking this stuff through so carefully. My problem is with wanting to keep up with at least most of the changes. I want to use numpy, not numpy.oldnumeric. I have a 10k+ line codebase I'm maintaining (with another 10k+ of wrapped Fortran and C) that I want to release when numpy-1.0 settles down. But there's nowhere I can go to find out what the changes are. My digest skimming is not turning up all of them. I have recent copies of the Guide (I'm not sure if these are regularly emailed out or if I always have to ask for an update), and though I find some of the changes when I stumble upon them, even in the Guide there is no section of "recent changes" or something like that. To find the changes, you just have to remember the old way and catch the new way. So now we have a "beta" out, but who are the beta testers supposed to be? Just the developers, who know about all the changes already? I'm not a numpy novice, but I'm stumbling with this beta more than I'm used to with betas, not because it's buggy, but simply because I don't know what's changing from release to release. I was tempted to start a Wiki page about it, but this discussion already reveals I'm not the one to do it, as some of what I thought I figured out about the changes turns out to be in error. > I was assuming adopters were willing to move with us until NumPy 1.0 > where backward compatibility would be made an important issue. I have been trying to move along with the changes (I had to roll back the last micro version or two of 0.9 because mpl wasn't keeping pace across my two main platforms). But without something like release notes or a "what's changed" page, it's sometimes been hard to move with you, and it's esp. hard with the 1.0 jump. > > * In the C API, ContiguousFromObject is now ContiguousFromAny. > > I am surprised that my libraries compile with no errors; I > > only get a runtime error. Shouldn't I be warned about this > > at compile-time? > > > > I'm not sure what the problem you are getting is. Please give your > runtime error. This should work. ContiguousFromObject is still available. Okay, I'm at home now on my Mac and it all works fine here. The failures were on FC3 linux. I'll have to undo my patches to mtrand to get the bugs back. More when I'm back at that machine. -Tom ------------------------------------------------- This mail sent through IMP: http://horde.org/imp/ |
From: Angus M. <a.m...@au...> - 2006-08-01 01:59:56
|
Hi people who know what's going on, I'm getting an install failure with the latest numpy from svn (revision 2940). This is on an amd64 machine running python 2.4.4c0. The build halts at: compile options: '-Ibuild/src.linux-x86_64-2.4/numpy/core/src -Inumpy/core/include -Ibuild/src.linux-x86_64-2.4/numpy/core -Inumpy/core/src -Inumpy/core/include -I/usr/include/python2.4 -c' gcc: numpy/core/src/multiarraymodule.c In file included from numpy/core/src/arrayobject.c:508, from numpy/core/src/multiarraymodule.c:64: numpy/core/src/arraytypes.inc.src: In function 'set_typeinfo': numpy/core/src/arraytypes.inc.src:2139: error: 'PyIntpArrType_Type' undeclared (first use in this function) numpy/core/src/arraytypes.inc.src:2139: error: (Each undeclared identifier is reported only once numpy/core/src/arraytypes.inc.src:2139: error: for each function it appears in.) In file included from numpy/core/src/arrayobject.c:508, from numpy/core/src/multiarraymodule.c:64: numpy/core/src/arraytypes.inc.src: In function 'set_typeinfo': numpy/core/src/arraytypes.inc.src:2139: error: 'PyIntpArrType_Type' undeclared (first use in this function) numpy/core/src/arraytypes.inc.src:2139: error: (Each undeclared identifier is reported only once numpy/core/src/arraytypes.inc.src:2139: error: for each function it appears in.) error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -O2 -Wall -Wstrict-prototypes -fPIC -Ibuild/src.linux-x86_64-2.4/numpy/core/src -Inumpy/core/include -Ibuild/src.linux-x86_64-2.4/numpy/core -Inumpy/core/src -Inumpy/core/include -I/usr/include/python2.4 -c numpy/core/src/multiarraymodule.c -o build/temp.linux-x86_64-2.4/numpy/core/src/multiarraymodule.o" failed with exit status 1 Am I missing something or might this be a bug? Cheers, Angus. -- Angus McMorland email a.m...@au... mobile +64-21-155-4906 PhD Student, Neurophysiology / Multiphoton & Confocal Imaging Physiology, University of Auckland phone +64-9-3737-599 x89707 Armourer, Auckland University Fencing Secretary, Fencing North Inc. |
From: Robert K. <rob...@gm...> - 2006-07-31 22:57:08
|
David Grant wrote: > I find myself needing the set operations provided by python 2.4 such as > intersection, difference, or even just the advantages of the data > strucure itself, like that fact that I can try adding something to it > and if it's already there, it won't get added again. Will my decision to > use of the python 'set' datastructure come back to haunt me later by > being too slow? If you are adding stuff few items at a time to large sets, it is likely that set() may be better for you O()-wise. However, the only way to know which method will be faster would be to try it yourself with your data. > Is there anything equivalent in scipy or numpy that I > can use? I find myself going between numpy arrays and sets a lot because > I sometimes need to treat it like an array to use some of the array > functions. Robert Cimrman wrote a number of set operations (union, intersection, difference) for arrays in numpy.lib.arraysetops . There have been some recent discussions on improving them, especially in the face of inf, nan, and other such floating point beasties. > Sorry for cross-posting to scipy and numpy... is that a bad idea? Yes. Please reserve cross-posting for announcements and other such things that don't require follow-up discussions. Cross-posted discussions can get a bit hairy. For questions like these ("Is there something in numpy or scipy to do foo?"), there is enough cross-readership that it really doesn't matter if you only ask one list. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco |
From: David G. <dav...@gm...> - 2006-07-31 22:38:09
|
I find myself needing the set operations provided by python 2.4 such as intersection, difference, or even just the advantages of the data strucure itself, like that fact that I can try adding something to it and if it's already there, it won't get added again. Will my decision to use of the python 'set' datastructure come back to haunt me later by being too slow? Is there anything equivalent in scipy or numpy that I can use? I find myself going between numpy arrays and sets a lot because I sometimes need to treat it like an array to use some of the array functions. Sorry for cross-posting to scipy and numpy... is that a bad idea? -- David Grant http://www.davidgrant.ca |
From: Gennan C. <gn...@co...> - 2006-07-31 22:02:18
|
Hi! Is there a new version of official numpybok? Mine dated at March 15, 2006. Gen-Nan Chen, PhD Chief Scientist Research and Development Group CorTechs Labs Inc (www.cortechs.net) 1020 Prospect St., #304, La Jolla, CA, 92037 Tel: 1-858-459-9700 ext 16 Fax: 1-858-459-9705 Email: gn...@co... |
From: Travis O. <oli...@ie...> - 2006-07-31 21:58:52
|
Daniel Poitras wrote: > Hello, > > I tried different arcsin functions on a complex number (z=0.52+0j) and obtained > the following results: > > cmath.asin(z) gives (0.54685095069594414+0j) #okay > > -1j*log(1j*z+sqrt(1-z**2)) gives (0.54685095069594414+0j) #okay, by definition > > numarray.arcsin(z) gives (0.54685095069594414+0j) #still okay > > but > > numpy.arcsin(z) gives (0.54685095069594414+0.54685095069594414j) #bug?? > > > Is it really a bug in numpy, or is there another explanation? > It's a bug. I seem to remember fixing this one before, but I guess it crept back in (or the bug I fixed was a similar one in another piece of code. Should be fixed in SVN now. -Travis |
From: Tim H. <tim...@ie...> - 2006-07-31 18:04:14
|
David L Goldsmith wrote: > All I can say is, if someone that confused about basic facts is being > cited as an authority and teaching a podcast class, I'm glad I have > someone on-site at my work who actually knows what they're talking about > and not relying on the Net for my numpy education. > The numpy == Numeric confusion is understandable. Numeric Python (AKA Numeric) was typically referred to as NumPy even though the name of the module was actually Numeric. So, in a sense numarray was a replacement for (the old) NumPy, and numpy is a replacement for both the old NumPy and numarray. At least I sure hope so! -tim > DG > > Travis Oliphant wrote: > >> Louis Cordier wrote: >> >> >>> Hmmm, I think people are spreading "fud" (lower case)... >>> http://media.libsyn.com/media/awaretek/Python411_060530_Numeric.mp3 >>> >>> >>> >>> >> I listened to this and it looks like Sergio Ray is giving an intro class >> on scientific computing with Python and has some concepts confused. We >> should take this as a sign that we need to keep doing a good job of >> educating people. >> >> Here are some things he has confused: >> >> NumPy == Numeric >> >> He tells his class that numarray is going to replace NumPy >> >> >> SciPy == Enthon >> >> He thinks that the Enthon distribution *is* SciPy. So, he tells his >> class that SciPy is "hard to use" on Unix because the Enthon >> distribution isn't available for Unix. >> >> >> I'm not sure when this recording was done but it was released on Python >> 411 at the end of May and he mentions this years SciPy conference with >> Guido as the key-note speaker so it was within the past few months. >> >> Other than that it has some good introductory material on what the Num* >> packages provide to Python. >> >> >> -Travis >> >> >> >> >> >> >> >> >> ------------------------------------------------------------------------- >> Take Surveys. Earn Cash. Influence the Future of IT >> Join SourceForge.net's Techsay panel and you'll get the chance to share your >> opinions on IT & business topics through brief surveys -- and earn cash >> http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV >> _______________________________________________ >> Numpy-discussion mailing list >> Num...@li... >> https://lists.sourceforge.net/lists/listinfo/numpy-discussion >> >> > > > |
From: Matthew B. <mat...@gm...> - 2006-07-31 17:55:42
|
Oh dear, I hope my friends remind me never to let anyone record what I say when I give a class! |
From: David L G. <Dav...@no...> - 2006-07-31 17:51:22
|
All I can say is, if someone that confused about basic facts is being cited as an authority and teaching a podcast class, I'm glad I have someone on-site at my work who actually knows what they're talking about and not relying on the Net for my numpy education. DG Travis Oliphant wrote: > Louis Cordier wrote: > >> Hmmm, I think people are spreading "fud" (lower case)... >> http://media.libsyn.com/media/awaretek/Python411_060530_Numeric.mp3 >> >> >> > I listened to this and it looks like Sergio Ray is giving an intro class > on scientific computing with Python and has some concepts confused. We > should take this as a sign that we need to keep doing a good job of > educating people. > > Here are some things he has confused: > > NumPy == Numeric > > He tells his class that numarray is going to replace NumPy > > > SciPy == Enthon > > He thinks that the Enthon distribution *is* SciPy. So, he tells his > class that SciPy is "hard to use" on Unix because the Enthon > distribution isn't available for Unix. > > > I'm not sure when this recording was done but it was released on Python > 411 at the end of May and he mentions this years SciPy conference with > Guido as the key-note speaker so it was within the past few months. > > Other than that it has some good introductory material on what the Num* > packages provide to Python. > > > -Travis > > > > > > > > > ------------------------------------------------------------------------- > Take Surveys. Earn Cash. Influence the Future of IT > Join SourceForge.net's Techsay panel and you'll get the chance to share your > opinions on IT & business topics through brief surveys -- and earn cash > http://www.techsay.com/default.php?page=join.php&p=sourceforge&CID=DEVDEV > _______________________________________________ > Numpy-discussion mailing list > Num...@li... > https://lists.sourceforge.net/lists/listinfo/numpy-discussion > -- HMRD/ORR/NOS/NOAA <http://response.restoration.noaa.gov/emergencyresponse/> |
From: Travis O. <oli...@ie...> - 2006-07-31 17:33:42
|
Louis Cordier wrote: > Hmmm, I think people are spreading "fud" (lower case)... > http://media.libsyn.com/media/awaretek/Python411_060530_Numeric.mp3 > > I listened to this and it looks like Sergio Ray is giving an intro class on scientific computing with Python and has some concepts confused. We should take this as a sign that we need to keep doing a good job of educating people. Here are some things he has confused: NumPy == Numeric He tells his class that numarray is going to replace NumPy SciPy == Enthon He thinks that the Enthon distribution *is* SciPy. So, he tells his class that SciPy is "hard to use" on Unix because the Enthon distribution isn't available for Unix. I'm not sure when this recording was done but it was released on Python 411 at the end of May and he mentions this years SciPy conference with Guido as the key-note speaker so it was within the past few months. Other than that it has some good introductory material on what the Num* packages provide to Python. -Travis |
From: Travis O. <oli...@ie...> - 2006-07-31 17:33:39
|
Bill Baxter wrote: > When you have a chance, could the powers that be make some comment on > the r_ and c_ situation? r_ and c_ were in SciPy and have been there for several years. For NumPy, c_ has been deprecated (but not removed because it is used in SciPy). The functionality of c_ is in r_ so it doesn't add anything. The purpose of it is as a "convenience function" so you can build arrays from the command line very quickly (which is easier in MATLAB). They were added while I was teaching a course in Signal Processing and was porting some MATLAB-written labs to SciPy. There is going to be overlap with long-name functions because of this. I have not had time to review Bill's suggestions yet --- were they filed as a ticket? A ticket is the best way to keep track of issues at this point. -Travis |
From: Matthew B. <mat...@gm...> - 2006-07-31 17:30:27
|
On 7/31/06, Robert Kern <rob...@gm...> wrote: > Louis Cordier wrote: > > Hmmm, I think people are spreading "fud" (lower case)... > > http://media.libsyn.com/media/awaretek/Python411_060530_Numeric.mp3 > > Can you give us a synopsis? or point us to when exactly in the clip we're > supposed to listen? Er, I didn't listen to the whole thing, but there are some references to numpy being difficult to install under platforms other than windows (arguable I guess), I got the impression the speaker was conflating the enthon distribution with scipy, and he says at one point that numarray is the successor to numpy. Not fud exactly, but a bit misleading. Matthew |
From: David L G. <Dav...@no...> - 2006-07-31 17:21:22
|
Louis Cordier wrote: > Hmmm, I think people are spreading "fud" (lower case)... > http://media.libsyn.com/media/awaretek/Python411_060530_Numeric.mp3 > > I googled fud and found two definitions: "fear, uncertainty and doubt," and "frustrate, undermine, and demoralize" - which is intended? DG -- HMRD/ORR/NOS/NOAA <http://response.restoration.noaa.gov/emergencyresponse/> |
From: Robert K. <rob...@gm...> - 2006-07-31 17:00:29
|
Louis Cordier wrote: > Hmmm, I think people are spreading "fud" (lower case)... > http://media.libsyn.com/media/awaretek/Python411_060530_Numeric.mp3 Can you give us a synopsis? or point us to when exactly in the clip we're supposed to listen? -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco |