You can subscribe to this list here.
2000 |
Jan
(8) |
Feb
(49) |
Mar
(48) |
Apr
(28) |
May
(37) |
Jun
(28) |
Jul
(16) |
Aug
(16) |
Sep
(44) |
Oct
(61) |
Nov
(31) |
Dec
(24) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2001 |
Jan
(56) |
Feb
(54) |
Mar
(41) |
Apr
(71) |
May
(48) |
Jun
(32) |
Jul
(53) |
Aug
(91) |
Sep
(56) |
Oct
(33) |
Nov
(81) |
Dec
(54) |
2002 |
Jan
(72) |
Feb
(37) |
Mar
(126) |
Apr
(62) |
May
(34) |
Jun
(124) |
Jul
(36) |
Aug
(34) |
Sep
(60) |
Oct
(37) |
Nov
(23) |
Dec
(104) |
2003 |
Jan
(110) |
Feb
(73) |
Mar
(42) |
Apr
(8) |
May
(76) |
Jun
(14) |
Jul
(52) |
Aug
(26) |
Sep
(108) |
Oct
(82) |
Nov
(89) |
Dec
(94) |
2004 |
Jan
(117) |
Feb
(86) |
Mar
(75) |
Apr
(55) |
May
(75) |
Jun
(160) |
Jul
(152) |
Aug
(86) |
Sep
(75) |
Oct
(134) |
Nov
(62) |
Dec
(60) |
2005 |
Jan
(187) |
Feb
(318) |
Mar
(296) |
Apr
(205) |
May
(84) |
Jun
(63) |
Jul
(122) |
Aug
(59) |
Sep
(66) |
Oct
(148) |
Nov
(120) |
Dec
(70) |
2006 |
Jan
(460) |
Feb
(683) |
Mar
(589) |
Apr
(559) |
May
(445) |
Jun
(712) |
Jul
(815) |
Aug
(663) |
Sep
(559) |
Oct
(930) |
Nov
(373) |
Dec
|
From: Jonathan T. <jon...@st...> - 2006-06-30 22:46:01
|
In some earlier code (at least one of) the following worked fine. I just want to get a new type that is a byteswap of, say, float64 because I want to memmap an array with a non-native byte order. Any suggestions? Thanks, Jonathan ------------------------------------------ Python 2.4.3 (#2, Apr 27 2006, 14:43:58) [GCC 4.0.3 (Ubuntu 4.0.3-1ubuntu5)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import numpy >>> numpy.__version__ '0.9.9.2716' >>> d=numpy.float64 >>> swapped=d.newbyteorder('big') Traceback (most recent call last): File "<stdin>", line 1, in ? TypeError: descriptor 'newbyteorder' requires a 'genericscalar' object but received a 'str' >>> swapped=d.newbyteorder('>') Traceback (most recent call last): File "<stdin>", line 1, in ? TypeError: descriptor 'newbyteorder' requires a 'genericscalar' object but received a 'str' >>> -- ------------------------------------------------------------------------ Jonathan Taylor Tel: 650.723.9230 Dept. of Statistics Fax: 650.725.8977 Sequoia Hall, 137 www-stat.stanford.edu/~jtaylo 390 Serra Mall Stanford, CA 94305 |
From: Sasha <nd...@ma...> - 2006-06-30 22:42:02
|
On 6/30/06, Travis Oliphant <oli...@ee...> wrote: > ... I still need to write the > convert-script code that inserts dtype=int > in routines that use old defaults: *plea* anybody want to write that?? > I will try to do it at some time over the long weekend. I was bitten by the fact that the current convert-script changes anything that resembles an old typecode such as 'b' regardless of context. (I was unlucky to have database columns called 'b'!) Fixing that is very similar to the problem at hand. |
From: Sasha <nd...@ma...> - 2006-06-30 22:31:48
|
On 6/30/06, Travis Oliphant <oli...@ee...> wrote: > This is great. How did you generate [the coverage statistic]? > It was really a hack. I've configured python using $ ./configure --enable-debug CC="gcc -fprofile-arcs -ftest-coverage" CXX="c++ gcc -fprofile-arcs -ftest-coverage" (I hate distutils!) Then I installed numpy and ran numpy.test(). Some linalg related tests failed which should be fixed by figuring out how to pass -fprofile-arcs -ftest-coverage options to the fortran compiler. The only non-obvious step in using gcov was that I had to tell it where to find object files: $ gcov -o build/temp.linux-x86_64-2.4/numpy/core/src numpy/core/src/*.c > ... > What happens if you run the scipy test suite? I don't know because I don't use scipy. Sorry. |
From: Sasha <nd...@ma...> - 2006-06-30 22:21:25
|
"Software developers also use coverage testing in concert with testsuites, to make sure software is actually good enough for a release. " -- Gcov Manual I think if we can improve the test coverage, it will speak volumes about the quality of numpy. Does anyone know if it is possible to instrument numpy libraries without having to instrument python itself? It would be nice to make the coverage reports easily available either by including a generating script with the source distribution or by publishing the reports for the releases. On 6/30/06, Sasha <nd...@ma...> wrote: > It is not as bad as I thought, but there is certainly room for improvement. > > File `numpy/core/src/multiarraymodule.c' > Lines executed:63.56% of 3290 > > File `numpy/core/src/arrayobject.c' > Lines executed:59.70% of 5280 > > File `numpy/core/src/scalartypes.inc.src' > Lines executed:31.67% of 963 > > File `numpy/core/src/arraytypes.inc.src' > Lines executed:47.35% of 868 > > File `numpy/core/src/arraymethods.c' > Lines executed:57.65% of 739 > > > > On 6/30/06, Sasha <nd...@ma...> wrote: > > As soon as I sent out my 10% estimate, I realized that someone will > > challenge it with a python level coverage statistics. My main concern > > is not what fraction of numpy functions is called by unit tests, but > > what fraction of special cases in the C code is exercised. I am not > > sure that David's statistics even answers the first question - I would > > guess it only counts statements in the pure python methods and > > ignores methods implemented in C. > > > > Can someone post C-level statistics from gcov > > <http://gcc.gnu.org/onlinedocs/gcc/Gcov.html> or a similar tool? > > > > On 6/30/06, David M. Cooke <co...@ph...> wrote: > > > On Fri, 30 Jun 2006 12:35:35 -0400 > > > Sasha <nd...@ma...> wrote: > > > > > > > On 6/30/06, Fernando Perez <fpe...@gm...> wrote: > > > > > ... > > > > > Besides, decent unit tests will catch these problems. We all know > > > > > that every scientific code in existence is unit tested to the smallest > > > > > routine, so this shouldn't be a problem for anyone. > > > > > > > > Is this a joke? Did anyone ever measured the coverage of numpy > > > > unittests? I would be surprized if it was more than 10%. > > > > > > A very quick application of the coverage module, available at > > > http://www.garethrees.org/2001/12/04/python-coverage/ > > > gives me 41%: > > > > > > Name Stmts Exec Cover > > > --------------------------------------------------- > > > numpy 25 20 80% > > > numpy._import_tools 235 175 74% > > > numpy.add_newdocs 2 2 100% > > > numpy.core 28 26 92% > > > numpy.core.__svn_version__ 1 1 100% > > > numpy.core._internal 99 48 48% > > > numpy.core.arrayprint 251 92 36% > > > numpy.core.defchararray 221 58 26% > > > numpy.core.defmatrix 259 186 71% > > > numpy.core.fromnumeric 319 153 47% > > > numpy.core.info 3 3 100% > > > numpy.core.ma 1612 1145 71% > > > numpy.core.memmap 64 14 21% > > > numpy.core.numeric 323 138 42% > > > numpy.core.numerictypes 236 204 86% > > > numpy.core.records 272 32 11% > > > numpy.dft 6 4 66% > > > numpy.dft.fftpack 128 31 24% > > > numpy.dft.helper 35 32 91% > > > numpy.dft.info 3 3 100% > > > numpy.distutils 13 9 69% > > > numpy.distutils.__version__ 4 4 100% > > > numpy.distutils.ccompiler 296 49 16% > > > numpy.distutils.exec_command 409 27 6% > > > numpy.distutils.info 2 2 100% > > > numpy.distutils.log 37 18 48% > > > numpy.distutils.misc_util 945 174 18% > > > numpy.distutils.unixccompiler 34 11 32% > > > numpy.dual 41 27 65% > > > numpy.f2py.info 2 2 100% > > > numpy.lib 30 28 93% > > > numpy.lib.arraysetops 121 59 48% > > > numpy.lib.function_base 501 70 13% > > > numpy.lib.getlimits 76 61 80% > > > numpy.lib.index_tricks 223 56 25% > > > numpy.lib.info 4 4 100% > > > numpy.lib.machar 174 154 88% > > > numpy.lib.polynomial 357 52 14% > > > numpy.lib.scimath 51 19 37% > > > numpy.lib.shape_base 220 24 10% > > > numpy.lib.twodim_base 77 51 66% > > > numpy.lib.type_check 110 75 68% > > > numpy.lib.ufunclike 37 24 64% > > > numpy.lib.utils 42 23 54% > > > numpy.linalg 5 3 60% > > > numpy.linalg.info 2 2 100% > > > numpy.linalg.linalg 440 71 16% > > > numpy.random 10 6 60% > > > numpy.random.info 4 4 100% > > > numpy.testing 3 3 100% > > > numpy.testing.info 2 2 100% > > > numpy.testing.numpytest 430 214 49% > > > numpy.testing.utils 151 62 41% > > > numpy.version 7 7 100% > > > --------------------------------------------------- > > > TOTAL 8982 3764 41% > > > > > > (I filtered out all the *.tests.* modules). Note that you have to import > > > numpy after starting the coverage, because we use a lot of module-level code > > > that wouldn't be caught otherwise. > > > > > > -- > > > |>|\/|< > > > /--------------------------------------------------------------------------\ > > > |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ > > > |co...@ph... > > > > > > |
From: Travis O. <oli...@ee...> - 2006-06-30 22:20:53
|
Sasha wrote: >It is not as bad as I thought, but there is certainly room for improvement. > >File `numpy/core/src/multiarraymodule.c' >Lines executed:63.56% of 3290 > >File `numpy/core/src/arrayobject.c' >Lines executed:59.70% of 5280 > >File `numpy/core/src/scalartypes.inc.src' >Lines executed:31.67% of 963 > >File `numpy/core/src/arraytypes.inc.src' >Lines executed:47.35% of 868 > >File `numpy/core/src/arraymethods.c' >Lines executed:57.65% of 739 > > > > > This is great. How did you generate that? This is exactly the kind of thing we need to be doing for the beta release cycle. I would like these numbers very close to 100% by the time 1.0 final comes out at the end of August / first of September. But, we need help to write the unit tests. What happens if you run the scipy test suite? -Travis |
From: Sasha <nd...@ma...> - 2006-06-30 22:10:12
|
It is not as bad as I thought, but there is certainly room for improvement. File `numpy/core/src/multiarraymodule.c' Lines executed:63.56% of 3290 File `numpy/core/src/arrayobject.c' Lines executed:59.70% of 5280 File `numpy/core/src/scalartypes.inc.src' Lines executed:31.67% of 963 File `numpy/core/src/arraytypes.inc.src' Lines executed:47.35% of 868 File `numpy/core/src/arraymethods.c' Lines executed:57.65% of 739 On 6/30/06, Sasha <nd...@ma...> wrote: > As soon as I sent out my 10% estimate, I realized that someone will > challenge it with a python level coverage statistics. My main concern > is not what fraction of numpy functions is called by unit tests, but > what fraction of special cases in the C code is exercised. I am not > sure that David's statistics even answers the first question - I would > guess it only counts statements in the pure python methods and > ignores methods implemented in C. > > Can someone post C-level statistics from gcov > <http://gcc.gnu.org/onlinedocs/gcc/Gcov.html> or a similar tool? > > On 6/30/06, David M. Cooke <co...@ph...> wrote: > > On Fri, 30 Jun 2006 12:35:35 -0400 > > Sasha <nd...@ma...> wrote: > > > > > On 6/30/06, Fernando Perez <fpe...@gm...> wrote: > > > > ... > > > > Besides, decent unit tests will catch these problems. We all know > > > > that every scientific code in existence is unit tested to the smallest > > > > routine, so this shouldn't be a problem for anyone. > > > > > > Is this a joke? Did anyone ever measured the coverage of numpy > > > unittests? I would be surprized if it was more than 10%. > > > > A very quick application of the coverage module, available at > > http://www.garethrees.org/2001/12/04/python-coverage/ > > gives me 41%: > > > > Name Stmts Exec Cover > > --------------------------------------------------- > > numpy 25 20 80% > > numpy._import_tools 235 175 74% > > numpy.add_newdocs 2 2 100% > > numpy.core 28 26 92% > > numpy.core.__svn_version__ 1 1 100% > > numpy.core._internal 99 48 48% > > numpy.core.arrayprint 251 92 36% > > numpy.core.defchararray 221 58 26% > > numpy.core.defmatrix 259 186 71% > > numpy.core.fromnumeric 319 153 47% > > numpy.core.info 3 3 100% > > numpy.core.ma 1612 1145 71% > > numpy.core.memmap 64 14 21% > > numpy.core.numeric 323 138 42% > > numpy.core.numerictypes 236 204 86% > > numpy.core.records 272 32 11% > > numpy.dft 6 4 66% > > numpy.dft.fftpack 128 31 24% > > numpy.dft.helper 35 32 91% > > numpy.dft.info 3 3 100% > > numpy.distutils 13 9 69% > > numpy.distutils.__version__ 4 4 100% > > numpy.distutils.ccompiler 296 49 16% > > numpy.distutils.exec_command 409 27 6% > > numpy.distutils.info 2 2 100% > > numpy.distutils.log 37 18 48% > > numpy.distutils.misc_util 945 174 18% > > numpy.distutils.unixccompiler 34 11 32% > > numpy.dual 41 27 65% > > numpy.f2py.info 2 2 100% > > numpy.lib 30 28 93% > > numpy.lib.arraysetops 121 59 48% > > numpy.lib.function_base 501 70 13% > > numpy.lib.getlimits 76 61 80% > > numpy.lib.index_tricks 223 56 25% > > numpy.lib.info 4 4 100% > > numpy.lib.machar 174 154 88% > > numpy.lib.polynomial 357 52 14% > > numpy.lib.scimath 51 19 37% > > numpy.lib.shape_base 220 24 10% > > numpy.lib.twodim_base 77 51 66% > > numpy.lib.type_check 110 75 68% > > numpy.lib.ufunclike 37 24 64% > > numpy.lib.utils 42 23 54% > > numpy.linalg 5 3 60% > > numpy.linalg.info 2 2 100% > > numpy.linalg.linalg 440 71 16% > > numpy.random 10 6 60% > > numpy.random.info 4 4 100% > > numpy.testing 3 3 100% > > numpy.testing.info 2 2 100% > > numpy.testing.numpytest 430 214 49% > > numpy.testing.utils 151 62 41% > > numpy.version 7 7 100% > > --------------------------------------------------- > > TOTAL 8982 3764 41% > > > > (I filtered out all the *.tests.* modules). Note that you have to import > > numpy after starting the coverage, because we use a lot of module-level code > > that wouldn't be caught otherwise. > > > > -- > > |>|\/|< > > /--------------------------------------------------------------------------\ > > |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ > > |co...@ph... > > > |
From: Travis N. V. <tr...@en...> - 2006-06-30 20:59:31
|
The *SciPy 2006 Conference* is scheduled for Thursday and Friday, August 17-18, 2006 at CalTech with Sprints and Tutorials Monday-Wednesday, August 14-16. Conference details are at http://www.scipy.org/SciPy2006 The deadlines for submitting abstracts and early registration are approaching... Call for Presenters ------------------- If you are interested in presenting at the conference, you may submit an abstract in Plain Text, PDF or MS Word formats to abstracts at scipy.org <http://mail.python.org/mailman/listinfo/python-announce-list> -- the deadline for abstract submission is July 7, 2006. Papers and/or presentation slides are acceptable and are due by August 4, 2006. Registration: ------------- Early registration ($100.00) is still available through July 14. You may register online at http://www.enthought.com/scipy06. Registration includes breakfast and lunch Thursday & Friday and a very nice dinner Thursday night. After July 14, 2006, registration will cost $150.00. Tutorials and Sprints --------------------- This year the Sprints (Monday and Tuesday, August 14-15) and Tutorials (Wednesday, August 16) are no additional charge (you're on your own for food on those days, though). Remember to include these days in your travel plans. The following topics are presented as Tutorials Wednesday (more info here: http://www.scipy.org/SciPy2006/TutorialSessions): - "3D visualization in Python using tvtk and MayaVi" - "Scientific Data Analysis and Visualization using IPython and Matplotlib." - "Building Scientific Applications using the Enthought Tool Suite (Envisage, Traits, Chaco, etc.)" - "NumPy (migration from Numarray & Numeric, overview of NumPy)" The Sprint topics are under discussion here: http://www.scipy.org/SciPy2006/CodingSprints See you in August! Travis |
From: Keith G. <kwg...@gm...> - 2006-06-30 20:56:14
|
On 6/30/06, David M. Cooke <co...@ph...> wrote: > On Fri, 30 Jun 2006 13:37:01 -0700 > "Keith Goodman" <kwg...@gm...> wrote: > > > When an array is printed, the numbers line up in nice columns (if > > you're using a fixed-width font): > > > > array([[0, 0], > > [0, 0]]) > > > > But for matrices the columns do not line up: > > > > matrix([[0, 0], > > [0, 0]]) > > Fixed in SVN. Thank you! All of the recent improvements to matrices will eventually bring many new numpy users. |
From: Sasha <nd...@ma...> - 2006-06-30 20:49:56
|
As soon as I sent out my 10% estimate, I realized that someone will challenge it with a python level coverage statistics. My main concern is not what fraction of numpy functions is called by unit tests, but what fraction of special cases in the C code is exercised. I am not sure that David's statistics even answers the first question - I would guess it only counts statements in the pure python methods and ignores methods implemented in C. Can someone post C-level statistics from gcov <http://gcc.gnu.org/onlinedocs/gcc/Gcov.html> or a similar tool? On 6/30/06, David M. Cooke <co...@ph...> wrote: > On Fri, 30 Jun 2006 12:35:35 -0400 > Sasha <nd...@ma...> wrote: > > > On 6/30/06, Fernando Perez <fpe...@gm...> wrote: > > > ... > > > Besides, decent unit tests will catch these problems. We all know > > > that every scientific code in existence is unit tested to the smallest > > > routine, so this shouldn't be a problem for anyone. > > > > Is this a joke? Did anyone ever measured the coverage of numpy > > unittests? I would be surprized if it was more than 10%. > > A very quick application of the coverage module, available at > http://www.garethrees.org/2001/12/04/python-coverage/ > gives me 41%: > > Name Stmts Exec Cover > --------------------------------------------------- > numpy 25 20 80% > numpy._import_tools 235 175 74% > numpy.add_newdocs 2 2 100% > numpy.core 28 26 92% > numpy.core.__svn_version__ 1 1 100% > numpy.core._internal 99 48 48% > numpy.core.arrayprint 251 92 36% > numpy.core.defchararray 221 58 26% > numpy.core.defmatrix 259 186 71% > numpy.core.fromnumeric 319 153 47% > numpy.core.info 3 3 100% > numpy.core.ma 1612 1145 71% > numpy.core.memmap 64 14 21% > numpy.core.numeric 323 138 42% > numpy.core.numerictypes 236 204 86% > numpy.core.records 272 32 11% > numpy.dft 6 4 66% > numpy.dft.fftpack 128 31 24% > numpy.dft.helper 35 32 91% > numpy.dft.info 3 3 100% > numpy.distutils 13 9 69% > numpy.distutils.__version__ 4 4 100% > numpy.distutils.ccompiler 296 49 16% > numpy.distutils.exec_command 409 27 6% > numpy.distutils.info 2 2 100% > numpy.distutils.log 37 18 48% > numpy.distutils.misc_util 945 174 18% > numpy.distutils.unixccompiler 34 11 32% > numpy.dual 41 27 65% > numpy.f2py.info 2 2 100% > numpy.lib 30 28 93% > numpy.lib.arraysetops 121 59 48% > numpy.lib.function_base 501 70 13% > numpy.lib.getlimits 76 61 80% > numpy.lib.index_tricks 223 56 25% > numpy.lib.info 4 4 100% > numpy.lib.machar 174 154 88% > numpy.lib.polynomial 357 52 14% > numpy.lib.scimath 51 19 37% > numpy.lib.shape_base 220 24 10% > numpy.lib.twodim_base 77 51 66% > numpy.lib.type_check 110 75 68% > numpy.lib.ufunclike 37 24 64% > numpy.lib.utils 42 23 54% > numpy.linalg 5 3 60% > numpy.linalg.info 2 2 100% > numpy.linalg.linalg 440 71 16% > numpy.random 10 6 60% > numpy.random.info 4 4 100% > numpy.testing 3 3 100% > numpy.testing.info 2 2 100% > numpy.testing.numpytest 430 214 49% > numpy.testing.utils 151 62 41% > numpy.version 7 7 100% > --------------------------------------------------- > TOTAL 8982 3764 41% > > (I filtered out all the *.tests.* modules). Note that you have to import > numpy after starting the coverage, because we use a lot of module-level code > that wouldn't be caught otherwise. > > -- > |>|\/|< > /--------------------------------------------------------------------------\ > |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ > |co...@ph... > |
From: David M. C. <co...@ph...> - 2006-06-30 20:46:22
|
On Fri, 30 Jun 2006 13:37:01 -0700 "Keith Goodman" <kwg...@gm...> wrote: > When an array is printed, the numbers line up in nice columns (if > you're using a fixed-width font): > > array([[0, 0], > [0, 0]]) > > But for matrices the columns do not line up: > > matrix([[0, 0], > [0, 0]]) Fixed in SVN. -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |co...@ph... |
From: Christopher B. <Chr...@no...> - 2006-06-30 20:40:47
|
Robert Kern wrote: > It's arange(0.0, 1.0, 0.1) that I think causes the most problems with arange and > floats. actually, much to my surprise: >>> import numpy as N >>> N.arange(0.0, 1.0, 0.1) array([ 0. , 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9]) But I'm sure there are other examples that don't work out. -Chris -- Christopher Barker, Ph.D. Oceanographer NOAA/OR&R/HAZMAT (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chr...@no... |
From: David M. C. <co...@ph...> - 2006-06-30 20:38:48
|
On Fri, 30 Jun 2006 12:35:35 -0400 Sasha <nd...@ma...> wrote: > On 6/30/06, Fernando Perez <fpe...@gm...> wrote: > > ... > > Besides, decent unit tests will catch these problems. We all know > > that every scientific code in existence is unit tested to the smallest > > routine, so this shouldn't be a problem for anyone. > > Is this a joke? Did anyone ever measured the coverage of numpy > unittests? I would be surprized if it was more than 10%. A very quick application of the coverage module, available at http://www.garethrees.org/2001/12/04/python-coverage/ gives me 41%: Name Stmts Exec Cover --------------------------------------------------- numpy 25 20 80% numpy._import_tools 235 175 74% numpy.add_newdocs 2 2 100% numpy.core 28 26 92% numpy.core.__svn_version__ 1 1 100% numpy.core._internal 99 48 48% numpy.core.arrayprint 251 92 36% numpy.core.defchararray 221 58 26% numpy.core.defmatrix 259 186 71% numpy.core.fromnumeric 319 153 47% numpy.core.info 3 3 100% numpy.core.ma 1612 1145 71% numpy.core.memmap 64 14 21% numpy.core.numeric 323 138 42% numpy.core.numerictypes 236 204 86% numpy.core.records 272 32 11% numpy.dft 6 4 66% numpy.dft.fftpack 128 31 24% numpy.dft.helper 35 32 91% numpy.dft.info 3 3 100% numpy.distutils 13 9 69% numpy.distutils.__version__ 4 4 100% numpy.distutils.ccompiler 296 49 16% numpy.distutils.exec_command 409 27 6% numpy.distutils.info 2 2 100% numpy.distutils.log 37 18 48% numpy.distutils.misc_util 945 174 18% numpy.distutils.unixccompiler 34 11 32% numpy.dual 41 27 65% numpy.f2py.info 2 2 100% numpy.lib 30 28 93% numpy.lib.arraysetops 121 59 48% numpy.lib.function_base 501 70 13% numpy.lib.getlimits 76 61 80% numpy.lib.index_tricks 223 56 25% numpy.lib.info 4 4 100% numpy.lib.machar 174 154 88% numpy.lib.polynomial 357 52 14% numpy.lib.scimath 51 19 37% numpy.lib.shape_base 220 24 10% numpy.lib.twodim_base 77 51 66% numpy.lib.type_check 110 75 68% numpy.lib.ufunclike 37 24 64% numpy.lib.utils 42 23 54% numpy.linalg 5 3 60% numpy.linalg.info 2 2 100% numpy.linalg.linalg 440 71 16% numpy.random 10 6 60% numpy.random.info 4 4 100% numpy.testing 3 3 100% numpy.testing.info 2 2 100% numpy.testing.numpytest 430 214 49% numpy.testing.utils 151 62 41% numpy.version 7 7 100% --------------------------------------------------- TOTAL 8982 3764 41% (I filtered out all the *.tests.* modules). Note that you have to import numpy after starting the coverage, because we use a lot of module-level code that wouldn't be caught otherwise. -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |co...@ph... |
From: Keith G. <kwg...@gm...> - 2006-06-30 20:37:03
|
When an array is printed, the numbers line up in nice columns (if you're using a fixed-width font): array([[0, 0], [0, 0]]) But for matrices the columns do not line up: matrix([[0, 0], [0, 0]]) |
From: Mark H. <ma...@mi...> - 2006-06-30 20:17:19
|
FYI, posted Sunday on python: "...even if the hard-core numeric-python people are all evangelizing for migration to numpy (for reasons that are of course quite defensible!), I think it's quite OK to stick with good old Numeric for the moment (and that's exactly what I do for my own personal use!)" "...Numeric has pretty good documentation (numpy's is probably even better, but it is not available for free, so I don't know!), and if you don't find that documentation sufficient you might want to have a look to my book "Python in a Nutshell" which devotes a chapter to Numeric..." http://groups.google.com/group/comp.lang.python/tree/browse_frm/thread/e5479dac51b6e481/fc475de9fd1b9669?rnum=1&q=martelli&_done=%2Fgroup%2Fcomp.lang.python%2Fbrowse_frm%2Fthread%2Fe5479dac51b6e481%2Fe282e6e2c9d4fc77%3Fq%3Dmartelli%26rnum%3D6%26#doc_55e0c696cb4aea87 Mark |
From: Travis O. <oli...@ee...> - 2006-06-30 20:11:23
|
Scott Ransom wrote: >On Fri, Jun 30, 2006 at 01:25:23PM -0600, Travis Oliphant wrote: > > >>Robert Kern wrote: >> >> >> >>>Whatever else you do, leave arange() alone. It should never have accepted floats >>>in the first place. >>> >>> >>> >>Actually, Robert makes a good point. arange with floats is >>problematic. We should direct people to linspace instead of changing >>the default of arange. Most new users will probably expect arange to >>return a type similar to Python's range which is int. >> >> >... > > >>So, I think from both a pragmatic and idealized situtation, arange >>should stay with the default of ints. People who want arange to return >>floats should be directed to linspace. >> >> I should have worded this as: "People who want arange to return floats *as a default* should be directed to linspace" So, basically, arange is not going to change. Because of this, shifting over was a cinch. I still need to write the convert-script code that inserts dtype=int in routines that use old defaults: *plea* anybody want to write that?? -Travis |
From: Bryce H. <bhe...@en...> - 2006-06-30 20:05:28
|
David M. Cooke wrote: > >>> [Really, distutils sucks. I think (besides refactoring) it needs it's API >>> documented better, or least good conventions on where to hook into. >>> setuptools and numpy.distutils do their best, but there's only so much you >>> can do before everything goes fragile and breaks in unexpected ways.] >>> >> I do hate distutils, having fought it for a long time. Its piss-poor >> dependency checking is one of its /many/ annoyances. For a package >> with as long a compile time as scipy, it really sucks not to be able >> to just modify random source files and trust that it will really >> recompile what's needed (no more, no less). >> >> Anyway, thanks for heeding this one. Hopefully one day somebody will >> do the (painful) work of replacing distutils with something that >> actually works (perhaps using scons for the build engine...) Until >> then, we'll trod along with massively unnecessary rebuilds :) >> > > I've tried using SCons -- still don't like it. It's python, but it's too > unpythonic for me. (The build engine itself is probably fine, though.) > Agreed, last time I used it was almost a year ago, so it might have changed, but SCons does a quasi-2 stage build that is very unnatural. If you have python code nested between 2 build events, the python code is executed and the build events are queued. BUT- its dependency management is great. Distutils suffers from 2 major problems as far as I am concerned: setup.py files often contain way too much business logic and verb-age for casual python developers, and worst-in-class dependency checking. I've been considering moving all Enthought projects to SCons. If another large project, such as scipy were to go that way it would make my decision much simpler. Bryce |
From: Robert K. <rob...@gm...> - 2006-06-30 20:02:57
|
Christopher Barker wrote: > Robert Kern wrote: >> Whatever else you do, leave arange() alone. It should never have accepted floats >> in the first place. > > Just to make sure we're clear: Because one should use linspace() for that? More or less. Depending on the step and endpoint that you choose, it can be nearly impossible for the programmer to predict how many elements are going to be generated. > If so, this would be the time to raise an error (or at least a > deprecation warning) when arange() is called with Floats. > > I have a LOT of code that does that! In fact, I posted a question here > recently and got a lot of answers and suggested code, and not one person > suggested that I shouldn't use arange() with floats. I should have been more specific, but I did express disapproval in the code sample I gave: x = arange(minx, maxx+step, step) # oy. Since your question wasn't about that specifically, I used the technique that your original sample did. > Did Numeric have linspace() It doesn't look like it to me. It doesn't. It was originally contributed to Scipy by Fernando, IIRC. It's small, so it is easy to copy if you need to maintain support for Numeric, still. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco |
From: Robert K. <rob...@gm...> - 2006-06-30 19:54:53
|
Scott Ransom wrote: > On Fri, Jun 30, 2006 at 01:25:23PM -0600, Travis Oliphant wrote: >> Robert Kern wrote: >> >>> Whatever else you do, leave arange() alone. It should never have accepted floats >>> in the first place. >>> >> Actually, Robert makes a good point. arange with floats is >> problematic. We should direct people to linspace instead of changing >> the default of arange. Most new users will probably expect arange to >> return a type similar to Python's range which is int. > ... >> So, I think from both a pragmatic and idealized situtation, arange >> should stay with the default of ints. People who want arange to return >> floats should be directed to linspace. > > I agree that arange with floats is problematic. However, > if you want, for example, arange(10.0) (as I often do), you have > to do: linspace(0.0, 9.0, 10), which is very un-pythonic and not > at all what a new user would expect... > > I think of linspace as a convenience function, not as a > replacement for arange with floats. I don't mind arange(10.0) so much, now that it exists. I would mind, a lot, about arange(10) returning a float64 array. Similarity to the builtin range() is much more important in my mind than an arbitrary "consistency" with ones() and zeros(). It's arange(0.0, 1.0, 0.1) that I think causes the most problems with arange and floats. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco |
From: Eric J. <jo...@MI...> - 2006-06-30 19:45:46
|
On Fri, 2006-06-30 at 12:35 -0400, Sasha wrote: > > Besides, decent unit tests will catch these problems. We all know > > that every scientific code in existence is unit tested to the smallest > > routine, so this shouldn't be a problem for anyone. > > Is this a joke? Did anyone ever measured the coverage of numpy > unittests? I would be surprized if it was more than 10%. Given the coverage is so low, how can people help by contributing unit tests? Are there obvious areas with poor coverage? Travis, do you have any opinions on this? ...Eric |
From: Scott R. <sr...@nr...> - 2006-06-30 19:45:07
|
On Fri, Jun 30, 2006 at 01:25:23PM -0600, Travis Oliphant wrote: > Robert Kern wrote: > > >Whatever else you do, leave arange() alone. It should never have accepted floats > >in the first place. > > > Actually, Robert makes a good point. arange with floats is > problematic. We should direct people to linspace instead of changing > the default of arange. Most new users will probably expect arange to > return a type similar to Python's range which is int. ... > So, I think from both a pragmatic and idealized situtation, arange > should stay with the default of ints. People who want arange to return > floats should be directed to linspace. I agree that arange with floats is problematic. However, if you want, for example, arange(10.0) (as I often do), you have to do: linspace(0.0, 9.0, 10), which is very un-pythonic and not at all what a new user would expect... I think of linspace as a convenience function, not as a replacement for arange with floats. Scott -- Scott M. Ransom Address: NRAO Phone: (434) 296-0320 520 Edgemont Rd. email: sr...@nr... Charlottesville, VA 22903 USA GPG Fingerprint: 06A9 9553 78BE 16DB 407B FFCA 9BFA B6FF FFD3 2989 |
From: Travis O. <oli...@ee...> - 2006-06-30 19:25:39
|
Robert Kern wrote: >Travis Oliphant wrote: > > > >>Comments? >> >> > >Whatever else you do, leave arange() alone. It should never have accepted floats >in the first place. > > Actually, Robert makes a good point. arange with floats is problematic. We should direct people to linspace instead of changing the default of arange. Most new users will probably expect arange to return a type similar to Python's range which is int. Also: Keeping arange as ints reduces the number of errors from the change in the unit tests to 2 in NumPy 3 in SciPy So, I think from both a pragmatic and idealized situtation, arange should stay with the default of ints. People who want arange to return floats should be directed to linspace. -Travis |
From: Christopher B. <Chr...@no...> - 2006-06-30 19:24:06
|
Robert Kern wrote: > Whatever else you do, leave arange() alone. It should never have accepted floats > in the first place. Just to make sure we're clear: Because one should use linspace() for that? If so, this would be the time to raise an error (or at least a deprecation warning) when arange() is called with Floats. I have a LOT of code that does that! In fact, I posted a question here recently and got a lot of answers and suggested code, and not one person suggested that I shouldn't use arange() with floats. Did Numeric have linspace() It doesn't look like it to me. -Chris -- Christopher Barker, Ph.D. Oceanographer NOAA/OR&R/HAZMAT (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chr...@no... |
From: David M. C. <co...@ph...> - 2006-06-30 19:19:31
|
On Wed, 28 Jun 2006 13:46:07 -0600 "Fernando Perez" <fpe...@gm...> wrote: > On 6/28/06, David M. Cooke <co...@ph...> wrote: > > > [Really, distutils sucks. I think (besides refactoring) it needs it's API > > documented better, or least good conventions on where to hook into. > > setuptools and numpy.distutils do their best, but there's only so much you > > can do before everything goes fragile and breaks in unexpected ways.] > > I do hate distutils, having fought it for a long time. Its piss-poor > dependency checking is one of its /many/ annoyances. For a package > with as long a compile time as scipy, it really sucks not to be able > to just modify random source files and trust that it will really > recompile what's needed (no more, no less). > > Anyway, thanks for heeding this one. Hopefully one day somebody will > do the (painful) work of replacing distutils with something that > actually works (perhaps using scons for the build engine...) Until > then, we'll trod along with massively unnecessary rebuilds :) I've tried using SCons -- still don't like it. It's python, but it's too unpythonic for me. (The build engine itself is probably fine, though.) A complete replacement for distutils isn't needed: bits and pieces can be replaced at a time (it gets harder if you've got two packages like setuptools and numpy.distutils trying to improve it, though). For instance, the CCompiler class could be replaced in whole with a rewrite, keeping what could be considered the public API. I've done this before with a version of UnixCCompiler that let me specify a "toolchain": which C compiler and C++ compiler worked together, which linker to use for them, and associated flags. I'm working (slowly) on a rewrite of commands/build_ext.py in numpy.distutils that should keep track of source dependencies better, for instance. -- |>|\/|< /--------------------------------------------------------------------------\ |David M. Cooke http://arbutus.physics.mcmaster.ca/dmc/ |co...@ph... |
From: Christopher B. <Chr...@no...> - 2006-06-30 19:17:22
|
Tim Hochberg wrote: > The number one priority for numpy should be to unify the three disparate > Python numeric packages. I think the number one priority should be the best it can be. As someone said, two (or ten) years from now, there will be more new users than users migrating from the older packages. > Personally, given no other constraints, I would probably just get rid of > the defaults all together and make the user choose. I like that too, and it would keep the incompatibility from causing silent errors. -Chris -- Christopher Barker, Ph.D. Oceanographer NOAA/OR&R/HAZMAT (206) 526-6959 voice 7600 Sand Point Way NE (206) 526-6329 fax Seattle, WA 98115 (206) 526-6317 main reception Chr...@no... |
From: Robert K. <rob...@gm...> - 2006-06-30 19:04:19
|
Travis Oliphant wrote: > Comments? Whatever else you do, leave arange() alone. It should never have accepted floats in the first place. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco |