You can subscribe to this list here.
2000 |
Jan
(8) |
Feb
(49) |
Mar
(48) |
Apr
(28) |
May
(37) |
Jun
(28) |
Jul
(16) |
Aug
(16) |
Sep
(44) |
Oct
(61) |
Nov
(31) |
Dec
(24) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2001 |
Jan
(56) |
Feb
(54) |
Mar
(41) |
Apr
(71) |
May
(48) |
Jun
(32) |
Jul
(53) |
Aug
(91) |
Sep
(56) |
Oct
(33) |
Nov
(81) |
Dec
(54) |
2002 |
Jan
(72) |
Feb
(37) |
Mar
(126) |
Apr
(62) |
May
(34) |
Jun
(124) |
Jul
(36) |
Aug
(34) |
Sep
(60) |
Oct
(37) |
Nov
(23) |
Dec
(104) |
2003 |
Jan
(110) |
Feb
(73) |
Mar
(42) |
Apr
(8) |
May
(76) |
Jun
(14) |
Jul
(52) |
Aug
(26) |
Sep
(108) |
Oct
(82) |
Nov
(89) |
Dec
(94) |
2004 |
Jan
(117) |
Feb
(86) |
Mar
(75) |
Apr
(55) |
May
(75) |
Jun
(160) |
Jul
(152) |
Aug
(86) |
Sep
(75) |
Oct
(134) |
Nov
(62) |
Dec
(60) |
2005 |
Jan
(187) |
Feb
(318) |
Mar
(296) |
Apr
(205) |
May
(84) |
Jun
(63) |
Jul
(122) |
Aug
(59) |
Sep
(66) |
Oct
(148) |
Nov
(120) |
Dec
(70) |
2006 |
Jan
(460) |
Feb
(683) |
Mar
(589) |
Apr
(559) |
May
(445) |
Jun
(712) |
Jul
(815) |
Aug
(663) |
Sep
(559) |
Oct
(930) |
Nov
(373) |
Dec
|
From: Pablo B. K. <pb...@em...> - 2000-05-05 16:01:35
|
Morten Kjeldgaard wrote: > Hi, > > What's up with the NumPy documentation? It seems not to be in the CVS > tree. The old documentation is still pretty incomplete IMHO. For example, > one of the most important routines "matrixmultiply" is not documented. And > the description of "indices" makes your head explode... :-) > > Also lacking is a description (examples) of how to add Ufuncs in an > extension module. > > Are there any plans of distributing the manual in an editable format (TeX, > DocBook...) instead of PDF?? That would make it easier for people to > contribute. And I offered myself to contribute some months ago but nobody answered. ... I felt so tiny... ... Nobody heard me... ... Oh, cruel world! ;^) Cheers! -- Pablo Bleyer Kocik | pbleyer |"Rintrah roars & shakes his fires in the burdend air; @embedded.cl | Hungry clouds swag on the deep" William Blake |
From: Morten K. <mo...@im...> - 2000-05-05 15:16:41
|
Hi, What's up with the NumPy documentation? It seems not to be in the CVS tree. The old documentation is still pretty incomplete IMHO. For example, one of the most important routines "matrixmultiply" is not documented. And the description of "indices" makes your head explode... :-) Also lacking is a description (examples) of how to add Ufuncs in an extension module. Are there any plans of distributing the manual in an editable format (TeX, DocBook...) instead of PDF?? That would make it easier for people to contribute. Cheers, /Morten -- Morten Kjeldgaard <mo...@im...> | Phone : +45 89 42 50 26 Institute of Molecular and Structural Biology | Fax : +45 86 12 31 78 Aarhus University | Home : +45 86 18 81 80 Gustav Wieds Vej 10 C, DK-8000 Aarhus C, Denmark | icq : 27224900 |
From: Vanroose W. <van...@ru...> - 2000-05-04 09:28:41
|
Dear Travis Oliphant, The pointer result->data is of type (char *). So I still have to do a cast data = (Py_complex *)result->data; Best Wishes Wim Vanroose Travis Oliphant wrote: > Yes, you can create the array, point a pointer to "result->data" and fill > it in. Internally, when the array is created an equivalent malloc is > performed. > > I've changed your code below to eliminate the unecessary copying. > > > > > > static PyObject* matrix(PyObject *self, PyObject *args){ > > double x,y; > > int size; > > int M,n,m; > > PyArrayObject *result; > > int dimensions[2]; > > Py_complex *data; > > Py_complex p; > > if(!PyArg_ParseTuple(args,"idd",&M,&x,&y)) > > return NULL; > > dimensions[0] = M; > > dimensions[1] = M; > > > result = (PyArrayObject > > *)PyArray_FromDims(2,dimensions,PyArray_CDOUBLE); > > > data = result->data; > > for(n=0;n < M;n++){ > > for(m=0; m<M; m++){ > > p.real=x; > > p.imag=y; > > data[n*M+m] = p; > > } > > } > > > return PyArray_Return(result); > > free(data); > > } > > > > Welcome to the wonderful world of NumPy. > > -Travis |
From: Travis O. <Oli...@ma...> - 2000-05-03 20:03:06
|
> Dear Numerical Python users, > > I am new to the numerical python and extending python in C. > I am making a module that involves the construction of complex > matrices. But first, I really want to understand how these matrices are > constructed. > Here is an example function that constructs a matrix of size M and > puts on each position a complex number "x+I y". x and y are arguments > when you call the function in python. > > My question: Is this the right way of organising the > construction of a complex matrix. Are there easier ways? > Can I construct the matrix directly in "result->data"?? > Yes, you can create the array, point a pointer to "result->data" and fill it in. Internally, when the array is created an equivalent malloc is performed. I've changed your code below to eliminate the unecessary copying. > > > static PyObject* matrix(PyObject *self, PyObject *args){ > double x,y; > int size; > int M,n,m; > PyArrayObject *result; > int dimensions[2]; > Py_complex *data; > Py_complex p; > if(!PyArg_ParseTuple(args,"idd",&M,&x,&y)) > return NULL; > dimensions[0] = M; > dimensions[1] = M; > result = (PyArrayObject > *)PyArray_FromDims(2,dimensions,PyArray_CDOUBLE); > data = result->data; > for(n=0;n < M;n++){ > for(m=0; m<M; m++){ > p.real=x; > p.imag=y; > data[n*M+m] = p; > } > } > return PyArray_Return(result);return > PyComplex_FromDoubles(data[1].real,data[1].imag); > free(data); > } > Welcome to the wonderful world of NumPy. -Travis |
From: Vanroose W. <van...@ru...> - 2000-05-03 15:30:19
|
Dear Numerical Python users, A small mistake when I cut and pasted the program: Here is the program again static PyObject* matrix(PyObject *self, PyObject *args){ double x,y; int size; int M,n,m; PyArrayObject *result; int dimensions[2]; Py_complex *data; Py_complex p; if(!PyArg_ParseTuple(args,"idd",&M,&x,&y)) return NULL; dimensions[0] = M; dimensions[1] = M; data = calloc(M*M+1,sizeof(Py_complex)); for(n=0;n < M;n++){ for(m=0; m<M; m++){ p.real=x; p.imag=y; data[n*M+m] = p; } } result = (PyArrayObject*)PyArray_FromDims(2,dimensions,PyArray_CDOUBLE); memcpy(result->data,data,M*M*sizeof(Py_complex)); return PyArray_Return(result); free(data); } |
From: Vanroose <van...@ru...> - 2000-05-03 15:25:57
|
Dear Numerical Python users, I am new to the numerical python and extending python in C. I am making a module that involves the construction of complex matrices. But first, I really want to understand how these matrices are constructed. Here is an example function that constructs a matrix of size M and puts on each position a complex number "x+I y". x and y are arguments when you call the function in python. My question: Is this the right way of organising the construction of a complex matrix. Are there easier ways? Can I construct the matrix directly in "result->data"?? Wim Vanroose static PyObject* matrix(PyObject *self, PyObject *args){ double x,y; int size; int M,n,m; PyArrayObject *result; int dimensions[2]; Py_complex *data; Py_complex p; if(!PyArg_ParseTuple(args,"idd",&M,&x,&y)) return NULL; dimensions[0] = M; dimensions[1] = M; data = calloc(M*M+1,sizeof(Py_complex)); for(n=0;n < M;n++){ for(m=0; m<M; m++){ p.real=x; p.imag=y; data[n*M+m] = p; } } result = (PyArrayObject *)PyArray_FromDims(2,dimensions,PyArray_CDOUBLE); memcpy(result->data,data,M*M*sizeof(Py_complex)); return PyArray_Return(result);return PyComplex_FromDoubles(data[1].real,data[1].imag); free(data); } Wim Vanroose |
From: Tim C. <tc...@bi...> - 2000-04-29 23:24:43
|
Vanroose Wim wrote: > > Dear Madam, Sir, > > I recently started to use the GSLibrary from > http://soureware.cygnus.com/gsl/ The have a interesting collection of > special functions. And a started to wrap several of them into my python > programs. Does anybody has experiences with GSL?? > > Would n't it be beautiful to produce a python module based > on the GSL special functions. Did somebody already do it??? > > Wim Vanroose Wim, Have a look at http://gestalt-system.sourceforge.net/gestalt_manifesto_exp_097.html and search for the string "GSL". As you will see, we originally proposed to wrap at least the statistical functions of GSL as User-Defined Functions and/or User-Defined Procedures for MySQL. (Note that the GNU Goose library which we mention is no longer under separate, active development, having been rolled back into the GNOME Guppy project, it seems). We still hope to wrap GSL for use directly in MySQL, but this now has a lower priority after experiencing how fast and memory-efficient NumPy is for basic exploratory statistics when used in conjunction with Gary Strangman's stats.py package. Nevertheless, it would be useful to have the GSL library available in Python - is it feasible to make it work with NumPy arrays as well as other Python sequences? We are most interested in the statistical aspects of the library but all the functions are potentially useful. My C skills are not up to the task but perhaps someone else on the GS-discuss mailing list might be able to assist? Regards, Tim Churches > > _______________________________________________ > Numpy-discussion mailing list > Num...@li... > http://lists.sourceforge.net/mailman/listinfo/numpy-discussion |
From: Janko H. <jh...@if...> - 2000-04-29 16:00:21
|
A very complete set of special functions is already wrapped by Travis Oliphant. But there are numerous other functions in GSL which would be worthwhile to connect to NumPy. Look for the cephes module. One benefit of using such a general library covering different areas is, that with one form of interface a whole slew of functions can be wrapped which also make the packaging a lot easier. Also I think the library is desinged with wrapping to other languages. Just to mention another library with a similar scope, but which is older, perhaps more mature there is also SLATEC from netlib. __Janko |
From: Vanroose W. <van...@ru...> - 2000-04-29 13:09:13
|
Dear Madam, Sir, I recently started to use the GSLibrary from http://soureware.cygnus.com/gsl/ The have a interesting collection of special functions. And a started to wrap several of them into my python programs. Does anybody has experiences with GSL?? Would n't it be beautiful to produce a python module based on the GSL special functions. Did somebody already do it??? Wim Vanroose |
From: Tim C. <tc...@bi...> - 2000-04-28 22:54:29
|
Andy Dustman wrote: [...snip...] > > Okay, I think I know what you mean here. You are wanting to return each > column as a (vertical) vector, whereas I am thinking along the lines of > returning the result set as a matrix. Is that correct? Yes, exactly. > Since it appears > you can efficiently slice out column vectors as a[:,n], is my idea > acceptable? i.e. > > >>> a=Numeric.multiarray.zeros( (2,2),'d') > >>> a[1,1]=2 > >>> a[0,1]=-1 > >>> a[1,0]=-3 > >>> a > array([[ 0., -1.], > [-3., 2.]]) > >>> a[:,0] > array([ 0., -3.]) > >>> a[:,1] > array([-1., 2.]) The only problem is that NumPy arrays must be homogeneous wrt type, which means that, say, a categorical column containing just a few distinct values stored as an integer would have to be upcast to a double in the NumPy matrix if it was part of a query which also returned a float. Would it be possible to extend your idea of passing in an array to the query? Perhaps the user could pass in a list of pre-existing, pre-sized sequence objects (which might be rank-1 NumPy arrays of various appropriate data types or Python tuples) which correspond to the columns which are to be returned by the SQL query. It would be up to the user to determine the correct type for each NumPy array and to size the array or tuples correctly. The reason for needing tuples as well as NumPy arrays is that, as you mention, NumPy arrays only support numbers. The intention would be for all of this to be wrapped in a class which may issue a number of small queries to determine the number of rows to be returned and the data types of the columns, so the user is shielded from having to work out these details. The only bit that has to be written in C is the function which takes the sequence of sequences (NumPy Arrays or Python tuples) in which to store the query results, column-wise and stuffs the value for each column for each row of the result set into the appropriate passed-in sequence object. I would be more than happy to assist with the Python code, testing and documentation but my C skills aren't up to helping with the guts of it. In other words, making this part of the low-level _mysql interface would be sufficient. Cheers, Tim C > > -- > andy dustman | programmer/analyst | comstar.net, inc. > telephone: 770.485.6025 / 706.549.7689 | icq: 32922760 | pgp: 0xc72f3f1d > "Therefore, sweet knights, if you may doubt your strength or courage, > come no further, for death awaits you all, with nasty, big, pointy teeth!" |
From: Andy D. <adu...@co...> - 2000-04-24 20:27:59
|
On Fri, 14 Apr 2000, Tim Churches wrote: > Andy Dustman wrote: > > Yes, but the problem with mysql_store_result() is the large amount of > memory required to store the result set. Couldn't the user be > responsible for predetermining the size of the array via a query such as > "select count(*) from sometable where...." and then pass this value as a > parameter to the executeNumPy() method? In MySQL at least such count(*) > queries are resolved very quickly so such an approach wouldn't take > twice the time. Then mysql_use_result() could be used to populate the > initialised NumPy array with data row, so there so only ever one > complete copy of the data in memory, and that copy is in the NumPy > array. After some more thought on this subject, and some poking around at NumPy, I came to the following conclusions: Since NumPy arrays are fixed-size, but otherwise sequences (in the multi-dimensional case, sequences of sequences), the best approach would be for the user to pass in a pre-sized array (i.e. from zeros(), and btw, the docstring for zeros is way wrong), and _mysql would simply access it through the Sequence object protocol, and update as many values as it could: If you passed a 100-row array, it would fill 100 rows or as many as were in the result set, whichever is less. Since this requires no special knowledge of NumPy, it could be a standard addition (no conditional compiliation required). This method (tentatively _mysql.fetch_rows_into_array(array)) would return the array argument as the result. IndexError would likely be raised if the array was too narrow (too many columns in result set). Probably this would not be a MySQLdb.Cursor method, but perhaps I can have a seperate module with a cursor subclass which returns NumPy arrays. > > Question: Would it be adequate to put all columns returned into the array? > > If label columns need to be returned, this could pose a problem. They may > > have to be returned as a separate query. Or else non-numeric columns would > > be excluded and returned in a list of tuples (this would be harder). > > Yes, more thought needed here - my initial thought was one NumPy array > per column, particularly since NumPy arrays must be homogenous wrt data > type. Each NumPy array could be named the same as the column from which > it is derived. Okay, I think I know what you mean here. You are wanting to return each column as a (vertical) vector, whereas I am thinking along the lines of returning the result set as a matrix. Is that correct? Since it appears you can efficiently slice out column vectors as a[:,n], is my idea acceptable? i.e. >>> a=Numeric.multiarray.zeros( (2,2),'d') >>> a[1,1]=2 >>> a[0,1]=-1 >>> a[1,0]=-3 >>> a array([[ 0., -1.], [-3., 2.]]) >>> a[:,0] array([ 0., -3.]) >>> a[:,1] array([-1., 2.]) -- andy dustman | programmer/analyst | comstar.net, inc. telephone: 770.485.6025 / 706.549.7689 | icq: 32922760 | pgp: 0xc72f3f1d "Therefore, sweet knights, if you may doubt your strength or courage, come no further, for death awaits you all, with nasty, big, pointy teeth!" |
From: Konrad H. <hi...@cn...> - 2000-04-21 12:56:25
|
I have just put ScientificPython 2.0.1 and 2.1.0 on my FTP server, ftp://dirac.cnrs-orleans.fr/pub/ScientificPython/ while Starship is recovering. Version 2.0.1 is mostly a bugfix release, with only minor additions. 2.1.0 is identical to 2.0.1 except for the addition of an MPI interface module. I have tested this on only one platform (Linux/Intel running MPICH), so I would be very interested in feedback from people running different MPI platforms. MPI support in ScientificPython is still very basic; there are probably more complete MPI interfaces out there. The strong point of ScientificPython's interface is the integration into Python: communicators are Python objects, all communication happens via methods defined on communicator objects, support is provided for sending and receiving both string and NumPy array objects. Moreover, Python scripts can rather easily be written in such a way that they work both with and without MPI support, of course using only a single processor when no MPI is available. Finally, there is a full C API as well, which means that other C modules can make use of MPI without having to link to the MPI library, which is particularly useful for dynamic library modules. It also facilitates packaging of MPI-based code, which doesn't need to know anything at all about the MPI library. Happy Easter, Konrad. -- ------------------------------------------------------------------------------- Konrad Hinsen | E-Mail: hi...@cn... Centre de Biophysique Moleculaire (CNRS) | Tel.: +33-2.38.25.55.69 Rue Charles Sadron | Fax: +33-2.38.63.15.17 45071 Orleans Cedex 2 | Deutsch/Esperanto/English/ France | Nederlands/Francais ------------------------------------------------------------------------------- |
From: <hi...@di...> - 2000-04-19 13:53:06
|
> Meanwhile I wrote QR in pyhton. I'll change to that interface to get > some speed improvement. Just to satisfy my curiosity, is it a design > decision to keep LinearAlgebra small, or just waiting for someone to > contribute more bindings? Not speaking for the NumPy maintainers, but I am sure it's the latter. I don't see what harm could be done by having more operations in LinearAlgebra. On the other hand, if I wanted to support everything in LAPACK, I would use several modules or, better yet, a package. Konrad. -- ------------------------------------------------------------------------------- Konrad Hinsen | E-Mail: hi...@cn... Centre de Biophysique Moleculaire (CNRS) | Tel.: +33-2.38.25.55.69 Rue Charles Sadron | Fax: +33-2.38.63.15.17 45071 Orleans Cedex 2 | Deutsch/Esperanto/English/ France | Nederlands/Francais ------------------------------------------------------------------------------- |
From: Joe V. A. <van...@at...> - 2000-04-18 16:07:48
|
sourceforge.net shows the latest release of Numeric Python as v 15.2, dated 1/19/2000 Could I encourage the Numeric Python developers to release a more recent version of Numeric Python? I know the latest is always available from CVS, but I'm sure that there are people who aren't ready to deal with CVS, just to get a current version of Numeric. -- Joe VanAndel National Center for Atmospheric Research http://www.atd.ucar.edu/~vanandel/ Internet: van...@uc... |
From: Pedro V. L. <eq...@eq...> - 2000-04-18 15:43:17
|
Konrad Hinsen wrote: > > Can someone give me a hand? I'm porting some code and I need to do > > QR decomposition. I couldn't find a such a function in Numpy. > > As I remember Lapack has one, isn't it part of the python interface? > > There is a lot more in LAPACK than is covered by the high-level Python > interface (Module LinearAlgebra). There is, however, a complete low-level > interface to all of LAPACK and BLAS, written eons ago by Doug Heisterkamp. > You can pick up an updated copy at > ftp://dirac.cnrs-orleans.fr/pub/PyLapack.tar.gz > > Konrad. > -- Thanks, Meanwhile I wrote QR in pyhton. I'll change to that interface to get some speed improvement. Just to satisfy my curiosity, is it a design decision to keep LinearAlgebra small, or just waiting for someone to contribute more bindings? pedro -- Pedro Vale Lima University of Coimbra |
From: Konrad H. <hi...@cn...> - 2000-04-18 15:26:50
|
> Can someone give me a hand? I'm porting some code and I need to do > QR decomposition. I couldn't find a such a function in Numpy. > As I remember Lapack has one, isn't it part of the python interface? There is a lot more in LAPACK than is covered by the high-level Python interface (Module LinearAlgebra). There is, however, a complete low-level interface to all of LAPACK and BLAS, written eons ago by Doug Heisterkamp. You can pick up an updated copy at ftp://dirac.cnrs-orleans.fr/pub/PyLapack.tar.gz Konrad. -- ------------------------------------------------------------------------------- Konrad Hinsen | E-Mail: hi...@cn... Centre de Biophysique Moleculaire (CNRS) | Tel.: +33-2.38.25.55.69 Rue Charles Sadron | Fax: +33-2.38.63.15.17 45071 Orleans Cedex 2 | Deutsch/Esperanto/English/ France | Nederlands/Francais ------------------------------------------------------------------------------- |
From: Pedro V. L. <eq...@eq...> - 2000-04-18 13:35:49
|
Hello, Can someone give me a hand? I'm porting some code and I need to do QR decomposition. I couldn't find a such a function in Numpy. As I remember Lapack has one, isn't it part of the python interface? thanks pedro vale lima -- University of Coimbra, Portugal eq...@eq... |
From: Scott M. R. <ra...@cf...> - 2000-04-14 17:00:51
|
Konrad Hinsen wrote: > > NumPy uses only one processor, and I am not even sure I'd want to > change that. I use biprocessor machines as well and I have adapted my > time-critical code to them (parallelization via threading), but > the parallelization is almost always at a higher level than > NumPy operations. In other words, I give one NumPy operation to each > processor rather than have both work on the same NumPy operation. I do the same thing. And I agree about not wanting it the other way (although an option for this might be nice...) > I'd prefer to build a parallelizing general-purpose library on top > of NumPy, ideally supporting message passing as well. Would anyone > else be interested in this? I have a nicely packaged MPI support > module for Python (to be released next week in a new version of > ScientificPython), so that part is already done. I am certainly interested. In fact, I have also written a MPI support module. Maybe when I see yours I will be able to add some stuff...I'm making the assumption that yours is probably more flexible than mine... > Which reminds me: there once was a parallelization project mailing > list on the old Starship, which disappeared during the move due to a > minor accident. Is there interest to revive it? I now have a > cluster of 20 biprocessors to feed, and I'd like to provide it with > only the best: Python code ;-) Once again, I'm in... Scott -- Scott M. Ransom Address: Harvard-Smithsonian CfA Phone: (617) 495-4142 60 Garden St. MS 10 email: ra...@cf... Cambridge, MA 02138 PGP Fingerprint: D2 0E D0 10 CD 95 06 DA EF 78 FE 2B CB 3A D3 53 |
From: <hi...@di...> - 2000-04-14 16:52:30
|
> I just put a second processor in my computer and it seems Numpy don't use > it. > > Is Numpy able to use 2 processors? From wich version does't work? NumPy uses only one processor, and I am not even sure I'd want to change that. I use biprocessor machines as well and I have adapted my time-critical code to them (parallelization via threading), but the parallelization is almost always at a higher level than NumPy operations. In other words, I give one NumPy operation to each processor rather than have both work on the same NumPy operation. I'd prefer to build a parallelizing general-purpose library on top of NumPy, ideally supporting message passing as well. Would anyone else be interested in this? I have a nicely packaged MPI support module for Python (to be released next week in a new version of ScientificPython), so that part is already done. Which reminds me: there once was a parallelization project mailing list on the old Starship, which disappeared during the move due to a minor accident. Is there interest to revive it? I now have a cluster of 20 biprocessors to feed, and I'd like to provide it with only the best: Python code ;-) Konrad. -- ------------------------------------------------------------------------------- Konrad Hinsen | E-Mail: hi...@cn... Centre de Biophysique Moleculaire (CNRS) | Tel.: +33-2.38.25.55.69 Rue Charles Sadron | Fax: +33-2.38.63.15.17 45071 Orleans Cedex 2 | Deutsch/Esperanto/English/ France | Nederlands/Francais ------------------------------------------------------------------------------- |
From: Konrad H. <hi...@cn...> - 2000-04-14 16:52:13
|
> I just put a second processor in my computer and it seems Numpy don't use > it. > > Is Numpy able to use 2 processors? From wich version does't work? NumPy uses only one processor, and I am not even sure I'd want to change that. I use biprocessor machines as well and I have adapted my time-critical code to them (parallelization via threading), but the parallelization is almost always at a higher level than NumPy operations. In other words, I give one NumPy operation to each processor rather than have both work on the same NumPy operation. I'd prefer to build a parallelizing general-purpose library on top of NumPy, ideally supporting message passing as well. Would anyone else be interested in this? I have a nicely packaged MPI support module for Python (to be released next week in a new version of ScientificPython), so that part is already done. Which reminds me: there once was a parallelization project mailing list on the old Starship, which disappeared during the move due to a minor accident. Is there interest to revive it? I now have a cluster of 20 biprocessors to feed, and I'd like to provide it with only the best: Python code ;-) Konrad. -- ------------------------------------------------------------------------------- Konrad Hinsen | E-Mail: hi...@cn... Centre de Biophysique Moleculaire (CNRS) | Tel.: +33-2.38.25.55.69 Rue Charles Sadron | Fax: +33-2.38.63.15.17 45071 Orleans Cedex 2 | Deutsch/Esperanto/English/ France | Nederlands/Francais ------------------------------------------------------------------------------- |
From: Tim C. <tc...@bi...> - 2000-04-14 09:49:39
|
Andy Dustman wrote: > > On Sun, 9 Apr 2000, Tim Churches wrote: > > > I've been experimenting with pulling quantitative data out of a MySQL > > table into NumPy arrays via Andy Dustman's excellent MySQLdb module and > > then calculating various statistics from the data using Gary Strangman's > > excellent stats.py functions, which when operating on NumPy arrays are > > lightning-fast. > > [...snip...] > > It might be possible to do something like this. I would prefer that such a > feature work as a seperate module (i.e. I don't think it is generally > applicable to MySQLdb/_mysql). Or perhaps it could be a compile-time > option for _mysql (-DUSE_NUMPY). The latter sounds good. I agree that most users of MySQLdb would not need it, so they shouldn't be burdened with it. > > The object that you want to mess with is the _mysql result object. It > contains an attribute MYSQL_RES *result, which is a pointer to the actual > MySQL structure. I don't remember if NumPy arrays are extensible or not, > i.e. can rows be appended? No they can't. I suspect that is the price to be paid for the efficient storage offered by NumPy arrays. > That would affect the design. If they are not > extensible, then you are probably limited to using mysql_store_result() > (result set stored on the client side), as opposed to mysql_use_result() > (result set stored on the server side). mysql_store_result is probably > preferable in this case anyway, so extensibility doesn't matter, as we can > find the size of the result set in advance with mysql_num_rows(). Then we > know the full size of the array. Yes, but the problem with mysql_store_result() is the large amount of memory required to store the result set. Couldn't the user be responsible for predetermining the size of the array via a query such as "select count(*) from sometable where...." and then pass this value as a parameter to the executeNumPy() method? In MySQL at least such count(*) queries are resolved very quickly so such an approach wouldn't take twice the time. Then mysql_use_result() could be used to populate the initialised NumPy array with data row, so there so only ever one complete copy of the data in memory, and that copy is in the NumPy array. > > However, with very large result sets, it may be necessary to use > mysql_use_result(), in which case the array will need to be extended, > possibly row-by-row. > > I could do this, but I need to know how to create and assign values to a > NumPy array from within C. Or perhaps an initial (empty) array with the > correct number of columns can be passed. I am pretty sure NumPy arrays > look like sequences (of sequences), so assignment should not be a big > problem. Easiest solution (for me, and puts least bloat in _mysql) would > be for the user to pass in a NumPy array. I'll look at the NumPy docs re this. Can any of the NumPy developers give some clues re this? > > Question: Would it be adequate to put all columns returned into the array? > If label columns need to be returned, this could pose a problem. They may > have to be returned as a separate query. Or else non-numeric columns would > be excluded and returned in a list of tuples (this would be harder). Yes, more thought needed here - my initial thought was one NumPy array per column, particularly since NumPy arrays must be homogenous wrt data type. Each NumPy array could be named the same as the column from which it is derived. Cheers, Tim C |
From: Jean-Bernard A. <jb...@ph...> - 2000-04-13 18:07:54
|
Hey Numpy people! I just put a second processor in my computer and it seems Numpy don't use it. Is Numpy able to use 2 processors? From wich version does't work? Jean-Bernard |
From: Robert K. <ke...@it...> - 2000-04-12 22:27:15
|
On Wed, 12 Apr 2000, Les Schaffer wrote: > i need to compile a simple Python extension, which uses NumPy/C API, > on WinXX. > > does anyone have (bad/good) experience compiling NumPy extension with > cygwin or mingw32 compilers? i am trying to decide whether i need to > purchase VC++ or not. http://starship.python.net/crew/kernr/mingw32/Notes.html Don't bother buying anything. > many thanks > > les schaffer -- Robert Kern ke...@ca... "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter |
From: Les S. <god...@ne...> - 2000-04-12 20:01:49
|
i need to compile a simple Python extension, which uses NumPy/C API, on WinXX. does anyone have (bad/good) experience compiling NumPy extension with cygwin or mingw32 compilers? i am trying to decide whether i need to purchase VC++ or not. many thanks les schaffer |
From: Robert K. <ke...@it...> - 2000-04-11 23:25:33
|
Hello, I'm working on a Multipack module to use ODRPACK for non-linear regression problems. ODRPACK is available from NETLIB if you want information: http://www.netlib.org/odrpack/index.html I'm in the debugging phase right now, and I want to be able to test my interfaces to most if not all of ODRPACK's capabilities. Consequently, I need datasets with some of the following properties: * multiple inputs (or a vector of inputs) * multiple responses (or a vector of responses) * errors/weights on either the responses, inputs, or both * covariance matrices for the errors on responses/inputs/both (in the case of multiple inputs/responses) * any differentiable functional form that I can make a Python function compute using Numpy and SpecialFuncs ufuncs (and maybe a few others) * problems where it is sensible to fix particular parameters and estimate the others * problems where it is sensible to fix particular datapoints (e.g. boundary conditions) * problems where some datapoints should be removed I would be much obliged if any of you could send me datasets that have some of these characteristics. I would prefer them to be in something parsable by Python, either a simple plaintext format or even NetCDF to goad me into learning how to use Konrad's NetCDF interface. A description of the function to fit to is necessary, and a brief description of the problem and perhaps even the expected answers would be nice. *** Please e-mail these to me and not the list. If you would like more information or clarification, please e-mail me. Many thanks for your time and possible contribution. -- Robert Kern ke...@ca... "In the fields of hell where the grass grows high Are the graves of dreams allowed to die." -- Richard Harter |