You can subscribe to this list here.
2000 |
Jan
(8) |
Feb
(49) |
Mar
(48) |
Apr
(28) |
May
(37) |
Jun
(28) |
Jul
(16) |
Aug
(16) |
Sep
(44) |
Oct
(61) |
Nov
(31) |
Dec
(24) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2001 |
Jan
(56) |
Feb
(54) |
Mar
(41) |
Apr
(71) |
May
(48) |
Jun
(32) |
Jul
(53) |
Aug
(91) |
Sep
(56) |
Oct
(33) |
Nov
(81) |
Dec
(54) |
2002 |
Jan
(72) |
Feb
(37) |
Mar
(126) |
Apr
(62) |
May
(34) |
Jun
(124) |
Jul
(36) |
Aug
(34) |
Sep
(60) |
Oct
(37) |
Nov
(23) |
Dec
(104) |
2003 |
Jan
(110) |
Feb
(73) |
Mar
(42) |
Apr
(8) |
May
(76) |
Jun
(14) |
Jul
(52) |
Aug
(26) |
Sep
(108) |
Oct
(82) |
Nov
(89) |
Dec
(94) |
2004 |
Jan
(117) |
Feb
(86) |
Mar
(75) |
Apr
(55) |
May
(75) |
Jun
(160) |
Jul
(152) |
Aug
(86) |
Sep
(75) |
Oct
(134) |
Nov
(62) |
Dec
(60) |
2005 |
Jan
(187) |
Feb
(318) |
Mar
(296) |
Apr
(205) |
May
(84) |
Jun
(63) |
Jul
(122) |
Aug
(59) |
Sep
(66) |
Oct
(148) |
Nov
(120) |
Dec
(70) |
2006 |
Jan
(460) |
Feb
(683) |
Mar
(589) |
Apr
(559) |
May
(445) |
Jun
(712) |
Jul
(815) |
Aug
(663) |
Sep
(559) |
Oct
(930) |
Nov
(373) |
Dec
|
From: Guido v. R. <gu...@py...> - 2001-12-17 16:19:17
|
> I downloaded and installed 2.2c1 and built the Numeric installers. > However, when I run them they all fail when the installation begins > (after one clicks the final click to install) with an access violation. > I removed any previous Numeric and it still happened. > > Building and installing with setup.py install works ok. I don't know anything about your installers, but could it be that you were trying to install without Administrator permissions? That used to crash the previous Python installer too. (The new one doesn't, but it's a commercial product so we can't share it.) --Guido van Rossum (home page: http://www.python.org/~guido/) |
From: Paul F. D. <pa...@pf...> - 2001-12-17 16:12:19
|
I downloaded and installed 2.2c1 and built the Numeric installers. However, when I run them they all fail when the installation begins (after one clicks the final click to install) with an access violation. I removed any previous Numeric and it still happened. Building and installing with setup.py install works ok. -----Original Message----- From: num...@li... [mailto:num...@li...] On Behalf Of Ray Drew Sent: Monday, December 17, 2001 2:45 AM To: num...@li... Subject: [Numpy-discussion] Python 2.2 Does anyone know when a release for Python 2.2 on Windows will be available? regards, Ray Drew _________________________________________________________ Do You Yahoo!? Get your free @yahoo.com address at http://mail.yahoo.com _______________________________________________ Numpy-discussion mailing list Num...@li... https://lists.sourceforge.net/lists/listinfo/numpy-discussion |
From: Ray D. <ray...@ya...> - 2001-12-17 10:47:41
|
Does anyone know when a release for Python 2.2 on Windows will be available? regards, Ray Drew _________________________________________________________ Do You Yahoo!? Get your free @yahoo.com address at http://mail.yahoo.com |
From: Rob <ro...@py...> - 2001-12-16 21:04:47
|
I have a number of thse routines in some EM code. I've tried to Numpyize them, but end up with code that runs even slower. Here is the old indexed routing: ----------------- for JJ in range(0,TotExtMetalEdgeNum): McVector+=Ccc[0:TotExtMetalEdgeNum,JJ] * VcVector[JJ] ---------------- Here is the Numpy version: --------------------- McVector= add.reduce(transpose(Ccc[...] * VcVector[...])) --------------------- I wonder if there is another faster way to do this? Thanks, Rob. -- The Numeric Python EM Project www.pythonemproject.com |
From: Achim G. <Ach...@un...> - 2001-12-14 10:26:54
|
Hi! I've checked in my source, so it is available at http://pygsl.sourceforge.net/ . A mailinglist for discussion needs and support is created, please mail to: mailto:pyg...@li... . pygsl provides now: module pygsl.sf: with 200 special functions module pygsl.const: with 23 often used mathematical constants module pygsl.ieee: access to the ieee-arithmetics layer of gsl module pygsl.rng: provides several random number generators and different probability densities It can be used with gsl-0.* and with gsl-1.* (with a small change in setup.py). Version 0.0.3 is tested for python-2.0. The cvs version can be used with python-2.1 and python-2.2 . Help is needed and code is accepted with thanks. Just visit our cvs repository at sourceforge: http://sourceforge.net/cvs/?group_id=34743 Yours, Achim |
From: Chad N. <cn...@ma...> - 2001-12-13 21:11:55
|
> Date: Thu, 13 Dec 2001 12:39:32 +0100 > From: Nils Wagner <nw...@me...> > Subject: [Numpy-discussion] Forth-Order Runge-Kutta > > I am looking for an implementation of the fourth-order Runge-Kutta > method in Numpy. Here is one I wrote (based on a similar I had seen in lorentz.py in the PyOpenGL demos, I believe), that I used for chaotic dynamics visualizations. # A simple, non-stepsize adaptive fourth order Runge-Kutta integration # estimator method. def fourth_order_runge_kutta(self, xyz, dt): derivative = self.differentiator hdt = 0.5 * dt xyz = asarray(xyz) # Force tuple or list to an array k1 = array(derivative(xyz)) k2 = array(derivative(xyz + k1 * hdt)) k3 = array(derivative(xyz + k2 * hdt)) k4 = array(derivative(xyz + k3 * dt)) new_xyz = xyz + (k1 + k4) * (dt / 6.0) + (k2 + k3) * (dt / 3.0) return new_xyz where self.differentiator is a function that takes an x,y,z coordinate tuple, and returns the deriviative as an x,y,z coordinate tuple. I wrote this when I was much more naive about Runge-Kutta and Python Numeric, so don't use it without some looking over. It is at least a good starting point. -- Chad Netzer cn...@ma... |
From: Nils W. <nw...@me...> - 2001-12-13 10:38:07
|
Hi, I am looking for an implementation of the fourth-order Runge-Kutta method in Numpy. Is it already done by someone ? Thanks in advance. Nils |
From: Greg K. <gp...@be...> - 2001-12-11 20:53:20
|
Spacesaver doesn't propagate through matrix multiplies: >>> import Numeric >>> x=Numeric.zeros((3,3), Numeric.Float32) >>> x.savespace(1) >>> x array([[ 0., 0., 0.], [ 0., 0., 0.], [ 0., 0., 0.]],'f') >>> x.spacesaver() 1 >>> y = Numeric.matrixmultiply(x, x) >>> y array([[ 0., 0., 0.], [ 0., 0., 0.], [ 0., 0., 0.]],'f') >>> y.spacesaver() 0 |
From: Jonathan M. G. <jon...@va...> - 2001-12-10 23:04:04
|
I don't know too much. From what I have read, it seems that the money dried up with the .com crash, The $800,000 allocated for development never materialized, and the project head left for greener pastures. Some of the projects that won the SC design competition have moved to SourceForge, where they show some life (http://sourceforge.net/projects/roundup/, http://sourceforge.net/projects/scons/). The big thing, from what I know (which is not much), is that the promise of almost a million dollars of federal funding for this project seems to have evaporated. At 02:32 PM 12/10/01, Joe Harrington wrote: >Could you let us know what happened to Software Carpentry? The web >site isn't very revealing. Who pulled the plug and why? Were they >making expected progress? Are they dead or merely negotiating some >changes with this or another sponsor? |
From: Martin S. <sz...@ai...> - 2001-12-10 23:03:25
|
Hi all, What is the recommended way of doing sparse matrix manipulations in NumPy? I read about Travis Oliphant's SparsePy http://pylab.sourceforge.net/ but on that web page, the links are broken (the tarball file fails). Also, this is version 0.1, and has not been updated in a long time. Are there any others? If anybody has a copy of the SparsePy source, could you make it available to me? (also it would be good if somebody could repair Travis Oliphant's page; I tried to notify him, but don't think I had the correct email). Has anybody tried compiling SparsePy under Windows? It seems tricky, given that it needs a Fortran compiler. I was thinking of trying Cygwin, but not sure if I can get it to link with the ActiveState python compiled with Microsoft tools. Thanks, -- Martin |
From: John J. L. <jj...@po...> - 2001-12-10 22:48:14
|
On Mon, 10 Dec 2001, Joe Harrington wrote: > Could you let us know what happened to Software Carpentry? The web > site isn't very revealing. Who pulled the plug and why? Were they > making expected progress? Are they dead or merely negotiating some > changes with this or another sponsor? Don't know about the official project / contest / whatever it was, but some of the tools proposed have been implemented to some extent. For example, SCcons --> scons (IIRC), and Roundup, the winning bug- tracking system design (Ka-Ping Yee I think) is being implemented by someone -- there was an announcement on the python announce list the other day. John |
From: Joe H. <jh...@oo...> - 2001-12-10 20:32:22
|
Could you let us know what happened to Software Carpentry? The web site isn't very revealing. Who pulled the plug and why? Were they making expected progress? Are they dead or merely negotiating some changes with this or another sponsor? Thanks, --jh-- |
From: Jonathan M. G. <jon...@va...> - 2001-12-10 19:55:44
|
To follow up on Paul Dubois's wise comments about history, I would cite a more recent example: Software Carpentry. In 1999, with great fanfare, DOE (LANL) and CodeSourcery announced: >The aim of the Software Carpentry project is to create a new generation of >easy-to-use software engineering tools, and to document both those tools >and the working practices they are meant to support. The Advanced >Computing Laboratory at Los Alamos National Laboratory is providing >$860,000 of funding for Software Carpentry, which is being administered by >Code Sourcery, LLC. All of the project's designs, tools, test suites, and >documentation will be generally available under the terms of an Open >Source license. With announcements to the scientific community and an article in Dr. Dobb's Journal, this thing looked like a project aimed at development tools quite similar to what you are envisioning for numerical tools, right down to the python-centricity. If you go to the web site (http://www.software-carpentry.com) today, you will find that a year into the project, the plug was pulled. This does not bode well for big open or free projects financed by the major scientific agencies. Curiously the private sector has done much better in this regard (e.g., VTK, spun out of General Electric, Data Explorer from IBM, etc.). At 01:53 AM 12/7/01, Christos Siopis <si...@um...> wrote: >In essence, what i am 'proposing' is for a big umbrella organization (NSF, >NASA and IEEE come to mind) to sponsor the development of this >uber-library for numerical scientific and engineering applications. This >would be 'sold' as an infrastructure project: creating the essential >functionality that is needed in order to build most kinds of scientific >and engineering applications. It would save lots of duplication effort and >improve productivity and quality at government labs, academia and the >private sector alike. The end product would have some sort of open-source >license (this can be a thorny issue, but i am sure a mutually satisfactory >solution can be found). Alternatively (or in addition), it might be better >to specify the API and leave the implementation part (open source and/or >commercial) to others. ============================================================================= Jonathan M. Gilligan jon...@va... The Robert T. Lagemann Assistant Professor www.vanderbilt.edu/lsp/gilligan of Living State Physics Office: 615 343-6252 Dept. of Physics and Astronomy, Box 1807-B Lab (X-Ray) 343-7574 6823 Stevenson Center Fax: 343-7263 Vanderbilt University, Nashville, TN 37235 Dep't Office: 322-2828 |
From: Phillip D. <phi...@xo...> - 2001-12-08 08:32:32
|
On Thursday 06 December 2001 06:39 pm, you wrote: > We've (well Travis O. mostly) been working to divide SciPy into multiple > "levels," and think this provides a solution for a wide range of users. > There are 3 levels now, and there is likely to be a 4th "sumo" level added > later. > > Level 1 is the core routines. Level 2 includes most Numeric algorithms. > Level 3 includes graphics and perhaps a few other things. The sumo package > would be a large all inclusive package with > Python/Numeric/SciPy/VTK/wxPython(maybe)/PyCrust/MayaVi(maybe) and possibly > others. I think this sounds GREAT! I've been looking for such a package to provide my company with a migration path from IDL, and such a package just might do it. If you're looking for other suggestions to add to sumo, I'd like to suggest ReportLab (or some sort of PDF generation tool) and as many scientific data formats (such as HDF, netCDF, and about a million others) as are sufficiently mature to be included. I've signed up to do an HDF5 port, and have been given a really nice start by Robert Kern. However, this one's not yet ready for inclusion. Phillip |
From: Paul F. D. <pa...@pf...> - 2001-12-07 23:40:12
|
There is a bug report about fromstring. This bug report is correct. The docstring differs from the documentation, which only mentions the two-argument form (string, typecode). Numeric itself uses the latter form. The implementation has the 'count' argument but implemented incorrectly, without allowing for keyword arguments. In order to reconcile this with the least damage to the most people, I have changed the signature to: fromstring(string, typecode='l', count=-1) and allowed keyword arguments. This does not break the arguments as advertised in the documentation but does break any explicitly three-argument call in the old order. I will commit to CVS and this will be in 2.3 unless someone mean yells at me. |
From: Joe H. <jh...@oo...> - 2001-12-07 21:01:27
|
Something that everyone should be aware of is that right now we *may* have an opportunity to get significant support. Kodak has acquired RSI, makers of IDL. Most of the planetary astronomy community uses IDL, as do many geophysicists and medical imaging people. Kodak is dramatically raising prices, and has killed support for Mac OS X. The IDL site license just arranged for the group at NASA Ames is over $200k, making site licenses more expensive than individual licenses were just a few years ago, on a per-license basis. At the Division for Planetary Sciences meeting last week, I was frequently approached by colleagues who said, "Joe, what do I do?", and from the more savvy, "Is Python ready yet?" I discussed the possibility of producing an OSS or PD analysis system with a NASA program manager. He sees the money going out of his own programs to Kodak, and is concerned. However, his programs cannot fund such an effort as it is out of his scope. The right program is probably Applied Information Systems Research, but he is checking around NASA HQ to see whether this is the case. He was very positive about the idea. I suspect that a proposal will be more likely to fly this year than next, as there is a sense of concern right now, whereas next year people will already have adjusted. Depending on how my '01 grant proposals turn out, I may be proposing this to NASA in '02. Paul Barrett and I proposed it once before, in 1996 I think, but to the wrong program. Supporting parts of the effort from different sources would be wise. Paul Dubois makes the excellent point that such efforts generally peter out. It would be important to set this up as an OSS project with many contributors, some of whom are paid full-time to design and build the core. Good foundational documents and designs, and community reviews solicited from savvy non-participants, would help ensure that progress continued as sources of funding appeared and disappeared...and that there is enough wider-community support to keep it going until it produces something. NASA's immediate objective will be a complete data-analysis system to replace IDL, in short order, including an IDL-to-python converter program. That shouldn't be hard as IDL is a simple language, and PyDL may have solved much of that problem. So, at this point I'm still assessing what to do and whether/how to do it. Should we put together proposals to the various funding agencies to support SciPy? Should we create something new? What efforts exist in other communities, particularly the numerical analysis community? How much can we rely on savvy users to contribute code, and will that code be any good? My feeling is that there is actually a lot of money available for this, but it will require a few people to give up their jobs and pursue it full-time. And there, as they say, is the rub. --jh-- |
From: Paul F. D. <pa...@pf...> - 2001-12-07 16:53:08
|
Chris wrote in part: -----Original Message----- From: num...@li... [mailto:num...@li...] On Behalf Of Christos Siopis <si...@um...> Sent: Thursday, December 06, 2001 11:53 PM To: num...@li... Subject: Re: [Numpy-discussion] Meta: too many numerical libraries doing the same thing? <snip> In essence, what i am 'proposing' is for a big umbrella organization (NSF, NASA and IEEE come to mind) to sponsor the development of this uber-library for numerical scientific and engineering applications. This would be 'sold' as an infrastructure project: creating the essential functionality that is needed in order to build most kinds of scientific and engineering applications. It would save lots of duplication effort and improve productivity and quality at government labs, academia and the private sector alike. The end product would have some sort of open-source license (this can be a thorny issue, but i am sure a mutually satisfactory solution can be found). ----------- Those who do not know history, etc. LLNL, LANL, and Sandia had such a project in the 70s called the SLATEC library for mathematical software. It was pretty successful for the Fortran era. However, the funding agencies are unable to maintain interest in infrastructure very long. If there came a day when the vast majority of scientific programmers shared a platform and a language, there is now the communications infrastructure so that they could do a good open-source library, given someone to lead it with some vision. Linus Mathguy. |
From: Christos S. <si...@um...> - 2001-12-07 07:53:26
|
I am reviving the 'too many numerical libraries doing the same thing?' thread, with apologies for the long delay to respond to the list's comments. Thanks to Joe Harrington, Paul Dubois, Konrad Hinsen and Chris Barker who took the time to respond to my posting. The funny thing is that, as i see it, even though they were mostly trying to provide reasons why such a project has not, and probably will not, be done, their reasons sound to me as reasons why it *should* be done! In essence, what i am 'proposing' is for a big umbrella organization (NSF, NASA and IEEE come to mind) to sponsor the development of this uber-library for numerical scientific and engineering applications. This would be 'sold' as an infrastructure project: creating the essential functionality that is needed in order to build most kinds of scientific and engineering applications. It would save lots of duplication effort and improve productivity and quality at government labs, academia and the private sector alike. The end product would have some sort of open-source license (this can be a thorny issue, but i am sure a mutually satisfactory solution can be found). Alternatively (or in addition), it might be better to specify the API and leave the implementation part (open source and/or commercial) to others. Below I give some reasons, based mostly on the feedback from the list, why i think it is appropriate that this effort be pursued at such a high level (both bureaucratically and talent-wise). - The task is too complex to be carried out by a single group of people (e.g., by a single government lab): * Not only does it involve 'too much work', but the expertise and the priorities of the group won't necessarily reflect those of the scientific and engineering community at large. * The resources of a single lab/group do not suffice. Even if they did, project managers primarily care to produce what is needed for their project. If it has beneficial side-effects for the community, so much the better, but this is not a priority. Alas, since the proposed task is too general to fit in the needs of any single research group, noone will probably ever carry it out. - The end product should be something that the 'average' numerical scientist will feel compelled to use. For instance, if the majority of the community does not feel at home with object-oriented techniques, there should be a layer (if technically possible) that would allow access to the library via more traditional languages (C/Fortran), all the while helping people to make the transition to OO, when appropriate --i can hear Paul saying "but it's *always* appropriate" :) - There is a number of issues for which it is not presently clear that there is a good answer. Paul brought up some of these problems. For example, is it possible to build the library in layers such that it can be accessed at different levels as object components, as objects, and as 'usual', non-OO functions all at the same time? As Paul said, "Eiffel or C++ versions of some NAG routines typically have methods with one or two arguments while the C or Fortran ones have 15 or more". Does NAG use the same code-base for both its non-OO and OO products? - It is hard to find people who are good scientists, good numerical scientists, good programmers, have good coordinator skills and can work in a team. This was presented as a reason why a project such as this proposed cannot be done. But in fact i think these are all arguments why a widely-publicised and funded project such as this would have more chances to succeed than small-scale, individual efforts. I would also imagine that numerical analysts might feel more compelled to 'publicize' their work to the masses if there were single, respected, quality resource that provides the context in which they can deposit their work (or a draft thereof, anyway). - Some companies and managers are hesitant to use open-source products due to requirements for 'fiduciary responsibility', as Paul put it. I think that a product created under the conditions that i described would probably pass this requirement --not in the sense that there will be an entity to sue, but in the sense that it would be a reputable product sponsored by some of the top research organizations or even the government. *************************************************************** / Christos Siopis | Tel : 734-615-1585 \ / Postdoctoral Research Fellow | \ / Department of Astronomy | FAX : 734-763-6317 \ / University of Michigan | \ / Ann Arbor, MI 48109-1090 | E-mail : si...@um... \ / U.S.A. _____________________| \ / / http://www.astro.lsa.umich.edu/People/siopis.html \ *************************************************************** |
From: Konrad H. <hi...@cn...> - 2001-12-07 07:40:04
|
"eric" <er...@en...> writes: > as a widely used tool. If you require multiple installations, you loose a > huge number of potential users. In the early days of NumPy, I had a similar problem because many people didn't succeed in installing it properly. I ended up preparing a specially packaged Python interpreter with NumPy plus my own scientific modules. With increasingly frequent releases of NumPy and Python itself, this became impossible to keep up. And since we have distutils, installation questions became really rare, being mostly about compatibility of my code with this or that version of NumPy (due to the change in the C API export). On the other hand, I work in a community dominated by Unix users, who come in two varieties: Linux people, who just get RPMs, and others, who typically have a competent systems administrator. The Windows world might have different requirements. Konrad. -- ------------------------------------------------------------------------------- Konrad Hinsen | E-Mail: hi...@cn... Centre de Biophysique Moleculaire (CNRS) | Tel.: +33-2.38.25.56.24 Rue Charles Sadron | Fax: +33-2.38.63.15.17 45071 Orleans Cedex 2 | Deutsch/Esperanto/English/ France | Nederlands/Francais ------------------------------------------------------------------------------- |
From: eric <er...@en...> - 2001-12-07 03:38:19
|
Hey Joe, Your right on many counts. A single monolithic packages sacrifices the ability to embed SciPy in other apps. This is unacceptable as it is one of the major powers of Python. On the other hand, a single monolithic (single download) package is exactly what is needed to introduce the full capabilities of Python and SciPy to a large number of novice users. I'm hoping the choice between these two isn't an "either/or" decision. We've (well Travis O. mostly) been working to divide SciPy into multiple "levels," and think this provides a solution for a wide range of users. There are 3 levels now, and there is likely to be a 4th "sumo" level added later. Level 1 is the core routines. Level 2 includes most Numeric algorithms. Level 3 includes graphics and perhaps a few other things. The sumo package would be a large all inclusive package with Python/Numeric/SciPy/VTK/wxPython(maybe)/PyCrust/MayaVi(maybe) and possibly others. Its form hasn't been defined yet, but it is meant to be a click-install package for the large population of potential users who are looking for a scientific programming/exploration environment. The goal of SciPy from the beginning has been to make Python an appealing platform to the scientific masses -- not just the computer savvy. I firmly believe that somthing like th sumo package is the only way that SciPy will ever take hold as a widely used tool. If you require multiple installations, you loose a huge number of potential users. The other thing to remember is that there are really two major platforms -- Linux and Windows (and many others hopefully supported). Linux users are generally much more savvy and do not mind multiple installs. But the windows world remains a huge portion of the target audience, and these people generally have a very different set of packaging expectations. If it ain't click/install/run, they'll simply choose a different package. We've considered releasing maybe 3 different packages, say level 1/2, level 1/2/3, and sumo so that people can pick the one they wish to use. On the good side, this gives the experts the core algorithms packaged up without much cruft so that they can use it with their current Python and the novice an easy way to try out all the cool features of Python/SciPy/Visualization. On the bad side, it introduces maintance headaches and might lead to version issues. Still, I'm really hoping a good compromise can be found here. > If SciPy or any other scientific package includes a Python > interpreter, it should have a special name, like "scipy", and not be > "python" to the command line. Frankly, I prefer the layered approach, > so long as everyone works to make the layers "just work" together. > This is quite practical with modern package managers. I guess the sumo version might have the interpreter renamed scipy -- we'll wait till at least a few blanks have been laid before we try and cross that bridge... On the layering side, I think SciPy is moving in the general direction that your suggesting. see ya, eric ----- Original Message ----- From: "Joe Harrington" <jh...@oo...> To: <num...@li...> Cc: <jh...@oo...> Sent: Thursday, December 06, 2001 6:33 PM Subject: Re: [Numpy-discussion] Meta: too many numerical libraries doing the same thing? > Sorry for the late hit, I have been away at a conference. > > Regarding the issue of SciPy including a Python interpreter and being > a complete distribution, I think this is a *poor* idea. Most Linux > distributions include Python for OS use. I ran into trouble when I > supported a separate Python for science use. There was confusion > (path issues, etc.) about which Python people were getting, why the > need for a second version, etc. Then, the science version got out of > date as I updated my OS and wanted to keep my data analysis version > stable. That led to confusion about what features were broken/fixed > in what version. If SciPy includes Python, it has to make a > commitment to release a new, tested version of the whole package just > as soon as the new Python is available. That will complicate the > release schedule, as Python releases won't be in sync with the SciPy > development cycle. There is also the issue, if OSs start including > both mainstream Python and SciPy, of their inherently coming with two > different versions of Python, and thereby causing headaches for the OS > distributor. The likely result is that the distributor would drop > SciPy. Further, they will have to to decide which version to install > in /usr/bin (SciPy will lose that one). > > If SciPy or any other scientific package includes a Python > interpreter, it should have a special name, like "scipy", and not be > "python" to the command line. Frankly, I prefer the layered approach, > so long as everyone works to make the layers "just work" together. > This is quite practical with modern package managers. > > --jh-- > > _______________________________________________ > Numpy-discussion mailing list > Num...@li... > https://lists.sourceforge.net/lists/listinfo/numpy-discussion > |
From: Joe H. <jh...@oo...> - 2001-12-06 23:33:40
|
Sorry for the late hit, I have been away at a conference. Regarding the issue of SciPy including a Python interpreter and being a complete distribution, I think this is a *poor* idea. Most Linux distributions include Python for OS use. I ran into trouble when I supported a separate Python for science use. There was confusion (path issues, etc.) about which Python people were getting, why the need for a second version, etc. Then, the science version got out of date as I updated my OS and wanted to keep my data analysis version stable. That led to confusion about what features were broken/fixed in what version. If SciPy includes Python, it has to make a commitment to release a new, tested version of the whole package just as soon as the new Python is available. That will complicate the release schedule, as Python releases won't be in sync with the SciPy development cycle. There is also the issue, if OSs start including both mainstream Python and SciPy, of their inherently coming with two different versions of Python, and thereby causing headaches for the OS distributor. The likely result is that the distributor would drop SciPy. Further, they will have to to decide which version to install in /usr/bin (SciPy will lose that one). If SciPy or any other scientific package includes a Python interpreter, it should have a special name, like "scipy", and not be "python" to the command line. Frankly, I prefer the layered approach, so long as everyone works to make the layers "just work" together. This is quite practical with modern package managers. --jh-- |
From: <sa...@hy...> - 2001-12-06 20:56:55
|
Well, never mind. I figured out that my test variables silently went to double precision, and that this is a case precision issues, not rounding error. So, I guess I will have to force my arrays to double precision. Sorry to bother the list. sue |
From: <sa...@hy...> - 2001-12-06 20:29:41
|
import MA I have been using add.reduce on some arrays with single precision data in them. At some point, the reduction seems to be producing incorrect values, caused I presume by floating point rounding errors. THe values are correct if the same array is created as double precision. The odd things is that I can do a straight summing of those values into long, single or double variable and get correct answers. Is this a bug or what? If the difference is due to rounding error, I would expect the same errors to show up in the cases of summing the individual values. The output from the following code shows the following. Note that the add.reduce from the single precision array is different from all the others. It doesn't matter the rank or size of array, just so sum of values gets to certain size. accumulated sums float 75151440.0 long 75151440 accumulated from raveled array float 75151440.0 double 75151440.0 add.reduce single 75150288.0 double 75151440.0 --- code --- import MA # this causes same problem if array is 1d of [amax*bmax*cmax] in len amax = 4 bmax = 31 cmax = 12 farr = MA.zeros((amax, bmax, cmax), 'f') # single float darr = MA.zeros((amax, bmax, cmax), 'd') # double float sum = 0.0 lsum = 0 value = 50505 # reducing this can cause all values to agree for a in range (0, amax): for b in range (0, bmax): for c in range (0, cmax): farr[a, b, c] = value darr[a, b, c] = value sum = sum + value lsum = lsum + value fflat = MA.ravel(farr) dflat = MA.ravel(darr) fsum = dsum = 0.0 for value in fflat: fsum = fsum + value for value in dflat: dsum = dsum + value freduce = MA.add.reduce(fflat) dreduce = MA.add.reduce(dflat) print "accumulated sums" print "\tfloat\t", sum, "\tlong ", lsum print "accumulated from raveled array" print "\tfloat\t", fsum, "\tdouble", dsum print "add.reduce" print "\tsingle\t", freduce, "\tdouble", dreduce |
From: Jeff W. <js...@cd...> - 2001-12-06 19:04:30
|
Oli: Works fine for me. I've contributed a numeric python package to the fink distribution (fink.sf.net), it saves you the trouble of porting it yourself. -Jeff On Thu, 6 Dec 2001, oli wrote: > Hi, > I have tried to compile Numpy for MacOsX but it doesn't compile the C > modules... > Have anyone not this problem? > Thanks > oli > > > _______________________________________________ > Numpy-discussion mailing list > Num...@li... > https://lists.sourceforge.net/lists/listinfo/numpy-discussion > -- Jeffrey S. Whitaker Phone : (303)497-6313 Meteorologist FAX : (303)497-6449 NOAA/OAR/CDC R/CDC1 Email : js...@cd... 325 Broadway Web : www.cdc.noaa.gov/~jsw Boulder, CO, USA 80303-3328 Office : Skaggs Research Cntr 1D-124 |
From: oli <Oli...@ep...> - 2001-12-06 18:27:12
|
Hi, I have tried to compile Numpy for MacOsX but it doesn't compile the C modules... Have anyone not this problem? Thanks oli |