When you execute "from visual import *" the following code in
__init__.py is executed:
if 1:
# Names defined for backward compatibility with Visual 3:
import sys, time
true = True
false = False
crayola = color
from cvisual import vector_array, scalar_array
With the break represented by Python 2 -> 3, I propose deleting
"import sys, time" and "from cvisual import vector_array,
scalar_array".
It seems harmless to retain the definitions of true and false (which
were added to Visual very early, before Python offered True and
False), and the definition of crayola. All of these now unneeded
entities are embedded in very old programs that people expect to
continue working. A real-life example is an instructor using some old
demo program in class, with some old or new version of Python. The
instructor may know little more about VPython than to doubleclick on
an inherited program, and if there are no serious consequences for the
future it makes sense to keep old programs working. (Of course old
print statements will fail on Python 3, and there's no solution for
that.) I don't see any serious consequences: If you use "true" when
you intend "True" it's fine, and if you assign "true = 17" you
override Visual's definition.
Probably very few people were even aware that sys and time were being
imported, and I strongly suspect that no one exploited these imports
after the very early period in the development of Visual (starting in
2000). I searched in various places for "sys." and "time." without
finding any instances other than programs that specifically imported
sys or time themselves.
The case of vector_array and scalar_array is more complex. In August
2003 Jonathan Brandmeyer created these objects with an eye to using
these objects to eliminate dependence of Visual on Numeric
(predecessor of numpy), which is used for some attributes of the array
objects (curve, convex, faces, points), including pos, color, and
normal (for faces). It seemed possible to use vector_array for these
arrays of vectors. However, in December 2003 he reverted to using
Numeric arrays for these attributes when it became apparent that there
were a number of existing interesting programs that needed the
capabilities of Numeric arrays; examples include drape.py, wave.py,
and toroid_drag.py.
In March 2003 he released VPython 2.9 which retained the use of
Numeric for array object attributes but announced this:
"New classes vector_array and scalar_array: these have an interface
similar to that provided by Numeric, but it provides
3D-vector-specific operations that are significantly faster than
Numeric and are more readable than the equivalent Numeric code."
These new classes were never described in the Visual Help and seem
never to have really been used. When I tried some things with
vector_array this week I found for example that subtracting one
vector_array from another yielded their sum, due to a typo in the
code. There is no slicing machinery. You can't take the cosine of such
an array. And so on. It doesn't make sense to try to extend
vector_array to do all the things that numpy does, which is a huge
package.
The example program crystal.py, originally written by me but rewritten
by Jonathan, does use vector_array, but that's the only example I've
run across (and I've succeeded with some effort to rewrite it to use
numpy). It is true that it runs faster with more readable code using
vector_array, but I'm afraid that it just isn't sufficiently general.
Here is the vector_array code from crystal.py, where a.nearpos is a
list of the sphere.pos locations of those atoms that are closest
neighbors to atom a (so a.nearpos is essentially a list of pointers):
r = vector_array(a.nearpos) - a.pos
a.p += k_dt *(r.norm()*(r.mag()-L)).sum()
Here is the equivalent numpy array code crystal.py:
r = array(a.nearpos) - a.pos
rmag = (sqrt(sum(square(r),-1))).reshape(-1,1)
a.p += k_dt * sum((1-L/rmag)*r,0)
This whole episode however does raise an interesting question. The
numpy package has major subpackages such as fft and linalg. As far as
I can tell, nowhere in numpy are there functions that are specifically
attuned to the 3D vectors that are central to the tasks for which
VPython is designed; it would be great if there were such a package. A
big advantage of numpy is its generality, handling arrays that are M
by N by .... But for our purposes it is a disadvantage that there
isn't somewhere in numpy tuned specifically to 3D geometry. For
example, there is a function ("norm") in numpy.linalg, but I have been
unable to figure out how to get it to take the magnitudes of an array
of 3D vectors; it seems capable solely of taking the square of the sum
of the squares of ALL the components of ALL the 3D vectors. I think
the main reason why the numpy version of crystal.py is slower than the
vector_array version is the necessity of making several more function
calls.
Comments?
Bruce
|