Hi Jess,
If it's always what you want, you can add an alias to your .cshrc file to
make it default:
alias psize "$APBSTOOLS/psize.py GMEMCEIL=3D2000"
David
On 1/26/06, Todd Dolinsky <todd@...> wrote:
>
> Hi Jessica 
>
> The change was originally part of a bug fix. In the past we had more tha=
n
> a handful of users try to use finelyspaced sequential grids on machines
> that lacked the available memory, so we switched to only displaying grids
> that would fit on a standard machine (with a default memory ceiling of 40=
0
> MB).
>
> With that in mind, you should be able to generate sequentialgrid spacing=
s
> by increasing the memory ceiling with the GMEMCEIL flag  since it look=
s
> like the suggested memory ceiling for your PQR file was ~ 1300 MB, passin=
g
> in the GMEMCEIL flag like
>
> python psize.py SPACE=3D0.10 GMEMCEIL=3D2000 cramhelix1abt.pqr
>
> should allow you to make a sequential grid.
>
> Similarly, if you use inputgen.py and use the METHOD=3Dauto flag it wil=
l
> ensure that the example input file is sequential and not parallel. See t=
he
> inputgen help (python intputgen.py help) for more info.
>
> I think that answers your question  if not, please let me know.
>
> Todd
>
> On 1/26/06, Jessica MJ Swanson < jswanson@...> wrote:
> >
> > Hi all,
> >
> > Just wondering if the updated version of psize.py could use a fix. The
> > previous version gave the mesh spacing for the single (nonparallel
> > execution) grid, but the new version only gives spacings for the grids
> > parced out in a parallel run (see example below). Both would be ideal.
> > Can we either change it back or add the single to the parallel?
> >
> > Thanks,
> > Jessica
> >
> >
> > version 0.4.0
> > $APBSTOOLS/psize.py SPACE=3D 0.10 cramhelix1abt.pqr
> >
> > ############## GENERAL CALCULATION INFO #############
> > Coarse grid dims =3D 33.109 x 16.199 x 15.609 A
> > Fine grid dims =3D 33.109 x 16.199 x 15.609 A
> > Num. fine grid pts. =3D 321 x 161 x 129
> > Parallel solve required (1271.601 MB > 400.000 MB)
> > Total processors required =3D 4
> > Proc. grid =3D 2 x 2 x 1
> > Grid pts. on each proc. =3D 97 x 129 x 129
> > Fine mesh spacing =3D 0.205647 x 0.0760531 x 0.121948 A
> > Estimated mem. required for parallel solve =3D 307.880 MB/proc.
> > Number of focusing operations =3D 2
> >
> > version 0.3.2
> > $APBSTOOLS/psize.py SPACE=3D0.10 cramhelix1abt.pqr
> >
> > ################# CALCULATION INFO ####################
> > Coarse grid dims =3D 33.109 x 16.199 x 15.609 A
> > Fine grid dims =3D 33.109 x 16.199 x 15.609 A
> > Num. fine grid pts. =3D 321 x 161 x 129
> > Fine mesh spacing =3D 0.103466 x 0.101246 x 0.121948 A
> > Estimated mem. required for sequential solve =3D 1017.280 MB
> > Parallel solve required (1017.280 MB > 400.000 MB)
> > Proc. grid =3D 2 x 2 x 1
> > Grid pts. on each proc. =3D 129 x 129 x 129
> > Estimated mem. required for parallel solve =3D 327.559 MB/proc.
> > Number of focusing operations =3D 2
> >
> >
> >
> > 
> > This SF.net email is sponsored by: Splunk Inc. Do you grep through log
> > files
> > for problems? Stop! Download the new AJAX search engine that makes
> > searching your log files as easy as surfing the web. DOWNLOAD SPLUNK!
> > http://sel.asus.falkag.net/sel?cmd=3Dlnk&kid=3D103432&bid=3D230486&dat=
=3D121642
> >
> > _______________________________________________
> > apbsusers mailing list
> > apbsusers@...
> > https://lists.sourceforge.net/lists/listinfo/apbsusers
> >
>
>
