Content-type: multipart/related; boundary="Boundary_(ID_Tbs3Zhwk0SoaZYGTeqvcTA)"; type="text/html" --Boundary_(ID_Tbs3Zhwk0SoaZYGTeqvcTA) Content-type: text/html; charset=ISO-8859-1 Content-transfer-encoding: quoted-printable

On 23 Jun, 2010,at 10:41 AM, Tom Browder < m> wrote:

On Wed, Jun 23, 2010 at 07:06, Christopher Sean Morrison <br=> wrote:
=0A> On Jun 23, 2010, at 6:58 AM, Tom Browde= r wrote:
=0A>> How much efficiency is gained (or lost) = by writing the complete series
=0A>> of commands to a script firs= t, and then feeding it to mged in one
=0A>> chunk?
=0A= > Batching the segments together in sets of 200 (i.e., 1800 commands at=
=0A> a time) takes LESS overall runtime than invoking MGED one segm= ent at
=0A> a time for just 10 segments (i.e., 9 commands at a time,= 90 total
=0A> commands).
=0AThen wouldn't it be better st= ill to do it in one script fed to mged?

Ah!  Yes, of course!  I = misread your original message, particularly the "in one chunk" part feedin= g it as one script.  Having mged invoke only once will be the clear w= inner and obviously fully minimizes the overhead.  I can't think of a= situation where it won't always be the fastest.

= The script was originally implemented to create one segment at a time, whi= ch was clearly non-optimal with several minutes runtime.  The batchin= g into larger groups was a simple compromise that only takes 12s here. &nb= sp;I didn't want to deviate from the original script too much and was inte= ntionally avoiding writing out a script file for simplicity.  Unfortu= nately, all of the commands are too long to be run as a single command lin= e (about 260k characters) and I didn't want to jump the hoops needed to fe= ed mged's standard input from within Perl.

Cheers= !
= --Boundary_(ID_Tbs3Zhwk0SoaZYGTeqvcTA)--