The problem seems to be that you have a LaTeX document which contains Reduce code wrapped in  \begin{reduce}-\end{reduce} blocks. The goal is to use the same file for LaTeX and Reduce, in order to avoid duplication. I suppose that you want to split the code in discontiguous blocks (otherwise you could write a .red file and use Listings http://en.wikibooks.org/wiki/LaTeX/Packages/Listings). 
An easy solution for this problem would be to extract the Reduce code out of the .tex file. This may be enough:

vim -e -s -c 'g/\\begin{reduce}/+1,/\\end{reduce}/-1 p' -cq   texfile.tex  > reducefile.red
http://www.commandlinefu.com/commands/view/3202/display-a-block-of-text-with-vim-with-offset-like-with-awk

I have tried with a handmade file, and it seems to work. You only must take care of putting no reduce code in the lines in which the \begin{reduce} and \end{reduce} statements appear.


2011/11/6 Arthur Norman <acn1@cam.ac.uk>
I observe that others (probably including myself) feel happier with
putting TeX within Reduce comments rather than sort of the other way
around - but my overall stance is that different people can legitimately
wish to do things their own way with files that are theirs. So without any
suggestion that it is something that Reduce as a whole wants to adopt, I
have hacked in some code!

The string of characters "\end{reduce}" should introduce a comment (as far
as Reduce is concerned) that is ended by "\begin{reduce}". These are
detected without regard to whether they are at the start of a line, but
probably now within Reduce comments or strings. I hope very much that the
string "\end{reduce}" could not legitimately arise as valid Reduce input
otherwise, and am encouraged in that belief by the fact that "\" is only
used in limited places and "end" is a reserved word.

Alongside the command "in" that reads one or more files there is now an
"in_tex". The way that works is that it sticks the string "\end{reduce}"
to be processed as if it was there just before the file you read. The
effect should be that if you have TeX stuff in your file that only the
bits from \begin{reduce} to \end{reduce} get processed by Reduce. There is
no attempt to look at file contents to see if this is wanted, so you need
to know whether you are wanting to read in one of these special files or
not.

The implementation is crude - but as far as anybody not using is is
concerned it merely adds a few lines of code that are not execured
unless somebody says "\end" and an extra test when a "\" occurs in the
input.

Anybody selecting new Reduce files to parse from should
beware if there could be any chance of them wanting to switch inputs when
the existing tokeniser had just seen say "\end{reducX" in that at that
stage it will have just returned the "\" and pushed back the "end{reducX"
into a look-ahead buffer that would need saving and restoring. Indeed the
input "\end" leads to that stort of behaviour, however the special status
and treatment of the word "end" means I hope that no valid cases of that
can ever arise!

If another developer hates this being present they may remove it! I rather
hate my code - if only because rlisp/tok.red has to be coded in a meagre
subset of rlisp with loads of goto statements because of its use early in
bootstrapping.

If this is close to what was asked about maybe it will help and I could
perhaps tinker around the edges. But maybe the scheme that embeds TeX
after "%" comment markers within Reduce code is also useful. I am not
using either at present!!!

              Arthur

------------------------------------------------------------------------------
RSA(R) Conference 2012
Save $700 by Nov 18
Register now
http://p.sf.net/sfu/rsa-sfdev2dev1
_______________________________________________
Reduce-algebra-developers mailing list
Reduce-algebra-developers@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/reduce-algebra-developers



--
Diego Esteban Alonso Blas.
Becario Predoctoral UCM. 
Departamento de Sistemas Informáticos y Computación. 
Facultad de Informática. Universidad Complutense de Madrid.