I use Perl a lot to create xml out of flat files when each row can create a part of the tree with no dependencies on what came before or after.  Generally this data comes from a relational database...

On Dec 17, 2007 11:28 PM, Tatu Saloranta <cowtowncoder@yahoo.com> wrote:
--- Dimitre Novatchev <dnovatchev@gmail.com> wrote:

> On Dec 17, 2007 3:25 AM, Julio de la Vega
> <juvepo@gmail.com> wrote:
> > I am working for a bank in this issue. They
> generate huge flat files that I
> > have to transform to xml. I stimate that my ouput
> files will be between 2
> > and 3 Gb. I will review your information and try
> to find the best solution.
> Then your application should not just transform the
> huge text input
> file into a single xml file. You are in full control
> to try to produce
> many xml files from the huge text file.

Besides, if input is text, not xml, why even bother
with xslt in the first place?
If you have to parse/splice textual content and
produce xml output, it's likely to be rather simple to
use any old xml writers for the output side. The
result could be done as multiple smaller files, or as
one huge file, depending on who has to process them

-+ Tatu +-

Never miss a thing.  Make Yahoo your home page.

SF.Net email is sponsored by:
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services
for just about anything Open Source.
saxon-help mailing list