Learn how easy it is to sync an existing GitHub or Google Code repo to a SourceForge project! See Demo
In some cases, like when parsing a Turtle file in my scenario, there are edge cases that results in very large tokens - even GBs. So, could you implement some type of a BLOB token type that allows for streaming?
Please, specify the API of 'blob' and how it is different from the accumulator.
Inge Eivind Henriksen
A client may want to stream a 1 GB BLOB over a network connection, to avoid exhausting internal memory resources I would like to stream a single token that takes up 1 GB space in bulks of 4 KB - is this possible? I have looked at all the Quex accumulator examples I have found to no avail. Thank you.
Perhaps I could use Direct Buffer Access to achieve this kind of token streaming? http://quex.sourceforge.net/doc/html/buffer-access/intro.html
Can you have something like 'sub-lexeme with maximum size 4096' defined
in your grammar; Then you might define a mode for it. Instead of writing
it into the accumulator, you might stream it through a socket. Or, do you
mean the START/PAUSE/RESUME/STOP api of streaming servers?