From: Pushkar R. P. <top...@gm...> - 2013-07-19 00:27:47
|
Thanks. I will try it out and post any findings. Pushkar On Thu, Jul 18, 2013 at 12:36 AM, Andreas Hilboll <li...@hi...> wrote: > > > > You could use pandas_ and the read_table function. There, you have nrows > and skiprows parameters with which you can easily do your own 'streaming'. > > .. _pandas: http://pandas.pydata.org/ On Thu, Jul 18, 2013 at 1:00 AM, Antonio Valentino < ant...@ti...> wrote: > Hi Pushkar, > > Il 18/07/2013 08:45, Pushkar Raj Pande ha scritto: > > Both loadtxt and genfromtxt read the entire data into memory which is not > > desirable. Is there a way to achieve streaming writes? > > > > OK, probably fromfile [1] can help you to cook something that works > without loading the entire file into memory (and without too much > iterations over the file). > > Anyway I strongly recommend you to not perform read/write cycles on > single lines, rather define a reasonable data block size (number of > rows) and process the file in chunks. > > If you find a reasonably simple solution it would be nice to include it > in out documentation as an example or a "recipe" [2] > > [1] > > http://docs.scipy.org/doc/numpy/reference/generated/numpy.fromfile.html#numpy.fromfile > [2] http://pytables.github.io/latest/cookbook/index.html > > best regards > > antonio > > |