[Torch5-devel] apply() and torch5-devel mailing list
Status: Pre-Alpha
Brought to you by:
andresy
From: Ronan C. <ro...@co...> - 2007-11-20 21:26:17
|
Hey ---- disclaimer i created a torch5-devel list. so if you want to discuss something on the development of torch, write here. new added features should also be advertised here. i will not send any mail personally anymore, so i strongly encourage you to subscribe to the list if you are interested by this kind of news. https://lists.sourceforge.net/lists/listinfo/torch5-devel ---- end of disclaimer I added in lab package the lab.apply() function. Suppose you have a 1000x10000 random tensor and you want to apply a tanh on it. You can: x = lab.rand(1000,10000) -- superfast, takes 0.7s z = lab.tanh(x) -- superslow, takes 44.7s z = torch.Tensor():resizeAs(x) for i=1,x:size(1) do for j=1,x:size(2) do z[i][j] = math.tanh(x[i][j]) end end -- superslow, takes 15.5s -- we can do it here, because we know the tensor is contiguous in memory z = torch.Tensor():resizeAs(x) local xs = x:storage() local zs = z:storage() for i=1,xs:size() do zs[i] = math.tanh(xs[i]) end -- supercool, takes 1.4s z = lab.apply(x, math.tanh) So this lab.apply() function is extremely efficient (considering it does lua calls underneath) and should be used instead of loops whenever you can. (well, except of course if an optimized C function already exists like for tanh()) Ronan. |