From: dmg <dmg...@uv...> - 2006-05-14 22:23:45
|
hi everybody, I have not been able to connect to the CVS repository in the last 4 or 5 days. Does anybody have the same problem? I just get a timeout from cvs.sourceforge.net -daniel -- Daniel M. German "If debugging is the art of removing bugs, then programming must Anonymous -> be the art of inserting them." http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . |
From: Bruno P. <br...@po...> - 2006-05-14 22:45:17
|
On Sun 14-May-2006 at 15:24 -0700, dmg wrote: > > I have not been able to connect to the CVS repository in the last 4 or > 5 days. Does anybody have the same problem? I just get a timeout from > cvs.sourceforge.net The CVS service has been rebuilt and now uses a different hostname, you have to do a fresh checkout, see the attached email. I haven't had time to try it yet myself. -- Bruno |
From: Daniel M. G. <dmg...@uv...> - 2006-05-16 15:14:47
|
It finally worked for me. I made small changes that fixed the compilation warnings in resample. daniel Bruno> On Sun 14-May-2006 at 15:24 -0700, dmg wrote: >> >> I have not been able to connect to the CVS repository in the last 4 or >> 5 days. Does anybody have the same problem? I just get a timeout from >> cvs.sourceforge.net Bruno> The CVS service has been rebuilt and now uses a different hostname, Bruno> you have to do a fresh checkout, see the attached email. Bruno> I haven't had time to try it yet myself. I -- Daniel M. German "It is a kind of spiritual snobbery Albert Camus -> to think one can be happy without money." http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . |
From: Daniel M. G. <dmg...@uv...> - 2006-05-16 20:14:40
|
Hi Max, I am CC-ing the list, as I think that it is useful information to keep stored in the list archives and to open the discussion to others. I hope you don't mind. Max> In any case, I thought I'd send you this e-mail to let you know Max> about the changes I've made to PTMender and Pano Tools. I've Max> fixed a number of issues that prevented it from compiling on my Max> Win32/MingW system, and have also managed to get the cropped Max> TIFF output working with the fast transform code. This is Max> exciting to me because PTMender can now output tiff_m files Max> extremely quickly for very large projects. My list of changes Max> is below. Max> The tiff_m format is the simplest format the Pano Tools outputs. It doesn't Max> need any feathering, mask creation, alpha channel addition, file flattening and Max> so on. However, I think that it would be possible to adopt the approach used Max> by tiff_m (i.e. only iterate over the region of interest of the output file Max> covered by each input file) for other formats as well. For example, PTMender Max> could use tiff_m as the intermediate format for flattened JPG output. The code Yes, I think so. One thing that will greatly simplify PTmender is to encapsulate access to the pixels via a macro or a function. Unfortunately PTmender (and this is inherited from PTstitcher) uses a pointer to the area of interest and then increment this pointer as needed. This approach has one two majors problems: it requires a lot of memory for large panoramas (such as 360-180 in which most areas are black), and it promotes cloning of code (one for each resolution/type). If I understand correctly the main bottleneck (in terms of performance) was the mappping of the original images (which you have already updated to use cropped tiffs). The other steps are not that computationally expensive. The feathering/flattening/masking steps work on the entire image space too. One quick way to improve the memory use of PTmender is to encapsulate the access to a pixel in a function that encapsulates the image data (where it will only store the area of interest). This will save lots of memory, but it might result in a slight penalty per processed pixel, as one function will have to be called per pixel processed (which I think is small cost). If we think in terms of memory usage this extra cost will probably be worth it if we avoid memory swapping. I think it is worth pursuing, but I'd like to hear opinions, alternatives before we make major changes. Another big incentive is that less disk space will be used (see below) and less data is read/written to disk. I think the final result might be a faster post-processing, just by implementing the encapsulation routines. This approach will be easier to implement than an entire rewrite of the feathering, masking, flattening algorithms. And the best is that these algorithms can be improved (over time) in a way that they do not need to request the empty pixels (outside the area of interest). Max> covered by each input file) for other formats as well. For example, PTMender Max> could use tiff_m as the intermediate format for flattened JPG output. The code This is actually the way it currently works. PTmender creates first tiff_m and then flattens them for either PSD or JPG output (or other formats). Max> that does the flattening into the final file would need to be modified so that Max> it could work with cropped TIFF files and produce a final file the size of the Max> entire output image, but this shouldn't be too hard. Similar adjustments would Max> have to be made to the code that deals with masks/alpha channels/PSD creation Max> if we want to extend this approach to all other file formats. The advantage of It actually needs to be done in all the steps. This is the actual process to generate a JPG: * Create TIFF_m of the mapped photographs (creates one TIFF per image) * Compute Stitching Masks (creates two TIFFs per image, deletes 2 tiffs/per image, including the ones from the previous step) * Feather (creates one TIFF per image, deletes one from previous step) * Flatten (deletes all previous TIFFs, generates one) At this point we have only one TIFF * Convert this TIFF to whatever format is required (like JPG), delete original TIFF if no longer required. -- Daniel M. German "Do not confuse luck with skill. " The Replacement Killers" http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . |
From: Bruno P. <br...@po...> - 2006-05-16 21:03:48
|
On Tue 16-May-2006 at 12:54 -0700, Daniel M. German wrote: > > The feathering/flattening/masking steps work on the entire image space > too. One quick way to improve the memory use of PTmender is to > encapsulate the access to a pixel in a function that encapsulates the > image data (where it will only store the area of interest). I don't have any suggestions for improving the PTmender internals, but why bother doing feathering/masking/flattening at all? enblend does this so much better than PTStitcher. This workflow looks good to me: `PTmender` -> cropped TIFF files -> `PTblender` -> colour-corrected cropped TIFF files -> `enblend` -> feathered flattened TIFF file There are things that could be improved: * PTblender doesn't do LZW compressed TIFF (I think) * PTblender doesn't work with cropped TIFF * enblend doesn't adjust the mask position like smartblend -- Bruno |
From: Daniel M. G. <dmg...@uv...> - 2006-05-16 22:14:11
|
Bruno Postle twisted the bytes to say: Bruno> On Tue 16-May-2006 at 12:54 -0700, Daniel M. German wrote: >> >> The feathering/flattening/masking steps work on the entire image space >> too. One quick way to improve the memory use of PTmender is to >> encapsulate the access to a pixel in a function that encapsulates the >> image data (where it will only store the area of interest). Bruno> I don't have any suggestions for improving the PTmender internals, Bruno> but why bother doing feathering/masking/flattening at all? enblend I don't think the feathering/masking algorithms are very good at all. But I do like the ability to maintain the layers. Will it be possible to modify enblend to output different layers (when no color correction is requested)? The main reason I don't use enblend is that it is too important for me to keep the layers separate so I can mask the regions myself (to avoid ghosts and the like). The routines used in PTblender and PTmender are the same. But I agree, I have decided there is no point on doing feathering in PTmender (which means it will never be a full replacement of PTstitcher). daniel Bruno> does this so much better than PTStitcher. Bruno> This workflow looks good to me: `PTmender` -> cropped TIFF files -> `PTblender` -> colour-corrected cropped TIFF files -> `enblend` -> feathered flattened TIFF file Bruno> There are things that could be improved: Bruno> * PTblender doesn't do LZW compressed TIFF (I think) Bruno> * PTblender doesn't work with cropped TIFF Bruno> * enblend doesn't adjust the mask position like smartblend Bruno> -- Bruno> Bruno Bruno> ------------------------------------------------------- Bruno> Using Tomcat but need to do more? Need to support web services, security? Bruno> Get stuff done quickly with pre-integrated technology to make your job easier Bruno> Download IBM WebSphere Application Server v.1.0.1 based on Apache Geronimo Bruno> http://sel.as-us.falkag.net/sel?cmd=lnk&kid=120709&bid=263057&dat=121642 Bruno> _______________________________________________ Bruno> PanoTools-devel mailing list Bruno> Pan...@li... Bruno> https://lists.sourceforge.net/lists/listinfo/panotools-devel -- Daniel M. German "Do not confuse luck with skill. " The Replacement Killers" http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . |
From: Bruno P. <br...@po...> - 2006-05-16 23:00:12
|
On Tue 16-May-2006 at 14:43 -0700, Daniel M. German wrote: > > I don't think the feathering/masking algorithms are very good at all. > But I do like the ability to maintain the layers. Will it be possible > to modify enblend to output different layers (when no color correction > is requested)? According to Andrew this isn't possible. > The main reason I don't use enblend is that it is too > important for me to keep the layers separate so I can mask the > regions myself (to avoid ghosts and the like). You can manually adjust the alpha channel before passing the images to enblend, which has the effect of moving the mask lines. This actually works very well: http://www.panotools.info/mediawiki/index.php?title=How_to_use_enblend_for_patching_zenith_and_nadir_images -- Bruno |
From: Daniel M. G. <dmg...@uv...> - 2006-05-17 00:49:45
|
>> The main reason I don't use enblend is that it is too >> important for me to keep the layers separate so I can mask the >> regions myself (to avoid ghosts and the like). Bruno> You can manually adjust the alpha channel before passing the images Bruno> to enblend, which has the effect of moving the mask lines. This Bruno> actually works very well: Bruno> http://www.panotools.info/mediawiki/index.php?title=How_to_use_enblend_for_patching_zenith_and_nadir_images I tried that approach but it did not work very well for me (mine were pictures with lots of parallax errors, as they were shot handheld). I needed the seems to be _exactly_ where I wanted them, and it does not appear to be possible to have that level of control. That frustration, in fact, was that got me to work on PTmender :) Perhaps I did not prepare the masks properly. I'll give it a try again. -- Daniel M. German "Beware of bugs in the above code; I have only proved it Donald Knuth -> correct, not tried it." http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . |
From: Max L. <max...@ve...> - 2006-05-17 03:20:09
|
Daniel: > One thing that will greatly simplify PTmender is to encapsulate access > to the pixels via a macro or a function. Unfortunately PTmender (and > this is inherited from PTstitcher) uses a pointer to the area of > interest and then increment this pointer as needed. This approach has > one two majors problems: it requires a lot of memory for large > panoramas (such as 360-180 in which most areas are black), and it > promotes cloning of code (one for each resolution/type). > >...If we think in terms of > memory usage this extra cost will probably be worth it if we avoid > memory swapping. I think it is worth pursuing, but I'd like to hear > opinions, alternatives before we make major changes. The main pixel remapping routine (MyTransform) works on arbitrarily sized regions of each input image. As such, it is very memory efficient. As you note, the routines to do the blending, masking, alpha channel creation (and perhaps feathering) are less efficient, requiring each remapped image to be held in memory. I agree that there are probably more efficient ways to do these things. Because the TIFF files used by Pano Tools are written in strips, they can be quite easily processed one row at a time. However, at least some of the routines in PTStitcher/PTMender operate one column at a time (see ComputeStitchingMask8bits, for example). To read a TIFF file one column at a time will involve an awful lot of disk I/O. In any case, I'm not sure how important it is to enhance all these routines. For me, at least, and I suspect many others, tiff_m output is really the most important format as it can be fed into Enblend or other blending software. And, this is the simplest format to generate. It is already working. So, I agree with your conclusion that adding feathering and support for other formats might be nice, and would be required for a complete replacement for PTStitcher, but probably not crucial. > Max> covered by each input file) for other formats as well. > Max> For example, PTMender could use tiff_m as the intermediate > Max> format for flattened JPG output. The code > > This is actually the way it currently works. PTmender creates first > tiff_m and then flattens them for either PSD or JPG output (or other > formats). > > It actually needs to be done in all the steps. This is the actual > process to generate a JPG: > > > * Create TIFF_m of the mapped photographs (creates one TIFF per image) > > * Compute Stitching Masks (creates two TIFFs per image, deletes 2 > tiffs/per image, including the ones from > the previous step) > > * Feather (creates one TIFF per image, deletes one > from previous step) > > * Flatten (deletes all previous TIFFs, generates one) > > At this point we have only one TIFF > > * Convert this TIFF to whatever format is required (like JPG), delete > original TIFF if no longer required. > > Right. What I had meant to say that it would be possible to modify PTMender to use cropped TIFFs as the intermediate format, rather than full-size TIFFs. It should be possible to rework the functions above to work with cropped TIFF files and just "pad" the intermediate cropped TIFF files as needed. For example, the other "important" formats (from my point of view, of course) are flattened JPG and layered PSD. When writing flattened JPG files this would be during the blending stage (BlendLayers8Bit), and for layered PSD format this would be in the functions that write the PSD file (writeChannelData in file.c would be the most likely candidate). I believe that working with cropped TIFF_m as the intermediate format would produce the most "bang for the buck" in terms of improving speed, reducing memory and storage requirements and would probably solve 90% of all the problems people have with crashes in PTStitcher (frequently as a result of running out of memory when trying to generate layered PSD files). You asked in another thread if "it be possible to modify Enblend to output different layers (when no color correction is requested)". I've written my own blending software based on the same multi- resolution splining algorithm used in Enblend, and can say that the answer is no. The multi-resolution splining algorithm doesn't adjust the colors or brightness in individual layers...it decomposes overlapping images into frequency bands, blends those, and then reconstitutes a final image by adding the blended frequency bands. As such, there isn't really any adjustment to the original layers. the Burt/Adelson paper explains all of this quite nicely. http://web.mit.edu/persci/people/adelson/pub_pdfs/spline83.pdf Having said this, I find that it is usually possible to produce a blended output image, and if there is a need for correction, I just layer the unblended layers on top of the blended output in Photoshop and manually fix any problem areas. Lastly...thanks for all the hard work on PTMender. After years of being "stuck" with PTStitcher.exe, but no corresponding source code, it is a real pleasure to be able to work on this project. If Helmut Dersch is reading, I hope he is able to take some satisfaction from the continued evolution of his original software. Max |
From: Daniel M. G. <dmg...@uv...> - 2006-05-17 22:03:26
|
Max> In any case, I'm not sure how important it is to enhance all Max> these routines. For me, at least, and I suspect many others, Max> tiff_m output is really the most important format as it can be Max> fed into Enblend or other blending software. And, this is the Max> simplest format to generate. It is already working. So, I Max> agree with your conclusion that adding feathering and support Max> for other formats might be nice, and would be required for a Max> complete replacement for PTStitcher, but probably not crucial. Last night I was fixing parallax on an 8Mbyte picture (it required a fair amount of correction as it was shot upwards at an angle of approx 45 degrees, and it was taken with a EF-S 10-22 at 10mm). I was very surprised when my machine started to run out of disk (due to swapping) and started trashing like crazy (I had lots of apps open, but I have 3 Gigs of memory). To my surprise the generated TIFF was almost 1 GByte. It was 1 image, 16 bit. Only 1! Of course I did not use compression nor cropping. Perhaps we should modify the default behaviour to use compression and potentially cropping. -- Daniel M. German "A first-rate laboratory is one in which mediocre scientists can produce Patrick Maynard Stuart Blackett ->outstanding work" http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . |
From: Max L. <max...@ve...> - 2006-05-18 12:53:10
|
> Last night I was fixing parallax on an 8Mbyte picture...To my surprise the generated TIFF was almost 1 GByte. It was 1 image, > 16 bit. Only 1! Of course I did not use compression nor cropping. > > Perhaps we should modify the default behaviour to use compression and > potentially cropping. I think that this would be a very good way to speed things up and reduce memory/storage requirements. However, I'm puzzled why an 8 megabyte (did you mean magepixel?) image could get blown up to a 1000 megabyte image? Even using 16 bits per channel, and 4 channels per pixel, that still means you had a 125 megapixel image. Why was it so large? Max |
From: Max L. <max...@ve...> - 2006-05-19 06:22:36
|
> > I believe that working with cropped TIFF_m as the intermediate format would > > produce the most "bang for the buck" in terms of improving speed, reducing > > memory and storage requirements and would probably solve 90% of all the > > problems people have with crashes in PTStitcher (frequently as a result of > > running out of memory when trying to generate layered PSD files). > > Using cropped TIFF end to end would be a tremendous help in speed and memory. I've got a working version of PTMender that generates layered PSD format but uses cropped TIFF as its intermediate file format. It runs much quicker using cropped TIFF "end to end". I've still got a bit more code cleanup to do, but the results are encouraging so far. I hope to be finished soon. Max |
From: Jim W. <jwa...@ph...> - 2006-05-17 13:31:32
|
On Tue, May 16, 2006 11:19 pm, Max Lyons said: > Because the TIFF files used by Pano Tools are written in strips, > they can be quite easily processed one row at a time. However, at least some > of the routines in PTStitcher/PTMender operate one column at a time (see > ComputeStitchingMask8bits, for example). To read a TIFF file one column at a > time will involve an awful lot of disk I/O. If memory serves me correctly many functions have already been rewritten from column to row processing. PTMender should follow and change to row. > In any case, I'm not sure how important it is to enhance all these routines. > For me, at least, and I suspect many others, tiff_m output is really the most > important format as it can be fed into Enblend or other blending software. > And, this is the simplest format to generate. It is already working. So, I > agree with your conclusion that adding feathering and support for other formats > might be nice, and would be required for a complete replacement for PTStitcher, > but probably not crucial. We do need a flattened result for the low res quick preview before committing to final result. For this case I rather no feathering and blending but having the seams in the center of the overlap so I can better see the alignment. So still need to create masks and flatten. If we can create a fast method to create feathering that may or may not blend very well. Leave the smooth blending to Enblend. > I believe that working with cropped TIFF_m as the intermediate format would > produce the most "bang for the buck" in terms of improving speed, reducing > memory and storage requirements and would probably solve 90% of all the > problems people have with crashes in PTStitcher (frequently as a result of > running out of memory when trying to generate layered PSD files). Using cropped TIFF end to end would be a tremendous help in speed and memory. The crashes with PTStitcher that I investigated were the result of a memory allocation that happened in pano12.dll and then was freed in PTStitcher. If both are compiled with the same libraries there would not be a problem. But that can not always be controlled. What I propose is to add a function to pano12.dll to free memory that it allocated. The function in question was the one that reads the script file and passed back a pointer to a char array. There may be others. This is the one thing I was hoping to add even since a replacement to PTStitcher was started. Jim Watters |
From: Daniel M. G. <dmg...@uv...> - 2006-05-17 22:03:23
|
Jim> Using cropped TIFF end to end would be a tremendous help in speed and memory. Jim> The crashes with PTStitcher that I investigated were the result Jim> of a memory allocation that happened in pano12.dll and then was Jim> freed in PTStitcher. If both are compiled with the same Jim> libraries there would not be a problem. But that can not always Jim> be controlled. What I propose is to add a function to Jim> pano12.dll to free memory that it allocated. The function in Jim> question was the one that reads the script file and passed back Jim> a pointer to a char array. There may be others. This is the Jim> one thing I was hoping to add even since a replacement to Jim> PTStitcher was started. LoadScript is the probably the function you are referring to. PTmender frees it. I think I discover one or two unfreed pointers when I was working on these routines, but they now release the memory. The memory has to be released by the caller, otherwise we never know when to release it (LoadScript returns a pointer). See LoadScript in main (PTmender.c) I don't think this is a major problem, though. It is called twice, and it does not request too much memory. In the worse case you have two pointers that are never freed, but the progrma should not crash. Perhaps I am misunderstanding the problem. Jim> Jim Watters -- Daniel M. German "Technology now more often arouses apocalyptic ecstasies or visions of the kingdom of God Jacques Ellul -> than rational reflection" http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . |
From: Jim W. <jwa...@ph...> - 2006-05-17 22:57:21
|
On Wed, May 17, 2006 3:39 pm, Daniel M. German said: > > Jim> Using cropped TIFF end to end would be a tremendous help in speed and memory. > > Jim> The crashes with PTStitcher that I investigated were the result > Jim> of a memory allocation that happened in pano12.dll and then was > Jim> freed in PTStitcher. If both are compiled with the same > Jim> libraries there would not be a problem. But that can not always > Jim> be controlled. What I propose is to add a function to > Jim> pano12.dll to free memory that it allocated. The function in > Jim> question was the one that reads the script file and passed back > Jim> a pointer to a char array. There may be others. This is the > Jim> one thing I was hoping to add even since a replacement to > Jim> PTStitcher was started. > > > LoadScript is the probably the function you are referring to. PTmender > frees it. I think I discover one or two unfreed pointers when I was > working on these routines, but they now release the memory. > > The memory has to be released by the caller, otherwise we never know > when to release it (LoadScript returns a pointer). See LoadScript in > main (PTmender.c) > > I don't think this is a major problem, though. It is called twice, and > it does not request too much memory. In the worse case you have two > pointers that are never freed, but the progrma should not crash. > > Perhaps I am misunderstanding the problem. > > Daniel M. German Yes LoadScript is the function. The problem is when to diffent compilers are used one for PTStitcher, and a different one for pano12. If the different compilers use totaly differnt alocation and dealocation rutines it can cause a crash. What I propose is a funtion in pano12 that would take a pointer to memory, that it prieviously alocated, so it can be dealocated by its sister funtion. This is what caused some people problems with the pano12.dll that I compiled with MSVS.net. Although it runs upto 50% faster, sometimes when PTStitcher,( that was compiled with WinGW )dealocates the memory it would crash. The new and delete funtions need to match. When there are multiple application and librearies all trying to work together it is not possible to control what compiler is used for each one. Pano12 already has pairs of funtions for alocating and freeing handles just need to add the freeing of pointers. Jim Watters |