## panotools-devel — Panorama Tools software developers

You can subscribe to this list here.

2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec (6) Jan (2) Feb (3) Mar Apr May (4) Jun (8) Jul Aug (2) Sep Oct (2) Nov (9) Dec Jan (13) Feb (3) Mar (5) Apr (12) May (22) Jun Jul (10) Aug (7) Sep (3) Oct (7) Nov (10) Dec (13) Jan (22) Feb (20) Mar (5) Apr (50) May (103) Jun (98) Jul (63) Aug (35) Sep (32) Oct (149) Nov (77) Dec (113) Jan (145) Feb (97) Mar (109) Apr (5) May Jun (3) Jul (2) Aug (19) Sep Oct (1) Nov (3) Dec Jan (5) Feb (10) Mar (4) Apr May Jun Jul (1) Aug (16) Sep (2) Oct Nov Dec (3) Jan (2) Feb (2) Mar (6) Apr (5) May (19) Jun Jul (5) Aug (27) Sep (188) Oct (31) Nov (23) Dec (17) Jan (48) Feb (14) Mar (11) Apr (3) May (12) Jun (2) Jul (3) Aug Sep (10) Oct (6) Nov Dec Jan (51) Feb (77) Mar (39) Apr May (1) Jun Jul (1) Aug (1) Sep Oct Nov (1) Dec Jan (1) Feb (3) Mar (1) Apr (1) May Jun Jul Aug Sep (2) Oct Nov (7) Dec Jan (3) Feb Mar Apr (4) May (3) Jun Jul Aug Sep Oct (1) Nov (2) Dec (11) Jan (14) Feb Mar Apr (2) May Jun (1) Jul Aug Sep Oct Nov Dec Jan Feb Mar (1) Apr May Jun Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr May (1) Jun (1) Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr (2) May Jun Jul Aug Sep Oct Nov Dec
S M T W T F S
1

2
(2)
3

4

5

6
(2)
7

8

9
(2)
10
(2)
11
(2)
12
(4)
13
(2)
14
(4)
15
(1)
16
(1)
17
(8)
18
(5)
19
(11)
20
(3)
21
(2)
22
(7)
23
(24)
24
(15)
25
(9)
26
(14)
27
(15)
28
(8)
29

30
(4)
31
(2)

Showing 15 results of 15

 [PanoTools-devel] Inner rectangle of an image From: dmg - 2006-10-27 22:22:09 ```Max mentioned that the problem was more difficult than it looks. Unfortunately he wasn't willing to share his solution. I spent some time thinking about it today. The main problem is that for a given polygon, the number of rectangles that are bound by the perimeter of the polygon could be infinite (in the continuous world). So it is necessary to add constraints. The simplest two are: 1. maximize the rectangle's area, and 2. align the rectangle with the axis This problem, I just learnt, is typical of many industries, including the metal one. I found this paper that solves it in the general domain http://citeseer.ist.psu.edu/daniels97finding.html @article{255874, author = {Karen Daniels and Victor Milenkovic and Dan Roth}, title = {Finding the largest area axis-parallel rectangle in a polygon}, journal = {Comput. Geom. Theory Appl.}, volume = {7}, number = {1-2}, year = {1997}, issn = {0925-7721}, pages = {125--148}, doi = {http://dx.doi.org/10.1016/0925-7721(95)00041-0}, publisher = {Elsevier Science Publishers B. V.}, address = {Amsterdam, The Netherlands, The Netherlands}, } And it cites a reference to the Orthogonal variant O(nlog^3(n)) which is the one we are facing. -- Daniel M. German http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . ```
 Re: [PanoTools-devel] Q From: Daniel M. German - 2006-10-27 21:21:22 ``` JD> Option B sounds good. If I were to completely gut PTblender, I'd: Great ideas. JD, we have a plan. I am going to add your description to the TODO. I suggest the following. Let us get the first 3 points done first (I am duplicating below). They can be done incrementally with the current code base, So I suggest as the goal to write this functionality first. It will not be required for 3.0.0. Please check the Apache coding standards: http://httpd.apache.org/dev/styleguide.html Functions names are camel type, with pano always as a prefix (so we know it is our routines) and next the module they belong to (Colour). for instance, ReadHistograms should be renamed: panoColourHistogramsRead local identifiers are also camel based, but there is no real restriction except that they are self-describing. types are pano_whatever The current code is not obeying these rules, but new code should. dmg ---------------------------------------------------------------------- - Pre-compute which images could actually overlap from their TIFF offsets, adding only these to a linked list of pairs. Might as well support cropped TIFFs where possible. This will really help people who do >20 image multi-row sphericals (since the current algorithm loops over all pixels in the image N^2 times). For such panos, it may even be worth calling PTcrop (when it exists) first on the uncropped images. - Replace the two inner nested loops in ReadHistogram with one loop over the linked list of "possible match" images, and invert the order of the loops: for (each row) { read_row_from_images(row,&row_buffer); // careful with crop for (each match in matching_images_list) { if (row intersects both image boundaries) { for (each pix in row) { if pixel_include(row,pix,im1,im2,trim) add_to_histogram(pix,match); } } } } - Factor out the code which decides whether to use a given pixel in the histogram into a separate function (pixel_include() above), and pass it an options structure which gives it what it needs to know (the optional trim factors, etc., called 'trim' above). This is also where separate mask data could be used, but the "graymask" method currently employed may obviate that. ---------------------------------------------------------------------- -- Daniel M. German "Science can be esoteric The Economist -> technology has to be pragmatic" http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . ```
 Re: [PanoTools-devel] Q From: JD Smith - 2006-10-27 20:13:59 ```On Thu, 26 Oct 2006 15:54:21 -0700, Daniel M. German wrote: > > JD Smith twisted the bytes to say: > > JD> On Thu, 2006-10-26 at 15:14 -0700, Daniel M. German wrote: > >> > >> > >> Ok, let us put the code in, but in a more generalizable way. > >> > >> Instead of adding 2 parms, let us add only one (a pointer to a > >> struct). > >> > >> The struct will contain: > >> > >> * Any info needed (in this case the 2 thresholds) > >> > >> * A pointer to a function to call to determine if one should use the > >> points. This function should accept: > >> > >> What do you think? > > JD> Sounds reasonable. The problem is, what does that function take as > JD> input, RGBHSV for both pixels? Pre-compute them all before deciding > JD> whether to proceed? Would slow things down a bit. > > This is an important design decision, I think. This is where C++ would > help. > > Why dont' we discuss it until we are all happy: > > proposal A: > > * Refactor a function for the computation of the histograms for a > given line of the images. (it is currently embedded in the middle). > > * Create one function to call for each method. > > * Add a switch statement that determines which method to call. > > proposal B: > > * replace the predicate of the if that determines if the method is to > be used with a function call. This function call will take 2 > parameters: > > 1. A struct with information about what type of method to use (this > is the one passed to ReadHistograms) > > 2. A struct with info about the pixels (RGV, or HSV, and perhaps > more data, such as the neighborhood). > > This might also require a switch to decide what type of information > needs to be computed (if one wants it optimized for speed) > > I don't think A and B are mutually exclusive. I think we can start > with B. > > I'd like to hear what you think. Option B sounds good. If I were to completely gut PTblender, I'd: - Pre-compute which images could actually overlap from their TIFF offsets, adding only these to a linked list of pairs. Might as well support cropped TIFFs where possible. This will really help people who do >20 image multi-row sphericals (since the current algorithm loops over all pixels in the image N^2 times). For such panos, it may even be worth calling PTcrop (when it exists) first on the uncropped images. - Replace the two inner nested loops in ReadHistogram with one loop over the linked list of "possible match" images, and invert the order of the loops: for (each row) { read_row_from_images(row,&row_buffer); // careful with crop for (each match in matching_images_list) { if (row intersects both image boundaries) { for (each pix in row) { if pixel_include(row,pix,im1,im2,trim) add_to_histogram(pix,match); } } } } - Factor out the code which decides whether to use a given pixel in the histogram into a separate function (pixel_include() above), and pass it an options structure which gives it what it needs to know (the optional trim factors, etc., called 'trim' above). This is also where separate mask data could be used, but the "graymask" method currently employed may obviate that. - Simplify the actual histogram remapping and subsequent color correction code: 1. Always match all three histograms, RGB. Impose "brightness only" or other constraints on the mapping functions at the very end (see below). No HSV computations are ever performed. 2. Use a single routine to compute a mapping function (table) from histogram 1 (source) to histogram 2 (target). This routine will simply: a. Form cumulative totals of the both histograms. b. Create the 256 element floating point mapping function z which maps between them (one for each of RGB). This function will be called many times, so needs to be short and sweet. 3. Build a ragged array of length n_images, with each element holding a linked list of all other images to which it matches, keeping track of pixel overlap count, and omitting matches without enough pixels in overlap. 3. Compute the floating point mapping functions z for all pairs in the ragged array.. There is one z per pair for each of RBG. 4. "Anneal" the (potentially long list of) mapping functions z over the entire image: a. For each image, compute a master mapping function m for the image, from the overlapping pixel count-weighted average of all the modified sub-functions to all neighbors. b. The modified sub-function z' to a neighbor will depend on i) the mapping function between the two, z, and ii) the master function m of the neighbor, as: z'=m^-1 z The inverse of a mapping function m is that function which, when m is run through it, produces the unit vector (0..255). In the first round, all master functions are set to the unit vector (0..255), and z'=z. c. Repeatedly iterate over all images in this way until all master mapping functions converge. Convergence can progress non-uniformly (image by image); each image is marked as converged once its master function converges. Note that a reference image is no longer needed... the average best mapping to make all images compatible is automatically developed (e.g. for a range of brightnesses, the "average" brightness will be targeted). If a reference image is desired, it is marked "converged" before the first round of annealing, and everything proceeds in the same manner. 5. Normalize each image's annealed master mapping function, subject to the (optional) user constraints: -t 1 (brightness only): m(r) = m(g) = m(b) (one average table). -t 2 (color only): m(r) + m(g) + m(b) = I (unit vector) 6. Convert all normalized, annealed master mapping tables to byte, by rounding, perhaps with some care taken to avoid banding caused by large gaps. 7. Run each image's data through its master mapping table and write out to output image. - No flattening (separate tool). - Add an optional debug switch to enable all that Debug.txt output (just to stdout). JD ```
 Re: [PanoTools-devel] Preparing for version 3.0.0: a roadmap From: Daniel M. German - 2006-10-27 16:13:13 ``` Max Lyons twisted the bytes to say: >> >> Daniel wrote: >> >> So PTmender is deprecated, PTremap will take its place. >> Max> When you say deprecated, what does that mean? >> Are you talking as the open source developer or as a tawbaware >> developer? I am paraphrasing you. >> In my opinion, if you want PTstitcher... >> If you really want a PTmender, would you be willing to maintain it? >> Open source tools languish only when no open source developer cares >> for them. Here is where you can make a difference. You can continue... >> Testing PTmender has always being a pain to maintain, and almost >> nobody (including you) seem to care to contribute... Max> Yes. Max> However, I'm not sure I saw an answer to my original question in there. Would Hi Max, Neither you answer my questions. Max> it be fair to say that the answer to what you had in mind when you declared Max> that PTMender would be "deprecated" is that you won't rework PTMender into Max> something different than it is now (i.e. remove existing features), but you Max> will stop further development on it in favor of other tools (PTRoller, PTRemp, Max> etc.)? As of today, PTmender does not do any post-processing. It will be modified to print a notice to direct users to use PTremap. I will continue the development of the rest of the tools. In terms of features, the only cumulative loss is that PTtools will not generate JPEgs or PNG. If you (or tawbaware) are serios about maintaining PTmender create a branch in SVN, and add the code there (or hire somebody to do it--not me). Create a framework that supports testing of all the combinations (not necessarily tests for it) and then we will discuss its inclusion back into the tools. It would also be nice if you add fisheye support and feathering too (not that difficult, really). dmg -- Daniel M. German "there is nothing more to know; but in a scientific pursuit Mary Shelley (Frankenstein) -> there is continual food for discovery and wonder." http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . ```
 Re: [PanoTools-devel] Preparing for version 3.0.0: a roadmap From: Jim Watters - 2006-10-27 13:33:05 ```Max Lyons wrote: > Would it be fair to say that the answer to what you had in mind when you declared > that PTMender would be "deprecated" is that you won't rework PTMender into > something different than it is now (i.e. remove existing features), but you > will stop further development on it in favor of other tools (PTRoller, PTRemp, > etc.)? > > Thanks, > > Max > I have been out of the loop of things for too long. I do like the idea of the individual tools to do things. But I we also need the PTStitcher replacement. When the replacement was started I was imaging something not much more complicated than what PTOptimizer. It just takes the script and calls the right functions in the pano12 library. I do not want to see duplicated code everywhere for similar tools. I would like to see most of the code in the library. I have not had a chance to look at the code in a long time and this may be the case. The difference between tools that knows how to do one thing or a tools that can do many things in sequence should not be that different. I would also like all the tools released statically linked to PanoTools. If this is the case then it should be fairly easy to maintain all the tools. It is true that PanoTools library may need some of the code separated into a different library for file handling but this is a different discussion. -- Jim Watters jwatters @ photocreations . ca http://photocreations.ca ```
 Re: [PanoTools-devel] Preparing for version 3.0.0: a roadmap From: Max Lyons - 2006-10-27 12:51:12 ```> >> Daniel wrote: > >> So PTmender is deprecated, PTremap will take its place. > > Max> When you say deprecated, what does that mean? > Are you talking as the open source developer or as a tawbaware > developer? I am paraphrasing you. > In my opinion, if you want PTstitcher... > If you really want a PTmender, would you be willing to maintain it? > Open source tools languish only when no open source developer cares > for them. Here is where you can make a difference. You can continue... > Testing PTmender has always being a pain to maintain, and almost > nobody (including you) seem to care to contribute... Yes. However, I'm not sure I saw an answer to my original question in there. Would it be fair to say that the answer to what you had in mind when you declared that PTMender would be "deprecated" is that you won't rework PTMender into something different than it is now (i.e. remove existing features), but you will stop further development on it in favor of other tools (PTRoller, PTRemp, etc.)? Thanks, Max ```
 [PanoTools-devel] More changes... metadata and PSDs From: dmg - 2006-10-27 09:39:14 ```Hi Everybody, * PTtools maintain now the script in the ImageDescription field of the tiff. Each tiff also maintains the image number, and total number of images (as in the script) in the PAGENUMBER TIFF field. * PStiff2psd can now support specifying the blending mode for each of the layers I have reduced a lot the size of the TODO list (it is now in the main directory of panotools). Hopefully I can finish everything that I plan for 3.0.0 before the end of the week. After that we will move into a bug fixing period until we are certain the tools are robust. -- dmg -- Daniel M. German http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . ```
 [PanoTools-devel] PTuncrop From: dmg - 2006-10-27 06:42:37 ```Hi Bruno, and everybody else interested: PTuncrop and all the other PTtools are now working with cropped TIFfs without full size information (such as those output by nona). The full size is being assumed to be: = + (Version 2.8.5pre11) -- Daniel M. German "One thing I have learned in a long life: that all our science, measured against reality, is primivite and childlike --and yet it is the most precious thing Albert Einstein -> we have. " http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . ```
 Re: [PanoTools-devel] Preparing for version 3.0.0: a roadmap From: Daniel M. German - 2006-10-27 05:10:39 ```Max Lyons twisted the bytes to say: >> Daniel wrote: >> So PTmender is deprecated, PTremap will take its place. Max> Is this the final consensus that we've reached, or is this still open for Max> discussion? It is still open for discussion. Max> When you say deprecated, what does that mean? Will it be Max> removed, and I should get ready to rewrite all the documentation Max> and PTAssembler code to deal with some other program? Are you talking as the open source developer or as a tawbaware developer? I am paraphrasing you. Perhaps it will be important to clarify. In my opinion, if you want PTstitcher compatibility there is nona. If you want features found in PTblender then call nona, then PTblender. If you really need PTremap, well, you can call it too. If you really want a PTmender, would you be willing to maintain it? Max> Or, will it just languish in a state of neglect until it becomes Max> so outdated as to be useless/incompatible with newer code? Open source tools languish only when no open source developer cares for them. Here is where you can make a difference. You can continue to maintain a tool for as long as you (or tawbaware) want. pano12 or PTmender will not languish if you keep maintaining them, for instance. Max> Take the logic in PTCommon and break it into two functions...one Max> that does the remapping, and one that does the image Max> finalization (flattening, feathering, etc.). Create two new Max> programs (PTRemap and PTFinalize) that call each of these two Max> functions. Then, restructure PTMender so that it calls the Max> remapping and the finalizing code in sequence. This way, Testing PTmender has always being a pain to maintain, and almost nobody (including you) seem to care to contribute to the creation of a test suite. The number of potential execution traces grows dramatically as we add more features to it. dmg -- Daniel M. German http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . ```
 Re: [PanoTools-devel] Preparing for version 3.0.0: a roadmap From: Daniel M. German - 2006-10-27 05:10:39 ``` Hi Max, Max> In fact, I've been thinking along these lines, and already created a Max> "PTFinalize" program (mentioned in my posting yesterday) that takes the TIFF_M Max> format and converts it to whatever format (jpg, psd, png, tif, etc.) the user Max> specifies. I did this by reworking the panoCreatePanorama function in Max> PTCommon.c into two functions and writing a new program to take command line Max> arguments and call the new function that converts the TIFF_m images into their Max> final format. I'm happy to contribute this to the repository if folks will Max> find it useful. If this won't fit into the general plan, then I won't Max> contribute it. PTroller (extracted from PTblender) will take a set of TIFFs and create a single TIFF, so PTfinalize should not duplicate this functionality. I think it is a good idea to have a PTfinalize that would: * Take one single TIFF image, and convert it to a desired format: JPEG, PSD, PNG or any obscure format. Hopefully one day we can generate QTVRs. But I'd like to see panotools metadata being preserved in JPEGs. Otherwise imageMagick does a better job at creating JPEGs from TIFFs, and we don't need to waste our time recreating what others do. Max> On a more general level, I'd suggest trying to make as few Max> external changes as possible. Not only does this end up Max> confusing folks, but it probably lessens the chances of Max> adoption. Programs that change names, remove features (and so Max> on) probably don't end up endearing themselves to the average Max> user. In my opinion, of course. This is one of the reasons that I want to do this before the major release. -- Daniel M. German "When asked whether or not we are Marxists, our position is the same as that of a physicist or a biologist who is asked if he is a `Newtonian', or if Che Guevara -> he is a `Pasteurian'" http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . ```
 Re: [PanoTools-devel] Preparing for version 3.0.0: a roadmap From: Max Lyons - 2006-10-27 04:12:58 ```> Max wrote: > I think that the idea of creating small utility programs for subsets of > the stitching process is a reasonable idea, but I'd suggest leaving > PTMender as the "superset" program. There is about of year's worth of > written material on internet forums, web sites, discussion groups, e-mail > lists (etc.) describing PTMender as a replacement for PTStitcher. > Stripping out a big chunk of its feature set at this point would end up > confusing a lot of people. > Pablo wote: > Hmm, as Max has noticed, changing the interface of the > tools extensively has to be well thought out. While I personally don't > have a problem with a bare bones PTmender, some people might be > surprised. > Daniel wrote: > So PTmender is deprecated, PTremap will take its place. Is this the final consensus that we've reached, or is this still open for discussion? When you say deprecated, what does that mean? Will it be removed, and I should get ready to rewrite all the documentation and PTAssembler code to deal with some other program? Or, will it just languish in a state of neglect until it becomes so outdated as to be useless/incompatible with newer code? Or, will PTMender continue to exist as the superset program that calls PTRemap and PTFinalize (or whatever the individual utility programs are called)? Or, something else? My own suggestion for a "roadmap" is as follows: Take the logic in PTCommon and break it into two functions...one that does the remapping, and one that does the image finalization (flattening, feathering, etc.). Create two new programs (PTRemap and PTFinalize) that call each of these two functions. Then, restructure PTMender so that it calls the remapping and the finalizing code in sequence. This way, PTMender would retain its existing features (and not confuse anyone or break existing code that depends on it), and we'd have two new utility programs as suggested in Daniel's earlier message. Max ```
 [PanoTools-devel] LZW processing From: dmg - 2006-10-27 01:25:22 ```I have made just updated the TIFF reading routine so it does not require a predictor field for LZW compressed files. Can people who have been having this problem try to see if the problem is, indeed, solved? The version of PTtools should be 2.8.5.pre9 (currently in SVN). -- Daniel M. German "Science can be esoteric The Economist -> technology has to be pragmatic" http://turingmachine.org/ http://silvernegative.com/ dmg (at) uvic (dot) ca replace (at) with @ and (dot) with . ```
 Re: [PanoTools-devel] Q From: JD Smith - 2006-10-27 01:20:09 ```On Wed, 25 Oct 2006 14:40:18 -0400, Jim Watters wrote: > Daniel M. German wrote: >> JD Smith twisted the bytes to say: >> >> JD> 0 ignore data (e.g. for flattening) >> JD> 255 process data >> JD> other process data, but exclude pixels from histogram correction estimate >> >> I just committed the change to support this feature in PTblender >> (version 2.8.5pre7): >> >> --------------------------------------------------------------------- >> >> 2006-10-25 dmg >> >> * version.h (VERSION), configure.ac: Upgraded to version 2.8.5pre7 >> >> * ColourBrightness.c (ReadHistograms): Compute histograms only >> when mask == 255. Ignore otherwise. >> > Do we need another tool or option that would compare the overlap of the > two images and create an mask where the difference is greater than some > cutoff value. The purpose of this would be to eliminate objects in the > frame that have moved or missing in the other. I am thinking mostly of > people with bright colored clothing moving around. These mask would > only be used in the calculating of the histogram correction and not for > blending the final pan. Something similar could be used to to determine > where the blending seams should be. That's essentially what my -u and -i options do for you (see contributed patch), and the "gray-masking" capability Daniel added does as well for hand painting such areas, values not 0 or 255 constitute such a secondary mask, and luckily enblend considers any number >=1 to be "on" for blending. I'd be happy to hear of people's experience with hue trimming (in particular) for cases where color shifts are a problem. JD ```
 Re: [PanoTools-devel] Preparing for version 3.0.0: a roadmap From: Daniel M. German - 2006-10-27 00:47:03 ``` Daniel> I agree, for that reason I think that using nona is good enough for Daniel> non-case users. PTtools is really intended for powerusers. Sorry, I meant to say: "nona is good enough for casual users". -- dmg ```
 Re: [PanoTools-devel] Q From: JD Smith - 2006-10-27 00:05:15 ```> Max comments that I like to "beautify code" while maintaining (and > sometimes reducing) functionality. It is true. Because at the end of > the day I like to work with good, professional code. Panotools is not > there yet (and I am also responsible for that), but I don't want to > move backwards. That is why I have been working on improving the code > base. Total agreement from this quarter. The knock-on effect of beautiful, well-documented, well-factored code is that the barrier of entry for new programming talent is much lower, and thus the code improves faster, lives longer and has a better chance of being kept up to date. > if you look at the past of panotools, it was in life support. Most > changes in the previous years were minor and designed to keep it > afloat. Part of the reason is there is very little interest on its > users to improve it. Well, in reality PanoTools may have become a liability for some developers, witness e.g. how Joost has moved most Pano functionality into the closed part of PTGui. Hugin has also reimplemented much PanoTools functionality, in a modern OOP generic programming framework which has real advantages (though barrier to entry for the average programmer isn't one). If this code is to remain relevant, it needs people such as yourself who think long term and big picture. Thanks for your work. JD ```

Showing 15 results of 15