A friend of mine asked me to see if he could use the monitor screen to display a test target? He says that he likes this idea, since the screen is absolutely flat, the humidity of the air does not affect it, you can show any test image at different scales. I answered him that most likely this should not be done, but I will ask the author of the program for the exact answer, since perhaps there is a chance of such use from a long distance so that the test target on the monitor screen is of the same quality as that printed on the printer.
Vlad
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I think it would be fine to use a monitor as a test chart, with the same general rules regarding absolute vs relative measurements, meaning as long as you test at the same distance from the chart with different lenses, then your relative results should be consistent and comparable.
Note that the pixels of the monitor will generally be noticeably larger than the dots of a printed test chart, but on the other hand you can display the test charts with a bit of anti-aliasing on the edges, using the gray levels of the monitor to produce a less-jagged edge. So I see this going two ways:
If you use a strict black/white rendering of the test chart on the monitor, then you will see more contrast at higher frequencies (above Nyquist, I expect), meaning it will look a bit like noise in your measured SFR. But using a black/white rendering of the chart will probably give you the highest possible resolution values using a given monitor.
If you allow some gray-scale smoothing (anti-aliasing) in the rendering of the edges, then you should obtain "cleaner" SFR curves that produce more repeatable / robust results, but your absolute resolution values will be lower.
(I think most software, especially PDF viewers, will distplay the MTF Mapper test charts with some anti-aliasing along the edges, i.e., option 2 above, but I think you can use Inkscape or GIMP to control this if you want to try opion 1)
Of course everything depends on the effective magnification factor in the final image, and the resolution of the monitor (a 4k monitor would be preferable over a 1080p monitor, for example), but overall I think it is worth trying a monitor. I would offer to do some experiments on this myself, but I have a significant backlog of work on the software itself, so I cannot promise such a comparison right now.
Regards,
Frans
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
"I think it would be fine to use a monitor as a test chart, with the same general rules regarding absolute vs relative measurements, meaning as long as you test at the same distance from the chart with different lenses, then your relative results should be consistent and comparable."
Yes you are right. We tested 3 lenses using a monitor screen as a test target. Test conditions were constant (light, distance). And the results obtained, the test diagrams, showed the same lens locations as when shooting in the real world. A good lens took 1st place in the test, a bad lens took the last place. Even the difference in the test charts was very similar to the difference when shooting in the real world.
We even looked at the field curvature, one lens had it. On tests, we also saw field curvature in the same part of the image.
The absolute value when testing is of course very good, but usually the photographer is interested in better or worse new lens than the old one he already has. Again, relative testing is very helpful.
We opened a * .pdf test target in Adobe Photochop, rotated the image horizontally and saved it in tiff. (for monitor resolution). All this was done without making any adjustments to the image. We then opened the full screen image of the target in FastStone Image Viewer. This image on the monitor screen was photographed in compliance with the rules.
Your recommendations 1 and 2 are very interesting. I think that anti-aliasing can be done in Adobe Photochop, the main thing is to choose the required anti-aliasing value.
Of course, the size of the monitor matters a lot and 4k is much better than 1080p, which gives a lot more options, for example when testing tele lenses (long focal length).
It seems to me that using a monitor makes testing faster and more versatile. Of course, subject to limitations, since the test target is still a monitor screen, not a sheet of paper.
It is interesting to check the operation of auto focus, how well the monitor screen will show itself in this. I hope that all tests will pass successfully.
"I would offer to do some experiments on this myself, but I have a significant backlog of work on the software itself, so I cannot promise such a comparison right now."
I completely agree with you, of course, the program is more important. But maybe someday you will try this method and give your even more valuable comments on this method of testing.
Regards,
Vlad
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I looked at this test with a 35mm lens. Screenshot results. A monitor screen was used as a test target. I think a monitor screen could be used here. I wanted to show you the results, maybe you will have some comments or suggestions. This is not a detailed test, but just a trial version to evaluate the general possibility of such testing of auto focus.
Yes, of course, I will be glad to be helpful in your work. I will try to send you a file by mail, only it has a large size of 25 mb. Or a link to the disk if I can't send it to the mail.
Regards,
Vlad
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
A small quote from the documentation on the Imatest pragram.
"An image of a horizontal or vertical edge on an LCD monitor (desktop or laptop) may also be used as a target if the monitor does not flicker (many do). The camera should be tilted with respect to the monitor. The disadvantage of this technique is that you have only one edge to work with; you can’t easily create a map of lens performance. Imatest Screen Patterns can display suitable LCD test images. You can set contrast, which should be in the range of 4:1 to 10:1 to minimize clipping. To minimize artifacts (Moire, etc.) the camera must be far enough from the monitor so the sensor pixel “frequency” (1/(2*pixel spacing) at the image sensor) is at least 30% above the Nyquist frequency of the LCD screen. A rule of thumb is that the LCD screen image should take up no more than 1/3 to 1/4 the image width (1/9 to 1/16 the area)."
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
I suppose that larger screens tend to have a larger pixel pitch, so for a
given era (1080p, 4K, 8K ...) the optimal size of the monitor image
relative to the camera frame would be relatively constant. But you can use
their 30% above Nyquist guideline to calculate the optimal size for your
chosen image sensor, magnification, and monitor pixel pitch combination.
A small quote from the documentation on the Imatest pragram.
"An image of a horizontal or vertical edge on an LCD monitor (desktop or
laptop) may also be used as a target if the monitor does not flicker (many
do). The camera should be tilted with respect to the monitor. The
disadvantage of this technique is that you have only one edge to work with;
you can’t easily create a map of lens performance. Imatest Screen Patterns
can display suitable LCD test images. You can set contrast, which should be
in the range of 4:1 to 10:1 to minimize clipping. To minimize artifacts
(Moire, etc.) the camera must be far enough from the monitor so the sensor
pixel “frequency” (1/(2*pixel spacing) at the image sensor) is at least 30%
above the Nyquist frequency of the LCD screen. A rule of thumb is that the
LCD screen image should take up no more than 1/3 to 1/4 the image width
(1/9 to 1/16 the area)."
I would like to hear from the author an example of such a calculation. I don't quite understand how to count all this, given that the target can be located at an angle of 45 degrees. Especially from a close distance, about 1m 50 cm.
It would be nice to read your recommendations for at least a Lensgrid test target. (It is parallel to the sensor of the camera)
The initial data are as follows:
1) Lensgrid_a4 (1697x1200)
2) LCD 24 inch. The resolution is 1900x1200. Monitor pixel pitch 0.27 mm.
3) Nikon AF-S NIKKOR 35mm 1.8G
4) The size of the camera matrix is 23.5x15.6 mm. The matrix pixel size is 0.00392 mm.
Regards,
Vlad
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
The correct way to approach this problem is to perform the usual camera transformation, similar to the steps in a typical camera calibration process (like illustrated here)
But we can take some shortcuts to simplify the problem. For starters, we can assume that the optical axis of the lens is pointed right at the centre of the test chart (e.g., MTF Mapper's "focus" style chart). We simplify the problem by only looking along a line running horizontally across the monitor, passing straight through the centre of the chart. With this assumption, we can ignore the vertical component, and model the problem from a top-down view.
We define the centre point on the chart(and monitor) as C=(0,0) in our top-down view. The chart is rotated 45 degrees relative to the optical axis, so the rotation matrix to transform the chart coordinates into the camera coordinate frame is just [ cos(theta) sin(theta); -sin(theta) cos(theta)]. With theta=45 degrees, this becomes R = [q q; -q q] where q = sqrt(0.5). Note that our top-down view means our coordinates are [x; z].
Next, we have to translate relative to the the camera Centre Of Projection (COP), which is located at K=(0, 1500) relative to chart centre (in millimetres). So a point P = [px, pz] on the monitor (along that horizontal line passing through chart centre) is transformed into the camera frame by the transformation R * P - K. For example, our camera frame x coordinate becomes tx = px*q + pz*q - 0, and z becomes tz = px*(-q) + pz*q - 1500. We can also use the fact that pz = 0 for all points on the monitor to simplify things even further, so tx = px*q, and tz = -px*q - 1500, where q = sqrt(0.5).
The last two steps is to apply the perspective division, and scaling to the sensor size. Since our virtual sensor plane is at a distance of 35 mm (the focal length) in front of the COP, we use similar triangles to compute the perspective-projected [sx; sz] coordinates. This means sx = 35 * tx / tz, and sz = 35 * tz / tz = 35 (since all the points fall exactly on the sensor plane after perspective projection, so we can ignore sz from here on). Note that sx is measured in millimetre units at this point, and we can transform it to pixel units by dividing sx by 0.00392 mm if we want.
So the final formula for calculating the sensor pixel coordinates of a point on the monitor is:
cx = ( 35 * (px*q) / (-1500 - px*q) ) / 0.00392
For a pixel right at the centre of the monitor we know that px=0.27 mm, and pz=0 (recall that by definition all pz=0 on the monitor). Using the above equations, we see that
px = -0.27 mm -> cx = 1.136 pixels.
(keep in mind that cx = 0 represents the centre of the sensor, so the range of cx is approximately -3000 to 3000 for your sensor)
For a point towards the front end of monitor:
px = -0.27 * 1697 / 2 -> cx = 1081 pixels
For a point towards the rear of end (furthest end) of the monitor:
px = 0.27 * 1697 / 2 -> cx = -870.27 pixels
I may have been a bit sloppy with the sign conventions here (for example, is the monitor rotated left or right, and what is the correct sign of the z-value of K above), so there is some ambiguity here. It looks like my convention means the monitor was rotated so that the left edge is further from the camera.
So you can see that the perspective transformation has a significant impact on the number of pixels spanned by a single monitor pixel. We already saw that near the centre, one monitor pixel projects to 1.136 sensor pixels, but on the furthest end of the monitor, a single monitor pixel only spans 0.93 pixels. I calculated this from the difference between px = 0.27 * (1697/2) and px = 0.27 * (1697/2-1).
Similarly, one monitor pixel projects to 1.4 sensor pixels on the close end.
You did not specify what the apparent size of monitor image was in your photo, but I did cross-check my values above with a field-of-view calculator, and I think my calculations appear to be in the right ballpark.
I just re-read your question, and I see that you also asked about the case where the chart/monitor is parallel to the sensor. That would reduce to R = [cos(0) sin(0); -sin(0) cos(0)], so that tx = px, and tz = -1500, or eventually cx = (-35 * px / 1500) / 0.00392, which gives you a constant ratio of 1.6 sensor pixels for each monitor pixel. This is easy to check in real life by direct measurement, but it will reveal some of the finer details (e.g., where exactly is the camera's centre of projection relative to the focal plane marker on the camera body).
Hope this is enough to get you started.
Regards,
Frans
Last edit: Frans van den Bergh 2020-10-25
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Thanks a lot for the clarification. As far as I understand, the issue of distortion is common to any target printed or displayed on a monitor screen as this distortion is lens dependent.
As for the test target directly displayed on the monitor screen, here it is of great importance the ratio of the pixel sizes of the monitor and the matrix, of course. Of course, when printing at high resolution, all this is not as critical as when using LCD. Probably not in vain, and you and a number of other authors used physical objects for your tests (knife blades, razors). In second place is high resolution printing and at the very end is LCD.
I agree with your calculations. However, I would like to clarify 2 more questions:
The aspect ratio of the monitor pixel and the camera pixel. I have not met a consensus on this issue, but various sources say that the pixel ratio should not be 1 to 1, but 1: 2, 1: 2.5, 1: 3 or even 1: 4. A lot of arguments are given (phase shift, debayering, etc.). The camera and lens are not perfect. If we place the matrix at such a distance from the LCD that 1 pixel of the monitor is projected onto 1 pixel of the matrix, then the matrix will not be able to adequately display this pixel.
Taking into account your calculations, physical requirements of the camera and lens to the ratio of pixels of the matrix and LCD, the requirements of the program itself for accuracy, what should be the ratio of pixels of the matrix and LCD (for the program to show acceptable accuracy)? (If, of course, it is possible to determine such a ratio for the program).
Regards,
Vlad
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
You are right about lens distortion (like radial distortion, e.g., "barrel" distortion), but I omitted that from the discussion of the camera model for brevity. At the scale we are talking about here, you can safely ignore the impact of moderate distortion, of course while keeping in mind that my model above only applies to rectilinear lenses.
To answer your specific questions:
Correct. 1:1 sampling is a terrible idea. Normally, one thinks in terms of sampling relative to the image sensor, but here we want to achieve the opposite: we do not want to sample the monitor pixels. This means that the smaller the monitor pixels relative to the sensor pixels, the better. If I had to hand-wave a bit here I would say that the minimum should be 1:2.5 for a monochrome camera sensor, and that you can safely double that for a Bayer sensor without fear that you are overdoing it.
I cannot give you a concrete answer (yet). The best way to approach this is to experiment. Perhaps you can use the same methodology used on printed test charts. Using a razor-blade imaged at the same focus distance, you can compare the razor-blade measurement to the monitor measurement. At high ratios (e.g., 1:5 sensor:monitor pixels) this should give you a decent measurement of the lower frequencies of the monitor MTF. I think this method may fail at lower ratios as it becomes harder to obtain good slanted-edge measurements, though. Remember that the slanted-edge method only works as long as the edge can reasonably be modelled as a straight line: if the edge is a jagged stair-step, then you violate the assumptions of the slanted-edge method. It may look like you are still getting SFR measurements, but how trustworthy are they?
Regards,
Frans
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
Thanks for your comments. Of course, this question is not easy. For example, increasing the ratio of pixels of the monitor to the matrix, we inevitably face an increase in the distance to the test target.
Lens 35mm.
1: 1 Distance about 2m 45cm
1: 2 Distance approx.4m 27cm
1: 3 Distance about 7m 27cm
1: 4 Distance about 9m 68cm
1: 5 Distance about 12m 9cm
In addition, the area that falls within the viewfinder frame inevitably increases.
Lens 35mm.
(Estimated size)
3m - 10.8x6 cm
4m - 14.4x8 cm
7m - 25.2x14 cm
9m - 32.4x18 cm
12m - 43.2x24 cm
All this will require a monitor of incredible size)
Regards,
Vlad
If you would like to refer to this comment somewhere else in this project, copy and paste the following link:
A friend of mine asked me to see if he could use the monitor screen to display a test target? He says that he likes this idea, since the screen is absolutely flat, the humidity of the air does not affect it, you can show any test image at different scales. I answered him that most likely this should not be done, but I will ask the author of the program for the exact answer, since perhaps there is a chance of such use from a long distance so that the test target on the monitor screen is of the same quality as that printed on the printer.
Vlad
Hi Vlad,
I think it would be fine to use a monitor as a test chart, with the same general rules regarding absolute vs relative measurements, meaning as long as you test at the same distance from the chart with different lenses, then your relative results should be consistent and comparable.
Note that the pixels of the monitor will generally be noticeably larger than the dots of a printed test chart, but on the other hand you can display the test charts with a bit of anti-aliasing on the edges, using the gray levels of the monitor to produce a less-jagged edge. So I see this going two ways:
(I think most software, especially PDF viewers, will distplay the MTF Mapper test charts with some anti-aliasing along the edges, i.e., option 2 above, but I think you can use Inkscape or GIMP to control this if you want to try opion 1)
Of course everything depends on the effective magnification factor in the final image, and the resolution of the monitor (a 4k monitor would be preferable over a 1080p monitor, for example), but overall I think it is worth trying a monitor. I would offer to do some experiments on this myself, but I have a significant backlog of work on the software itself, so I cannot promise such a comparison right now.
Regards,
Frans
Hi Frans,
"I think it would be fine to use a monitor as a test chart, with the same general rules regarding absolute vs relative measurements, meaning as long as you test at the same distance from the chart with different lenses, then your relative results should be consistent and comparable."
Yes you are right. We tested 3 lenses using a monitor screen as a test target. Test conditions were constant (light, distance). And the results obtained, the test diagrams, showed the same lens locations as when shooting in the real world. A good lens took 1st place in the test, a bad lens took the last place. Even the difference in the test charts was very similar to the difference when shooting in the real world.
We even looked at the field curvature, one lens had it. On tests, we also saw field curvature in the same part of the image.
The absolute value when testing is of course very good, but usually the photographer is interested in better or worse new lens than the old one he already has. Again, relative testing is very helpful.
We opened a * .pdf test target in Adobe Photochop, rotated the image horizontally and saved it in tiff. (for monitor resolution). All this was done without making any adjustments to the image. We then opened the full screen image of the target in FastStone Image Viewer. This image on the monitor screen was photographed in compliance with the rules.
Your recommendations 1 and 2 are very interesting. I think that anti-aliasing can be done in Adobe Photochop, the main thing is to choose the required anti-aliasing value.
Of course, the size of the monitor matters a lot and 4k is much better than 1080p, which gives a lot more options, for example when testing tele lenses (long focal length).
It seems to me that using a monitor makes testing faster and more versatile. Of course, subject to limitations, since the test target is still a monitor screen, not a sheet of paper.
It is interesting to check the operation of auto focus, how well the monitor screen will show itself in this. I hope that all tests will pass successfully.
"I would offer to do some experiments on this myself, but I have a significant backlog of work on the software itself, so I cannot promise such a comparison right now."
I completely agree with you, of course, the program is more important. But maybe someday you will try this method and give your even more valuable comments on this method of testing.
Regards,
Vlad
Hi Frans,
I looked at this test with a 35mm lens. Screenshot results. A monitor screen was used as a test target. I think a monitor screen could be used here. I wanted to show you the results, maybe you will have some comments or suggestions. This is not a detailed test, but just a trial version to evaluate the general possibility of such testing of auto focus.
Regards,
Vlad
Hi Vlad,
Thanks for doing the experiment! Would it be possible to share a full size sample image (preferrably a raw camera file) of this example?
You can email it directly to fvdbergh@gmail.com , or share a link / dropbox / google drive or something along those lines.
I would like to take a look at the edge quality, and the potential impact on the SFR curves in more detail.
Thanking you
Regards,
Frans
Hi Frans,
Yes, of course, I will be glad to be helpful in your work. I will try to send you a file by mail, only it has a large size of 25 mb. Or a link to the disk if I can't send it to the mail.
Regards,
Vlad
A small quote from the documentation on the Imatest pragram.
"An image of a horizontal or vertical edge on an LCD monitor (desktop or laptop) may also be used as a target if the monitor does not flicker (many do). The camera should be tilted with respect to the monitor. The disadvantage of this technique is that you have only one edge to work with; you can’t easily create a map of lens performance. Imatest Screen Patterns can display suitable LCD test images. You can set contrast, which should be in the range of 4:1 to 10:1 to minimize clipping. To minimize artifacts (Moire, etc.) the camera must be far enough from the monitor so the sensor pixel “frequency” (1/(2*pixel spacing) at the image sensor) is at least 30% above the Nyquist frequency of the LCD screen. A rule of thumb is that the LCD screen image should take up no more than 1/3 to 1/4 the image width (1/9 to 1/16 the area)."
Interesting find!
I suppose that larger screens tend to have a larger pixel pitch, so for a
given era (1080p, 4K, 8K ...) the optimal size of the monitor image
relative to the camera frame would be relatively constant. But you can use
their 30% above Nyquist guideline to calculate the optimal size for your
chosen image sensor, magnification, and monitor pixel pitch combination.
-F
On Wed, 21 Oct 2020 at 21:50, Vlad_spb vlad-spb@users.sourceforge.net
wrote:
Hi Frans,
I would like to hear from the author an example of such a calculation. I don't quite understand how to count all this, given that the target can be located at an angle of 45 degrees. Especially from a close distance, about 1m 50 cm.
It would be nice to read your recommendations for at least a Lensgrid test target. (It is parallel to the sensor of the camera)
The initial data are as follows:
1) Lensgrid_a4 (1697x1200)
2) LCD 24 inch. The resolution is 1900x1200. Monitor pixel pitch 0.27 mm.
3) Nikon AF-S NIKKOR 35mm 1.8G
4) The size of the camera matrix is 23.5x15.6 mm. The matrix pixel size is 0.00392 mm.
Regards,
Vlad
Hi Vlad,
The correct way to approach this problem is to perform the usual camera transformation, similar to the steps in a typical camera calibration process (like illustrated here)
But we can take some shortcuts to simplify the problem. For starters, we can assume that the optical axis of the lens is pointed right at the centre of the test chart (e.g., MTF Mapper's "focus" style chart). We simplify the problem by only looking along a line running horizontally across the monitor, passing straight through the centre of the chart. With this assumption, we can ignore the vertical component, and model the problem from a top-down view.
We define the centre point on the chart(and monitor) as C=(0,0) in our top-down view. The chart is rotated 45 degrees relative to the optical axis, so the rotation matrix to transform the chart coordinates into the camera coordinate frame is just [ cos(theta) sin(theta); -sin(theta) cos(theta)]. With theta=45 degrees, this becomes R = [q q; -q q] where q = sqrt(0.5). Note that our top-down view means our coordinates are [x; z].
Next, we have to translate relative to the the camera Centre Of Projection (COP), which is located at K=(0, 1500) relative to chart centre (in millimetres). So a point P = [px, pz] on the monitor (along that horizontal line passing through chart centre) is transformed into the camera frame by the transformation R * P - K. For example, our camera frame x coordinate becomes tx = px*q + pz*q - 0, and z becomes tz = px*(-q) + pz*q - 1500. We can also use the fact that pz = 0 for all points on the monitor to simplify things even further, so tx = px*q, and tz = -px*q - 1500, where q = sqrt(0.5).
The last two steps is to apply the perspective division, and scaling to the sensor size. Since our virtual sensor plane is at a distance of 35 mm (the focal length) in front of the COP, we use similar triangles to compute the perspective-projected [sx; sz] coordinates. This means sx = 35 * tx / tz, and sz = 35 * tz / tz = 35 (since all the points fall exactly on the sensor plane after perspective projection, so we can ignore sz from here on). Note that sx is measured in millimetre units at this point, and we can transform it to pixel units by dividing sx by 0.00392 mm if we want.
So the final formula for calculating the sensor pixel coordinates of a point on the monitor is:
cx = ( 35 * (px*q) / (-1500 - px*q) ) / 0.00392
For a pixel right at the centre of the monitor we know that px=0.27 mm, and pz=0 (recall that by definition all pz=0 on the monitor). Using the above equations, we see that
px = -0.27 mm -> cx = 1.136 pixels.
(keep in mind that cx = 0 represents the centre of the sensor, so the range of cx is approximately -3000 to 3000 for your sensor)
For a point towards the front end of monitor:
px = -0.27 * 1697 / 2 -> cx = 1081 pixels
For a point towards the rear of end (furthest end) of the monitor:
px = 0.27 * 1697 / 2 -> cx = -870.27 pixels
I may have been a bit sloppy with the sign conventions here (for example, is the monitor rotated left or right, and what is the correct sign of the z-value of K above), so there is some ambiguity here. It looks like my convention means the monitor was rotated so that the left edge is further from the camera.
So you can see that the perspective transformation has a significant impact on the number of pixels spanned by a single monitor pixel. We already saw that near the centre, one monitor pixel projects to 1.136 sensor pixels, but on the furthest end of the monitor, a single monitor pixel only spans 0.93 pixels. I calculated this from the difference between px = 0.27 * (1697/2) and px = 0.27 * (1697/2-1).
Similarly, one monitor pixel projects to 1.4 sensor pixels on the close end.
You did not specify what the apparent size of monitor image was in your photo, but I did cross-check my values above with a field-of-view calculator, and I think my calculations appear to be in the right ballpark.
I just re-read your question, and I see that you also asked about the case where the chart/monitor is parallel to the sensor. That would reduce to R = [cos(0) sin(0); -sin(0) cos(0)], so that tx = px, and tz = -1500, or eventually cx = (-35 * px / 1500) / 0.00392, which gives you a constant ratio of 1.6 sensor pixels for each monitor pixel. This is easy to check in real life by direct measurement, but it will reveal some of the finer details (e.g., where exactly is the camera's centre of projection relative to the focal plane marker on the camera body).
Hope this is enough to get you started.
Regards,
Frans
Last edit: Frans van den Bergh 2020-10-25
Hi Frans,
Thanks a lot for the clarification. As far as I understand, the issue of distortion is common to any target printed or displayed on a monitor screen as this distortion is lens dependent.
As for the test target directly displayed on the monitor screen, here it is of great importance the ratio of the pixel sizes of the monitor and the matrix, of course. Of course, when printing at high resolution, all this is not as critical as when using LCD. Probably not in vain, and you and a number of other authors used physical objects for your tests (knife blades, razors). In second place is high resolution printing and at the very end is LCD.
I agree with your calculations. However, I would like to clarify 2 more questions:
Regards,
Vlad
Hi Vlad,
You are right about lens distortion (like radial distortion, e.g., "barrel" distortion), but I omitted that from the discussion of the camera model for brevity. At the scale we are talking about here, you can safely ignore the impact of moderate distortion, of course while keeping in mind that my model above only applies to rectilinear lenses.
To answer your specific questions:
Regards,
Frans
Hi Frans,
Thanks for your comments. Of course, this question is not easy. For example, increasing the ratio of pixels of the monitor to the matrix, we inevitably face an increase in the distance to the test target.
Lens 35mm.
1: 1 Distance about 2m 45cm
1: 2 Distance approx.4m 27cm
1: 3 Distance about 7m 27cm
1: 4 Distance about 9m 68cm
1: 5 Distance about 12m 9cm
In addition, the area that falls within the viewfinder frame inevitably increases.
Lens 35mm.
(Estimated size)
3m - 10.8x6 cm
4m - 14.4x8 cm
7m - 25.2x14 cm
9m - 32.4x18 cm
12m - 43.2x24 cm
All this will require a monitor of incredible size)
Regards,
Vlad