I’d like to obtain the “real” dimension of an object from a picture.
My forst task is to find the right resolution for my image. Am I reasoning right?
The minimum dimension I need to disciminate is 0.01mm on a picture of a object measuring in real life 80mm x 30mm (this is 2,400mm2)
If I want to discriminate 0.01mm therefore this should be the dimension of 1px.
Then I need 100px/mm and 100^2 px/mm^2. This is 10^4px for 1 mm square.
Giving 2,400 x 10^4 = 24x10^6 px image. Right?
a picture has normally a resolution in pixels per inch.
Do you need to know how many pixels on mm is or what?
The width should be 80 mm / 0.01 mm = 8000 pixels, the height 30 mm / 0.01 mm = 3000 pixels. Assuming that the 80 x 30 mm object fills the picture completely.
And assuming of course that you don’t have any lens aberrations (or corrected for it) and that the image was taken with a lens with fixed focal length and distance to the object, so that the measurements can be calibrated.
Does your picture include a scale? If it includes a scale, then you can use image registration to dig out the scale and then use it to compare with what it is that you’re trying to measure.
And before you ask me about image registration, don’t… I don’t understand enough to do it myself and have wasted a lot of time not getting it to work…
Another way, again if you image contains a scale, is to let the user set the start and end points of the scale. While this allows for a lot of human error, it will be far easier to calculate.
But like Robert say, there are far too many variables involved to get a measurement that precise if the picture was taken with a digital camera, adding to Robert’s list is moire reduction, noise reduction, angle, quality of the glass, is it a bayer sensor or x-Trans, the sensor size, and even the curvature of the micro-lenses on the sensor.
Regardless of the dpi, you need to know the size in pixels of an object on the picture, and the real world size.
For instance, the Eiffel Tower is pretty big, but in my holiday snaps, it only takes up 1/4 of the photo.
If I know that the tower stretches from y = 1200 to y = 8300 pixels then I know it has a height of 7100 pixels.
The tower in real ‘life’ is 324 m.
Now, knowing that, we can work out the size of a building IF IT IS RIGHT NEXT TO IT, by taking the size of the other building in pixels (say 3000) and multiplying through
3000/7100 * 324 = height in metres.
But if we apply the same process to a dog which is 8 meters from the camera, the dog will be measured as a giant.
So this process only works if all the objects in the picture lie on a fixed plane… no perspective at all.
Or have I missed the point somewhere?
If you need the real size of an object, you need a point of reference near the object to give a scale. That’s why in police photography there is a ruler near the object.
Actually, it’s 312 metres when it w
[quote=47799:@Eric de La Rochette]
Actually, it’s 312 metres when it w
Damn keyboard! And I wasn’t able to edit my previous post.
Actually it was 312 metres when it was built because there wasn’t any antennae.
this is a plus from what have already been said:
There is a formulae to compute the size from dpi to inch or | and dpi to cm.
What you will learn too from the wikipedia page is the dpi vary from 72 (OS X) to Windows (96).
At last, you want to measure an object inside a photo ? Sorry, I cannot help.
But, you you want to know the size of a scanned magazine, the above wikipedia page can help (if the image scan file is honest); for a Macintosh scan at 72 dpi:
Width in pixels / 72 = a value in inch
value in inch * 2.54 = value in cm
( 720 / 72) x 2.54 = 25.4 cm
Sorry guys I was out for some time and exxxtremly busy.
1 - I understand that I have to “calibrate” the image using a known reference on the picture to make a correspondance between a px and a distance on the real world (that’s the easy bit).
2 - Regarding the optical bits (ie. Robert and Sam answers), hum
sounds less straightforward than I thought.
Need to investigate