How to get screen dpi ?

This is just the sort of thing I can’t resist… so here’s something.

define CGSize and CGPoint as follows:

CGPoint: x as cgfloat y as cgfloat

CGSize: width as cgfloat height as cgfloat

And this method:

[code]Function DPIForDisplayAtXY(x as integer, y as integer) As xojo.Core.Size
declare function CGGetDisplaysWithPoint lib “CoreGraphics” ( point as CGPoint, maxDisplays as uint32, byref displays as ptr , byref matchingDisplayCount as uint32 ) as integer
declare function CGDisplayScreenSize lib “CoreGraphics” ( display as ptr ) as CGSize
declare function CGDisplayPixelsWide lib “CoreGraphics” ( display as ptr ) as UInteger
declare function CGDisplayPixelsHigh lib “CoreGraphics” ( display as ptr ) as UInteger

dim p As CGPoint
p.x=x
p.y=y

dim cnt as uint32
dim d as ptr
call CGGetDisplaysWithPoint(p,1,d,cnt)

dim screenSize as CGSize
screenSize=CGDisplayScreenSize(d)

Return new xojo.Core.Size(CGDisplayPixelsWide(d)/screenSize.width25.4,CGDisplayPixelsHigh(d)/screenSize.height25.4)

End Function
[/code]

Will return the horizontal and vertical DPI of the screen at the X and Y provided.

Keep in mind I put this together in a few minutes, so some error checking and maybe a check that the display supports “Extended Display Identification Data” may be in order (if not then the documentation says 72dpi is assumed) …

My quick tests here return reasonable values for my cinema display at various resolutions and Macbook display.

Nice Jim!
I really need to start learning to work with declares. So cool.

I have an app into which having precise measurement on screen would be great. However I gave up, as the notion of dpi has become completely imprecise, as new screen kept coming.

An iPhone 6 plus for instance has a screen 5.5" wide, with 2208x1242 pixels. So in fact the actual dpi is 401.45, NOT 216 as one would believe from a Retina 3x.

Even an older machine like the iMac 21.5 inch mid 2011 I use does not provide 72pixels per inch. In fact, with an 18.5" wide screen and 1980 pixels wide, the actual number of pixels per inch is 107.02.

I believe getting down to the millimeter is possible for a given machine, at the condition of doing that bit of research. For instance, on the iMac above, width of 469.9 millimeters/1980 each pixel is 0.2373232 millimeter.

The only way I can think about which I left off because of the potential for endless support requests would be to ask the user to actually measure on screen.

One could imagine displaying a series of bars, and ask the user to select which one has for instance exactly 7 inches, or some amount of millimeters.

Michel is right on the ball here, this has been a hotly debated topic for years and his suggestion of displaying a ruler for calibration is probably the most feasible/accurate. If the user is serious about getting dimensions right when using your app, they will take a few minutes to set this up.

On Windows any call to GetDeviceCaps will not return the correct values for HORZSIZE and VERTSIZE as they only work for printing.

Reading data right out of the EDID will only be as accurate as the data inserted in there on production of the display (see link below)

See here for an answer to all of your “what if you do this” or " what if you do that questions" https://lists.fedoraproject.org/pipermail/devel/2011-October/157671.html

One word sums all this up… nightmare :slight_smile:

Even photoshop has a setting for this, if they can’t work out how to do it, what hope do we have :wink:

Jims code works spot on for me. Maybe EDID can be considered reliable for Mac displays but have a manual entry in case of them lying vendors :slight_smile:

Fact is the notion of “dot per inch” has become with the year pure fantasy. The age old 72 dots/pixels per inch remained as a souvenir of older cathodic monitors, but yet, that too was a fallacy. Since the very beginning of the Apple II “high res graphics” with 280x192 pixels, it would have taken a 4 inch screen to truly be 72 dpi.

In print, though, the notion of dot per inch remains, as most printer manufacturers make great efforts to have precise rendition, in spite of variable resolutions.

Well, I hate to contradict, but my ruler says the screen is 18.5 inches ; divided by 1980 that is a value of 107.02 pixels per inch. Not 103 as reported by Jim’s code. At 1280x800 declare reports 68, measurement 69.18. I love declares, but I tend to better trust the real world. Declare cannot be used for precision work.

To be fair, it would be possible given the relatively small number of different models in the Apple range to have a database of values and detect the machine.

Since this thread is in General, beyond the fact that the declare works only on OS X, we have to envision a cross-platform environment, and that means hundreds of different machines with hundreds of different screens.

A good ruler or measure tape can be found for a few dollars at any supply store.

ok… so I was bored, and I took the code supplied by Jim, and wrapped it into a tiny app that analyzes whatever Mac it is run on …

www.rdS.com/screen_ppi.xojo_xml_project.zip

and makes a display like this

The code needs to be fixed for RETINA (I think)… when I put my iMac into “fake retina” mode,
it says it is 1280x720 (ok that looks right), but at 54.5 ppi where it should be 218 shouldn’t it?

I think that’s correct as it’s measuring points rather than pixels. In fake retina, the pixels don’t change, the points just are pushed farther apart, so it should be half of the normal mode value. On a real Retina display it should be similar to a normal display, as again, it’s measuring points rather than pixels. I don’t have a real Retina screen to test that though.

points/pixels… on a non-retina display its the same thing… would be interested to see what my little app says on a real retina

Here’s a quick ruler drawing function using the code I posted above. Add this paint event to a canvas and try different screen resolutions (be sure to resize the canvas or it won’t redraw with the new values). Seems pretty accurate here. I’d love to hear what a real Retina does with it.

[code]Sub Paint(g As Graphics, areas() As REALbasic.Rect)
dim ppi as xojo.Core.Size=DPIForDisplayAtXY(window.left,window.top)

dim nextx,nexty as Double
dim count as integer

g.DrawLine(0,0,g.Width,0)

While nextx<me.Width
g.DrawLine(nextx,0,nextx,10)
g.DrawLine(nextx+ppi.Width*.25,0,nextx+ppi.Width*.25,3)
g.DrawLine(nextx+ppi.Width*.5,0,nextx+ppi.Width*.5,6)
g.DrawLine(nextx+ppi.Width*.75,0,nextx+ppi.Width*.75,3)
if count>0 then g.DrawString(count.ToText,nextx-3,25)
count=count+1
nextx=nextx+ppi.Width
wend

g.DrawLine(0,0,0, g.Height)
count=0

While nexty<me.Height
g.DrawLine(0,nexty,10,nexty)
g.DrawLine(0,nexty+ppi.Height*.25,3,nexty+ppi.Height*.25)
g.DrawLine(0,nexty+ppi.Height*.5,6,nexty+ppi.Height*.5)
g.DrawLine(0,nexty+ppi.Height*.75,3,nexty+ppi.Height*.75)
if count>0 then g.DrawString(count.ToText,20,nexty+5)
count=count+1
nexty=nexty+ppi.Height
wend
End Sub
[/code]

If I were writing a print preview that needed to be accurate, I would do a calibration like Sam mentioned, but use something like the above as a staring point to be verified and adjusted by the user.

Nope. As I measured on my non-retina screen this appears to be a legend. The pixel = point thing mostly never actually existed.

On the 21.5 Mac the only time when it is close is with 1344x756, and yet, at exactly 72 pixels per inch, it should be 1332x756.

On the MacBook 13" at 1280x800 I get 125.04 points per inch.

comparing to my tape measure looks pretty accurate on my retina and non-retina displays

[quote=263500:@Michel Bujardet]Nope. As I measured on my non-retina screen this appears to be a legend. The pixel = point thing mostly never actually existed.

On the 21.5 Mac the only time when it is close is with 1344x756, and yet, at exactly 72 pixels per inch, it should be 1332x756.[/quote]

I think when you refer to POINTS and Video displays in the same sentence… the “72 points per inch” definition no longer applies
the smallest unit on a any display is a pixel,
on a Retina display one POINT = 2x2 pixels [on an iPhone 6+ it is 3x3] …

Indeed, I refer to points rather than pixels because on Retina display they are not the same thing. However, a point on a Retina display does not equate a 1/72th of an inch typographical point. It is unfortunate Apple decided to use the word “point” for a combination of pixels which has no relationship with the typographical unit. Hence a deplorable confusion.

But I also want also to dispel the common illusion that one pixel equals one typographical point (1/72th an inch) in non Retina displays.

Unless terminology is precise, and actual measurement replaces a belief system, no 1:1 screen display is possible.

The very notion of typographical point has been quite independent from resolution in print for decades. The actual resolution of paper is of about 2400 dots per inch on newspapers. And that has nothing to do with the typographical point which remains an unit of measurement. Likewise, the advent of higher screen resolutions means there is no relationship between points and pixels, if there was ever any.

In common communication, there is this bizarre preconception that non-retina displays have 72 pixels per inch. And sometimes screen specs are expressed in dots per an inch (dpi). In practice, that is inaccurate. But hey, who am I to against decades of marketing twist ?

In my comments I was referring simply to screen coordinates as points. Typographic points could easily be calculated once you know the screen (coordinate)points/inch.

Using graphic’s new scalex/scaley properties, you can get some very accurate results!

g.ScaleX=(DPIForDisplayAtXY(top,left).width/72)*ScaleFactor g.ScaleY=(DPIForDisplayAtXY(top,left).Height/72)*ScaleFactor g.TextSize=35 g.DrawString("abcdefghijklmnopqrstuvwxyz",0,30)

The text is 6.25 inches long in all resolutions including fake retina on my machine.