Best font type and size for multiplatform?

I normally develop desktop programs on a Windows 10 machine and labels and text fields look okay. But then when I look at it on OS X, it looks cropped.
I think there are rules for OS X, so I could just use those best practices on Windows and I wouldn’t need to have conditional compiles.

My usual settings are Font = System, Size = 12 Point and I set a height of 22.
I’m thinking of upping the Size to 14 Point for better readability, but that will mean adjusting the height of the objects.
Also, is there any advantage or reason to choose Point, Pixels (or for that matter inches or mm)?

I also plan to compile for Ubuntu too, so thanks in advance for any advice and tips.

  • Joe

This topic has come up a lot lately… not “which” font or size to use, but the fact that they look different on OSX vs Windows, and the major reason is Pixels (PX) vs Points (PT)…

OSX defaults to Points
WIN defaults to Pixels

so assuming for the sake of argument that there are 96 pixels per inch, and a point is 1/72 of an inch
then a size of “12” would be 12pts or 16px on OSX, and only 12px on WIN…

you might try setting the TEXTUNIT property, but if I recall, testing I did on that indicated it too was not consistent

Hi Dave,
Thanks for the quick reply and helping with the Points vs Pixels, learned something new today! I will have to do a little more research, but you’ve pointed me in the right direction.

In most cases, you’ll just want to use:
Font = System, Size = 0 Point

Which are the defaults in 2016r2.1, I believe, and seem to work fine between Windows and OS X.

With the advent of HiDPI (Retina) displays where the pixels do not match what you see on screen, using Points is better.

Ubuntu can vary a bit depending on the theme. This blog posts shows how you can determine what works best at run-time: