Why optimise for retina display

How come people worry about optimisations for retina display. What difference does it make? How does it work? Is it because pixels are considerably smaller on retina displays?


Mostly because text looks dramatically better.

Why is it not like this by default? Does it resize the text for retina or apply extra anti-aliasing that only looks good with smaller pixels? I understand that retina display can give you the same size screen as an average monitor with much smaller pixels.


Basically if your application is not Retina ready, it will look wrong on a Retina machine. If your competition is already Retina ready, then it makes your app look bad, and if your competition isn’t Retina ready, it makes your app look better.

It’s like watching HD movies and TV, then going back to watching low resolution.

If you want to sell on the App Store, get Retina Ready, it will give you a chance of being promoted and if Apple promote your application, you’re set!

Because Retina optimized apps require additional bit maps. Apps will default to pixel-doubling, so they can perform exactly as they would on traditional screens. You can enable HiDPI resolutions on any Mac to see what happens. On a 1x screen, every element will appear twice as large. On a 2x screen, because the pixels are half the size, the elements appear to the user as the same size. The result is an increase in sharpness. By a lot.

Retina is not enabled by default simply for legacy reasons. If we spent the last 20 years designing Retina apps, the default would be different. Instead, you need to specify to the app that you know what you’re designing.

If you don’t have a Retina display, don’t bother trying to develop a Retina app. Just like you shouldn’t build a Windows app without Windows to test on, Retina needs proper testing.

Kind of. To this day, most people cannot tell the difference between the two. Most people can see the difference between a Retina and non-Retina display. This is mostly due to proximity. TVs are traditionally across the room, where the higher resolution is less noticeable. Computers are only a couple feet away, so the difference can be noticed much quicker.

Hmmm… Have to rethink how I explain Retina to everyone as I can’t stand SD even on a 32" TV, 10’ away from me! Maybe I’ve been working with Retina too much :slight_smile:

I will agree with this, even testing on ‘fake’ Retina is not actually showing the real world performance of a Retina display.

You can use Vector PDF instead of bitmaps, which decreases file size and improves scalability (but at the cost of what you can use as in app graphics).

If you have an iMac, is it possible to buy an external retina display?

No. You need something that could run a 4K display, and of course, the display itself. For a monitor, you’d want about a 21.5" to get something Retina-caliber. IIRC, only the new Mac Pro can handle a 4K, but I could be wrong.

Is it possible to check directly if the users screen is retina?


[code]If Retina then

MsgBox(“Not Retina”)
End if[/code]

Using something like this is not recommend. For its possible the application can be moved between screens with different scaling factors and then your application would look wrong.

I know that it’s been difficult to integrate the RetinaKit into your workflow, but it really is the Apple recommend way of doing it.

Anyhow, if you wish to continue down this path, then the function you need to use is as follows (there are others, but are not recommend by Apple).

declare function CGContextConvertRectToDeviceSpace lib AppKit ( context as integer, rect as CGRect) as CGRect Dim userRect as CGRect = CGRectMake( x,y,w,h ) Dim repRect as CGRect = CGContextConvertRectToDeviceSpace( g.Handle( g.HandleTypeCGContextRef ), userRect )

You’ll need a CGRect structure in your project. Basically this function takes a rectangle and then returns a converted rectangle, to which you can then calculate the scaling factor.