Learn how easy it is to sync an existing GitHub or Google Code repo to a SourceForge project! See Demo


#48 OS X HiDPI support

Ed Ropple

OS X has supported HiDPI ("Retina") displays since 10.7. This works with NSOpenGLView (much like the iOS version of Allegro) but you have to explicitly ask for it in order to enable it. It's very easy to enable it--you just implement prepareOpenGL() within ALOpenGLView and invoke the following:

[self setWantsBestResolutionOpenGLSurface:YES];

It's not hard and I started in on a patch, but I'm going to hold off because I'm not sure on how you folks would like this to be exposed. Obviously you don't want this to be on by default, but I'm not familiar enough with Allegro's flags to figure out exactly how you'd like to go about this.

For fullscreen, I think (not 100%) that you just ask for the pixel resolution that you really want (so 2880x1800 for a retina MacBook Pro), but fullscreen is apparently broken at the moment so I can't say for sure. =)


  • Peter Wang
    Peter Wang

    What happens if we enable it by default? I don't understand what is special about a higher resolution display.

  • Ed Ropple
    Ed Ropple

    It's not quite a higher-resolution display as you might see elsewhere--some of the Cocoa APIs have different behavior in HiDPI mode, whereas OpenGL function calls always acts in pixel. Those interactions can be unpredictable (you have to adjust for the difference when doing hit testing, for example) and it seemed to me like principle-of-least-surprise might suggest making it a flag. I'm sure that you have a better idea than I do as far as how to present this to an end user, though.

    More information: http://developer.apple.com/library/mac/#documentation/GraphicsAnimation/Conceptual/HighResolutionOSX/CapturingScreenContents/CapturingScreenContents.html

    Last edit: Ed Ropple 2013-01-20
  • Evert Glebbeek
    Evert Glebbeek

    On 20 Jan 2013, at 8:36 , Peter Wang tjaden@users.sf.net wrote:

    What happens if we enable it by default? I don't understand what is special about a higher resolution display.

    There wouldn't be anything special if it was simply treated as a high resolution display. In general it's not though.
    To get an idea for how it's used, imagine replacing your display by one that is 4x the resolution of your current display (double in each dimension) but that is the same physical size. Running your desktop on that higher resolution is probably not very useful (because everything becomes very small), but if you set all UI elements to be 4x as large (using high-resolution versions of the artwork, or vector graphics) and increase the size of your fonts, then your display will have the same apparent size as before, but at a higher resolution, so it looks much better.
    That's how retina displays are supposed to be used: my display is really 2560x1600, but except for things like images (including text) and video it mostly behaves as 1280x800. Older applications that don't support retina display render as though the display is 1280x800 and are upscaled by the OS (and generally look like crap).
    The point of retina displays is not that the display is high resolution, but that it's high DPI.

    I agree though that Allegro should probably give you the full resolution by default, but you do need to have a way to detect high DPI displays: 10 pt text may be fine on a high-resolution low-dpi screen, but it'll be far too small on a retina display.

    To make things interesting, the scaling of the display can be configured. What I said above (2560x1600, behaving like 1280x800) is the default setting.


  • Peter Wang
    Peter Wang

    Yes, I should have read up a bit before responding.

    It seems like the way to go is to expose the full resolution to the user, but provide a way to get the DPI information (or approximate), then the user can decide what to do. We already have transforms; I don't think exposing another concept 'points' adds anything.