[OpenGL] What's your Macbook OpenGL Version ?

Hello,

Just a little survey here, as, sadly, OpenGL on Mac OS is always behind Windows/Linux.

I have a Macbook Pro bought last year. My OpenGL version is 2.1.

I wonder what is your OpenGL version if you have a Macbook up to, say, 5 years old ?

What is the OpenGL version of this year (the last one) Macbook Pro ?

What about Macbook Air ?

In short, what is your Macbook Pro/Air year and OpenGL version ?

Thanks in advance for your feedback.

Cheers,
Guy.

One of the rare situations where Apple documentation is excellent:

https://developer.apple.com/graphicsimaging/opengl/capabilities/

Thanks for the link, Michael,

I was aware of this chart but it seems the reality is far away from it, unless I’m missing something (I’m no Mac expert).

As an example, I have a Macbook Pro with a NVIDIA GeForce GT 650M and with OS X 10.9.2.

According to this chart, it should support OpenGL v4.1. But when I query the card through OpenGL I get version 2.1 (and shading language version 1.20) which is far below 4.1 (and shading language version 4.10).

That’s why I’m asking people directly to check on their own Macbook, just to have the ‘real’ supported OpenGL version.

Cheers,
Guy.

That table has two sections, “Core” and “Legacy” https://developer.apple.com/graphicsimaging/opengl/capabilities/GLInfo_1090.html
I’m not sure of the difference, but under Legacy the version is indicated as 2.1, which seems correct.

Ahh, see here: https://developer.apple.com/library/mac/documentation/graphicsimaging/conceptual/opengl-macprogguide/opengl_pixelformats/opengl_pixelformats.html

Apparently your app can request which feature set (legacy vs. core) it’s run under.

Ah yes, I was not aware of the Legacy chart.

That explains my results indeed.

Thanks a lot Michael.

Cheers,
Guy.

Any ideas how to make the request?

This versioning is something that has always confused me. The Xojo docs imply its OpenGL module implements v1.1 but I find calls in there for most of point sprites which are v1.5.

[quote=73941:@Will Shank]Any ideas how to make the request?

This versioning is something that has always confused me. The Xojo docs imply its OpenGL module implements v1.1 but I find calls in there for most of point sprites which are v1.5.[/quote]
This is Xojo independent.

Choosing an OpenGL version and a profile requires, in this order:

  1. to first retrieve platform specific OpenGL utilities functions and possibly some extensions (WGL on Windows, GLX on Linux, CGL/Cocoa on Mac).
  2. to create a ‘surface’ with a specific ‘Pixel Format’ (color bits, acceleration, sRGB, multisampling, etc…).
  3. to create an OpenGL ‘Context’ on this surface with a specific ‘Profile’ (OpenGL version, Legacy/Core, Debug, …)
  4. to bind the OpenGL functions from/up to a specific version from this context (Windows/Linux). On Mac - from what I understand - you don’t need to ‘bind’ the functions, just use Declares but this is precisely what I’m working on as I’m experimenting OpenGL on Mac for the first time.

The Xojo OpenGL Module has its own functions binding, and while it is supposed to support even the most basic version it lacks from OpenGL.glGetString for instance, which is part of OpenGL v1.0. Personaly I don’t rely on this module and I’ve re-creating bindings for all OpenGL versions.

Cheers,
Guy.

Thanks Guy, this has clarified a lot.

I’ve been using the Xojo Module and OpenGLSurface on mac with a few declares for glGetString, glPointParameteri, gluLookAt and the gluQuadrics. The version I get reported from glGetString is 2.1 which matches the Legacy value in that chart. I’m guessing the declares work because they’re in the legacy set and that’s the kind of Context Xojo is creating (but exposing only some of).

I’ve never used OpenGLSurface.Configure before. I think it’s used for #2, setting Pixel Format, can it also do #3 setting Profile to Core? Michaels link to Buffer Attributes (which I’m still processing) indicates passing kCGLOGLPVersion_3_2_Core will set Core Profile but I haven’t got that working.

Even if I can set Profile to Core and get OpenGL v4.1 it seems the functions there are totally different, the fixed pipeline is gone and its all shader based. So the Xojo OpenGL Module wouldn’t work with it anyways, it’d have to be all declares right?

Any plans to release or sell this?

Thanks again. That outline and Michaels links are filling in the big picture big time.

When I run System Information > Frameworks on my 15" MBPro I get:

OpenGL:

Version: 9.6.0
Obtained from: Apple
Last Modified: 8/02/2014 9:30 pm
Kind: Intel
64-Bit (Intel): Yes
Signed by: Software Signing, Apple Code Signing Certification Authority, Apple Root CA
Get Info String: OpenGL 9.6.0.0.0
Location: /System/Library/Frameworks/OpenGL.framework
Private: No

Weird huh?

@ @David:
Actually, this is the version of the Apple OpenGL software renderer.

To have a better view of the different supported versions, there is GLView available free on the AppStore:
https://itunes.apple.com/app/opengl-extensions-viewer/id444052073?mt=12#

It gives indeed all the correct versions, 2.1 for legacy and 4.1 Core on my MBPro.

With all these informations (and the charts, thanks again Michael) I can now summarize:

  1. It is safe to assume OpenGL Legacy v2.1 is available on all Macbook available nowadays.
  2. It is safe to assume OpenGL Core at least v3.3 is available on all Macbook available nowadays.

@ @Will:
I’m working on a product I plan to sell in a near future, all bindings to OpenGL will be include but I may release those bindings separately as well, for free.

Regarding OpenGLSurface.Configure, it wont suffice to set OpenGL Profiles. It only gives the opportunity to (slightly) adjust the ‘basic’ pixel format use by ‘basic’ OpenGL contexts. In order to go one step further, you use this ‘basic’ context to retrieve the OpenGL extensions, in fact the WGL/GLX extensions (on Windows/Linux, for Mac I have yet to understand precisely how it works) needed to set up advanced Pixel Formats and Profiles.

The kCGLOGLPVersion_3_2_Core constant you mention has to be use with Profile setup, not Pixel Format (but again, on Mac I’m trying to figure out how this works as it seems this is a different approach than for Windows/Linux).

Then you are right, using a Core Profile is a totally different story than using Legacy OpenGL and you need to retrieve all the functions by yourself.

Declare wont help though, at least on Windows/Linux (again I have to figure out how it works on Mac :wink: ), because those OpenGL extensions has to be queried at runtime (wglGetProcAddress() for instance on Windows) and only once you have set up your OpenGL context with the right Pixel Format and Profile. This is very important, as a function pointer may be different from one profile to another one for the same function. That explain you can’t use Declare (except perhaps on Mac, but again… :wink: ).

Cheers,
Guy.

Looking forward to it :slight_smile:

I’ve managed to get v4.1 using both Configure and pure declares. Went the declare route first and it came down to getting the NSOpenGLPixelFormat attributes right.

Following examples here I created an NSOpenGLView, initing it with a pixel format which creates the context. Then in drawRect it just clears the color buffer and a call to glFlush is needed to see the drawing, and glGetString reports 4.1.

Seeing that working I copied the same attributes memoryblock to an OpenGLSurfaces Configure and voilà it worked. At first I didn’t think so but then adding glFlush it showed up. I guess OpenGLSurface stops flushing for you if you configure?!.

Here’s code for the Configure route. Drop an OpenGLSurface on a window and implement these events, debuglog will show the version.

[code]Function Configure() As MemoryBlock

dim attribs As new MemoryBlock(100)
attribs.UInt32Value(0) = 99 //NSOpenGLPFAOpenGLProfile
attribs.UInt32Value(4) = &h3200 //NSOpenGLProfileVersion3_2Core
return attribs

End Function

Function Render() As Boolean

OpenGL.glClearColor(1, 0, 1, 0)
OpenGL.glClear(OpenGL.GL_COLOR_BUFFER_BIT)
OpenGL.glFlush

soft declare function glGetString lib “OpenGL.framework” (enumName As integer) As CString
System.DebugLog("Version: " + glGetString(OpenGL.GL_VERSION))

End Function[/code]

This is as far as I’ve got, using pure declares just sets up the same thing I believe. A few things still work with the 4.1 context like clearing and scissoring but drawing a line with glVertex doesn’t of course, appears to just be ignored.

oh, you have to add the Double Buffer attribute and then glFlush is automatically called…

add
attribs.UInt32Value(8) = 5 //NSOpenGLPFADoubleBuffer

remove
OpenGL.glFlush

Ah, thanks Will, that’s is nice to see you can configure Profiles from the Configure event on Mac OS X. (In the documentation there are no information regarding this attribute list, if it is used with CGL or NSOpenGL on Mac. Seems NSOpenGL then).

Unfortunately, this is not the case on Windows, as the PIXELFORMATDESCRIPTOR is not enough to setup advanced OpenGL settings and Profiles, for this we also have to create an attributes list and use ‘extended’ WGL functions (so the context must be created first after Configure… in order to retrieve these functions).

Personally I’ve chosen a consistent way of doing things across platforms and for Mac I’ve decided to use CGL over NSOpenGL, CGL being closer to the ‘usual’ API from the other platforms.

I’ve also managed to retrieve OpenGL extensions through the use of dlopen/dlsym, which is the recommended way by Apple (instead of using NSOpenGL for this).

Yes, forget about glVertex with OpenGL Core Profiles :slight_smile:

You now have to send your geometry through buffers, use vertex shaders to transform it (and do all the transforms yourself), use fragment shaders to display it… Although this leads to a much more powerful pipeline, for simple things its not really ideal.

Cheers,
Guy.

OK thanks Guy. I think I understand but it’s somewhat over my head, I mean the distinction between CGL/NSOpenGL*, linking extensions, etc. Whew. As far as getting a 4.1 context on mac it appears the choice is in the pixel format thingy though. I guess this is unusual because I remember reading someone complain how pixelformat is a poor place for this setting.

If it helps this code creates a 4.1 CGL context by having the profile attribute in the pixel format. At least I think that’s what happens! Going to step back to OpenGL 2.1 now, learn shaders and wait for your framework :slight_smile:

[code]Sub Action()
const ogl = “OpenGL.framework”
soft declare function CGLChoosePixelFormat lib ogl (pattribs As Ptr, ppix As Ptr, pnpix As Ptr) As UInt32
soft declare function CGLCreateContext lib ogl (ppix As Ptr, pshare As Ptr, pctx As Ptr) As UInt32
soft declare function CGLSetCurrentContext lib ogl (pctx As Ptr) As UInt32
soft declare function glGetString lib ogl (enumName As integer) As CString
dim e0, e1, e2 As UInt32

dim attribs As new MemoryBlock(12)
attribs.UInt32Value(0) = 99 //kCGLPFAOpenGLProfile
attribs.UInt32Value(4) = &h3200 //kCGLOGLPVersion_3_2_Core
'attribs.UInt32Value(4) = &h1000 //kCGLOGLPVersion_Legacy

dim pix As new MemoryBlock(8)
dim npix As new MemoryBlock(8)
dim ctx As new MemoryBlock(8)

e0 = CGLChoosePixelFormat(attribs, pix, npix)
e1 = CGLCreateContext(pix.Ptr(0), nil, ctx)
e2 = CGLSetCurrentContext(ctx.Ptr(0))
System.DebugLog("cgl Version: " + glGetString(OpenGL.GL_VERSION))

End Sub[/code]

Thanks Will for all the Mac OpenGL related informations.

After further tests, it seems sufficient - on Mac OS X - to indeed setup the attributes list in the Configure event to enable any wanted OpenGL version / profile. The rest can be done with standard OpenGL functions. So I’ve dropped my calls to CGL and only use the Configure event now. It seems to work perfectly well.

On Windows it is another story as - indeed - the pixel format descriptor is not enough to setup OpenGL versions / profiles. But I got it cover on this platform, and everything is working perfectly well as well. I have to do some more tests to check if everything is working correctly as well on Linux as the procedure is pretty similar to the Windows one.

So far, so good then.

Regarding any ‘framework’, what I’m working on is a little tool to generate Xojo OpenGL ‘headers’ (in fact Modules) from the Khronos OpenGL registry which is available in XML format. This tool will be able to directly download this file, parse it and let you choose up to which OpenGL version you want the bindings and for which profile (core or compatibility) and will generate a Xojo XML Module with all the needed Declare (for up to OpenGL v1.1) and Delegate (OpenGL v1.2 and up). This tool will be available commercially for an insignificant price but the generated Modules will be free to be re-distributed.

This is just a start for other upcoming products. The roadmap is to have a solid OpenGL basis in Xojo to build upon. Next planned is a hardware accelerated (through OpenGL) 2D drawing library, based on OpenVG which will be used in turn to create a hardware accelerated Xojo Graphics object, similar to the Xojo/Canvas Graphics one but usable on a specialized OpenGLSurface we will release as well.

And this will be another ‘basis’ for us to build on top of it an entire new Xojo hardware accelerated GUI system, with several Layout systems and of course Controls.

This will then give us the means to build the applications we have in mind. But I wont talk about those yet.

Cheers,
Guy.