Well I tried the hopefully correct variant:
context = glXCreateContext(display, visual, None, GL_TRUE);
if (!context || !(glGetError() == GL_NO_ERROR))
context = glXCreateContext(display, visual, None, GL_FALSE);
but on a Linux box in our department this does not work: after the
first line bloody OpenGL claims the context is fine and there are no
errors, but then proceeds to fall over later. So I'm going to have to put
a #ifdef in, but I imagine this will also not work in all circumstances,
so a warning to everyone, for some BadMatch errors swapping GL_FALSE to
GL_TRUE or vice versa will fix the problem:
#ifdef LINUX
context = glXCreateContext(display, visual, None, GL_FALSE);
#else
context = glXCreateContext(display, visual, None, GL_TRUE);
#endif
On Mac GL_TRUE seems to be required, and from what Bruce Ray says, on
Solaris 7 this is also required, on Solaris 9 both variants seem to work
(but I am displaying remotely, which might make the difference). So far
on Linux GL_FALSE seems to work (although GL_TRUE ought to be faster where
it works, but that doesn't even seem to necessarily hold).
Wayne
On Fri, 26 Nov 2004, Wayne Boucher wrote:
> Hello,
>
> A month or two ago we changed a boolean flag in one of the glX calls to
> FALSE from TRUE (because some Linux was happy somehow or other, I forget
> now, it might have been over a network). Well it turns out that in OSX
> (at least on the machine I was trying) it needs to be TRUE, otherwise you
> get an X BadMatch error. Lines 401-404 in
> ccpnmr/ccpnmr1.0/c/ccpnmr/global/gl_handler.c currently say:
>
> /*
> context = glXCreateContext(display, visual, None, GL_TRUE);
> */
> context = glXCreateContext(display, visual, None, GL_FALSE);
>
> and on the Mac we need that second GL_FALSE to be GL_TRUE. I think I'm
> going to change that code to first try GL_TRUE and pray that it returns
> NULL if that doesn't work and then try GL_FALSE. Hopefully that will keep
> all parties happy.
>
> Wayne
>
|