Technical: Depth buffer in OpenGL
I just thought I’d post this up here, in case other people had the same problem.
The Megastrata project started out like any other first-time project: cobbled together from examples found on the internet. One of the long-standing traditions regarding development involve the so-called “hack-n-slash” technique: smash the code around until it works.
Megastrata started life out as the NoiseViewer, which was, true to its name, simply a 2D noise patter viewer. Hence, there was originally no need for a depth buffer.
When the leap was made to 3D, I made it work on Windows. But the OS X version didn’t render the same way. Wolfgang put a lot of effort into debugging it, and after sending screenshots back and forth, we understood that the problem was with the depth buffer.
I saw something suspicious when I looked up the OpenGL FAQ on the subject. Step one reads:
1. Ask for a depth buffer when you create your window.
Evidently, Windows automatically specifies a depth buffer created when initializing the window, whereas OS X does not do this by default.
We changed the line in the code that read:
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB); |
to:
glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH); |
and the problem was solved.
In: Uncategorized · Tagged with: buffer, culling, depth, depth buffer, glutInitDisplayMode, opengl, OS X, rendering, troubleshooting, windows, z-buffer