OpenGLES 2.0 错误的深度缓冲位

OpenGLES 2.0 wrong depth buffer bits

我下载了这个 Apple 示例代码 GLEssentials sample code。我想用深度缓冲区进行一些实验,所以一开始我决定检查 BUFFER_BITS.

我在 -initWithDefaultFBO 方法中将下一个代码添加到 OpenGLRenderer.m:

// code from sample
NSLog(@"%s %s\n", glGetString(GL_RENDERER), glGetString(GL_VERSION));

// buffer bits check
GLint depthBits;
glGetIntegerv(GL_DEPTH_BITS, &depthBits);
printf("depthBits: %d\n", depthBits);

我有下一个输出:

 GLEssentials[3630:112826] Apple Software Renderer OpenGL ES 2.0 APPLE-12.4.2 
 depthBits: 24

但在 ES2Renderer.m 我看到下一行:

// using 16-bit depth buffer
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, backingWidth, backingHeight);

为什么会这样?是错误吗?

PS: 我只在 iOS 模拟器中测试过,因为我没有 ios 设备。

spec 说:

An OpenGL ES implementation may vary its allocation of internal component resolution based on any RenderbufferStorage parameter (except target), but the allocation and chosen internal format must not be a function of any other state and cannot be changed once they are established. The actual resolution in bits of each component of the allocated image can be queried with GetRenderbufferParameteriv.

基本上,OpenGLES 可以根据要求选择不同的位深度。

我怀疑在设备上会使用实际的 16 位深度缓冲区。