开发者

OpenGL ES texture combines instead of replaces (works on device, not on Simulator)

开发者 https://www.devze.com 2023-03-28 10:20 出处:网络
I\'m repeatedly rendering a UIView to an OpenGL texture. Things are working well on the device (the texture updates as expected). On the simulator the texture is initially correct (correct alpha and c

I'm repeatedly rendering a UIView to an OpenGL texture. Things are working well on the device (the texture updates as expected). On the simulator the texture is initially correct (correct alpha and colour) however subsequent updates to the textures seem to combine with existing texture (as if 'pasted' onto existing texture) instead of replacing the existing texture, gradually producing an ugly mess.

Some (possibly) relevant context:

  • I'm using OpenGL ES 1.1
  • I'm running Xcode 4.0.2 (Build 4A2002a) on OSX 10.6.8 on a 2007 MBP (Radeon X1600 video)
  • The project uses iOS SDK 4.3 and the deployment target is iOS 4.0

Here is the code which renders the view to the texture (the same code is responsible for the initial render and subsequent updates).

// render UIView to pixel buffer
GLubyte *pixelBuffer = (GLubyte *)malloc(4 * width * height);
CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pixelBuffer, width, height, 8, 4 * width, colourSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colourSpace);
[view.layer renderInContext:context];

// replace OpenGL texture with pixelBuffer data
glBindTexture(GL_TEXTURE_2D, textureId);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST); 
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, 开发者_JAVA技巧GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, pixelBuffer);

Initially I was not worried about the difference between Simulator and device, however now I need to make instructional videos using the simulator.

(interestingly the overwriting has additional RGB noise when the simulator device is set to iPhone than when it is set to iPhone (retina))


I came across a similar problem myself. Don't know why there's a difference in implementation between the simulator and device, but what I found worked was making sure the pixel buffer was zeroed before using it. If the texture I was loading had completely transparent pixels, on the simulator it wasn't bothering to set the values for those pixels!

So, try using calloc instead of malloc, which should initialize the memory to 0s. ie something like...

GLubyte *pixelBuffer = (GLubyte *)calloc(4 * width * height, sizeof(GLubyte));

...or memset...

GLubyte *pixelBuffer = (GLubyte *)malloc(4 * width * height);
memset(pixelBuffer, 0, 4 * width * height);
0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号