开发者

How to load OpenGL texture from ARGB NSImage without swizzling?

开发者 https://www.devze.com 2023-02-09 14:18 出处:网络
I\'m writing an app for Mac OS >= 10.6 that creates OpenGL textures from images loaded from disk. First, I load the image into an NSImage. Then I get the NSBitmapImageRep from the image and load the

I'm writing an app for Mac OS >= 10.6 that creates OpenGL textures from images loaded from disk.

First, I load the image into an NSImage. Then I get the NSBitmapImageRep from the image and load the pixel data into a texture using glTexImage2D.

For RGB or RGBA images, it works perfectly. I can pass in either 3 bytes/pixel of RGB, or 4 bytes of RGBA, and create a 4-byte/pixel RGBA texture.

However, I just had a tester send me a JPEG image (shot on a Canon EOS 50D, not sure how it was imported) that seems to 开发者_如何学编程have ARGB byte ordering.

I found a post on this thread: (http://www.cocoabuilder.com/archive/cocoa/12782-coregraphics-over-opengl.html) That suggests that I specify a format parameter of GL_BGRA to glTexImage2D, and a type of GL_UNSIGNED_INT_8_8_8_8_REV.

That seems logical, and seems like it should work, but it doesn't. I get different, but still wrong, color values.

I wrote "swizzling" (manual byte-swapping) code that shuffles the ARGB image data into a new RGBA buffer, but this byte-by-byte swizzling is going to be slow for large images.

I would also like to understand how to make this work "the right way".

What is the trick to loading ARGB data into an RGBA OpenGL texture?

My current call to xxx looks like this:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, newWidth, newHeight, 0, format, GL_UNSIGNED_BYTE, pixelBuffer);

where is either RGB or RGBA.

I tried using:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, newWidth, newHeight, 0, GL_BGRA, GL_UNSIGNED_INT_8_8_8_8_REV, pixelBuffer);

When my image rep's reports that it is in "alpha first" order.

As a second question, I've also read that most graphics card's "native" format is GL_BGRA, so creating a texture in that format results in faster texture drawing. The speed of texture drawing is more important than the speed of loading the texture, so "swizzling" the data to BGRA format up-front would be worth it. I tried asking OpenGL to create a BGRA texture by specifying an "internalformat" of GL_RGBA, but that results in a completely black image. My interpretation on the docs makes me expect that glTexImage2D would byte-swap the data as it reads it if the source and internal formats are different, but instead I get an OpenGL error 0x500 (GL_INVALID_ENUM) when I try to specify an "internalformat" of GL_RGBA. What am I missing?


I'm not aware of the way to load the ARGB data directly into the texture, but there is a better workaround than just doing the swizzle on CPU. You can do it very effectively on GPU instead:

  1. Load the ARGB data into the temporary RGBA texture.
  2. Draw a full-screen quad with this texture, while rendering into the target texture, using a simple pixel shader.
  3. Continue to load other resources, no need to stall the GPU pipeline.

Example pixel shader:

#version 130
uniform sampler2DRect unit_in;
void main() {
    gl_FragColor = texture( unit_in, gl_FragCoord.xy ).gbar;
}


You're rendering it with OpenGL, right? If you want to do it the easy way, you can have your pixel shader swizzle the colors in realtime. This is no problem at all for the graphics card, they're made to do faar more complicated stuff :).

You can use a shader like this:

uniform sampler2D image;
void main()
{
    gl_FragColor = texture2D(image, gl_FragCoord.xy).gbar;
}

If you don't know about shaders, read this tut here: http://www.lighthouse3d.com/opengl/glsl/


This question is old but in case anyone else is looking for this I found a not strictly safe but effective solution. The problem is that each 32-bit RGBA value has A as the first byte rather than the last.

NBitmapImageRep.bitmapData gives you a pointer to that first byte which you give to OpenGL as the pointer to its pixels. Simply add 1 to that pointer and you point at the RGB values in the right order, with the A of the next pixel at the end.

The problems with this are that the last pixel will take the A value from one byte beyond the end of the image and the A values are all one pixel out. But like the asker, I get this while loading a JPG so alpha is irrelevant anyway. This doesn't appear to cause a problem, but I wouldn't claim that its 'safe'.


The name of a texture whose data is in ARGB format.

GLuint argb_texture;

An array of tokens to set ARGB swizzle in one function call.

static const GLenum argb_swizzle[] =
{
   GL_GREEN, GL_BLUE, GL_ALPHA, GL_RED
};

Bind the ARGB texture

glBindTexture(GL_TEXTURE_2D, argb_texture);

Set all four swizzle parameters in one call to glTexParameteriv

glTexParameteriv(GL_TEXTURE_2D, GL_TEXTURE_SWIZZLE_RGBA, argb_swizzle);

I know this work, but I am not sure if argb_swizzle is in right order. Please correct me if this is not right. I am not very clear how are GL_GREEN, GL_BLUE, GL_ALPHA, GL_RED determined in argb_swizzle.

As The OpenGL Programming Guide suggested:

...which is a mechanism that allows you to rearrange the component order of texture data on the fly as it is read by the graphics hardware.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号