开发者

Ensure GLSL compatibility

开发者 https://www.devze.com 2023-03-27 13:18 出处:网络
How can ensure that GLSL shaders are compatible with most modern cards? I\'ve got a software where I use GLSL code from here. But even though I\'ve added #version 120 to the beginning of my final sha

How can ensure that GLSL shaders are compatible with most modern cards?

I've got a software where I use GLSL code from here. But even though I've added #version 120 to the beginning of my final shader and made sure it compiles, on some users computers they get shader compilation errors (even though they support OpenGL 3.2).

开发者_如何学JAVAIs there any tool I can use to "validate" or try compiling with different "shader compilers"?


There is no tool for validating a shader. Even if there was, it wouldn't be useful to you, because what good is a shader that is "valid" if it doesn't run on the hardware you want? You can be right all you want, but if your hardware rejects it even though you're technically right, your shader still won't run.

If a shader of a particular version compiles on one target (call it A) and not on another (call it B), this could be due to one of the following problems:

  1. Target A does not properly implement GLSL. It allowed you to compile something that the spec does not allow. The more compliant target B (or at least, differently non-compliant) rejects your shader, because your shader does not follow the specification.
  2. Target B is non-compliant with the specification. You are feeding it a legitimate shader and it is rejecting it.
  3. Target B does not support the version of GLSL your shader uses (this is unlikely), except when:
  4. Target B is using the OpenGL core specification, version 3.2 or greater. GLSL 1.20 shaders cannot be run on core 3.2 OpenGL implementations.

#1 is more likely to happen if you develop solely on NVIDIA hardware. NVIDIA plays a bit fast-and-loose with the OpenGL specification. They will take a few liberties here and there, smoothing out some of the unpleasant things the specification says. It makes for a smoother developer experience, but it also helps with keeping vendors using NVIDIA hardware if shaders don't run on competitors ;)

#3 is pretty much non-existent, with the noted exception. You linked to a Photoshop shader, so I gather that you are not in control of the creation and management of the OpenGL context. Even so, I highly doubt Photoshop would use a core context; they have too many shaders that need backwards compatibility.

The best way to deal with this is to test on both AMD and NVIDIA hardware (and Intel if you need to run there). You may not need to test on every possible combination of systems, but pick a Radeon HD card and a GeForce 200 or better. They don't even have to be high-end.


The problem is that all the hardware vendors write their own implementations of the GLSL compiler in their drivers. Even though the language is defined quite well, this leads to inconsistencies in shader parsing. For instance, the Nvidia drivers often "forgive" certain small mistakes, which are caught by ATI's.

A reasonable solution, I think, is to use Nvidia's Cg instead (http://developer.nvidia.com/cg-toolkit) - this way you will make your shaders hardware-agnostic and make sure they run on all hardware.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号