I do not understand why the following two shaders produce different results when I render a vertex buffer with x coordinates equal to zero:
First:
attribute vec3 position;
void main() {
gl_Position = vec4(position.x, position.y, 0.0, 1.0);
}
Second:
attribute vec3 position;
void main() {
gl_Position = vec4(0.0, position.y, 0.0, 1.0);
}
The result of the first is a line of nine dots. The result of the second is a single dot.
I'm drawing the following vertex array as GL_POINTS:
0.0, -1.00, 0.0,
0.0, -0.75, 0.0,
0.0, -0.50, 0.0,
0.0, -0.25, 0.0,
0.0, 0.00, 0.0,
0.0, 0.25, 0.0,
0.0, 0.50, 0.0,
0.0, 0.75, 0.0,
0.0, 1.00, 0.0
Here's the VBO preparation calls:
var开发者_JAVA技巧 a = new Float32Array([
0.0, -1.00, 0.0,
0.0, -0.75, 0.0,
0.0, -0.50, 0.0,
0.0, -0.25, 0.0,
0.0, 0.00, 0.0,
0.0, 0.25, 0.0,
0.0, 0.50, 0.0,
0.0, 0.75, 0.0,
0.0, 1.00, 0.0
]);
gl.bindBuffer(gl.ARRAY_BUFFER, b);
gl.bufferData(gl.ARRAY_BUFFER, a.byteLength, gl.STATIC_DRAW);
gl.bufferSubData(gl.ARRAY_BUFFER, 0, a);
Here's the draw calls:
gl.bindBuffer(gl.ARRAY_BUFFER, b);
gl.vertexAttribPointer(p.position, 3, gl.FLOAT, false, 0, 0);
gl.enableVertexAttribArray(p.position);
gl.drawArrays(gl.POINTS, 0, 9);
That is a known problem (sorry, no link for now).
Basically, GLSL compiler thinks that attribute 'position' is not used, because the first component of it is not used. To check this try the following test:
gl_Position = vec4(0.0, position.y, 0.0, 1.0) + position.x*0.001;
Once you confirm the bug, you can either update you video driver or use an obvious workaround.
精彩评论