移动纹理 OpenGL ES 2.0
Moving texture OpenGL ES 2.0
我正在尝试在 OpenGL ES 2.0 中实现 8 列和 8 行的精灵
我制作了第一个图像,但我无法弄清楚如何在 OpenGL ES 2.0 中转换纹理矩阵,我正在寻找的与 OpenGL 1.0 中的代码等效的是
gl.glMatrixMode(GL10.GL_TEXTURE);
gl.glLoadIdentity();
gl.glPushMatrix();
gl.glTranslatef(0.0f, 0.2f, 0f);
gl.glPopMatrix();
这是我使用 atm 的矩阵
/**
* Store the model matrix. This matrix is used to move models from object space (where each model can be thought
* of being located at the center of the universe) to world space.
*/
private float[] mModelMatrix = new float[16];
/**
* Store the view matrix. This can be thought of as our camera. This matrix transforms world space to eye space;
* it positions things relative to our eye.
*/
private float[] mViewMatrix = new float[16];
/** Store the projection matrix. This is used to project the scene onto a 2D viewport. */
private float[] mProjectionMatrix = new float[16];
/** Allocate storage for the final combined matrix. This will be passed into the shader program. */
private float[] mMVPMatrix = new float[16];
/**
* Stores a copy of the model matrix specifically for the light position.
*/
private float[] mLightModelMatrix = new float[16];
我的顶点着色器
uniform mat4 u_MVPMatrix; // A constant representing the combined model/view/projection matrix.
uniform mat4 u_MVMatrix; // A constant representing the combined model/view matrix.
attribute vec4 a_Position; // Per-vertex position information we will pass in.
attribute vec3 a_Normal; // Per-vertex normal information we will pass in.
attribute vec2 a_TexCoordinate; // Per-vertex texture coordinate information we will pass in.
varying vec3 v_Position; // This will be passed into the fragment shader.
varying vec3 v_Normal; // This will be passed into the fragment shader.
varying vec2 v_TexCoordinate; // This will be passed into the fragment shader.
// The entry point for our vertex shader.
void main()
{
// Transform the vertex into eye space.
v_Position = vec3(u_MVMatrix * a_Position);
// Pass through the texture coordinate.
v_TexCoordinate = a_TexCoordinate;
// Transform the normal's orientation into eye space.
v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));
// gl_Position is a special variable used to store the final position.
// Multiply the vertex by the matrix to get the final point in normalized screen coordinates.
gl_Position = u_MVPMatrix * a_Position;
}
我的片段着色器:
precision mediump float; // Set the default precision to medium. We don't need as high of a
// precision in the fragment shader.
uniform vec3 u_LightPos; // The position of the light in eye space.
uniform sampler2D u_Texture; // The input texture.
varying vec3 v_Position; // Interpolated position for this fragment.
varying vec3 v_Normal; // Interpolated normal for this fragment.
varying vec2 v_TexCoordinate; // Interpolated texture coordinate per fragment.
// The entry point for our fragment shader.
void main()
{
// Will be used for attenuation.
float distance = length(u_LightPos - v_Position);
// Get a lighting direction vector from the light to the vertex.
vec3 lightVector = normalize(u_LightPos - v_Position);
// Calculate the dot product of the light vector and vertex normal. If the normal and light vector are
// pointing in the same direction then it will get max illumination.
float diffuse = max(dot(v_Normal, lightVector), 0.0);
// Add attenuation.
diffuse = diffuse * (1.0 / (1.0 + (0.25 * distance)));
// Add ambient lighting
diffuse = diffuse + 0.7;
// Multiply the color by the diffuse illumination level and texture value to get final output color.
gl_FragColor = (diffuse * texture2D(u_Texture, v_TexCoordinate));
}
您需要自己对纹理坐标进行变换,您可以在以下四个地方之一进行:
- 将转换应用于原始模型数据。
- 在 CPU 中应用转换(不推荐,除非你有充分的理由,因为这是顶点着色器的用途)。
- 在顶点着色器中应用变换(推荐)。
- 在片段着色器中应用变换。
如果你要对纹理坐标应用平移,最灵活的方法是使用你的数学库创建一个平移矩阵,并将新矩阵作为统一传递给你的顶点着色器(就像你通过 mMVPMatrix 和 mLightModelMatrix)。
然后,您可以将平移矩阵乘以顶点着色器中的纹理坐标,并将结果输出为可变向量。
顶点着色器:
texture_coordinate_varying = texture_matrix_uniform * texture_coordinate_attribute;
片段着色器:
gl_FragColor = texture2D(texture_sampler, texture_coordinate_varying);
请注意:您的 GLES 1.0 代码实际上并没有执行转换,因为您用 push 和 pop 包围了它。
我正在尝试在 OpenGL ES 2.0 中实现 8 列和 8 行的精灵
我制作了第一个图像,但我无法弄清楚如何在 OpenGL ES 2.0 中转换纹理矩阵,我正在寻找的与 OpenGL 1.0 中的代码等效的是
gl.glMatrixMode(GL10.GL_TEXTURE);
gl.glLoadIdentity();
gl.glPushMatrix();
gl.glTranslatef(0.0f, 0.2f, 0f);
gl.glPopMatrix();
这是我使用 atm 的矩阵
/**
* Store the model matrix. This matrix is used to move models from object space (where each model can be thought
* of being located at the center of the universe) to world space.
*/
private float[] mModelMatrix = new float[16];
/**
* Store the view matrix. This can be thought of as our camera. This matrix transforms world space to eye space;
* it positions things relative to our eye.
*/
private float[] mViewMatrix = new float[16];
/** Store the projection matrix. This is used to project the scene onto a 2D viewport. */
private float[] mProjectionMatrix = new float[16];
/** Allocate storage for the final combined matrix. This will be passed into the shader program. */
private float[] mMVPMatrix = new float[16];
/**
* Stores a copy of the model matrix specifically for the light position.
*/
private float[] mLightModelMatrix = new float[16];
我的顶点着色器
uniform mat4 u_MVPMatrix; // A constant representing the combined model/view/projection matrix.
uniform mat4 u_MVMatrix; // A constant representing the combined model/view matrix.
attribute vec4 a_Position; // Per-vertex position information we will pass in.
attribute vec3 a_Normal; // Per-vertex normal information we will pass in.
attribute vec2 a_TexCoordinate; // Per-vertex texture coordinate information we will pass in.
varying vec3 v_Position; // This will be passed into the fragment shader.
varying vec3 v_Normal; // This will be passed into the fragment shader.
varying vec2 v_TexCoordinate; // This will be passed into the fragment shader.
// The entry point for our vertex shader.
void main()
{
// Transform the vertex into eye space.
v_Position = vec3(u_MVMatrix * a_Position);
// Pass through the texture coordinate.
v_TexCoordinate = a_TexCoordinate;
// Transform the normal's orientation into eye space.
v_Normal = vec3(u_MVMatrix * vec4(a_Normal, 0.0));
// gl_Position is a special variable used to store the final position.
// Multiply the vertex by the matrix to get the final point in normalized screen coordinates.
gl_Position = u_MVPMatrix * a_Position;
}
我的片段着色器:
precision mediump float; // Set the default precision to medium. We don't need as high of a
// precision in the fragment shader.
uniform vec3 u_LightPos; // The position of the light in eye space.
uniform sampler2D u_Texture; // The input texture.
varying vec3 v_Position; // Interpolated position for this fragment.
varying vec3 v_Normal; // Interpolated normal for this fragment.
varying vec2 v_TexCoordinate; // Interpolated texture coordinate per fragment.
// The entry point for our fragment shader.
void main()
{
// Will be used for attenuation.
float distance = length(u_LightPos - v_Position);
// Get a lighting direction vector from the light to the vertex.
vec3 lightVector = normalize(u_LightPos - v_Position);
// Calculate the dot product of the light vector and vertex normal. If the normal and light vector are
// pointing in the same direction then it will get max illumination.
float diffuse = max(dot(v_Normal, lightVector), 0.0);
// Add attenuation.
diffuse = diffuse * (1.0 / (1.0 + (0.25 * distance)));
// Add ambient lighting
diffuse = diffuse + 0.7;
// Multiply the color by the diffuse illumination level and texture value to get final output color.
gl_FragColor = (diffuse * texture2D(u_Texture, v_TexCoordinate));
}
您需要自己对纹理坐标进行变换,您可以在以下四个地方之一进行:
- 将转换应用于原始模型数据。
- 在 CPU 中应用转换(不推荐,除非你有充分的理由,因为这是顶点着色器的用途)。
- 在顶点着色器中应用变换(推荐)。
- 在片段着色器中应用变换。
如果你要对纹理坐标应用平移,最灵活的方法是使用你的数学库创建一个平移矩阵,并将新矩阵作为统一传递给你的顶点着色器(就像你通过 mMVPMatrix 和 mLightModelMatrix)。 然后,您可以将平移矩阵乘以顶点着色器中的纹理坐标,并将结果输出为可变向量。
顶点着色器:
texture_coordinate_varying = texture_matrix_uniform * texture_coordinate_attribute;
片段着色器:
gl_FragColor = texture2D(texture_sampler, texture_coordinate_varying);
请注意:您的 GLES 1.0 代码实际上并没有执行转换,因为您用 push 和 pop 包围了它。