在 OpenGL 3.3 的着色器中投影和偏移顶点
Projecting and offseting vertices in shaders in OpenTK 3.3
我正在尝试使用 OpenTK 为 OpenGL 制作多边形 class。目标是创建 class 的实例,将像素坐标中的顶点数组传递给它,并将其正确绘制到屏幕上。
我打算实现这一点的方法是使用投影矩阵(在屏幕 class 中定义为 public 变量),着色器应该使用它来缩放从 NDC 到像素坐标的所有内容.我还打算将一个偏移向量传递给着色器以将其添加到位置。
投影是使用
计算的
Matrix4.CreateOrthographicOffCenter(0.0f, width, 0.0f, height, -100.0f, 100.0f);
用于多边形的顶点是:
float[] vertices = new float[]
{
0.0f, 100f, 0f,
0.0f, 0.0f, 0f,
100f, 0.0f, 0f,
100f, 100f, 0f,
};
这是我的顶点着色器的样子:
#version 330 core
layout (location = 0) in vec3 position;
uniform mat4 projection;
uniform vec3 offset;
void main()
{
gl_Position = projection * vec4(position + offset, 1.0);
}
这是多边形 class:
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Text;
using OpenTK;
using OpenTK.Graphics;
using OpenTK.Graphics.OpenGL;
using Shaders;
using Engine;
namespace Engine.Engine.Shape
{
class Polygon
{
float[] vertices;
List<uint> indexes;
uint[] indices;
int VBO, VAO, EBO;
Shader shader;
Matrix4 projection;
Vector3 offset;
public Polygon(float[] vertices)
{
this.vertices = vertices;
this.projection = Screen.projection;
offset = new Vector3(10, 10, 0);
// This creates an index array for the EBO to use
indexes = new List<uint>();
for (uint curr_vert = 1; curr_vert < vertices.Length / 3 - 1; curr_vert++)
{
// For each triangle of the polygon, I select a fixed vertex (index 0), the current vertex and the next vertex
uint[] triangle = new uint[] { 0, curr_vert, curr_vert + 1 };
foreach (uint x in triangle)
{
//I use this list to make life easier
indexes.Add(x);
}
}
// Now that the list is complete I convert it into an array
indices = indexes.ToArray();
}
public void Load()
{
string path = "../../../Shaders/";
VBO = GL.GenBuffer();
GL.BindBuffer(BufferTarget.ArrayBuffer, VBO);
GL.BufferData(BufferTarget.ArrayBuffer, vertices.Length * sizeof(float), vertices, BufferUsageHint.StaticDraw);
VAO = GL.GenVertexArray();
GL.BindVertexArray(VAO);
GL.VertexAttribPointer(0, 3, VertexAttribPointerType.Float, false, 3 * sizeof(float), 0);
GL.EnableVertexAttribArray(0);
// We create/bind the Element Buffer Object EBO the same way as the VBO, except there is a major difference here which can be REALLY confusing.
// The binding spot for ElementArrayBuffer is not actually a global binding spot like ArrayBuffer is.
// Instead it's actually a property of the currently bound VertexArrayObject, and binding an EBO with no VAO is undefined behaviour.
// This also means that if you bind another VAO, the current ElementArrayBuffer is going to change with it.
// Another sneaky part is that you don't need to unbind the buffer in ElementArrayBuffer as unbinding the VAO is going to do this,
// and unbinding the EBO will remove it from the VAO instead of unbinding it like you would for VBOs or VAOs.
EBO = GL.GenBuffer();
GL.BindBuffer(BufferTarget.ElementArrayBuffer, EBO);
// We also upload data to the EBO the same way as we did with VBOs.
GL.BufferData(BufferTarget.ElementArrayBuffer, indices.Length * sizeof(uint), indices, BufferUsageHint.StaticDraw);
// The EBO has now been properly setup. Go to the Render function to see how we draw our rectangle now!
shader = new Shader(path + "shader.vert", path + "shader.frag");
shader.Use();
}
public void Show()
{
shader.Use();
shader.SetMatrix4("projection", projection);
shader.SetVector3("offset", offset);
GL.BindVertexArray(VAO);
GL.DrawElements(PrimitiveType.Triangles, indices.Length, DrawElementsType.UnsignedInt, 0);
}
}
}
预期结果是 window 灰色背景,左上角有一个白色多边形,每个轴从角偏移 10 个像素。
但是,如果我尝试将偏移或投影矩阵应用于着色器,则不会出现多边形。由于 GLSL 无法打印到控制台,我无法调试着色器本身,只能查看结果并更改输入。我究竟做错了什么?这是我看到每个人都这样做的方式。
这段代码中使用的Shaderclass如下(可以找到here):
using System;
using System.IO;
using System.Text;
using System.Collections.Generic;
using OpenTK.Graphics.OpenGL;
using OpenTK;
namespace Shaders
{
// A simple class meant to help create shaders.
public class Shader
{
public readonly int Handle;
private readonly Dictionary<string, int> _uniformLocations;
// This is how you create a simple shader.
// Shaders are written in GLSL, which is a language very similar to C in its semantics.
// The GLSL source is compiled *at runtime*, so it can optimize itself for the graphics card it's currently being used on.
// A commented example of GLSL can be found in shader.vert
public Shader(string vertPath, string fragPath)
{
// There are several different types of shaders, but the only two you need for basic rendering are the vertex and fragment shaders.
// The vertex shader is responsible for moving around vertices, and uploading that data to the fragment shader.
// The vertex shader won't be too important here, but they'll be more important later.
// The fragment shader is responsible for then converting the vertices to "fragments", which represent all the data OpenGL needs to draw a pixel.
// The fragment shader is what we'll be using the most here.
// Load vertex shader and compile
var shaderSource = File.ReadAllText(vertPath);
// GL.CreateShader will create an empty shader (obviously). The ShaderType enum denotes which type of shader will be created.
var vertexShader = GL.CreateShader(ShaderType.VertexShader);
// Now, bind the GLSL source code
GL.ShaderSource(vertexShader, shaderSource);
// And then compile
CompileShader(vertexShader);
// We do the same for the fragment shader
shaderSource = File.ReadAllText(fragPath);
var fragmentShader = GL.CreateShader(ShaderType.FragmentShader);
GL.ShaderSource(fragmentShader, shaderSource);
CompileShader(fragmentShader);
// These two shaders must then be merged into a shader program, which can then be used by OpenGL.
// To do this, create a program...
Handle = GL.CreateProgram();
// Attach both shaders...
GL.AttachShader(Handle, vertexShader);
GL.AttachShader(Handle, fragmentShader);
// And then link them together.
LinkProgram(Handle);
// When the shader program is linked, it no longer needs the individual shaders attacked to it; the compiled code is copied into the shader program.
// Detach them, and then delete them.
GL.DetachShader(Handle, vertexShader);
GL.DetachShader(Handle, fragmentShader);
GL.DeleteShader(fragmentShader);
GL.DeleteShader(vertexShader);
// The shader is now ready to go, but first, we're going to cache all the shader uniform locations.
// Querying this from the shader is very slow, so we do it once on initialization and reuse those values
// later.
// First, we have to get the number of active uniforms in the shader.
GL.GetProgram(Handle, GetProgramParameterName.ActiveUniforms, out var numberOfUniforms);
// Next, allocate the dictionary to hold the locations.
_uniformLocations = new Dictionary<string, int>();
// Loop over all the uniforms,
for (var i = 0; i < numberOfUniforms; i++)
{
// get the name of this uniform,
var key = GL.GetActiveUniform(Handle, i, out _, out _);
// get the location,
var location = GL.GetUniformLocation(Handle, key);
// and then add it to the dictionary.
_uniformLocations.Add(key, location);
}
}
private static void CompileShader(int shader)
{
// Try to compile the shader
GL.CompileShader(shader);
// Check for compilation errors
GL.GetShader(shader, ShaderParameter.CompileStatus, out var code);
if (code != (int)All.True)
{
// We can use `GL.GetShaderInfoLog(shader)` to get information about the error.
var infoLog = GL.GetShaderInfoLog(shader);
throw new Exception($"Error occurred whilst compiling Shader({shader}).\n\n{infoLog}");
}
}
private static void LinkProgram(int program)
{
// We link the program
GL.LinkProgram(program);
// Check for linking errors
GL.GetProgram(program, GetProgramParameterName.LinkStatus, out var code);
if (code != (int)All.True)
{
// We can use `GL.GetProgramInfoLog(program)` to get information about the error.
throw new Exception($"Error occurred whilst linking Program({program})");
}
}
// A wrapper function that enables the shader program.
public void Use()
{
GL.UseProgram(Handle);
}
// The shader sources provided with this project use hardcoded layout(location)-s. If you want to do it dynamically,
// you can omit the layout(location=X) lines in the vertex shader, and use this in VertexAttribPointer instead of the hardcoded values.
public int GetAttribLocation(string attribName)
{
return GL.GetAttribLocation(Handle, attribName);
}
// Uniform setters
// Uniforms are variables that can be set by user code, instead of reading them from the VBO.
// You use VBOs for vertex-related data, and uniforms for almost everything else.
// Setting a uniform is almost always the exact same, so I'll explain it here once, instead of in every method:
// 1. Bind the program you want to set the uniform on
// 2. Get a handle to the location of the uniform with GL.GetUniformLocation.
// 3. Use the appropriate GL.Uniform* function to set the uniform.
/// <summary>
/// Set a uniform int on this shader.
/// </summary>
/// <param name="name">The name of the uniform</param>
/// <param name="data">The data to set</param>
public void SetInt(string name, int data)
{
GL.UseProgram(Handle);
GL.Uniform1(_uniformLocations[name], data);
}
/// <summary>
/// Set a uniform float on this shader.
/// </summary>
/// <param name="name">The name of the uniform</param>
/// <param name="data">The data to set</param>
public void SetFloat(string name, float data)
{
GL.UseProgram(Handle);
GL.Uniform1(_uniformLocations[name], data);
}
/// <summary>
/// Set a uniform Matrix4 on this shader
/// </summary>
/// <param name="name">The name of the uniform</param>
/// <param name="data">The data to set</param>
/// <remarks>
/// <para>
/// The matrix is transposed before being sent to the shader.
/// </para>
/// </remarks>
public void SetMatrix4(string name, Matrix4 data)
{
GL.UseProgram(Handle);
GL.UniformMatrix4(_uniformLocations[name], true, ref data);
}
/// <summary>
/// Set a uniform Vector3 on this shader.
/// </summary>
/// <param name="name">The name of the uniform</param>
/// <param name="data">The data to set</param>
public void SetVector3(string name, Vector3 data)
{
GL.UseProgram(Handle);
GL.Uniform3(_uniformLocations[name], data);
}
}
}
剩下的代码可以找here
投影矩阵定义了观察体积。此体积之外的任何几何体都将被剪裁。您的几何图形被裁剪了,因为它不在 Orthographic projection 定义的体积的近平面和远平面之间。几何z坐标为0,但到近平面的距离为0.1,到远平面的距离为100。
改变几何体的 z 坐标并沿负 z 轴移动几何体:
offset = new Vector3(10, 10, 0);
offset = new Vector3(10, 10, -10);
或者改变正投影的近平面:
Matrix4.CreateOrthographicOffCenter(0.0f, width, 0.0f, height, 0.1f, 100.0f);
Matrix4.CreateOrthographicOffCenter(0.0f, width, 0.0f, height, -100.0f, 100.0f);
在着色器代码中,矩阵乘以左边的向量(这很常见)。因此你不能转置矩阵:
gl_Position = projection * vec4(position + offset, 1.0);
GL.UniformMatrix4(_uniformLocations[name], true, ref data)
GL.UniformMatrix4(_uniformLocations[name], false, ref data);
我发现了问题。事实证明,因素的顺序 确实 改变了乘积。
通过改变
gl_Position = projection * vec4(position + offset, 1.0);
至
gl_Position = vec4(position + offset, 1.0) * projection;
一切正常。
编辑:矩阵正在传递转置,禁用它并撤消此处解释的更改也有效。
No Don't change the shader, but do not transpose the matrix. (GL.UniformMatrix4(_uniformLocations[name], false, ref data);) see man answer.
我正在尝试使用 OpenTK 为 OpenGL 制作多边形 class。目标是创建 class 的实例,将像素坐标中的顶点数组传递给它,并将其正确绘制到屏幕上。
我打算实现这一点的方法是使用投影矩阵(在屏幕 class 中定义为 public 变量),着色器应该使用它来缩放从 NDC 到像素坐标的所有内容.我还打算将一个偏移向量传递给着色器以将其添加到位置。
投影是使用
计算的Matrix4.CreateOrthographicOffCenter(0.0f, width, 0.0f, height, -100.0f, 100.0f);
用于多边形的顶点是:
float[] vertices = new float[]
{
0.0f, 100f, 0f,
0.0f, 0.0f, 0f,
100f, 0.0f, 0f,
100f, 100f, 0f,
};
这是我的顶点着色器的样子:
#version 330 core
layout (location = 0) in vec3 position;
uniform mat4 projection;
uniform vec3 offset;
void main()
{
gl_Position = projection * vec4(position + offset, 1.0);
}
这是多边形 class:
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Text;
using OpenTK;
using OpenTK.Graphics;
using OpenTK.Graphics.OpenGL;
using Shaders;
using Engine;
namespace Engine.Engine.Shape
{
class Polygon
{
float[] vertices;
List<uint> indexes;
uint[] indices;
int VBO, VAO, EBO;
Shader shader;
Matrix4 projection;
Vector3 offset;
public Polygon(float[] vertices)
{
this.vertices = vertices;
this.projection = Screen.projection;
offset = new Vector3(10, 10, 0);
// This creates an index array for the EBO to use
indexes = new List<uint>();
for (uint curr_vert = 1; curr_vert < vertices.Length / 3 - 1; curr_vert++)
{
// For each triangle of the polygon, I select a fixed vertex (index 0), the current vertex and the next vertex
uint[] triangle = new uint[] { 0, curr_vert, curr_vert + 1 };
foreach (uint x in triangle)
{
//I use this list to make life easier
indexes.Add(x);
}
}
// Now that the list is complete I convert it into an array
indices = indexes.ToArray();
}
public void Load()
{
string path = "../../../Shaders/";
VBO = GL.GenBuffer();
GL.BindBuffer(BufferTarget.ArrayBuffer, VBO);
GL.BufferData(BufferTarget.ArrayBuffer, vertices.Length * sizeof(float), vertices, BufferUsageHint.StaticDraw);
VAO = GL.GenVertexArray();
GL.BindVertexArray(VAO);
GL.VertexAttribPointer(0, 3, VertexAttribPointerType.Float, false, 3 * sizeof(float), 0);
GL.EnableVertexAttribArray(0);
// We create/bind the Element Buffer Object EBO the same way as the VBO, except there is a major difference here which can be REALLY confusing.
// The binding spot for ElementArrayBuffer is not actually a global binding spot like ArrayBuffer is.
// Instead it's actually a property of the currently bound VertexArrayObject, and binding an EBO with no VAO is undefined behaviour.
// This also means that if you bind another VAO, the current ElementArrayBuffer is going to change with it.
// Another sneaky part is that you don't need to unbind the buffer in ElementArrayBuffer as unbinding the VAO is going to do this,
// and unbinding the EBO will remove it from the VAO instead of unbinding it like you would for VBOs or VAOs.
EBO = GL.GenBuffer();
GL.BindBuffer(BufferTarget.ElementArrayBuffer, EBO);
// We also upload data to the EBO the same way as we did with VBOs.
GL.BufferData(BufferTarget.ElementArrayBuffer, indices.Length * sizeof(uint), indices, BufferUsageHint.StaticDraw);
// The EBO has now been properly setup. Go to the Render function to see how we draw our rectangle now!
shader = new Shader(path + "shader.vert", path + "shader.frag");
shader.Use();
}
public void Show()
{
shader.Use();
shader.SetMatrix4("projection", projection);
shader.SetVector3("offset", offset);
GL.BindVertexArray(VAO);
GL.DrawElements(PrimitiveType.Triangles, indices.Length, DrawElementsType.UnsignedInt, 0);
}
}
}
预期结果是 window 灰色背景,左上角有一个白色多边形,每个轴从角偏移 10 个像素。
但是,如果我尝试将偏移或投影矩阵应用于着色器,则不会出现多边形。由于 GLSL 无法打印到控制台,我无法调试着色器本身,只能查看结果并更改输入。我究竟做错了什么?这是我看到每个人都这样做的方式。
这段代码中使用的Shaderclass如下(可以找到here):
using System;
using System.IO;
using System.Text;
using System.Collections.Generic;
using OpenTK.Graphics.OpenGL;
using OpenTK;
namespace Shaders
{
// A simple class meant to help create shaders.
public class Shader
{
public readonly int Handle;
private readonly Dictionary<string, int> _uniformLocations;
// This is how you create a simple shader.
// Shaders are written in GLSL, which is a language very similar to C in its semantics.
// The GLSL source is compiled *at runtime*, so it can optimize itself for the graphics card it's currently being used on.
// A commented example of GLSL can be found in shader.vert
public Shader(string vertPath, string fragPath)
{
// There are several different types of shaders, but the only two you need for basic rendering are the vertex and fragment shaders.
// The vertex shader is responsible for moving around vertices, and uploading that data to the fragment shader.
// The vertex shader won't be too important here, but they'll be more important later.
// The fragment shader is responsible for then converting the vertices to "fragments", which represent all the data OpenGL needs to draw a pixel.
// The fragment shader is what we'll be using the most here.
// Load vertex shader and compile
var shaderSource = File.ReadAllText(vertPath);
// GL.CreateShader will create an empty shader (obviously). The ShaderType enum denotes which type of shader will be created.
var vertexShader = GL.CreateShader(ShaderType.VertexShader);
// Now, bind the GLSL source code
GL.ShaderSource(vertexShader, shaderSource);
// And then compile
CompileShader(vertexShader);
// We do the same for the fragment shader
shaderSource = File.ReadAllText(fragPath);
var fragmentShader = GL.CreateShader(ShaderType.FragmentShader);
GL.ShaderSource(fragmentShader, shaderSource);
CompileShader(fragmentShader);
// These two shaders must then be merged into a shader program, which can then be used by OpenGL.
// To do this, create a program...
Handle = GL.CreateProgram();
// Attach both shaders...
GL.AttachShader(Handle, vertexShader);
GL.AttachShader(Handle, fragmentShader);
// And then link them together.
LinkProgram(Handle);
// When the shader program is linked, it no longer needs the individual shaders attacked to it; the compiled code is copied into the shader program.
// Detach them, and then delete them.
GL.DetachShader(Handle, vertexShader);
GL.DetachShader(Handle, fragmentShader);
GL.DeleteShader(fragmentShader);
GL.DeleteShader(vertexShader);
// The shader is now ready to go, but first, we're going to cache all the shader uniform locations.
// Querying this from the shader is very slow, so we do it once on initialization and reuse those values
// later.
// First, we have to get the number of active uniforms in the shader.
GL.GetProgram(Handle, GetProgramParameterName.ActiveUniforms, out var numberOfUniforms);
// Next, allocate the dictionary to hold the locations.
_uniformLocations = new Dictionary<string, int>();
// Loop over all the uniforms,
for (var i = 0; i < numberOfUniforms; i++)
{
// get the name of this uniform,
var key = GL.GetActiveUniform(Handle, i, out _, out _);
// get the location,
var location = GL.GetUniformLocation(Handle, key);
// and then add it to the dictionary.
_uniformLocations.Add(key, location);
}
}
private static void CompileShader(int shader)
{
// Try to compile the shader
GL.CompileShader(shader);
// Check for compilation errors
GL.GetShader(shader, ShaderParameter.CompileStatus, out var code);
if (code != (int)All.True)
{
// We can use `GL.GetShaderInfoLog(shader)` to get information about the error.
var infoLog = GL.GetShaderInfoLog(shader);
throw new Exception($"Error occurred whilst compiling Shader({shader}).\n\n{infoLog}");
}
}
private static void LinkProgram(int program)
{
// We link the program
GL.LinkProgram(program);
// Check for linking errors
GL.GetProgram(program, GetProgramParameterName.LinkStatus, out var code);
if (code != (int)All.True)
{
// We can use `GL.GetProgramInfoLog(program)` to get information about the error.
throw new Exception($"Error occurred whilst linking Program({program})");
}
}
// A wrapper function that enables the shader program.
public void Use()
{
GL.UseProgram(Handle);
}
// The shader sources provided with this project use hardcoded layout(location)-s. If you want to do it dynamically,
// you can omit the layout(location=X) lines in the vertex shader, and use this in VertexAttribPointer instead of the hardcoded values.
public int GetAttribLocation(string attribName)
{
return GL.GetAttribLocation(Handle, attribName);
}
// Uniform setters
// Uniforms are variables that can be set by user code, instead of reading them from the VBO.
// You use VBOs for vertex-related data, and uniforms for almost everything else.
// Setting a uniform is almost always the exact same, so I'll explain it here once, instead of in every method:
// 1. Bind the program you want to set the uniform on
// 2. Get a handle to the location of the uniform with GL.GetUniformLocation.
// 3. Use the appropriate GL.Uniform* function to set the uniform.
/// <summary>
/// Set a uniform int on this shader.
/// </summary>
/// <param name="name">The name of the uniform</param>
/// <param name="data">The data to set</param>
public void SetInt(string name, int data)
{
GL.UseProgram(Handle);
GL.Uniform1(_uniformLocations[name], data);
}
/// <summary>
/// Set a uniform float on this shader.
/// </summary>
/// <param name="name">The name of the uniform</param>
/// <param name="data">The data to set</param>
public void SetFloat(string name, float data)
{
GL.UseProgram(Handle);
GL.Uniform1(_uniformLocations[name], data);
}
/// <summary>
/// Set a uniform Matrix4 on this shader
/// </summary>
/// <param name="name">The name of the uniform</param>
/// <param name="data">The data to set</param>
/// <remarks>
/// <para>
/// The matrix is transposed before being sent to the shader.
/// </para>
/// </remarks>
public void SetMatrix4(string name, Matrix4 data)
{
GL.UseProgram(Handle);
GL.UniformMatrix4(_uniformLocations[name], true, ref data);
}
/// <summary>
/// Set a uniform Vector3 on this shader.
/// </summary>
/// <param name="name">The name of the uniform</param>
/// <param name="data">The data to set</param>
public void SetVector3(string name, Vector3 data)
{
GL.UseProgram(Handle);
GL.Uniform3(_uniformLocations[name], data);
}
}
}
剩下的代码可以找here
投影矩阵定义了观察体积。此体积之外的任何几何体都将被剪裁。您的几何图形被裁剪了,因为它不在 Orthographic projection 定义的体积的近平面和远平面之间。几何z坐标为0,但到近平面的距离为0.1,到远平面的距离为100。
改变几何体的 z 坐标并沿负 z 轴移动几何体:
offset = new Vector3(10, 10, 0);
offset = new Vector3(10, 10, -10);
或者改变正投影的近平面:
Matrix4.CreateOrthographicOffCenter(0.0f, width, 0.0f, height, 0.1f, 100.0f);
Matrix4.CreateOrthographicOffCenter(0.0f, width, 0.0f, height, -100.0f, 100.0f);
在着色器代码中,矩阵乘以左边的向量(这很常见)。因此你不能转置矩阵:
gl_Position = projection * vec4(position + offset, 1.0);
GL.UniformMatrix4(_uniformLocations[name], true, ref data)
GL.UniformMatrix4(_uniformLocations[name], false, ref data);
我发现了问题。事实证明,因素的顺序 确实 改变了乘积。
通过改变
gl_Position = projection * vec4(position + offset, 1.0);
至
gl_Position = vec4(position + offset, 1.0) * projection;
一切正常。
编辑:矩阵正在传递转置,禁用它并撤消此处解释的更改也有效。
No Don't change the shader, but do not transpose the matrix. (GL.UniformMatrix4(_uniformLocations[name], false, ref data);) see man answer.