I have the follwoing issue with a program i’m writing to learn more about graphics programmming. However, I ran into an issue where disabling the MSVC ASan breaks my program, while the version compiled with MinGW works fine without ASan.
I wrote a small renderer in C and used OpenGL as my graphics API, I have gotten to the point where I can render a sample cube from a OBJ file (using my own OBJ parser). Since I mainly use Windows and MinGW has no address sanitizer available, I decided to move my code base and build system to MSVC so that I can make use of the address sanitizer but still check with MinGW if everything works across compilers.
My problem is that I’m experiencing very different behavior between the executtables generated by each compiler in the form of the data that gets copied to my Element Buffer Object, resulting on MinGW being able to render something akin to a cube, while MSVC presents me a blank Window. The differences in behavior started after i disabled the address sanitizer for MSVC.
To debug the issue I ran both executables through RenderDoc, to see what could be causingmy vertices to not be rendered and I found the follwoing issue: when I run the executable produced by MSVC through RenderDoc (without the ASan since it causes issues when used with RenderDoc) and noticed the following: the data in the EBO that is presented in RenderDoc is incorrect, the indices do not match the indices I read from the OBJ file, ewith the values bein gin the thousands when I expect them to be single digit values.
Buffer values from the MSVC executable
RenderDoc also presents the buffer with a strange name (“CXlient-memory pointer data”) instead of Buffer insert a number here like it does with GCC/MinGW.
Buffer values from the MinGW executtable
I also checked the follwoing:
- I’m linking with the correct compiler specific version of GLFW for each executable
- Re-enable the ASan to see if I could get the vertices to show up (this worked)
- Enabling more warnings to check if there was a pointer warning that was nott being reported
The expected result is this.
My attempt at rendering a cube
My rendering function is very similar with what is availbale at learnopengl.com
void renderLoop(GLFWwindow* win)
{
VG_3D_ENTITY* cube = {0};
uint32_t VBO, VAO, EBO;
cube = loadModelFromObj("./res/cube.obj");
if(cube == NULL) return;
glUseProgram(shaderProgram);
glGenBuffers(1, &EBO);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, EBO);
//glBufferData(GL_ELEMENT_ARRAY_BUFFER, cube->attribs.numFaces * sizeof(uint32_t) * 3, cube->faceIndices, GL_STATIC_DRAW);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeof(indices), indices, GL_STATIC_DRAW);
//uint32_t *debug_ptr = malloc(cube->attribs.numFaces * sizeof(uint32_t) * 3);
uint32_t *buffer_pttr = (uint32_t *) glMapBufferRange(GL_ELEMENT_ARRAY_BUFFER, 0, cube->attribs.numFaces * sizeof(uint32_t) * 3, GL_MAP_READ_BIT);
glGenBuffers(1, &VBO);
glBindBuffer(GL_ARRAY_BUFFER, VBO);
glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_STATIC_DRAW);
//glBufferData(GL_ARRAY_BUFFER, sizeof(vec3) * cube->attribs.numVertices, cube->vertexArray, GL_STATIC_DRAW);
glGenVertexArrays(3, &VAO);
glBindVertexArray(VAO);
glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 3 * sizeof(float), (void*)0);
glEnableVertexAttribArray(0);
glVertexAttribPointer(1, 3, GL_FLOAT, GL_FALSE, 3 * sizeof(float), (void*)3);
glEnableVertexAttribArray(1);
glVertexAttribPointer(2, 2, GL_FLOAT, GL_FALSE, 2 * sizeof(float), (void*)5);
glEnableVertexAttribArray(2);
/*Temporary triangle transform*/
mat4 triangleTransform = GLM_MAT4_IDENTITY_INIT;
GLint triangleTransformLoc = glGetUniformLocation(shaderProgram, "model");
/*Temporary triangle transform*/
/*Temporary camera transform*/
VG_PLAYER_CAMERA cam = cameraSetup();
GLint cameraTransformLoc = glGetUniformLocation(shaderProgram, "view");
/*Temporary camera transform*/
/*Temporary projection transform*/
mat4 projection = GLM_MAT4_IDENTITY_INIT;
glm_perspective(glm_rad(45), 16/9, 0.1f, 100.0f, projection);
GLint projectionTransformLoc = glGetUniformLocation(shaderProgram, "projection");
/*Temporary projection transform*/
glfwSwapInterval(1);
glBindVertexArray(VAO);
while(!glfwWindowShouldClose(win))
{
glfwPollEvents();
glEnable(GL_DEPTH_TEST);
glClearColor(0.70f, 0.83f, 0.69f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
moveCamera(&cam, win);
glm_spinned(triangleTransform, glm_rad(3), cam.__cameraUp);
glUniformMatrix4fv(triangleTransformLoc, 1, GL_FALSE, (float *) triangleTransform);
glUniformMatrix4fv(projectionTransformLoc, 1, GL_FALSE, (float *) projection);
glUniformMatrix4fv(cameraTransformLoc, 1, GL_FALSE, (float *) cam.lookat);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, EBO);
glDrawElements(GL_TRIANGLES, cube->attribs.numFaces * 3, GL_UNSIGNED_INT, 0);
glfwSwapBuffers(win);
}
}
The 3D model is stored into these structures
typedef struct{
uint32_t numVertices;
uint32_t numFaces;
uint32_t numNormals;
}VG_3D_MODEL_ATTRIBUTES;
typedef struct{
vec3* vertexArray;
int32_t* faceIndices;
VG_3D_MODEL_ATTRIBUTES attribs;
}VG_3D_ENTITY;
I am at a loss as to what to try next. The address sanitizer does not signal any memory IO issues when I read my model, I tried the UBsan on linux but nothing gets signaled either. The code base can be found on my repo over at https://github.com/JoseAFRibeiro/vertigal/tree/obj
JR2411 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.