Hey guys and girls,
i am currently working on a small 3d game in game maker. So far i had no problems with performance however iam now generating meshes during runtime and the algorithm (greedy meshing) for doing that is pretty slow. So i started looking into improving overall performance to shave of valuable ms on my processing time per frame. First thing i tried was enabling back face culling and thats were my problem starts. For some reason my graphics card decides that all triangles are doomed for removal, it even removed all triangles on the example for the d3d_set_culling function. I already searched the forum, looked on google and searched some open gl forums for help but it seems like i am the only person encountering the problem. Iam currently believing that the problem might lies in my hardware. The laptop iam currently using has an intel hd 5500 graphics card on it. Hopefully some of you guys and girls know an answer to this problem.
Zekkan
i am currently working on a small 3d game in game maker. So far i had no problems with performance however iam now generating meshes during runtime and the algorithm (greedy meshing) for doing that is pretty slow. So i started looking into improving overall performance to shave of valuable ms on my processing time per frame. First thing i tried was enabling back face culling and thats were my problem starts. For some reason my graphics card decides that all triangles are doomed for removal, it even removed all triangles on the example for the d3d_set_culling function. I already searched the forum, looked on google and searched some open gl forums for help but it seems like i am the only person encountering the problem. Iam currently believing that the problem might lies in my hardware. The laptop iam currently using has an intel hd 5500 graphics card on it. Hopefully some of you guys and girls know an answer to this problem.
Zekkan
Last edited: