Hi everyone!
Over the past few days, I've been doing a lot of investigation into my game's performance, specifically with regard to keeping a steady 60 fps. The debugger & profiler have been pretty helpful, but I have a few questions about some problems I've been having.
How much are memory leaks tied to performance issues / frame drops? I'm making a game with procedurally generated rooms, and I've recently discovered that the way I'm doing this is flawed.
Basically, I'm using a bunch of room_add() calls and then filling them in with objects from predefined templates. I only just realized I have no way of removing these rooms, even when the game is reset, so the number of rooms, and thus, the memory use, will climb indefinitely. I usually accumulate another 1-2 MB per new room generated, does that seem reasonable? The rooms are also persistent, and I assume that Game Maker needs to store information about these rooms somewhere, so that will also lead to increased memory usage.
My idea is to stop using built in persistence, and use my own system that involves storing room contents to a file. I'll only use one room, say rm_proc_gen, and when the player swaps rooms, I'll switch to a temporary room, rm_waiting. At this point, I'll resize rm_proc_gen and read its contents from a file, and then switch back from rm_waiting to rm_proc_gen. This should cut down on my memory usage.
My question is, is this likely the source of my frame drops and performance issues? As the game approaches 50-100MB of memory usage, the framerate declines. I know it's important to fix either way, but I would've thought that my 8GB of memory should be more than enough to handle that kind of load.
I've also been using the show_debug_overlay() option, and I found something interesting. As my framerate issues start to crop up, I notice a consistent framedrop when I create & destroy objects. I noticed it when my player attacked, as a hitbox object is created and then destroyed a few frames later. I tried simply creating and destroying a dummy object that does nothing, and the issue happened there as well. The debug overlay shows a big gray bar, which, according to the docs, is "The time required to clear screen each draw step." I couldn't really find any other info about what that means or what I can do about it. It doesn't happen when the game starts, but as the game runs for a bit (and my memory usage climbs) , the gray bar on my show_debug_overlay() starts to get big.
I should also mention I've been using the profiler quite a bit. Usually the most usage I see is about 10% coming from my input/controller reading code, and maybe 10% from enemy collision checks. I'm not really sure at what threshold something should be deemed "too much" or a performance bottleneck. I've noticed that fps_real is quite a bit above 60 (several 100 usually), but fps is dropping regardless. From what I understand that indicates an issue with the draw pipeline, but my hitbox isn't drawing anything! Visible is set to false and there's no draw event. I'll also mention my texture swaps and vertex batches usually sit at around 6 and 8.
So, to summarize:
Over the past few days, I've been doing a lot of investigation into my game's performance, specifically with regard to keeping a steady 60 fps. The debugger & profiler have been pretty helpful, but I have a few questions about some problems I've been having.
How much are memory leaks tied to performance issues / frame drops? I'm making a game with procedurally generated rooms, and I've recently discovered that the way I'm doing this is flawed.
Basically, I'm using a bunch of room_add() calls and then filling them in with objects from predefined templates. I only just realized I have no way of removing these rooms, even when the game is reset, so the number of rooms, and thus, the memory use, will climb indefinitely. I usually accumulate another 1-2 MB per new room generated, does that seem reasonable? The rooms are also persistent, and I assume that Game Maker needs to store information about these rooms somewhere, so that will also lead to increased memory usage.
My idea is to stop using built in persistence, and use my own system that involves storing room contents to a file. I'll only use one room, say rm_proc_gen, and when the player swaps rooms, I'll switch to a temporary room, rm_waiting. At this point, I'll resize rm_proc_gen and read its contents from a file, and then switch back from rm_waiting to rm_proc_gen. This should cut down on my memory usage.
My question is, is this likely the source of my frame drops and performance issues? As the game approaches 50-100MB of memory usage, the framerate declines. I know it's important to fix either way, but I would've thought that my 8GB of memory should be more than enough to handle that kind of load.
I've also been using the show_debug_overlay() option, and I found something interesting. As my framerate issues start to crop up, I notice a consistent framedrop when I create & destroy objects. I noticed it when my player attacked, as a hitbox object is created and then destroyed a few frames later. I tried simply creating and destroying a dummy object that does nothing, and the issue happened there as well. The debug overlay shows a big gray bar, which, according to the docs, is "The time required to clear screen each draw step." I couldn't really find any other info about what that means or what I can do about it. It doesn't happen when the game starts, but as the game runs for a bit (and my memory usage climbs) , the gray bar on my show_debug_overlay() starts to get big.
I should also mention I've been using the profiler quite a bit. Usually the most usage I see is about 10% coming from my input/controller reading code, and maybe 10% from enemy collision checks. I'm not really sure at what threshold something should be deemed "too much" or a performance bottleneck. I've noticed that fps_real is quite a bit above 60 (several 100 usually), but fps is dropping regardless. From what I understand that indicates an issue with the draw pipeline, but my hitbox isn't drawing anything! Visible is set to false and there's no draw event. I'll also mention my texture swaps and vertex batches usually sit at around 6 and 8.
So, to summarize:
- What's a reasonable amount of memory usage? Will usage of say, 50-100MB cause a decline in performance, even on hardware with several GB of memory?
- What numbers or percents should I be on the lookout for in the profiler? Is 10% bad?
- As my game runs for a while, the debug overlay shows a big gray bar when I create & destroy even a simple dummy object. The grey bar indicates "The time required to clear screen each draw step." I'm curious about how to optimize this for an object that isn't visible or drawing anything. Or is something else probably afoot?