Performance Issues

ehddev

Member
Hi everyone!

Over the past few days, I've been doing a lot of investigation into my game's performance, specifically with regard to keeping a steady 60 fps. The debugger & profiler have been pretty helpful, but I have a few questions about some problems I've been having.

How much are memory leaks tied to performance issues / frame drops? I'm making a game with procedurally generated rooms, and I've recently discovered that the way I'm doing this is flawed.

Basically, I'm using a bunch of room_add() calls and then filling them in with objects from predefined templates. I only just realized I have no way of removing these rooms, even when the game is reset, so the number of rooms, and thus, the memory use, will climb indefinitely. I usually accumulate another 1-2 MB per new room generated, does that seem reasonable? The rooms are also persistent, and I assume that Game Maker needs to store information about these rooms somewhere, so that will also lead to increased memory usage.

My idea is to stop using built in persistence, and use my own system that involves storing room contents to a file. I'll only use one room, say rm_proc_gen, and when the player swaps rooms, I'll switch to a temporary room, rm_waiting. At this point, I'll resize rm_proc_gen and read its contents from a file, and then switch back from rm_waiting to rm_proc_gen. This should cut down on my memory usage.

My question is, is this likely the source of my frame drops and performance issues? As the game approaches 50-100MB of memory usage, the framerate declines. I know it's important to fix either way, but I would've thought that my 8GB of memory should be more than enough to handle that kind of load.

I've also been using the show_debug_overlay() option, and I found something interesting. As my framerate issues start to crop up, I notice a consistent framedrop when I create & destroy objects. I noticed it when my player attacked, as a hitbox object is created and then destroyed a few frames later. I tried simply creating and destroying a dummy object that does nothing, and the issue happened there as well. The debug overlay shows a big gray bar, which, according to the docs, is "The time required to clear screen each draw step." I couldn't really find any other info about what that means or what I can do about it. It doesn't happen when the game starts, but as the game runs for a bit (and my memory usage climbs) , the gray bar on my show_debug_overlay() starts to get big.

I should also mention I've been using the profiler quite a bit. Usually the most usage I see is about 10% coming from my input/controller reading code, and maybe 10% from enemy collision checks. I'm not really sure at what threshold something should be deemed "too much" or a performance bottleneck. I've noticed that fps_real is quite a bit above 60 (several 100 usually), but fps is dropping regardless. From what I understand that indicates an issue with the draw pipeline, but my hitbox isn't drawing anything! Visible is set to false and there's no draw event. I'll also mention my texture swaps and vertex batches usually sit at around 6 and 8.

So, to summarize:
  • What's a reasonable amount of memory usage? Will usage of say, 50-100MB cause a decline in performance, even on hardware with several GB of memory?
  • What numbers or percents should I be on the lookout for in the profiler? Is 10% bad?
  • As my game runs for a while, the debug overlay shows a big gray bar when I create & destroy even a simple dummy object. The grey bar indicates "The time required to clear screen each draw step." I'm curious about how to optimize this for an object that isn't visible or drawing anything. Or is something else probably afoot?
Thanks! Any help would really be appreciated :oops:
 

Joe Ellis

Member
I think something else probably is afoot, cus all the stuff you've explained doesn't sound like it'd be a problem.

Ram\memory usage doesn't really affect performance. It's usually either the cpu or gpu doing more than it can handle. The cpu is all the code you've written and doing the stuff with variables changing and the gpu is all the stuff you've told it to draw.

I know that you could create a buffer that's 500mb, or even 2gb, and the performance won't drop as long as the computer has more ram than this.

If you're by accident creating instances very often that never get destroyed\deleted the cpu usage will keep increasing, plus the ram for storing their variables and stuff that the engine uses for handling them.

If the fps is above 60, or 1000+, but the game is lagging, that's usually a gpu issue, cus the fps counter measures the cpu usage, (How long it took to process all the stuff it needed to do that step, and how many more times it could do that within the frame time (16000 microseconds)) But this is calculated just before it starts the gpu stuff, so none of the time it takes for the gpu to do every is included, which explains why the fps can be saying it's done everything within the 60fps requirement, but the gpu stuff can add a huge amount processing time and then add huge amounts of lag, sometimes even more than 10 seconds or so, in which I think the application has a safety mechanism that closes it to prevent the hardware overheating etc.

Other than that, from what you've said I don't know what could be the problem. Do other games run on your computer fine?
 
Last edited:

ehddev

Member
Thanks for the well thought out reply!

That sounds reasonable, it didn't seem like that kind of memory usage would give me these issues. I guess another option is that the problem just correlates with increased memory usage.

I think my instance count may be fairly high. Depending on the room, the counts vary from ~200 to 1000 in some of the bigger rooms. Assuming all of these rooms are kept persistent, could that be what's slowing me down? I wouldn't say I'm creating them often, just once when the room is first entered and then a few hitboxes now and then. But there are a lot of them being created.

So the flow goes something like this: player enters new room -> room is populated with solids, enemies, etc. according to a predetermined schema-> memory usage increases by a few MB.

Your gpu explanation makes a lot of sense, thanks for that. It's hard to imagine what I might be trying to do graphically that's so intense, aside from the simple act of drawing a large number of instances. I've tried reducing my number of "batch breaks" - I'm not really drawing any primitives or calling draw_set_font/alpha or anything like that.

Other games seem to run just fine. It's not a top of the line pc but it was built for gaming about 4 years ago- it's got an i5 and a Radeon R9 390X. I also tried on a Dell Inspiron gaming laptop and ran into the same issues.
 

Joe Ellis

Member
Ah, yeah maybe that the rooms are persistent, cus I think it keeps all the instances in it active, so yeah that could well be the problem.
I see you need the rooms to be persistent so they stay the same when you come back to them, so I think you need to deactivate all the instances in each room when you leave it using instance_deactivate_all, just before the code to change room and use instance_activate_all just after you've entered the new room. I think this should keep it under control. Cus when they're deactivated they basically turn into structs and don't have any cpu processing, they just sit in ram with all their variables.
 

ehddev

Member
Ah, should've known there'd be consequences for keeping so many persistent rooms. I didn't realize objects in other rooms would continue to impact performance in that way.

These rooms are all generated dynamically (it's a roguelite esque game), so I've been continually using room_add() and setting them all to persistent. I think this room_add() approach will be problematic though, especially as the player dies and more rooms are generated because I have no way to remove them. So I'll probably need to ditch the persistent rooms idea altogether and come up with my own way of saving & switching room states rather than creating a bunch of rooms.

I'll report back once I have a more robust system in place, but some preliminary testing with just the instance deactivating on room end seems to be helping. Thanks a lot!
 
Top