So, do I want high GPU?(shader..surface)

Joh

Member
Hi, started using shaders and I've noticed they lead to High GPU usage. (which I'm fairly sure is by design)
I'm using the Xor blur, and it essentially doubles my GPU usage from 25% to 50%
First I thought that was bad and I had to fix this, but reading upon it, it seems like its a good thing? well hadn't seen any GM specific topic so I'm asking to be sure.

Is high GPU ok or even good?
I'm noticing it has a positive effect on the true FPS (its weird though, because the room fps is unaffected).

As for what I want to do, just in case its worth doing anyway.
I'm bluring my background. but I do it at full screen (straight from draw event).
I realize, it could be more efficient to draw it to a surface, blur the surface and draw that surface aftwerwards. I could even scale-down the background, blur that small background, and draw a scaled surface.
The downside is, i have to manage surfaces, which I prefer to avoid; especially if everything is fine as is (but the GPU).

-side note, should I prioritize the shader version of most effects? (shadows, outlines etc)
Thank you!
 

Niels

Member
Xor's blur shader is really badly optimized (I think most blur shaders are) tbh.
Basically it copies the the layer and pastes it a few times with a slight offset.
That's why it's quite heavy on the cpu
 

Nocturne

Friendly Tyrant
Forum Staff
Admin
In general, you want anything that can run on the GPU to run on the GPU, as that's what it's their for! Increased GPU usage is fine for shaders and to be expected, as they are specifically designed to run on the GPU and free up CPU cycles.

I'm noticing it has a positive effect on the true FPS (its weird though, because the room fps is unaffected).
This is because the room FPS is a fixed value that GameMaker will always attempt to maintain to keep the game speed stable and smooth, while the real FPS value is the amount of frames you COULD be running in a second if the room FPS value wasn't capped.
 

Yal

🐧 *penguin noises*
GMC Elder
High GPU usage being a problem depends a lot on how good/bad your GPU is compared to your expected players... if you have an i7 9999+++ with 500 cores and watercooling, and you make a game that runs at 95% GPU load, little Johnny that uses a cheap integrated GPU in his laptop might not even be able to run the game at 1 frame per second. Same goes for CPU stats, high usages only becomes a problem if your hardware is above average, because then average people will have an even higher usage.

The best solution to this issue? Let the player turn off effects in the options, and offer hints in the menus that tell them what effects has the highest effect on CPU/GPU usage.
 

Joh

Member
Xor's blur shader is really badly optimized (I think most blur shaders are) tbh.
Basically it copies the the layer and pastes it a few times with a slight offset.
That's why it's quite heavy on the cpu
Seems to me like it just averages pixel colors around. it is true that it ran awful as is, droped me to 45 fps. but I reduced directions & quality and now it works fine.
redrawing same sprite with offsets (low alpha) was the non shader approach I was thinking of using; i think it works, and I don't think it should have that heavy a cost.
any tips on potential optimization?
In general, you want anything that can run on the GPU to run on the GPU, as that's what it's their for! Increased GPU usage is fine for shaders and to be expected, as they are specifically designed to run on the GPU and free up CPU cycles.


This is because the room FPS is a fixed value that GameMaker will always attempt to maintain to keep the game speed stable and smooth, while the real FPS value is the amount of frames you COULD be running in a second if the room FPS value wasn't capped.
Ahh, makes sense.
my thing with the fps is that with full quality blur it ran at 45 fps fixed!(cap still 60) even if true fps was much higher and stable. it was odd as it was like 45 was a new fps ceiling that didn't reflect the actual potential

High GPU usage being a problem depends a lot on how good/bad your GPU is compared to your expected players... if you have an i7 9999+++ with 500 cores and watercooling, and you make a game that runs at 95% GPU load, little Johnny that uses a cheap integrated GPU in his laptop might not even be able to run the game at 1 frame per second. Same goes for CPU stats, high usages only becomes a problem if your hardware is above average, because then average people will have an even higher usage.

The best solution to this issue? Let the player turn off effects in the options, and offer hints in the menus that tell them what effects has the highest effect on CPU/GPU usage.
oh, thanks! I have i5, i think thats fairly average, but I should probably avoid pushing things too hard. and yeah, ill keep effect menu option in mind

Thanks everyone
 

Yal

🐧 *penguin noises*
GMC Elder
There's a whole bunch of i5s with wildly varying specs (including some with better performance than i7s), just so you know :p It's slighly more helpful than "I have an intel GPU, that should be average" but not a whole lot.
 
  • Like
Reactions: Joh
Top