Consistently low frame rate

Ive been working on a game on my pc for a while and it smoothly at 60 fps but when I tried to play it on my laptop it runs at a consistent 20 or something frames. I was wondering if anyone had any ideas I could try. I tried messing with the sleep margin a bit but I didnt really know what I was doing. I have the majority of my texture pages sorted, I dont think I have any memory leaks.

Im not sure what information is helpful but my pc has an i5 4670 and a gtx980ti and runs windows 7
my laptop has an i7-4710HQ 2.5 GHz and a GTX 860M and runs windows 8
My laptop can run other games just fine.

If you want to look at the game for you self to see if there are any glaring problems I have the game up on itchio if you search eye strain demo it's the one with the skeleton thumbnail should be the first.
ps the frames seem fine on the title page and you need a game pad to play.

Im new to posting in the forums so im sorry if im an idiot about anything.
 

GMWolf

aka fel666
There seems to be an issue with GameMaker and intel GPUs.
Try to run you game using the nvidia graphics card (right click -> run with graphics processor -> high performance nvidia graphics card).
If now the game runs fine, then that is most likely the issue, unfortunately, there isnt much you can do about that until YYG get round to issuing a fix.
It would also be nice to know what sort of project you are working on to perhaps isolate the sort of projects causing the problem:
are you using surfaces? are you using vertex buffers? a lot of draw calls? etc.
 
I gave that a try and it runs great! I looked around at a bunch of different posts but I dont think I ever saw that come up thank you so much!

There is a lot of drawing I have 8 moving backgrounds I have a bunch of different layers of tiles. The majority of walls and other objects arent drawn when their off screen. I tried having the tiles not be drawn when they're off screen but I didnt notice a difference in frame rate
Im using 2 different surfaces that are 1024,512 one is for drawing ripples in the water and the other is for lighting.
Im not using vertex buffers.
 

GMWolf

aka fel666
I have a strong suspicion that the issue with Intel GPUs is the use of surfaces.
Would you mind trying to run you game on the Intel GPU without using sirfasur (just disable your lighting and ripples) so we can determine if that is the problem?
That way we can learn to avoid it, and YYG can maybe fix it.
 

MilesThatch

Member
I have a strong suspicion that the issue with Intel GPUs is the use of surfaces.
Would you mind trying to run you game on the Intel GPU without using sirfasur (just disable your lighting and ripples) so we can determine if that is the problem?
That way we can learn to avoid it, and YYG can maybe fix it.
you know, when I was testing stuff out way back before I got you the key to test RTAG on your laptop, Getting rid of surfaces Indeed made the frame rate issue go away. I'm going to have to do a test one more time to verify but if my memory is correct, surfaces were just the thing that caused the frame-rate issues. Thing is though that the surfaces I use are just the screen size (not room size)
 

GMWolf

aka fel666
you know, when I was testing stuff out way back before I got you the key to test RTAG on your laptop, Getting rid of surfaces Indeed made the frame rate issue go away. I'm going to have to do a test one more time to verify but if my memory is correct, surfaces were just the thing that caused the frame-rate issues. Thing is though that the surfaces I use are just the screen size (not room size)
I'm not saying the use of surfaces is wrong, surfaces are really useful.
I'm saying YYGs implementation clearly has an issue with Intel GPUs.
 

sylvain_l

Member
I'm saying YYGs implementation clearly has an issue with Intel GPUs.
surface for me is just working directly with the VRAM of GPU.

Also Intel integrated GPU are often less powerfull then most dedicated GPU and most important as far as I know generally don't have dedicated VRAM, they just use RAM. That's often a big leap in memory frequencies and must clearly be slower when it come to allocation and manipulation.

for once, I don't see how yoyogames could have a way of changing that!
 

GMWolf

aka fel666
surface for me is just working directly with the VRAM of GPU.

Also Intel integrated GPU are often less powerfull then most dedicated GPU and most important as far as I know generally don't have dedicated VRAM, they just use RAM. That's often a big leap in memory frequencies and must clearly be slower when it come to allocation and manipulation.

for once, I don't see how yoyogames could have a way of changing that!
Well, when working in OpenGL, I have no issues using "surfaces" on my Intel chip.

There is a lot that goes into "surfaces", or Framebuffer in openGL.
The format of the texture, or the buffer storage flags are both important here.
For instance, the buffers could be mapped persistently, or coherently, etc, which can have a significant impact on the performance.

It's entirely possible the specifics they chose don't play nicely with Intel GPUs.


I'm aware GM uses direct X on windows, but i would be very surprised if DX didn't offer similar options when building frame buffers.
 

sylvain_l

Member
It's entirely possible the specifics they chose don't play nicely with Intel GPUs.
Never worked with OpenGL directly, only through games engines.
So I don't know what can be optimized/adapted at that level.

But feels logical if they had to choose a spec/ optimization that could only work better with one side: integrated GPU or dedicated GPU; for game favoring the dedicated doesn't seem wrong to me. Of course, I suppose with more work should be possible to handle both situations at best.
 

GMWolf

aka fel666
Never worked with OpenGL directly, only through games engines.
So I don't know what can be optimized/adapted at that level.

But feels logical if they had to choose a spec/ optimization that could only work better with one side: integrated GPU or dedicated GPU; for game favoring the dedicated doesn't seem wrong to me. Of course, I suppose with more work should be possible to handle both situations at best.
Its entirely possible to detect what GPU the user is running and change the implementation based on that.
its not even that hard! just get the vendor, and switch based on that. Its quite common...
 
I have a strong suspicion that the issue with Intel GPUs is the use of surfaces.
Would you mind trying to run you game on the Intel GPU without using sirfasur (just disable your lighting and ripples) so we can determine if that is the problem?
That way we can learn to avoid it, and YYG can maybe fix it.
So I tried taking out both the surfaces but it didnt seem to change the frame rate much. I took out all the backgrounds and it seemed to help a bit but it still wasnt running at 60. I tried taking even more out but it didnt seem to help anymore. Ill take another look at it again another day and get back to you if I find out what it is.

Also tried using fps real and it was saying I was getting about 1000fps but it was definitely getting around 30 or less.(I dont know if thats an indicator of another problem.)
 

TheouAegis

Member
Make sure Use Synchronization To Avoid Tearing is disabled (or however they labeled it now). Had a project that ran fine in GM8 until I hit a custom room_transition code. It ran hella slow -- but only for me! I was scratching my head for a month until it occurred to me to check the game settings and I noticed I accidently enabled vsync.
 
Top