surface_set_target_ext failing?

Binsk

Member
Howdy, this is a new one for me and I'm reaching out to see if anyone with more technical knowledge can help me out.

I am working on a project that is targeting multiple systems and on one (the Nintendo Switch) I am getting a problem with surface_set_target_ext failing and returning 0. I haven't the slightest what could be causing this error. I use more than one shader that uses MRTs and it is simply failing in this one case.

The shader only uses 2 MRTs and it takes in 4 samplers (counting GameMaker's default). I wrote a test-project for a very simple shader that imitated these inputs and it ran fine. What kinds of cases might cause this to fail? I realize that I am kind of taking a shot in the dark but if anyone has any extra know-how w/ the Switch that would be great.

I have to do tests through proxy so this is becoming a bit of a nightmare to debug. The game is only taking about 500-800mb RAM and the surface sizes at absolute worst would be 2048x2048. As a note, when I set the MRTs, the first surface (of equal size) succeeds, but the second one fails.

Relevant code looks like this, but I'm not sure how much help this would be:
GML:
gpu_set_texfilter(false);
gpu_set_blendmode_ext(bm_one, bm_zero);
shader_set(shd_depth);

if (surface_set_target_ext(0, surface) <= 0)
    show_error("[DEBUG ERROR / MERGE RENDER] Failed to set albedo surface!", true);
if (surface_set_target_ext(1, render_tbuffer[5]) <= 0)
    show_error("[DEBUG ERROR / MERGE RENDER] Failed to set depth surface!", true);

    // Set the surface to merge:
texture_set_stage(shd_depth_ustop, surface_get_texture(render_tbuffer[not render_tbuffer_active]));            // Old albedo buffer
texture_set_stage(shd_depth_udepth1, surface_get_texture(render_tbuffer[render_tbuffer_active + 3]));        // New depth buffer
texture_set_stage(shd_depth_udepth2, surface_get_texture(render_tbuffer[not render_tbuffer_active + 3]));    // Old depth buffer
shader_set_uniform_f(shd_depth_uspos, // Relative position on our surface
                     ((cpos_o.x + render_tbuffer_hpadding - ssize.x * 0.5 * sscale)) / surface_get_width(surface),
                     ((cpos_o.y + render_tbuffer_vpadding - ssize.y * 0.5 * sscale)) / surface_get_height(surface));
shader_set_uniform_f(shd_depth_ussize, sscale); // Relative size to our surface

draw_surface(render_tbuffer[render_tbuffer_active], 0, 0);                                                        // New albedo buffer
shader_reset();
surface_reset_target();
The gist of the shader is it takes an old depth/albedo buffer (with adjusted scale / position) and a new depth/albedo buffer and merges them into a new depth/albedo buffer taking depth into account so as to handle color blending properly. The second MRT is the new merged depth buffer.
 

Binsk

Member
Still trying to figure this one out. Some new info:

I have tried swapping which order MRTs are set and which surfaces go in which MRTs. It is always slot 1 that fails regardless of order or surface ID. Again, no idea why since I use MRTs elsewhere w/o issue.

Edit: Looks like I can have MRTs or I can have samplers. Once I try to use both in the same shader the second+ MRT fails to bind.
 
Last edited:
Top