How to correctly use buffer_set_surface?

Alice

Darts addict
Forum Staff
Moderator
Well, I'm really stumped here. All I want to do is to generate a bitmap from scratch (in a buffer, because it gives such an option) and copy to a surface for further processing. However, whenever I actually try doing that, I instead end up clearing the surface somehow.

If someone has a basic code that e.g. generates entirely white bitmap and copies that to a surface (a buffer_set_surface "hello world" of sorts), I'd really appreciate it.

Generally, here is some code I used for testing (in Draw Event). Feel free to copy it to dummy object if you want to test it:
Code:
// source surface
var src = surface_create(64, 64);
surface_set_target(src);
draw_clear(c_black);
surface_reset_target();

// buffer
var b = buffer_create(64 * 64 * 4, buffer_fixed, 4);
buffer_get_surface(b, src, 0, 0, 0);
buffer_fill(b, 0, buffer_u32, $ffffffff, buffer_get_size(b));

// destination surface
var dest = surface_create(64, 64);
surface_set_target(dest);
draw_clear(c_red);
surface_reset_target();

//b = buffer_create(64 * 64 * 4, buffer_fixed, 4);

// copying data to destination and drawing the surface
buffer_set_surface(b, dest, 0, 0, 0);
draw_surface(dest, x, y);

// cleaning up
buffer_delete(b);
surface_free(src);
surface_free(dest);
The behaviour I observed is confusing, to say the least.
  • if I comment out buffer_set_surface line, the drawn surface is red, which is actually kinda obvious (the destination surface is initialised as red, and without buffer_set_surface is not affected any further)
  • if I comment out buffer_get_surface line, the drawn surface is transparent, as if after being first painted red it gets replaced with all #00000000 pixels (kind of like the buffer is empty)
  • if I don't comment out the buffer_get_surface line, the drawn surface is instead black, as if the buffer is successfully filled with #FF000000 pixels from the source surface
  • buffer_fill doesn't change the drawn surface; I'd expect it to make the represented bitmap all white; instead, all I get is the initially copied surface
  • uncommenting later buffer_create line doesn't change anything either; if buffer_get_surface line is executed, the drawn surface will be black like that source surface that copied itself to the buffer I'm not even referencing anymore

Anyone has idea how to correctly generate surfaces from buffers? That behaviour is unfathomable to me. I later tried adding more buffers, and it seems like the surface most recently retrieved via buffer_get_surface is the one actually drawn. It's all the more confusing since I heard from some people they used buffer_get_surface without issues...

Kind of like, the program looks at the buffer, makes sure it's valid and that it's not undersized (otherwise it fail silently drawing that buffer, apparently), then decides to draw whichever surface was last passed to buffer_get_surface. Kinda reminds me of that quote:
Every time the Tardis materialises in a new location, within the first nanosecond of landing, it analyses its surroundings, calculates a 12-dimensional data map of everything within a thousand mile radius, and determines which outher shell would blend in best with the enviroment. And then it disguises itself as a police telephone box from 1963.
Seriously, like the actual data from the buffer isn't used at all. ^^'
(and yet, buffer_get_surface seems to work just fine, populating the buffer and stuff; it's just buffer_set_surface that works in mysterious, incomprehensible ways...)
 

GMWolf

aka fel666
I think these are broken. I have tried multiple times since the feature came out and never got it to work.
I think buffer_set_surface is the one that isn't doing it's job correctly.

I iterate through the buffer instead, setting the pixels myself.
I got that working a Coue times but I can't remember the byte order (rgba or abgr. Could even be argb).
 
R

rui.rosario

Guest
Got it, check out this example.

The thing is the byte order is BGRA, and in the example I only write byte by byte to the buffer and then set the surface (although it has commented code in which I use buffer_get_surface to find out the byte order).

EDIT:
Building up on the process:
The buffer needs to be populated on a row basis, and the colors in the BGRA order (when writing byte by byte). I remember I used this a long time ago and back then I had problems when trying to use everything directly with 32-bits, don't know if that is still the case, I just prefer to write byte by byte.
 
Last edited by a moderator:

Alice

Darts addict
Forum Staff
Moderator
@rui.rosario: I've ran it, earlier removing the test sprite drawing. Nothing shows, like the sprite isn't actually drawn. It seems buffer_set_surface is indeed broken. >.<

Well, anyway, I played around with some workaround which very well might be far superior to what I originally planned, so I guess it's not that urgent anymore. Thanks for your help, people. ^^'

(still leaving it open in case some new solution appears, or an official confirmation about buffer_set_surface not working correctly; I've yet to get into Wraithious example, too, but not now...)
 
R

rui.rosario

Guest
@rui.rosario: I've ran it, earlier removing the test sprite drawing. Nothing shows, like the sprite isn't actually drawn. It seems buffer_set_surface is indeed broken. >.<
What version are you running it on? :eek: I'm running on 1.4.1757 and it works.
 

GMWolf

aka fel666
What version are you running it on? :eek: I'm running on 1.4.1757 and it works.
I'm starting to think this is system dependent.
I'll try running it on both my Intel chip and my nvidia card to see the differences...
My guess is it will work better on a dedicated card.
 
R

rui.rosario

Guest
I'm starting to think this is system dependent.
I'll try running it on both my Intel chip and my nvidia card to see the differences...
My guess is it will work better on a dedicated card.
The computer I ran it on only has the Intel chip (for you to consider this input in your tests)
 
W

Wraithious

Guest
I know this is an old thread, but this problem stillll isn't fixed, like for 2 YEARS,

create event:
Code:
sprite1=sprTest;
sprite2=spr_replace;
alarm[0]=25 // Give time to see the error
alarm0 event:
Code:
var buffer = buffer_create(16, buffer_fixed, 1); // Test 1
//var buffer = buffer_create(16, buffer_fast, 1); // Test 2
//var buffer = buffer_create(16, buffer_grow, 1); // Test 3

buffer_write(buffer, buffer_u8, $00); // blue
buffer_write(buffer, buffer_u8, $00); // green
buffer_write(buffer, buffer_u8, $FF); // red
buffer_write(buffer, buffer_u8, $FF); // alpha

buffer_write(buffer, buffer_u8, $00); // blue
buffer_write(buffer, buffer_u8, $FF); // green
buffer_write(buffer, buffer_u8, $00); // red
buffer_write(buffer, buffer_u8, $FF); // alpha

buffer_write(buffer, buffer_u8, $FF); // blue
buffer_write(buffer, buffer_u8, $00); // green
buffer_write(buffer, buffer_u8, $00); // red
buffer_write(buffer, buffer_u8, $FF); // alpha

buffer_write(buffer, buffer_u8, $00); // blue
buffer_write(buffer, buffer_u8, $00); // green
buffer_write(buffer, buffer_u8, $00); // red
buffer_write(buffer, buffer_u8, $FF); // alpha

var sur = surface_create(2, 2);
surface_set_target(sur);
buffer_set_surface(buffer, sur, 0, 0, 0);
surface_reset_target();
sprite2 = sprite_create_from_surface(sur, 0, 0, 2, 2, false, false, 0, 0);
surface_free(sur);
buffer_delete(buffer);
draw event:
Code:
///draw known sprite (left) & modified sprite (right)
draw_sprite_ext(sprite1, 0, 0, 0, 50, 50, 0, -1, 1.0);
draw_sprite_ext(sprite2, 0, 110, 0, 50, 50, 0, -1, 1.0);
@Alice you said you had some sort of workaround for this, can you explain how you did it?
 

YellowAfterlife

ᴏɴʟɪɴᴇ ᴍᴜʟᴛɪᴘʟᴀʏᴇʀ
Forum Staff
Moderator
I cant open that link, do you know any other method I can use? My goal is getting an image into a buffer and creating sprites from that, the reason I have to do it this specific way is so I can chunk 16384x16384 images to load into my project.
Odd, I can open it from an incognito tab just fine.

In short, this only happens in GMS1 and only on Intel GPUs, being a DX9 issue. I guess the workaround would be to detect this on game start and revert to going through a file instead.
 
W

Wraithious

Guest
Odd, I can open it from an incognito tab just fine.

In short, this only happens in GMS1 and only on Intel GPUs, being a DX9 issue. I guess the workaround would be to detect this on game start and revert to going through a file instead.
hmmm, The thing is my gpu is nividia
my cards specs: Nividia GeForce 1050 GDDR5 @ 4gb (128 bit)
but my processors (4) are all intel core i7 7th gen

So is it actually a processor issue?
 

YellowAfterlife

ᴏɴʟɪɴᴇ ᴍᴜʟᴛɪᴘʟᴀʏᴇʀ
Forum Staff
Moderator
hmmm, The thing is my gpu is nividia
my cards specs: Nividia GeForce 1050 GDDR5 @ 4gb (128 bit)
but my processors (4) are all intel core i7 7th gen

So is it actually a processor issue?
I don't think - on my end it specifically happens if the game is ran on the Intel GPU, and does not happen when ran on the NVidia GPU. Others reported similar results.
 
W

Wraithious

Guest
I don't think - on my end it specifically happens if the game is ran on the Intel GPU, and does not happen when ran on the NVidia GPU. Others reported similar results.
that's really bizzar that it won't work on my system then, I have a fairly good nividia card :(
 

YellowAfterlife

ᴏɴʟɪɴᴇ ᴍᴜʟᴛɪᴘʟᴀʏᴇʀ
Forum Staff
Moderator
that's really bizzar that it won't work on my system then, I have a fairly good nividia card :(
Here's the example from the bug report (executable version). If you are getting one squares, it's somehow busted on your system. If you are getting two, you just aren't using the function right (it doesn't throw errors if something's wrong aside of surface being amiss)
 
W

Wraithious

Guest
Here's the example from the bug report (executable version). If you are getting one squares, it's somehow busted on your system. If you are getting two, you just aren't using the function right (it doesn't throw errors if something's wrong aside of surface being amiss)
Yep, just getting 1 square :(
Clipboard03.png

My version of gamemaker is 1.4.1804 if that might make any difference?

EDIT I guess not, I just fired up my old version of early access 1.99.551 and got the same results
 
Last edited by a moderator:
W

Wraithious

Guest
@YellowAfterlife Thanks man you are a lifesaver!! I checked my graphics card settings and saw this ridiculousness:
Clipboard04.png

So I simply changed it to what it should of been from the start, I mean really why build a laptop with a nice graphics card and ship it with the stock card enabled in the first place???

Clipboard05.png

Anyways it's fixed now so thank you again.

I should report this to nividia tho, The settings were on 'auto', so wouldn't you think that if the graphics card was receiving data that it couldn't do anything with that it would detect that and 'auto' switch to the other card? for all the hype about AI these days they sure don't implement it where they should....
 
Last edited by a moderator:
Top