Oof, the legend himself! Been following your work for years, actually learnt how to work with 3d projection and collision in GM8 with your olde fps example!
Thanks for the tip, that'd surely help a lot. I should really plan out how I want to finish this project, I'm currently designing with 30fps being default, with unlocked framerate being available for those with killer single core performance or whatever it takes to actually render at 60fps with the delphi runner - saying that alone inspires me to port to GMS for the new runner alone.
BTW have you encountered the joystick fps bug in Studio at all? I can't tell if it's really an issue in 1.4 anymore or if it only shows up when joystick_* is used?
Aww, you're flattering me~ (Especially considering I didn't really get
better at 3D since those days - I've been bashing my head against trying to get shaders to work for the last 5 hours in a futile attempt to make one of my old GM8 projects look good
)
GMS1 has native support for XInput (XBOX / PS3+ gamepads), and I've never encountered any lag issues in any of my GMS1 games (that is to say,
all of my games
).
I suppose those might be related, if Windows now tries force-feeding XInput into all apps and only gives them the DirectInput (old "joystick" API) raw data if they time out trying to process it.... GMS1 games having the API means they don't need to get the slower default data. Or something. I just realized the games were slower
without a gamepad so my entire logic was flawed. Oh well.
The Game Maker XInput API also has a bunch of advantages over the old DirectInput API because buttons are guaranteed to always be in the same spots, so you can refer to the "left face button" or "right shoulder button" and
know the button is there and comfortable to press. With the joystick API, buttons just are numbered and the joystick manufacturer has no real rules to follow for placement.
In GMS1 and onwards, you can get a
lot more performance by using "vertex buffers"... it's a fairly straightforward change. First define a vertex format with the same data as the most detailed d3d_vertex_* function you wanna use, create a convenience script that adds the same data in the same order, and then it's just a big text-replace party.
Code:
///model_vertexify_ff(buffer,x,y,z,nx,ny,nz,u,v,col,alpha)
vertex_position_3d(argument0,argument1,argument2,argument3)
vertex_normal(argument0,argument4,argument5,argument6)
vertex_texcoord(argument0,argument7,argument8)
vertex_colour(argument0,argument9,argument10)
Using a single vertex buffer for the entire level, I made my 3D game get like a 50x overall performance boost... and drawing the level went from 70% to 4% resource usage. Basically, a single vertex buffer takes the same time to draw no matter how many vertices is in it.
(a "vertex buffer" is just a fancy name for a raw 3D model, btw)
There's another caveat you need to keep in mind: now you provide the primitive type (triangle list / fan, etc) when submitting the vertex buffer for drawing, so all polygons you add to the buffer needs to be the same format. (And this usually means you have to convert any triangle-fans into triangle-lists). It gets a bit involved, and I've consumed a large amount of sketch paper drawing out in what order you need to draw the vertices to create the same triangles... visualizing this stuff helps a LOT with getting the maths right.
There's a bunch of other incompabilities as well, like execute_string being gone... you can get around that by using script_execute() and putting every string into a script, unless you built the strings to execute dynamically. The order Create, Room Start and Instance Creation Code are run are different (and changes
again in Studio 2). I don't remember all the stuff off the top of my head, it's probably better if you ask once you actually run into any issues than me just making this wall of text any longer with preemptive warnings