angelwire
Member
I'm writing an outline shader in GMS2 using HLSL 11 and I'm having issues drawing and using a depth map.
I have two shaders, a main shader and an outline shader. The main shader encodes the pixel depth into the Green and Blue channels and draws that to the depth surface (the red channel is being used for something else). The main shader uses Multiple Render Targets to draw the depth to the depth surface and the actual textured model to the application surface. The surfaces are then both drawn in the Draw GUI event. The depth surface is drawn with the outline shader that takes the Blue and Green channels from the depth surface and decodes the depth. The depth is compared to neighboring pixels in order to draw the outline.
Here is the portion of the main fragment shader where the depth is set:
Here is the summary of the outline fragment shader:
Here's the result:
Setting the threshold higher removes some of the banding effects, but there are still issues. So I'm assuming the problem is the way I'm encoding and decoding the depth. This is what it looks like with a threshold of 300:
I can post the whole shader code if I need to but it's pretty long and I thought it would be better to summarize the important parts.
I've turned off the texture filtering and disabled mipmapping to make sure those weren't causing any problems.
I'll be more than happy to answer questions or provide more code if you need it. Thanks in advance
I have two shaders, a main shader and an outline shader. The main shader encodes the pixel depth into the Green and Blue channels and draws that to the depth surface (the red channel is being used for something else). The main shader uses Multiple Render Targets to draw the depth to the depth surface and the actual textured model to the application surface. The surfaces are then both drawn in the Draw GUI event. The depth surface is drawn with the outline shader that takes the Blue and Green channels from the depth surface and decodes the depth. The depth is compared to neighboring pixels in order to draw the outline.
Here is the portion of the main fragment shader where the depth is set:
Code:
float depthResolution = 256.0;
float4 outColor;
outColor.r = cameraDot; //I'm planning on using this later
outColor.g = INPUT.vDepth;
outColor.b = (INPUT.vDepth * depthResolution) % 1.00;
outColor.a = 1;
OUTPUT.depthColor = outColor;
Code:
int getDepth(float4 color)
{
int gv = floor(color.g * 255.99);
int bv = floor(color.b * 255.99);
int returnValue = (gv * 256) + bv;
return returnValue;
}
float4 main(PixelShaderInput INPUT) : SV_TARGET
{
//...the first part is just to get the color at the current position and the colors on each side
///...it's pretty long so cut it out
//This is a summary of the shader
int threshold = 100;
if (getDepth(currentColor) - getDepth(adjacentColor) > threshold)
{
return float4(1,0,0,1);
}
else
{
return float4(0,0,0,0);
}
}
}
Setting the threshold higher removes some of the banding effects, but there are still issues. So I'm assuming the problem is the way I'm encoding and decoding the depth. This is what it looks like with a threshold of 300:
I can post the whole shader code if I need to but it's pretty long and I thought it would be better to summarize the important parts.
I've turned off the texture filtering and disabled mipmapping to make sure those weren't causing any problems.
I'll be more than happy to answer questions or provide more code if you need it. Thanks in advance
Last edited: