Screen Space Refraction in WebGL

Back in 2018, I worked on a quick demo for a potential client. The topic was rendering a beer bottle in a browser. I only had a few days, but decided to go ninja and do it all from scratch in WebGL. I knew I had to get refraction right for it to look good so I went with implementing this article here, which is connected to this one here both written by the same author.

The author from the above articles also has an implementation available here which I used as a reference for my demo. The general idea is quite simple, that’s why I originally liked it. It does require however several passes and a precomputation step. The most important diagram we need to focus on, which also encapsulates that this method tries to do ( and also what it doesn’t do) is this one:

We need to find P2 ≈ P1 + dT1 . We know P1 and T1, we need to compute d. The paper suggests we can approximate d as dV which is the distance from the front facing fragment to the back facing fragment in view space. However, it also suggests we can approximate it more accurately by interpolating between dV adn dN :

d = (θt / θi ) * dV + (1 – θt / θi) * dN

We already determined dV above, we need dN and this is where our precomputation step comes in. We can precompute dN and store it on a per-vertex basis. For each vertex we find the nearest vertex along it’s normal. I remember my implementation for this was quite naive since I really did not have time to come up with anything fancy and it took quite some time to compute for all vertices, but hey, it worked.

Right, so at this point we know what we have to do, we have our precomputed data available, so we need to start rendering the passes :

  • We render the opaque/non refractive objects in the scene. In this pass we render the fragments color normally in the color attachment and it’s depth in the depth attachment. We will be using this in the next phase of the implementation
  • We render the view space backface normals as color + Depth. We will need the backface normals in order to determine N2 . We will need the depth in order to determine dV
  • We render the refractive objects using the above generated textures.

Right, so now we have computed P2 and T2, so we know the location of the ray exiting our object and also it’s direction. We need to march into the opaque scene’s depth buffer in order to obtain the closest point where it ends up. At this point, the paper suggests that before starting the main marching loop, they also do a smaller depth buffer indexing in order to find a better depth where the main loop should start. Personally, I skipped this step and just did the main marching part since this was fine for the scene I had in the demo.

#define STEP 0.05
#define SIZE 0.5

vec2 ProjectToTexCoord( vec4 eyeSpacePos )
{
    vec4 projLoc = projectionMatrix * eyeSpacePos;
    return ( (projLoc.xy / projLoc.w) * 0.5 + 0.5 );
}

for (float index = 0.0; index < SIZE; index += STEP)
{   
        vec2 coords = ProjectToTexCoord( P2 + T2 * index );
		float texel = texture2D( geometryDepth, coords ).x;
		float distA = -(far*near / (texel * (far-near) - far)) + P2.z;
		if ( abs(distA-index) < deltaDist )
		{
			deltaDist = abs(distA-index);
			minDist = index;
		}
}
vec4 refractedVSPos = P2+T2*minDist;
vec2 qCoords = ProjectToTexCoord(refractedVSPos);
vec3 refractedColor = texture2D( geometryColor, qCoords).rgb; 

This implementation has several issues and downsides. First off, being in screen space it suffers from the hidden surfaces issue. Second, it has no way of simulating total internal reflection, so rendering say a diamond correctly is not possible. The depth buffer marching we’re making will not return the actual physical location the refracted ray hits. It’s only an approximation. Regardless of the issues above, the results are quite convincing and for typical cases they closely match ray traced results, as the article points out.

Right, so now that I had nice refractions, I also had to implement a whole pile of other stuff like PBR, HDR and post processing. In the end I managed to get all of these working together nicely. Here’s a video I made for the demo. It has a wicked soundtrack, a song from Arca called Thievery, which I do not claim any kind of ownership to. Also, the beer bottle is from Heineken, but that was the asset I got so I couldn’t do much about it. I’m putting the demo video here, and hoping to not get sued or something.

Response

  1. Doc Avatar
    Doc

    Cool article!

Leave a Reply

Your email address will not be published. Required fields are marked *