Search code examples
unity-game-engineshaderhlslfragment-shader

How to make Unity glass shader only refract objects behind it?


I am looking for a glass shader for Unity that only refracts the objects behind it, or ideas for how to modify an existing glass shader to do that.

This screenshot shows what happens when I use FX/Glass/Stained BumpDistort on a curved plane mesh.

enter image description here

As you can see, the glass shader refracts both the sphere in front of the mesh and the ground behind it. I am looking for a shader that will only refract the objects behind it.

Here is the code for that shader, for reference:

// Per pixel bumped refraction.
// Uses a normal map to distort the image behind, and
// an additional texture to tint the color.

Shader "FX/Glass/Stained BumpDistort" {
Properties {
    _BumpAmt  ("Distortion", range (0,128)) = 10
    _MainTex ("Tint Color (RGB)", 2D) = "white" {}
    _BumpMap ("Normalmap", 2D) = "bump" {}
}

Category {

    // We must be transparent, so other objects are drawn before this one.
    Tags { "Queue"="Transparent" "RenderType"="Opaque" }


    SubShader {

        // This pass grabs the screen behind the object into a texture.
        // We can access the result in the next pass as _GrabTexture
        GrabPass {
            Name "BASE"
            Tags { "LightMode" = "Always" }
        }

        // Main pass: Take the texture grabbed above and use the bumpmap to perturb it
        // on to the screen
        Pass {
            Name "BASE"
            Tags { "LightMode" = "Always" }

CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#pragma multi_compile_fog
#include "UnityCG.cginc"

struct appdata_t {
    float4 vertex : POSITION;
    float2 texcoord: TEXCOORD0;
};

struct v2f {
    float4 vertex : SV_POSITION;
    float4 uvgrab : TEXCOORD0;
    float2 uvbump : TEXCOORD1;
    float2 uvmain : TEXCOORD2;
    UNITY_FOG_COORDS(3)
};

float _BumpAmt;
float4 _BumpMap_ST;
float4 _MainTex_ST;

v2f vert (appdata_t v)
{
    v2f o;
    o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
    #if UNITY_UV_STARTS_AT_TOP
    float scale = -1.0;
    #else
    float scale = 1.0;
    #endif
    o.uvgrab.xy = (float2(o.vertex.x, o.vertex.y*scale) + o.vertex.w) * 0.5;
    o.uvgrab.zw = o.vertex.zw;
    o.uvbump = TRANSFORM_TEX( v.texcoord, _BumpMap );
    o.uvmain = TRANSFORM_TEX( v.texcoord, _MainTex );
    UNITY_TRANSFER_FOG(o,o.vertex);
    return o;
}

sampler2D _GrabTexture;
float4 _GrabTexture_TexelSize;
sampler2D _BumpMap;
sampler2D _MainTex;

half4 frag (v2f i) : SV_Target
{
    // calculate perturbed coordinates
    half2 bump = UnpackNormal(tex2D( _BumpMap, i.uvbump )).rg; // we could optimize this by just reading the x & y without reconstructing the Z
    float2 offset = bump * _BumpAmt * _GrabTexture_TexelSize.xy;
    i.uvgrab.xy = offset * i.uvgrab.z + i.uvgrab.xy;

    half4 col = tex2Dproj( _GrabTexture, UNITY_PROJ_COORD(i.uvgrab));
    half4 tint = tex2D(_MainTex, i.uvmain);
    col *= tint;
    UNITY_APPLY_FOG(i.fogCoord, col);
    return col;
}
ENDCG
        }
    }

    // ------------------------------------------------------------------
    // Fallback for older cards and Unity non-Pro

    SubShader {
        Blend DstColor Zero
        Pass {
            Name "BASE"
            SetTexture [_MainTex] { combine texture }
        }
    }
}

}

My intuition is that it has to do with the way that _GrabTexture is captured, but I'm not entirely sure. I'd appreciate any advice. Thanks!


Solution

  • No simple answer for this. You cannot think about refraction without thinking about the context in some way, so let's see:

    Basically, it's not easy to define when an object is "behind" another one. There are different ways to even meassure a point's distance to the camera, let alone accounting for the whole geometry. There are many strange situations where geometry intersects, and the centers and bounds could be anywhere.

    Refraction is usually easy to think about in raytracing algorithms (you just march a ray and calculate how it bounces/refracts to get the colors). But here in raster graphics (used for 99% of real-time graphics), the objects are rendered as a whole, and in turns.

    What is going on with that image is that the background and ball are rendered first, and the glass later. The glass doesn't "refract" anything, it just draws itself as a distortion of whatever was written in the render buffer before.

    "Before" is key here. You don't get "behinds" in raster graphics, everything is done by being conscious of rendering order. Let's see how some refractions are created:

    • Manually set render queue tags for the shaders, so you know at what point in the pipeline they are drawn
    • Manually set each material's render queue
    • Create a script that constantly marshals the scene and every frame calculates what should be drawn before or after the glass according to position or any method you want, and set up the render queues in the materials
    • Create a script that render the scene filtering out (through various methods) the objects that shouldn't be refracted, and use that as the texture to refract (depending on the complexity of the scene, this is sometimes necessary)

    These are just some options off the top of my head, everything depends on your scene

    My advice:

    • Select the ball's material
    • Right-click on the Inspector window --> Tick on "Debug" mode
    • Set the Custom Render Queue to 2200 (after the regular geometry is drawn)
    • Select the glass' material
    • Set the Custom Render Queue to 2100 (after most geometry, but before the ball)