Cyanilux

Game Dev Blog & Tutorials

FAQ

Over time I’ve answered quite a few questions in the Shaders (and URP) channels on the Official Unity Discord (as well as my own discord). But one problem with using discords over regular forums, is those answers are no longer searchable through Google - so I’ve made this page!

Sections :

General Unity

You’ll commonly see your materials turn magenta if you’ve tried switching to a different render pipeline. This occurs because each pipeline uses different shaders.

You can automatically convert any materials using Unity’s shaders (e.g. Standard) :

Any custom shaders (that you’ve written or from downloaded assets) cannot be converted automatically. You would need to rewrite them or a find a replacement asset.

If you aren’t upgrading, materials can also turn magenta when there is an error in the shader syntax (Should be able to see these in the Console, or in the Inspector when the shader asset is selected).

If a magenta material is using a Shader Graph, be sure the target is assigned correctly, and save the graph using the Save Asset button in the top left of the shader graph window.

In a build, magenta materials could also mean the shader uses a compile target which is higher than what is supported by the platform.

By default the Scene View does not constantly re-render it’s camera to help with editor performance.

If requried though, there’s a dropdown in the top right where you can enable “Always Refresh” (or “Animated Materials” in older versions)

Correctly sorting transparent geometry can be difficult. It’s basically an unsolved problem in realtime computer graphics. There usually isn’t a single answer that can solve every case.

Opaque objects don’t have this problem as they write to the Depth Buffer (ZWrite), which allows Depth Testing (ZTest) to occur to sort on a per-fragment/pixel basis. Unity also will render opaque objects closer to the camera first, so we don’t need to waste time rendering pixels behind others.

But with the Transparent queue, shaders typically don’t write to the depth buffer and we have to render objects further away first, in order to achieve the correct Alpha Blending. Even if depth write was enabled, it still wouldn’t sort correctly when blending is involved.

Quick tips :

Solutions to fixing common transparent sorting issues are listed below.

Render Queue

Transparent objects will sort based on their mesh origin. If you know that a particular transparent material should always appear behind / on top of others, you can use the Render Queue or Sorting Priority on the material to force the render order.

Pre-sorting Mesh Triangle Order

When rendering a mesh, the faces are rendered in order of their indices/triangles array. In cases where you don’t have intersecting geometry, it may be possible to pre-sort the triangles to force the draw order. This can be more performant than using the above method, but requires some setup in the modelling program.

(Image)

A mesh consisting of layered spheres.
Left : The inner spheres are incorrectly rendering on top.
Right : Sorting corrected by combining layers in specific order in Blender.

It may vary in each 3D modelling program, but as you combine multiple meshes, the assumption is that the triangles are appended.

Example in Blender :

  1. Separate each “layer” of triangles into separate objects.
  2. Select two layers - outer most layer first. I find it easiest to do this in the Outliner while holding Ctrl
  3. Combine/Join with Ctrl+J (while hovering over the 3D Viewport)
  4. Repeat steps 2 & 3 until each layer is collapsed down into a single mesh
(Image)

Blender Outliner, showing each layer being combined after repeating these steps.

Transparent Depth Prepass

When you don’t want to see overlapping faces of the model through itself, we can use a prepass to write to the depth buffer only. When we then render our object normally it can ZTest against this.

(Image)

Left : Normal Render
Right : Render with Prepass
(Character from Kenney Animated Characters 2)

For Built-in RP, can add a pass before your main one :

1
2
3
4
Pass {
    ZWrite On       // Write to the Depth Buffer
    ColorMask 0     // Don't write to the Color Buffer
}

(Can find another example here : https://forum.unity.com/threads/transparent-depth-shader-good-for-ghosts.149511/)

In other pipelines we can’t use multi-pass shaders, but you can instead use a separate shader/material with -1 on the Sorting Priority (or Render Queue). Can use the following code in a .shader file (Create → Unlit Shader)

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
Shader "Custom/DepthPrepass" {
    Properties { }
    SubShader {
        Tags { "Queue"="Transparent-1" }
        Pass {
            ZWrite On
            ColorMask 0
        }
    }
}

OIT

There are also methods to achieve Order Independent Transparency - But I’m not at all familiar with these. I think they’re quite expensive (either in terms of performance or memory), and Unity does not include any support for them so you’d need to implement this stuff yourself (or find an asset/package that handles it for you. This would require custom shaders too though, so not easy to implement)

2021.2+ versions of the URP have a Depth Priming option on the Universal Renderer Asset while using the Forward path. In the URP template this is set to Auto (enabled if URP is already doing a Depth Prepass rather than a CopyDepth pass. Can check Frame Debugger to see what is used)

Depth Priming allows rendering opaque objects with a Depth Prepass, to populate the depth buffer without the fragment shading cost - potentially reducing the cost of overdraw in the opaque stage.

It requires shaders to have a pass with the DepthOnly LightMode tag in the shader (and DepthNormals if using a renderer feature that requires the Camera Normals Texture - such as SSAO or Decals). If a shader doesn’t have these passes, the forward pass is still rendered (but with ZWrite Off) and the skybox later draws over it, hence it appears invisible.

For example of how to set these passes up, see the templates here : https://github.com/Cyanilux/URP_ShaderCodeTemplates/

You could instead set Depth Priming to Disabled, which will make the object visible. However without the DepthOnly/DepthNormals passes, objects still may not appear in the Camera Depth Texture (Scene Depth node) or Camera Normals Texture.

An alternative is using Shader Graph, which will generate the passes required by URP.

OnRenderImage does not work in URP.

If you are using it to apply image effects with Graphics.Blit, In 2022.2+ should use the Fullscreen Pass Renderer Feature and Fullscreen Graph - or custom shader including Blit.hlsl. Use the Vert function as the vertex shader and call FragBlit in your fragment shader.

For prior versions, can use my Blit Renderer Feature.

For effects that require more than just a Blit, may need to write your own Custom Renderer Feature - my post on them may help (aimed at 2022+).

If you want to edit materials from imported models in Unity, you need to extract those materials before they can be edited. Click on the model file in the Project window under your assets, then the Materials tab in the Inspector. Should be a button to extract there.

Alternatively / For models with no materials / or for Unity default objects (Cube, Sphere, Capsule, etc) :

As of Unity 2022, Shader Graph automatically generates a Material under each graph asset. These materials cannot be edited from Unity’s Inspector, but uses the default values assigned to properties within Shader Graph. To adjust them, select the property in the Blackboard and edit the Default field in the Graph Inspector window inside Shader Graph.

If you want a Material with values different than the defaults, you need to create a new Material asset using the shader. In the Project window under your assets, right-click the Shader Graph Asset then select Create → Material.

Shader Graph (Issues / Workarounds)

The Main Preview showing as Magenta/Pink usually means that the graph Target is not supported in the current Render Pipeline. This can be found under the Graph Settings tab of the Graph Inspector window - which you can toggle with a button in the top right of the graph.

If you have created a “Blank Shader Graph” you would also need to add a target to that list before you can use the graph on a material.

In 2021.2+ it is possible to add the Built-in RP as a target. Older verisons do not have support for Built-in and must use URP or HDRP. Once the URP or HDRP package is installed, there is more that is required to properly configure Unity to use those pipelines, such as assigning a pipeline asset under Project Settings.

See the documentation for steps (can select version in top left of these pages) :

If you have already done this, check the Target (under the Graph Settings tab of the Graph Inspector window). Make sure this matches the the pipeline you have installed & configured. Then save the graph using the Save Asset button in the top left.

May be able to find more information in my Intro to Shader Graph post.

Note that you may also see other previews showing as Magenta/Pink when obtaining undefined values (NAN), such as when dividing by zero, or using negative values in the Power node (as the hlsl pow function only supports positive ranges).

In v10+ (Unity 2020.2+), you can obtain a redirect node by double left-clicking a connection wire. Or right-click and select “Add Redirect”.

Intro to Shader Graph post - Redirect

This is intentional. Previews inside nodes do not show alpha/transparency, only the RGB data. It’s common for colours to “stretch out” in fully transparent areas to avoid artifacts. If it was instead black in these areas, the colour might darken along the transparent edge. (That may look even worse when dealing with mipmaps, though you typically wouldn’t have those with sprites in particular)

Only the Main Preview will show transparency.

(Image)

(Robot Character from Kenney Toon Characters 1)

Even with these stretched-out previews, the final result should look correct provided the alpha channel (A output from Sample Texture 2D) is connected to the Alpha port in the Master Stack.

If for some reason you do need the texture masked correctly to the alpha, can Multiply the RGBA and A outputs. Or if you need to control the background color, put the A output into the T input on a Lerp with A as the background and B as the RGBA output.

Some nodes can only be connected to the Fragment stage.

Shader Graph does not make this problem very clear, but it is most commonly encountered when dealing with the Sample Texture 2D node, which uses these derivatives behind the scenes to calculate the mipmap level when sampling. You would instead need to use the Sample Texture 2D LOD version.

There are other nodes that rely on the derivates, such as Normal From Height and the various procedural shapes (e.g. Ellipse, Rectangle, Polygon, Rounded Rectangle, Rounded Polygon).

There are also nodes that use SAMPLE_TEXTURE2D(tex, sampler, uv) in their code (e.g. Triplanar), but these could likely be rewritten manually via a Custom Function node to use the SAMPLE_TEXTURE2D_LOD(tex, sampler, uv, lod) macro instead, in order to support the vertex shader. (Can view generated code for a specific node by right-clicking it and selecting Show Generated Code or Open Documentation)

For more information / nodes affected, see Intro to Shader Graph post - Fragment-Only Nodes

Editing Vertex Positions does not automatically change the Normal Vectors, which are required to affect shading.

I have a few methods on how to calculate new normals listed here : Vertex Displacement post - Recalculating Normals

This occurs because the Normal vectors stored in the mesh are intended for the front faces only. We can flip them for back faces by making use of the Is Front Face and Branch nodes.

If using a Normal Map, we would use the result of our Sample Texture 2D in the True input, and Multiply it by (1, 1, -1) for the False input. This would then go into the Normal (Tangent Space) port on the Fragment stage of the Master Stack.

If not using a Normal Map, we could use the Normal Vector node in Tangent space - or just a Vector3 set to (0, 0, 1) as those will be equal…

But it should actually be cheaper to change the Fragment Normal Space to “World” in the Graph Settings (tab of Graph Inspector, toggled with button in the top right of graph), as this avoids the need for the Tangent → World transformation. We can then use the Normal Vector node in World space for the True input and put through a Negate node for the False input.

In some cases you might want to use the Fraction node (aka frac() in HLSL) on the UVs (e.g. for repeating sections of a larger texture). But when using this with the Sample Texture 2D you may notice some strange pixellation artifacts along the seam produced by the jump in the UV coordinates.

This isn’t limited to the Fraction node, but I find that this is the most common place where it is noticeable. It occurs for anything that causes a jump in the UVs. Another fairly common example is the seam in the Y component when using the Polar Coordinates node.

This pixellation occurs because the Sample Texture 2D calculates the mipmap level for sampling by using the partial screenspace derivatives (DDX, DDY), which compare values between neighbouring pixels. The fragment shader can do this as it runs in 2x2 pixel blocks.

Usually this mipmapping is a good thing, as it reduces artifacts when viewing the texture at a distance. But when comparing values for pixels along this seam, the jump is interpreted as needing to sample a higher LOD/mipmap level for those pixels. At mipmap resolutions this small, it basically results in the colour being an average of the entire texture.

This pixellation can be fixed in a few ways :

In Unity 2022.2+ can also now handle this all in graph by using the “Gradient” Mip Sampling Mode under the Node Settings on the Sample Texture 2D node itself, to expose derivative inputs. Use the DDX and DDY nodes on some continous UV coordinates when connecting to these ports.

In order to get shadows working in an Unlit Graph you need to add some important keywords which allow the ShaderLibrary to calculate shadow info.

I’ve shared a Custom Lighting for Shader Graph package on github, which can handle this for you. For supporting shadows, use the Main Light Shadows and Additional Lights subgraphs. Both of these will work in Unlit Graphs.

Main Light Shadows

If you’d instead prefer to handle it yourself, Create these Boolean Keywords in the Shader Graph Blackboard :

Make sure you set the Reference field, and not just the name. Also set them to Multi Compile and Global, and untick Exposed.

Can then sample shadows by calculating the shadowCoord :

Then use one of the following :

Additional Light Shadows

For additional lights, you need the following keywords:

In your light loop, you’d then use :

For better examples see CustomLighting.hlsl in my Custom Lighting package. Also see the Lighting.hlsl, RealtimeLights.hlsl (v12+) and Shadows.hlsl files in the URP ShaderLibrary. Can find these under the Packages in the Project window, or via Unity/Graphics github.

You can disable fog entirely via Unity’s Lighting window (under Environement tab). If you still want fog enabled, but a certain object to not include fog, you can disable fog in a Lit Graph by using a Custom Function node.

I’d recommend using String mode, with a Vector4 Input named “In” and a Vector4 Output named “Out”. The function name can be anything (e.g. DisableFog), while the body should use :

For URP Target :

1
2
Out = In;
#define MixFog(x,y) x

The function itself doesn’t do anything, except pass the input straight through. But by using the #define we override the later MixFog function call (in URP/PBRForwardPass.hlsl), so rather than applying fog it just outputs the unmodified colour.

For Built-in Target :

1
2
3
4
Out = In;
// Untested, but I think this should work
#undef UNITY_APPLY_FOG
#define UNITY_APPLY_FOG(x,y) y

For the Built-in version, they are already using the UNITY_APPLY_FOG macro (in SG/Built-in/PBRForwardPass.hlsl), so we undefine and then redefine it to output the colour (second parameter).

This is kinda hacky (abusing what macros are supposed to be used for), but hey it works!~

As of 2023.2, Shader Graph has a “Canvas” material type to create shaders compatible with UI components. See Roadmap

For previous versions :

Shader Graph does not have proper support for UI, however it is possible to still use it with some workarounds & drawbacks. This is written with URP in mind, but it may work the similar in other pipelines.

More detailed info/setup below

Canvas

The Canvas can’t use Screenspace-Overlay mode with Shader Graphs, as that renders all shader passes generated by the graph (such as Shadowcaster / DepthOnly / DepthNormals / etc). Unless you edit the generated code (explained more in the section below), we must instead use Screenspace-Camera mode (and set camera field). Worldspace UI also works.

(Image)

UI Image rendering incorrectly on a Screenspace-Overlay Canvas. As you can see from the Frame Debugger, it’s rendering 5 times with each pass (such as the selected DepthNormalsOnly pass)

(Image)

UI Image rendering correctly on a Screenspace-Camera Canvas

If you don’t want Post Processing to affect the UI, can create an Overlay Camera with the Culling Mask set to UI and Post Processing off :

Then set the Screenspace-Camera Canvas to use that new camera and add it to the Stack on the Main Camera.

Masking

Graphs will cause the UI to not work with masking, such as the Scroll Rect/View and Mask components. This requires Stencil operations, which shader graph does not support/expose.

The RenderObjects feature can override Stencil operations, however we cannot use it here for a few reasons :

If you don’t need masking, but want to surpress warnings like “Material UI doesn’t have _Stencil property” you can create Float properties in the Blackboard with those names (_Stencil, _StencilComp, _StencilOp, _StencilWriteMask, _StencilReadMask, _ColorMask), however it is not possible to make them actually do anything from the graph.

If you require masking to work, you will need to copy the shader code generated by the graph and edit it. This would involve clicking the “View Generated Shader” button in the inspector when viewing the Shader Graph asset. Save the shader that appears in your assets, and use that on Materials instead of the graph.

To actually support the masking, add the following Properties and edit the SubShader :

Shader "Shader Graphs/UITest" {
    Properties {
        // ... (properties generated by graph)
        _StencilComp ("Stencil Comparison", Float) = 8
        _Stencil ("Stencil ID", Float) = 0
        _StencilOp ("Stencil Operation", Float) = 0
        _StencilWriteMask ("Stencil Write Mask", Float) = 255
        _StencilReadMask ("Stencil Read Mask", Float) = 255
        _ColorMask ("Color Mask", Float) = 15
    }
    SubShader {
        Stencil {
            Ref [_Stencil]
            Comp [_StencilComp]
            Pass [_StencilOp]
            ReadMask [_StencilReadMask]
            WriteMask [_StencilWriteMask]
        }
        ColorMask [_ColorMask]
        // ... rest of shader (Tags, Pass, etc)

Note the generated shader code may have multiple SubShaders, you would want to add this Stencil block (and ColorMask) to both.

With the material using this shader, the Image is now masked correctly :

You could also delete all passes except the UniversalForward, and use ZTest [unity_GUIZTestMode] (replacing the ZTest line in the Pass) to support rendering into a ScreenSpace-Overlay UI. All these changes are based on the Built-in RP / UI-Default.shader.

Note that the shader is now completely separate from the graph. If you want to make changes to the graph, you’ll need to regenerate the code and make these changes again. Provided you don’t need to make changes often, it isn’t that bad.

Shader Graph does not have proper support for Terrain, but it is still possible with some drawbacks. (Note this is for regular Draw mode, not Instanced Draw)

A Material can be assigned in the last tab on the Terrain component (cog symbol). When using a material with a graph shader, it will always display a warning that it “might be unsuitable”, but this can be ignored. It’s not possible to remove this warning afaik (without generating shader code from the graph), as we can’t add the TerrainCompatible tag from within the graph editor.

If you want to support painting of Terrain Layers, it is only possible to support 4 layers from a graph. You just need to use specific names/references. At it’s simplest, you’d want :

(Image)

(Click image to view fullscreen)

Layers also provide additional data which the shader could support, such as :

You’d handle these using the same sort of setup and connect to the appropiate ports on the Master Stack (if using a Lit Graph this would involve Smoothness, Metallic, Normal Vector (Tangent) & Ambient Occlusion).

If using Unlit Graph instead (with a Custom Lighting model, e.g. Toon Shaded), you’d Transform the calculated normal from Tangent to World space so it can be used in lighting calculations.

GPU Instancing is typically not required when using MeshRenderer and SkinnedMeshRenderers in URP, as these already use the SRP Batcher to optimise setup between draw calls. When using the SRP Batcher you should avoid using Material Property Blocks though and stick to multiple Materials (or instaniate materials, e.g. using renderer.material)

But if you wish to render lots (many thousands) of the same mesh, you could consider using GPU Instancing via Graphics.DrawMeshInstanced, to remove some of the overhead of GameObjects. Or even better, DrawMeshInstancedIndirect (see answer below instead).

To support DrawMeshInstanced, Materials should already have a “Enable GPU Instancing” tickbox which can be ticked.

However using any properties in the Shader Graph Blackboard will still break instancing. Instead, I’ve found that you can use a Custom Function to define the instancing buffer & properties. This is using macros similar to how you would set up instancing in a code-written shader.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
#ifndef CUSTOM_INSTANCING
#define CUSTOM_INSTANCING
// This allows multiple Custom Function nodes to use the same file
// without trying to include the code multiple times (causing redefinition errors)

// Instancing Buffer
UNITY_INSTANCING_BUFFER_START(Props)
  UNITY_DEFINE_INSTANCED_PROP(float4, _Color)
UNITY_INSTANCING_BUFFER_END(Props)

// Custom Function "GetInstancedColor", Outputs : Vector4
void GetInstancedColor_float(out float4 Out){
    Out = UNITY_ACCESS_INSTANCED_PROP(Props, _Color);
}

#endif

For supporting DrawMeshInstancedIndirect, (possibly also DrawProcedural), add a Boolean Keyword to the Blackboard, with reference set to PROCEDURAL_INSTANCING_ON. Should also be Global and using Multi-Compile.

Use two Custom Function nodes. Both with a Vector3 input named “In” and a Vector3 output named “Out”.

1
2
Out = In;
#pragma instancing_options procedural:vertInstancingSetup

(vertInstancingSetup being a function in the include file)

Both functions here don’t alter the input passed in, but it is required to be able to connect the node to the Master Stack. We should connect it somewhere in the Vertex stage - likely easiest using the Position port with the Position node set to Object space (or swap this out for a displaced vertex position if you require that)

See instanced grass example here :

The Scene Color node samples a texture containing opaque geometry in the scene, allowing you to obtain the colour of opaque objects behind a transparent shader.

This is useful to produce distortion effects. The default UVs input is the Screen Position node, so you can Add or Subtract to that to distort the view, for example with a noise texture. Can find examples in the following posts :

Note that the node will only function correctly when using a graph with Surface Mode : Transparent (or manually set RenderQueue to 2501+). See below for pipeline-specific differences.

URP :

Built-in RP (Unity 2021.2+) :

HDRP :

Shader Graph (Examples / Snippets)

If you want to layer a texture containing an alpha channel onto another texture/colour, you can do so using a Lerp, where the A output from the Sample Texture 2D is connected to the T input. The A input would be the colour of the Base/Background while B input would be the colour of what you want to Layer on top.

If you don’t have a texture with an alpha channel, you could also use a procedural shape (or a channel from another texture) as the T input. This should be a float value, commonly referred to as a mask.

In this case the noise texture is used to interpolate between two colours (blue, green). Then the result is masked using an Ellipse to layer it on the background (same “Base Color” group as used in previous example)

It is not possible to change the frequency/wavelength or adjust the phase of the Sine Time output from the Time node. You can only remap values, such as using a Multiply to adjust the amplitude of the wave.

If you need to control the frequency, use the Time output instead, Multiply by your frequency. To adjust phase use Add or Subtract. You would then put into a Sine node. Can also Multiply again after this to adjust amplitude, same as before.

While it’s possible to rotate meshes in C# to produce billboarding (e.g. with transform.LookAt), it’s usually cheaper to handle effects like this in the shader - especially if many objects require billboarding.

Billboarding usually requires the vertex shader to ignore using the view matrix for rotations, but Shader Graph makes it a little trickier to handle as it does space conversions behind the scenes. The Position port in the Master Stack is intended to be in Object space, rather than the clip space output a written vertex shader would usually have. So instead, we need to apply an inverse view matrix to cancel out the rotation Shader Graph applies later.

First up, we need a matrix to handle the effect. Can use one of the following.

These may also cause faces to flip too - you can Negate the first column to prevent this (would need Matrix Split and Matrix Construction like shown below), or change the Render Face option in the Graph Settings, e.g. to Both.

To make billboard only rotate around the Y axis, use one of the above (depending if shadows are required) into a Matrix Split (Column mode), and put M0 and M2 into a Matrix Construction node (Column mode). M1 should be set to (0, 1, 0, 0). M3 isn’t too important but can be set it to (0, 0, 0, 1).

Examples :

To apply the scale/rotations from the matrix (without translation parts), can use a 3x3 matrix, or 4x4 matrix and Multiply with Vector4 with W/A component set to 0. (Be sure the matrix is in the A port - the order is important with matrix multiplication!)

Textures are typically applied to the mesh using UV coordinates stored in the vertex data. But we can also use other coordinates.

In a technique usually referred to as “Worldspace Planar Mapping” (sometimes also called “Worldspace Projected UV”), we use the Position node set to World space. This is a Vector3 output but nodes that take a UV port use a Vector2 though, so we first need to Swizzle (or Split and recombine into a Vector2 node), where we can also reorder the components. For example, we can use RB (aka XZ) axis.

This can then be put into a Sample Texture 2D, or any other nodes that have a UV port - such as procedural noise (Simple Noise and Gradient Noise).

Because we are in World space this acts like projecting the texture from above/below (as the G/Y axis is the one that we didn’t use). The texture will be stretched for any faces that are vertical, but this method is useful for flat surfaces. The texture also does not move, rotate or scale with the object, and so seamlessly continues over neighbouring objects.

We could also project from other axis by using the RG/XY or GB/YZ ports (and swizzle these further for 90 deg rotations).

Sampling from all three axis, then blending based on the Normal Vector is known as Triplanar Mapping. There is a Triplanar node which handles this for you - with only one texture input for all three axis though. For supporting different textures per axis, you’d need to handle it yourself by recreating it in nodes or using a Custom Function. I have a page on my old site explaining this further.

Shader Graph already provides some texture sampling nodes (e.g. Sample Texture 2D) for use in the graph. But it can be useful to sample inside a Custom Function, e.g. inside a loop, or to access sampling macros that don’t have built-in nodes.

Macros

In shader code online you might typically see sampler2D and tex2D() functions being used. This is the older DX9 syntax, but with Shader Graph we mostly use the newer DX11+ syntax which has separate objects for the texture and sampler. This isn’t supported in all platforms though (GLES2), so Unity provides a bunch of macros which can generate different code. These macros are defined in files found under render-pipelines.core/ShaderLibrary/API.

Some important macros for sampling textures are listed below :

Shader Graph Texture Types

In Shader Graph v10.3+ (Unity 2020.2.3f1+), structs were added which wrap textures and their associated samplers together, allowing them to both be passed into a Custom Function node in a single parameter. These types are defined in render-pipelines.core/ShaderLibrary/Texture.hlsl

The sampler state associated with the texture can be accessed using .samplerstate on the UnityTextureX object. UnityTexture2D also has access to the Texel Size (via .texelSize).

Custom Function Example (v10.3+)

If you aren’t familiar with the Custom Function node syntax see Intro to Shader Graph - Custom Functions and Custom Function docs page.

Here is an example sampling a 3D Texture, using LOD version so it could be used in the vertex stage. This code would be put in a .hlsl file for use with Custom Function File mode. It would also work with String mode but then you would only want the body of the function (inside {}) and you need to match the names for inputs/outputs.

1
2
3
void Example_float(UnityTexture3D Tex, float3 UV, out float4 Out){
    Out = SAMPLE_TEXTURE3D_LOD(Tex, Tex.samplerstate, UV, 0);
}

Inputs :

Outputs :

Custom Function Example (Pre-v10.3)

Earlier versions don’t have these UnityTextureX/UnitySamplerState types so instead need to use the HLSL texture objects, such as Texture3D and SamplerState. For example :

1
2
3
void Example_float(Texture3D Tex, SamplerState SS, float3 UV, out float4 Out){
    Out = SAMPLE_TEXTURE3D_LOD(Tex, SS, UV, 0);
}

Inputs :

Outputs :

This could also be used in newer versions, by using the “Bare Texture 3D” and “Bare Sampler State” options on the input/output types. (Though this will likely error in GLES2 platforms, hence why the newer structs were introduced)

Unity supports fixed-size Float and Vector(4) arrays in shaders. We can define them in Shader Graph by using a Custom Function node. In this case, we must use the File mode as the array has to be defined outside the function scope - and this is not possible using String mode.

Here is some example code, defining an array named _ExampleArray, containing 10 floats. Note that the [10] is put after the name, not after float. We can then index the array inside the function, in this case using a loop to add the contents together.

1
2
3
4
5
6
7
float _ExampleArray[10];
 
void ExampleFunction_float(out float Sum){
   for (int i=0;i<10;i++){
      Sum += _ExampleArray[i];
   }
}

The array would then be set from a C# script, using :

Shader.SetGlobalFloatArray("_ExampleArray", floatArray);

For a Vector4 array, you would use float4 instead of float, and :

Shader.SetGlobalVectorArray("_ExampleArray", vectorArray);

When using these C# functions, make sure that floatArray (or vectorArray) has the same length that is specified in the shader! Can pad with zeros if required.

I also have a forcefield shader breakdown which uses an array, if you want a real example.

The following graph can be used to support Fade Mode : “Cross Fade” on the LOD Group component.

This requires using a built-in variable, unity_LODFade. Shader Graph doesn’t provide access to it but we can use a Custom Function node (String mode). Add a single Float output named “Out” and set Body to Out = unity_LODFade.x;. The function’s name isn’t that important but needs to be set to something.

For use with the Dither node, we first need to Absolute to bring any negative values into the positive range. However to make sure it fades in the correct direction, we also need to take the Sign of unity_LODFade.x, and Multiply by the dither result.

As shown, can then optionally use a Boolean Keyword (set this up in the Blackboard window. While highlighted change it’s settings under the Node Settings tab of Graph Inspector window).

Drag the keyword into the graph, then put the result of our Multiply into the On port, and set the Off port to 1.

Using this keyword may be optional, but it means we can avoid having to do the dither calculations when not in the crossfading range. However this would affect batching, so you may want to try it with and without and profile both.

The above graph results in this when moving the camera :

With LOD Group setup looking something like this :

(Image)

(Usually would have less tris in your higher/further away LODs, but the meshes here is just for example)

For fading based on time, enable Animate Cross-fading. Can control the duration (globally) by setting LODGroup.crossFadeAnimationDuration from a C# script.

For fading based on distance, disable Animate Cross-fading and edit the “Fade Transition Width” values.

See Unity Docs : LOD Group for more info.

For some effects it can be useful to project a texture/noise/etc from screen space (using the Screen Position node into UV ports). When the object or camera moves the texture stays in place relative to the screen, so changes on the model. That isn’t always the desired effect - it could be useful if the texture stayed relative to the object itself.

One easy way to achieve something like this is to use View Space instead, and offset it by the object’s origin in the same space. I do this in my Dissolve Shader Breakdown :

This can create some warping when objects are close to the sides of the camera view, but still tends to work pretty well.

The same thing can be done in screen space (which doesn’t have this warping issue), but the graph is more complicated. It also includes optional groups to deal with screen aspect ratio (so texture/noise isn’t stretched) and make UVs scale down when viewing at a distance :

(Image)

(Click image to view fullscreen)

In newer versions of Shader Graph (v14 / Unity 2022.2+) the Transform node now has a Screen option, therefore we can simplify the “Stabilised Screen Position” group in the above graph :

There may be cases where you want to displace/distort a texture in multiple directions. But just using Direction * Time to offset the UV coordinates will mean the difference in values grows larger between where the directions change - tilling and stretching the texture more and more in that gap.

There may be multiple methods to solve this - sampling a texture multiple times with displacements in each cardinal direction and blending between them comes to mind, but that isn’t ideal. The below technique provides a better solution that only uses 2 samples so should also be fairly performant.

The idea is to have the time repeat, by using the Fraction node to only take the decimal part of the value. The displacement will jump back to 0 when time reaches 1, but to avoid seeing this we fade the texture out while fading another version in that uses a different phase (time offset by 0.5).

The fading is controlled by two “triangle waves” using the same repeating time values.

Here’s an example graph showing the implementation :

(Image)

(Click image to view fullscreen)

The section highlighted in orange is the same for both the top and bottom (but with A input swapped for B) so putting this into a SubGraph may help keep the graph more organised.

In the top left, two options are shown for obtaining the displacement directions. Can either use a “Flowmap” - essentially the same as a Normal Map but typically only with RG channels. Should likely still be imported and treated the same as a Normal Map, or at least disable sRGB so colour space differences don’t mess with values. Will also need to remap from the 0 to 1 range of the texture to the -1 to 1 range of direction vectors. I’ve used the “Normal” mode on the Sample Texture 2D to handle this for me.

The other option is storing directions in the mesh data, such as Vertex Color. You may also want a Normalize node after the remapping here. You can then paint the mesh in modelling software like Blender, or in Unity using Polybrush (found in Package Manager), or find a way to set the directions from C#.

In both cases, any values will work but (0.5, 0.5) would be no displacement. (0, 0.5) would be towards the left, (1, 0.5) for right, (0.5, 0) for down, (0.5, 1) for up, etc.

For more details, this “Texture Distortion” CatlikeCoding tutorial should help explain the technique.

Shaders

Shader code can contain multiple passes that define LightMode tags. See here for related docs page which also links to lists of LightMode values used by Built-in & URP.

In the Built-in Render Pipeline, you can also have multiple passes without a LightMode tag (or "LightMode"="Always") which are rendered without lighting/shadows. Unity will render each one as a separate draw call, though the exact render order also depends on the queue :

In the Universal Render Pipeline, it is possible to have one Pass with "LightMode"="UniversalForward", as well as one pass without (or "LightMode"="SRPDefaultUnlit") and both will render… However this breaks SRP Batcher compatibility for objects using the shader, so is not recommended if the shader will be applied to many objects.

Better options for URP are :

Note that in 2022.2+ we also have a Override Shader option on the RenderObjects feature (and in DrawingSettings used with ScriptableRenderContext.DrawRenderers in a custom feature). This will keep using the same property values from the original shader, closer matching Replacement Shaders from BiRP. However, I believe objects rendered with overriden shaders currently do not SRP Batch, so if there is many objects involved it could be expensive. Should try to only use this where required.

Rather than hardcoding ShaderLab operations, it is possible to specify a Property so they can be changed on the material or at runtime (e.g. through material.SetFloat)

// (in Properties)
[Enum(Off, 0, On, 1)] _ZWrite("Z Write", Float) = 1
[Enum(UnityEngine.Rendering.CompareFunction)] _ZTest("ZTest", Float) = 4 // "LessEqual"
[Enum(UnityEngine.Rendering.CullMode)] _Cull ("Cull", Float) = 2 // "Back"
[Enum(UnityEngine.Rendering.ColorWriteMask)] _ColorMask ("ColorMask", Float) = 15 // "RGBA"

[Enum(UnityEngine.Rendering.BlendMode)] _BlendSrc ("Blend Src Factor", Float) = 1 // "One"
[Enum(UnityEngine.Rendering.BlendMode)] _BlendDst ("Blend Dst Factor", Float) = 0 // "Zero"
[Enum(UnityEngine.Rendering.BlendMode)] _BlendSrcA ("Blend Src Factor (Alpha)", Float) = 1 // "One"
[Enum(UnityEngine.Rendering.BlendMode)] _BlendDstA ("Blend Dst Factor (Alpha)", Float) = 0 // "Zero"
[Enum(UnityEngine.Rendering.BlendOp)] _BlendOp ("Blend Op", Float) = 0 // "Add"

[Enum(UnityEngine.Rendering.CompareFunction)] _StencilComp ("Stencil Comparison", Float) = 0 // "Disabled"
[IntRange] _Stencil ("Stencil ID", Range (0, 255)) = 0
[Enum(UnityEngine.Rendering.StencilOp)] _StencilOp ("Stencil Op (Pass)", Float) = 2 // "Replace"
[Enum(UnityEngine.Rendering.StencilOp)] _StencilOpFail ("Stencil Op (Fail)", Float) = 0 // "Keep"
[Enum(UnityEngine.Rendering.StencilOp)] _StencilOpZFail ("Stencil Op (ZFail)", Float) = 0 // "Keep"
_StencilWriteMask ("Stencil Write Mask", Float) = 255
_StencilReadMask ("Stencil Read Mask", Float) = 255

...

// (in SubShader/Pass)
ZWrite [_ZWrite]
ZTest [_ZTest]
Cull [_Cull]
ColorMask [_ColorMask]

//Blend [_BlendSrc] [_BlendDst] // Uses Blend mode for both RGB and Alpha channels
Blend [_BlendSrc] [_BlendDst], [_BlendSrcA] [_BlendDstA] // Use different Blend mode for Alpha
BlendOp [_BlendOp]

Stencil {
    Ref [_Stencil]
    Comp [_StencilComp]
    Pass [_StencilOp]
    Fail [_StencilOpFail]
    ZFail [_StencilOpZFail]
    ReadMask [_StencilReadMask]
    WriteMask [_StencilWriteMask]
}

Tangent space uses vectors from the mesh data to stay relative to the surface of the mesh. It can be difficult to visualise for an entire model as unlike other spaces, it can be different per-pixel.

(Image)

A way to visualise the tangent space for a given point (centered at the gizmo)

The Normal vector you should be familiar with, points out from each vertex - It’s the Z axis of tangent space (shown in blue, since XYZ=RGB). The X and Y axis use the Tangent vector and a Bitangent vector (also called Binormal) - which is typically calculated using a Cross Product with the other two vectors.

These tangent and bitangent vectors are also aligned to the direction of the UV coordinates stored in the mesh. (The tangent follows the X axis of the UV coordinates, and the bitangent follows the Y axis)

The space is needed so tangent space normal maps (that use UV coordinates for sampling) can be converted back to world space to produce correct lighting/shading. The tangent space View Direction is also used to produce Parallax effects.

Of note, if you are using techniques like Triplanar Mapping, then the “Tangent space” you’d need would be different than the Tangent space calculated from mesh data. This article by Ben Golus explains this in detail. In Shader Graph, the Triplanar node already takes this into account when using it’s Normal mode (I believe using the “Whiteout Blend” example)

Particle System, Line/Trail Renderer :

Colors for these are usually passed into vertex colors - data stored in each vertex of the mesh. To obtain this :

Sprite Renderer

For Shader Graph, the Vertex Color node should be enough. For versions prior to Unity 6000.0.0b16, you had to use a Custom Function to access unity_SpriteColor (possibly _RendererColor in older versions?) - see an example here. Note that Sprite Graphs will already automatically tint the output, but there’s an option to disable this under Graph Settings (in 2023.3/6000). For older versions you can’t disable the tinting afaik.

For shader code, if instanced in Built-in RP, I believe sprites use a _RendererColor property (in the instancing buffer?). In URP the same property name was used, though as of 2023.1+ that has been replaced with unity_SpriteColor. To support all options should be able to use something like IN.color * _RendererColor * unity_SpriteColor, similar to what the sprite graph does.

See Local UVs for Sprites in Sprite Sheet/Atlas post.

The origin of a mesh is (0,0,0) in object space. To calculate this in world space we could use a matrix multiplciation, like float3 originWS = mul(UNITY_MATRIX_M, float4(0,0,0,1)).xyz, however a cheaper method is to extract the translation data from the matrix :

This is equivalent to the Position output of the Object node in Shader Graph.

Note : Unity 2023.1+ now uses SRP Batching for sprites (in play mode at least) which somewhat fixes this.

For prior versions (and if SRP batching is disabled) :

To save on drawcalls (and performance), Sprites on the screen that use the same material, are batched so they can be drawn together. It shows as “Draw Dynamic” in the Frame Debugger window. When this batching occurs, meshes for each sprite are transformed into World space and combined into a single object. The model matrix (UNITY_MATRIX_M) is cleared to an identity matrix (scale of 1, no rotation/translation)

The model matrix is usually responsible for transforming vertex data stored in the mesh into World space, but an identity matrix is used so the values aren’t altered. “Object space” now doesn’t really exist on the shader side, as the vertex positions are already stored in World space.

Anything else that relies on the model matrix also won’t work correctly, such as calculating the origin and scale of the object (outputs on the Object node)

There isn’t really a good way around this afaik, but I don’t work in 2D that often. You could break batching by using different material instances, but that may not be good for performance. Typically you would try to rely on UV coordinates rather than vertex positions.

You could consider using MeshRenderers instead as they can support the SRP Batcher - which doesn’t combine meshes, but instead batches setup between the draw calls.

This error means the shader is using more than 16 samplers. While shaders can support more textures (DX11 supports 128), it has a much lower limit to the number of samplers. To get around this, we can re-use samplers from other textures or use inline sampler states.

In Shader Graph, can use Sampler State node to achieve this, which would be connected to the sampler port on the Sample Texture 2D node.

Be aware that the GLSL mod function is not an exact equivalent to fmod function in HLSL, the result will be different when dealing with negative values.

If you are converting a shader, you may want to implement your own mod function using that code instead.

While many shaders in the Built-in RP use CGPROGRAM and ENDCG in their shader code, they are still actually written (and compiled) using HLSL. Unity used to use Nvidia’s CG language, but it was deprecated and is no longer used.

When using the CGPROGRAM tag, Unity automatically includes some files from the Built-in RP shader includes (such as HLSLSupport.cginc and UnityShaderVariables.cginc). This can cause conflicts with other shader libraries - so shaders written for the Post Processing package, URP, and HDRP, should all use HLSLPROGRAM and ENDHLSL instead.

Even if it is all HLSL, it’s important to note that the code written for each pipeline can still vary quite a lot as they use separate include files, which can contain different functions and macros. Can view the ShaderLibrary code via :


License / Usage Cookies & Privacy RSS Feed