Unity Rendering Principles (6) Unity HLSL
In Unity, we use HLSL syntax to write Shader Programs, but initially Unity used CG syntax, so some Unity keyword names (CGPROGRAM) and Enterprise Archive File (.cginc) were used. Although Unity no longer uses Cg, these names are still used.
Put the HLSL code in the Code Block in the ShaderLab code. The shader program usually looks like this:
1 | Pass { |
The HLSL language has two syntaxes: the older DirectX 9 style syntax and the more modern DirectX 10 + style syntax. The main difference lies in the way the texture sampling function works:
- Legacy syntax uses sampler2D, tex2D () and similar functions. This syntax works on all platforms.
- The DX10 + syntax uses the Texture2D, SamplerState, and .Sample () functions. Since textures and samplers are not different objects in OpenGL, some forms of this syntax are not valid on the OpenGL platform.
Preprocessing instructions in HLSL
Internally, shader assembly has multiple stages. The first stage is preprocessing, where a program called “preprocessor” prepares the compiled code. Preprocessor directives are instructions for preprocessors.
For details, please refer to官方文档
Shader semantics
When writing HLSL shader programs, input and output variables need to express their intent through semantics.
It should be noted that the semantics of variables are different from the type of variables, such as’float4 vertex: POSITION '. The type of this vertex variable is float4, but its semantics are POSITION, indicating that this variable represents the coordinates of the point in the clipping space
Vertex shader input semantics
The primary vertex shader function (represented by the #pragma vertex directive) requires semantics on all input parameters. These correspond to individual mesh data elements such as vertex positions, normal meshes, and texture coordinates. See Vertex Program Inputs for more details.
Below is an example of a simple vertex shader that takes vertex position and texture coordinates as input. Pixel shaders visualize texture coordinates as colors.
1 | Shader "Unlit/Show UVs" |
Element shader output semantics
** Typically, chip element (pixel) shaders output colors with SV_Target semantics **. The chip element shader in the example above looks exactly like this:
fixed4 frag (v2f i) : SV_Target
The return type of function frag is fixed4 (low-precision RGBA color). Because it only returns one value, the semantics are indicated by the function itself: SV_Target.
** It is also possible to return a structure containing output. The above slice element shader can also be rewritten as follows, with exactly the same function **:
1 | struct fragOutput { |
Returning structure from a chip element shader is very useful for shaders that return more than a single color. Other semantics supported by chip element shader output are as follows.
SV_TargetN: Multiple Rendering Targets
SV_Target1, SV_Target2, etc.: These are additional colors written by the shader. This is used when rendering to multiple render targets at once (called “multi-render target” rendering technology, or MRT for short). SV_Target0 equivalent to SV_Target.
SV_Depth: Pixel Shader Depth Output
Typically, slice-element shaders do not overwrite the Z buffer values and use the default values in regular triangle rasterization. However, for some effects, it is useful to output a custom Z buffer depth value per pixel.
Note that on many GPUs this turns off some deep buffer optimizations, ** so don’t override the Z buffer value without a good reason **. The cost incurred SV_Depth depends on the GPU architecture, but in general is very similar to the cost of Alpha testing (using the built-in clip () function in HLSL). Modify the depth after all regular opaque shaders by rendering shaders (e.g. render queues with AlphaTest).
The depth output value must be a single float.
Vertex shader output and slice element shader input
The vertex shader needs to output the final clipped spatial position of the vertices so that the GPU knows the rasterized position on the screen and the depth. This output needs to have SV_POSITION semantics and be of type float4.
** All other outputs (“interpolators” or “variations”) generated by the vertex shader are required by your specific shader **. The values output from the vertex shader will be interpolated on the faces of the rendered triangles, and the value of each pixel will be passed as input to the chip shader.
Many modern GPUs don’t really care about the semantics of these variables; however, some older systems (most notably Shader Model 2 GPUs on Direct3D 9) have special rules about semantics:
Semantics such as TEXCOORD0 and TEXCOORD1 are used to indicate arbitrary high-precision data, such as texture coordinates and positions.
The COLOR0 and COLOR1 semantics of vertex outputs and slice-element inputs are used for low-precision data in the 0 to 1 range (such as simple color values).
For optimal cross-platform support, vertex output and slice-element input should be labeled with TEXCOORDn semantics.
Limit on the number of interpolators
There are some limitations on how many interpolator variables can be used in total to pass information from vertices to slice shaders. The limitations, depending on the platform and GPU, are as follows:
Up to 8 interpolators: OpenGL ES 2.0 (Android), Direct3D 11 9.x level (Windows Phone), and Direct3D 9 shader model 2.0 (old PC). The number is limited due to interpolators, but each interpolator can be a 4-component vector, so some shaders pack content together so as not to exceed the limit. For example, two textures, coordinates can be passed in a float4 variable (.xy for one coordinate, .zw for the second coordinate).
Up to 10 interpolators: Direct3D 9 Shader Model 3.0 (#pragma target 3.0).
Up to 16 interpolators: OpenGL ES 3.0 (Android) and Metal (iOS).
Up to 32 interpolators: Direct3D 10 Shader Model 4.0 (#pragma target 4.0).
Regardless of the specific target hardware, it is generally best to use as few interpolators as possible for performance reasons.
Other special semantics
- Screen space pixel position: VPOS
Element shaders can receive the location of pixels rendered as special VPOS semantics. This feature only exists since shader model 3.0, so shaders need to have the #pragma target 3.0 compile directive.
The base type of screen space location input will vary on different platforms, so for maximum portability, use UNITY_VPOS_TYPE type for it (float4 on most platforms, float2 on Direct3D 9).
Additionally, using pixel position semantics will make it difficult to have SV_POSITION and VPOS in the same vertex-to-slice meta structure. Vertex shaders should therefore output the clipped spatial position as a separate “out” variable.
- Facing direction: VFACE
Element shaders can receive a variable that indicates whether the rendered surface is facing or facing away from the camera. This is useful when rendering geometry that should be visible from both sides - typically for leaves and similar thin objects. The VFACE semantic input variable will contain positive values representing the front triangle, and negative values representing the back triangle.
This feature only exists since Shader Model 3.0, so shaders need to have the #pragma target 3.0 compile directive.
- Vertex ID: SV_VertexID
Vertex shaders can receive variables with “vertex numbers” (unsigned integers). This is useful when you want to get additional per-vertex data from textures or ComputeBuffers.
This feature only exists since DX10 (Shader Model 4.0) and GLCore/OpenGL ES 3, so shaders need to have the #pragma target 3.5 compile directive.
How to use shader properties
Shaders declare material properties in the Properties Code Block. If you want to access some of these properties in the shader program, you need to declare Cg/HLSL variables with the same name and matching type.
For example, the following shader properties:
1 | _MyColor ("Some Color", Color) = (1,1,1,1) |
It can be declared for access through the following Cg/HLSL code:
1 | Fixed4 _MyColor;//low precision types are usually sufficient for color |
Cg/HLSL can also accept the uniform keyword, but this keyword is not required.
1 | uniform float4 _MyColor; |
Property types in ShaderLab map to Cg/HLSL variable types as follows:
- Color and Vector properties map to float4, half4, or fixed4 variables.
- The Range and Float properties map to float, half, or fixed variables.
- For normal (2D) textures, the Texture property maps to the sampler2D variable; the Cubemap maps to samplerCUBE__; and the 3D texture maps to sampler3D__.
How to provide property values to shaders
Find the value of the shader property in the following locations and provide it to the shader:
The per-renderer value set in the MaterialPropertyBlock. This is usually “per-instance” data (for example, custom shading colors for many objects that all share the same material).
The value set in the Material used on the rendered object.
Global shader properties, set by the Unity rendering code itself (see Built-in Shader Variables), or set by your own scripts (such as Shader. SetGlobalTexture).
The order of precedence is as above: per-instance data overwrites everything; then material data is used; finally, if shader properties do not exist in these two places, global property values are used. Finally, if shader property values are not defined anywhere, “default values” will be provided (zero for floats, black for colors, white for textures).
Serialization and runtime material properties
Materials can contain both serialized property values and property values set at runtime.
Serialized data are all properties defined in the Properties Code Block of the shader. Typically, these are values that need to be stored in the material and can be adjusted by the user in the material inspector.
Materials can also have some properties used by shaders, but not declared in the shader’s Properties Code Block. Usually, this applies to properties set at runtime from script code (for example, through Material. SetColor). Note that matrices and arrays can only exist as non-serialized runtime properties (as they cannot be defined in the Properties Code Block).
Special texture properties
For each texture set to shader/material properties, Unity also sets some additional information in other vector properties.
Texture tiling and offset
Materials usually have Tiling and Offset fields for their texture properties. This information is passed to the float4 {TextureName} _ST properties in the shader:
X contains X tile values
Y contains Y tile values
Z contains X offset values
W contains Y offset values
For example, if the shader contains a texture named _MainTex, the tile information will be in _MainTex_ST vector.
Texture size
The {TextureName} _TexelSize - float4 property contains texture size information:
X contains 1.0/width
Y contains 1.0/height
Z contains width
W contains height
Texture
{TextureName} _HDR - A float4 property that contains information about how to decode potential HDR (e.g. RGBM encoded) textures based on the color space used. See the DecodeHDR function in the UnityCG.cginc shader include file.
Color space and color/vector shader data
When using linear color spaces, all material color properties are provided as sRGB colors, but are converted to linear values when passed to shaders.
For example, if the Properties shader Code Block contains a Color property named “MyColor”, the corresponding “MyColor” HLSL variable will get a linear color value.
** For properties marked as Float or Vector type, no color space conversion is done by default; instead, they are assumed to contain non-color data **. The [Gamma] attribute can be added to float/vector properties to indicate that they are specified in sRGB space, just like colors
Providing vertex data to vertex programs
For Cg/HLSL vertex programs, mesh vertex data is passed as input to vertices, shader function. Each input needs to have specified semantics: for example, POSITION input represents vertex position, NORMAL represents vertex normal.
Usually, vertex data inputs are declared in structures, rather than listed one by one. In the UnityCG.cginc include file, several commonly used vertex structures are defined, and in most cases, using them alone is sufficient. These structures are:
- appdata_base: position, normals and a texture coordinate.
- appdata_tan: position, tangent, normal and a texture coordinate.
- appdata_full: position, tangent, normal, four texture coordinates and color.
To access different vertex data, you need to declare the vertex structure yourself, or add input parameters to the vertex shader. Vertex data is identified by Cg/HLSL semantics and must come from the following list:
- POSITION is the vertex position, usually float3 or float4.
- NORMAL is the vertex normal, usually float3.
- TEXCOORD0 is the first UV coordinate, usually float2, float3, or float4.
TEXCOORD1, TEXCOORD2 and TEXCOORD3 are the 2nd, 3rd and 4th UV coordinates respectively. - TANGENT is a tangent vector (for normal maps), usually float4.
- COLOR is the color per vertex, usually float4.
When the mesh data contains fewer components than the vertex shader input requires, the rest is padded with zeros, except for the .w component, which defaults to 1. For example, mesh texture coordinates are usually 2D vectors containing only x and y components. If the vertex shader declares a float4 input using TEXCOORD0 semantics, the value received by the vertex shader will contain (x, y, 0, 1).
Built-in macros, variables, helper functions
Unity has built in many common macros for us, such as judging the current platform, many variables, such as the transformation matrix of model space, and some helper functions, such as converting model coordinates to world coordinates.
For specific information, you can check the official doc: