URP深度法线纹理
深度法线纹理
前言
在Built-in管线下我们对深度纹理与深度法线纹理获取方式是直接通过设置摄像机和Shader全局纹理获取的,在URP下深度纹理依旧可以这样获取,但深度法线纹理在URP10版本才支持,使用URP的前置条件是升级到Unity2020版本。所以当前在Unity2019版本下是无法直接获取深度法线纹理的,但我们可以通过一些其他手段来自定义我们的深度法线纹理。
全局雾(深度纹理)
为什么将全局雾与高度雾放一起讲呢,因为在《Shader入门精要》一书中,对全局雾的讲解是使用深度纹理反向重建顶点的世界坐标。所以在实现全局雾的过程中,我们需要使用到深度纹理。
效果图
思路
-
我们需要和Built-in管线下一样,开启摄像机的深度纹理渲染,默认情况下摄像机的深度纹理渲染是跟随渲染管线的,所以我们需要开启渲染管线的深度纹理渲染。
开启深度纹理渲染后,即可在Frame Debug下看到渲染增加了Copy Depth流程,由于深度纹理占用的是R通道,所以纹理面板呈现红色和黑色(图中的红色比较浅,需要仔细看,红色深度是由物体和摄像机距离决定的,距离越近,颜色越深)
-
创建Shader——FogWithDepthTexture,在Shader中没有什么新语法,基本就是将CG语法的Shader翻译一下。
需要注意的是我们使用了float4 _MainTex_ST;与_MainTex_TexelSize;,前者是主纹理的偏移与缩放,后者是纹理的大小,用于校验是否需要翻转坐标。
再者是LinearEyeDepth在URP下需要我们传入线性缓冲值才能计算,该参数为_ZBufferParams,由相机参数获取而来:
x
是 (1-远/近),y
是 (远/近),z
是 (x/远),w
是 (y/远)。 -
既然是后处理脚本,自然需要集成到Volume框架中。
- 创建FogWithDepthTexture.cs,继承
VolumeComponent
与IPostProcessComponent
接口,用于获取属性参数 - 修改AdditionalPostProcessData.cs和AdditionalMaterialLibrary.cs,以方便获取Shader与材质。
- 修改AdditionPostProcessPass.cs,修改
void Render(CommandBuffer cmd, ref RenderingData renderingData)
方法和增加方法void SetFogWithDepthTexture(CommandBuffer cmd, Material uberMaterial, CameraData cameraData)
。参考入门精要,我们可以得知想要在Shader重构世界坐标,则需要我们将近裁剪平面的4个角对应的向量传入Shader中,那么我们则需要实现在Render方法中从参数renderingData中获取到当前的渲染相机,并传入我们的新增方法进行计算。 - 最后不要忘记将Shader赋值给管线中的AdditionalPostProcessData对象,然后就可以在Voloume组件中添加雾效了。
- 创建FogWithDepthTexture.cs,继承
描边(深度法线纹理)
描边效果的实现需要我们对像素的法线、深度和邻近像素进行比较计算,以此来确定该像素是否是物体的边缘,那么这就需要用到我们的深度法线纹理。由于URP某些版本下不支持深度法线纹理(本章开头有提到),所以需要我们自己造轮子。流程上比较杂,慢慢研究吧。
效果图
思路
扩展管线功能
思路上参考当前比较常见的作法,在URP管线下,阴影渲染后和URP自带深度渲染前插入我们的自定义Pass进行深度法线纹理绘制。该Pass通过调用Built-in管线下的Shader——"Hidden/Internal-DepthNormalsTexture"
进行渲染。(没错,我们在URP下使用Built-in管线的Shader,因为它方便,不然你可以看下这个Shader的源码,就知道有多复杂)最后再将绘制出来的纹理设置成管线的全局纹理,方便后续的自定义Pass渲染下的其他Shader使用。
- 创建DepthNormalsFeature.cs,继承自ScriptableRendererFeature,然后在URP管线的配置文件对象中添加该功能(不理解的可以看上一章博客)。
- 该类的逻辑我在代码中都有备注,但部分都是瞎蒙的,毕竟我也是个菜鸡,再就是逻辑上一些写法和自定义管线设置有较大关联,可以参考我之前写的SRP尝试博客。
- 在类中有个Shader标签,需要特别注意,该标签名后需要和场景中物体的ShaderPass标签相互配合使用,下面会讲到
ShaderTagId m_ShaderTagId = new ShaderTagId("DepthOnly");
"DepthOnly"Pass
在场景物体渲染Shader中,我们需要增加对应的一个“DepthOnly”Pass,用于给我们上步创建的管线功能进行深度纹理渲染。
-
该Pass可参考我们之前写的多光源阴影处理的"ShadowCaster"Pass,改动上只是去除了阴影坐标转换。为方便,场景中物体使用的Shader,暂时由我们之前的标准光照模型使用Shader扩展而来,取名为StandardLightDepthPass。
-
上一步我们提到过,管线扩展功能中有个Shader标签,我们在写“DepthOnly”Pass时需要将LightMode设置成对应的标签,否则在深度法线纹理中,该物体不会被绘制。
Tags { "LightMode" = "DepthOnly" }
-
不出意外的话,我们已经可以通过帧调试器看到我们所绘制的深度法线纹理了。注意观察,可以看到绘制时机在阴影绘制后和深度绘制前。
扩展Volume框架
既然获取法线深度纹理的前置条件都完成了,接下来就是实现我们的描边效果了。同样的作为后处理效果,我们需要集成到Volume框架中,流程也还是那几步:Shader、对应的属性参数组件、修改AdditionPostProcessPass类和AdditionalPostProcessData类、AdditionalMaterialLibrary类。
- 创建Shader——EdgeDetectionNormalsAndDepth,同样的该Shader由入门精要下的描边Shader翻译而来,语法上只有一个DecodeFloatRG解码RG颜色方法需要我们自己实现一下,思路上不做过多追究,代码已经实现了。
需要注意的是,从深度法线纹理中获取到的深度亮度值相比直接从深度纹理获取到的不够高,需要将获取到的深度值*1000以提升亮度
- 创建属性参数类——EdgeDetectionNormalsAndDepth.cs,集成Volume框架,并实现Shader所需的各种属性参数,操作和之前一样,这里就不过多讲解了。
- 修改AdditionalPostProcessData.cs和AdditionalMaterialLibrary.cs,以方便获取Shader与材质。
- 修改AdditionPostProcessPass.cs,修改
void Render(CommandBuffer cmd, ref RenderingData renderingData)
方法和增加方法void SetEdgeDetectionNormalsAndDepth(CommandBuffer cmd, Material uberMaterial)
。参考入门精要的逻辑,我们只需要将属性参数类中的参数传入Shader即可。 - 最后将Shader赋值给管线中的AdditionalPostProcessData对象,然后就可以在Voloume组件中添加描边效果。
完整代码
全局雾
Shader
Shader "URP/Fog With Depth Texture"
{
Properties
{
_MainTex ("Base (RGB)", 2D) = "white" { }
// 雾效浓度
_FogDensity ("Fog Density", Float) = 1.0
// 雾效颜色
_FogColor ("Fog Color", Color) = (1, 1, 1, 1)
// 雾效起始高度
_FogStart ("Fog Start", Float) = 0.0
// 雾效终止高度
_FogEnd ("Fog End", Float) = 1.0
}
SubShader
{
Tags { "RenderPipeline" = "UniversalPipeline" }
HLSLINCLUDE
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
CBUFFER_START(UnityPerMaterial)
// 近裁剪平面的四个角对应的向量
float4x4 _FrustumCornersRay;
float4 _MainTex_ST;
float4 _MainTex_TexelSize;
half _FogDensity;
half4 _FogColor;
float _FogStart;
float _FogEnd;
CBUFFER_END
// 声明深度纹理,注意该名称是指定的
TEXTURE2D(_CameraDepthTexture);
SAMPLER(sampler_CameraDepthTexture);
// 声明纹理
TEXTURE2D(_MainTex);
SAMPLER(sampler_MainTex);
struct a2v
{
float4 vertex: POSITION;
float4 texcoord: TEXCOORD0;
};
struct v2f
{
float4 pos: SV_POSITION;
half2 uv: TEXCOORD0;
half2 uv_depth: TEXCOORD1;
// 用于存储插值后的像素向量
float4 interpolatedRay: TEXCOORD2;
};
v2f vert(a2v v)
{
v2f o;
o.pos = TransformObjectToHClip(v.vertex.xyz);
o.uv = TRANSFORM_TEX(v.texcoord, _MainTex);
o.uv_depth = o.uv;
// 检测DirectX平台
#if UNITY_UV_STARTS_AT_TOP
// 检测Unity是否已自动翻转了坐标
if (_MainTex_TexelSize.y < 0)
o.uv_depth.y = 1 - o.uv_depth.y;
#endif
int index = 0;
if(v.texcoord.x < 0.5 && v.texcoord.y < 0.5)
{
index = 0;
}
else if(v.texcoord.x > 0.5 && v.texcoord.y < 0.5)
{
index = 1;
}
else if(v.texcoord.x > 0.5 && v.texcoord.y > 0.5)
{
index = 2;
}
else
{
index = 3;
}
// 检测DirectX平台
#if UNITY_UV_STARTS_AT_TOP
// 检测Unity是否已自动翻转了坐标
if (_MainTex_TexelSize.y < 0)
index = 3 - index;
#endif
// 使用索引值获取四个顶点变量中对应顶点作为插值后的像素向量
o.interpolatedRay = _FrustumCornersRay[index];
return o;
}
half4 frag(v2f i): SV_Target
{
// 视角空间下的线性深度值 LinearEyeDepth:线性转换 SAMPLE_DEPTH_TEXTURE:深度纹理采样
//float linearDepth = LinearEyeDepth(SAMPLE_DEPTH_TEXTURE(_CameraDepthTexture, i.uv_depth));
float depth = SAMPLE_DEPTH_TEXTURE(_CameraDepthTexture, sampler_CameraDepthTexture, i.uv_depth);
// 通过相机的深度缓冲值计算当前像素深度 _ZBufferParams:用于线性化 Z 缓冲区值。
float linearDepth = LinearEyeDepth(depth, _ZBufferParams) ;
// 世界空间下的深度值 _WorldSpaceCameraPos:世界空间下相机位置
float3 worldPos = _WorldSpaceCameraPos + linearDepth * i.interpolatedRay.xyz;
// 高度雾效系数 = (终止高度 - 当前像素高度)/(终止高度 - 起始高度)
float fogDensity = (_FogEnd - worldPos.y) / (_FogEnd - _FogStart);
// 计算浓度 saturate:截取到0-1
fogDensity = saturate(fogDensity * _FogDensity);
// 原始颜色
half4 finalColor = SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, i.uv);
// 插值原始颜色,与雾效颜色,使用雾效系数作为参数
finalColor.rgb = lerp(finalColor.rgb, _FogColor.rgb, fogDensity);
return finalColor;
}
ENDHLSL
Pass
{
Tags { "RenderPipeline" = "UniversalPipeline" }
ZTest Always Cull Off ZWrite Off
HLSLPROGRAM
#pragma vertex vert
#pragma fragment frag
ENDHLSL
}
}
FallBack Off
}
属性参数组件
using System;
// 通用渲染管线程序集
namespace UnityEngine.Rendering.Universal
{
// 实例化类 添加到Volume组件菜单中
[Serializable, VolumeComponentMenu("Addition-Post-processing/FogWithDepthTexture")]
// 集成VolumeComponent组件和IPostProcessComponent接口,用以继承Volume框架
public class FogWithDepthTexture : VolumeComponent, IPostProcessComponent
{
// 在框架下的属性与Unity常规属性不一样,例如 Int 由 ClampedIntParameter 取代。
[Tooltip("开关")]
public BoolParameter _Switch = new BoolParameter(false);
[Tooltip("密度")]
public ClampedFloatParameter fogDensity = new ClampedFloatParameter(1f, 0, 100);
[Tooltip("颜色")]
public ColorParameter fogColor = new ColorParameter(Color.gray);
[Tooltip("起始高度")]
public ClampedFloatParameter fogStart = new ClampedFloatParameter(0f, -50f, 50);
[Tooltip("终止高度")]
public ClampedFloatParameter fogEnd = new ClampedFloatParameter(0f, -50f, 50);
// 实现接口
public bool IsActive() => _Switch.value;
public bool IsTileCompatible()
{
return false;
}
}
}
修改AdditionalPostProcessData与AdditionalMaterialLibrary
using System;
namespace UnityEngine.Rendering.Universal
{
/// <summary>
/// 附加后处理数据
/// </summary>
[Serializable]
public class AdditionalPostProcessData : ScriptableObject
{
[Serializable]
public sealed class Shaders
{
public Shader brightnessSaturationContrast;
public Shader fogWithDepthTexture;
}
public Shaders shaders;
}
}
namespace UnityEngine.Rendering.Universal
{
/// <summary>
/// 材质列表
/// </summary>
public class AdditionalMaterialLibrary
{
public readonly Material brightnessSaturationContrast;
public readonly Material fogWithDepthTexture;
/// <summary>
/// 初始化时从配置文件中获取材质
/// </summary>
/// <param name="data"></param>
public AdditionalMaterialLibrary(AdditionalPostProcessData data)
{
brightnessSaturationContrast = Load(data.shaders.brightnessSaturationContrast);
fogWithDepthTexture = Load(data.shaders.fogWithDepthTexture);
}
Material Load(Shader shader)
{
if (shader == null)
{
Debug.LogErrorFormat($"丢失 shader. {GetType().DeclaringType.Name} 渲染通道将不会执行。检查渲染器资源中是否缺少引用。");
return null;
}
else if (!shader.isSupported)
{
return null;
}
return CoreUtils.CreateEngineMaterial(shader);
}
internal void Cleanup()
{
CoreUtils.Destroy(brightnessSaturationContrast);
}
}
}
修改AdditionPostProcessPass
using UnityEngine.Experimental.Rendering;
namespace UnityEngine.Rendering.Universal
{
/// <summary>
/// 附加的后处理Pass
/// </summary>
public class AdditionPostProcessPass : ScriptableRenderPass
{
//标签名,用于续帧调试器中显示缓冲区名称
const string CommandBufferTag = "AdditionalPostProcessing Pass";
// 用于后处理的材质
Material m_BlitMaterial;
AdditionalMaterialLibrary m_Materials;
AdditionalPostProcessData m_Data;
// 主纹理信息
RenderTargetIdentifier m_Source;
// 深度信息
RenderTargetIdentifier m_Depth;
// 当前帧的渲染纹理描述
RenderTextureDescriptor m_Descriptor;
// 目标相机信息
RenderTargetHandle m_Destination;
// 临时的渲染目标
RenderTargetHandle m_TemporaryColorTexture01;
// 属性参数组件
BrightnessSaturationContrast m_BrightnessSaturationContrast;
FogWithDepthTexture m_FogWithDepthTexture;
EdgeDetectionNormalsAndDepth m_EdgeDetectionNormalsAndDepth;
public AdditionPostProcessPass(RenderPassEvent evt, AdditionalPostProcessData data, Material blitMaterial = null)
{
renderPassEvent = evt;
m_Data = data;
m_Materials = new AdditionalMaterialLibrary(data);
m_BlitMaterial = blitMaterial;
}
public void Setup(in RenderTextureDescriptor baseDescriptor, in RenderTargetIdentifier source, in RenderTargetIdentifier depth, in RenderTargetHandle destination)
{
m_Descriptor = baseDescriptor;
m_Source = source;
m_Depth = depth;
m_Destination = destination;
}
/// <summary>
/// URP会自动调用该执行方法
/// </summary>
/// <param name="context"></param>
/// <param name="renderingData"></param>
public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
{
// 从Volume框架中获取所有堆栈
var stack = VolumeManager.instance.stack;
// 从堆栈中查找对应的属性参数组件
m_BrightnessSaturationContrast = stack.GetComponent<BrightnessSaturationContrast>();
m_FogWithDepthTexture = stack.GetComponent<FogWithDepthTexture>();
m_EdgeDetectionNormalsAndDepth = stack.GetComponent<EdgeDetectionNormalsAndDepth>();
// 从命令缓冲区池中获取一个带标签的渲染命令,该标签名可以在后续帧调试器中见到
var cmd = CommandBufferPool.Get(CommandBufferTag);
// 调用渲染函数
Render(cmd, ref renderingData);
// 执行命令缓冲区
context.ExecuteCommandBuffer(cmd);
// 释放命令缓存
CommandBufferPool.Release(cmd);
}
// 渲染
void Render(CommandBuffer cmd, ref RenderingData renderingData)
{
ref var cameraData = ref renderingData.cameraData;
bool m_IsStereo = renderingData.cameraData.isStereoEnabled;
bool isSceneViewCamera = cameraData.isSceneViewCamera;
// VolumeComponent是否开启,且非Scene视图摄像机
if (m_BrightnessSaturationContrast.IsActive() && !isSceneViewCamera)
{
SetBrightnessSaturationContrast(cmd, m_Materials.brightnessSaturationContrast);
}
// VolumeComponent是否开启,且非Scene视图摄像机
if (m_FogWithDepthTexture.IsActive() && !isSceneViewCamera)
{
SetFogWithDepthTexture(cmd, m_Materials.fogWithDepthTexture, cameraData);
}
}
RenderTextureDescriptor GetStereoCompatibleDescriptor(int width, int height, int depthBufferBits = 0)
{
var desc = m_Descriptor;
desc.depthBufferBits = depthBufferBits;
desc.msaaSamples = 1;
desc.width = width;
desc.height = height;
return desc;
}
#region 处理材质渲染
// 亮度、饱和度、对比度渲染
void SetBrightnessSaturationContrast(CommandBuffer cmd, Material uberMaterial)
{
// 写入参数
uberMaterial.SetFloat("_Brightness", m_BrightnessSaturationContrast.brightness.value);
uberMaterial.SetFloat("_Saturation", m_BrightnessSaturationContrast.saturation.value);
uberMaterial.SetFloat("_Contrast", m_BrightnessSaturationContrast.contrast.value);
// 通过目标相机的渲染信息创建临时缓冲区
//RenderTextureDescriptor opaqueDesc = m_Descriptor;
//opaqueDesc.depthBufferBits = 0;
//cmd.GetTemporaryRT(m_TemporaryColorTexture01.id, opaqueDesc);
//or
int tw = m_Descriptor.width;
int th = m_Descriptor.height;
var desc = GetStereoCompatibleDescriptor(tw, th);
cmd.GetTemporaryRT(m_TemporaryColorTexture01.id, desc, FilterMode.Bilinear);
// 通过材质,将计算结果存入临时缓冲区
cmd.Blit(m_Source, m_TemporaryColorTexture01.Identifier(), uberMaterial);
// 再从临时缓冲区存入主纹理
cmd.Blit(m_TemporaryColorTexture01.Identifier(), m_Source);
// 释放临时RT
cmd.ReleaseTemporaryRT(m_TemporaryColorTexture01.id);
}
// 全局高度雾
void SetFogWithDepthTexture(CommandBuffer cmd, Material uberMaterial, CameraData cameraData)
{
Matrix4x4 frustumCorners = Matrix4x4.identity;
Camera camera = cameraData.camera;
// FOV是相机的竖直方向的视角范围
float fov = camera.fieldOfView;
// Near是相机的近裁剪平面的距离
float near = camera.nearClipPlane;
// Aspect是相机的宽高比(宽度除以高度)
float aspect = camera.aspect;
// 近裁剪面中心
float halfHeight = near * Mathf.Tan(fov * 0.5f * Mathf.Deg2Rad);
// 相机正右方
Vector3 toRight = camera.transform.right * halfHeight * aspect;
// 相机正上方
Vector3 toTop = camera.transform.up * halfHeight;
// 左上角相对于相机的方向
Vector3 topLeft = camera.transform.forward * near + toTop - toRight;
float scale = topLeft.magnitude / near;
topLeft.Normalize();
// 左上角距离摄像机的欧式距离dist
topLeft *= scale;
Vector3 topRight = camera.transform.forward * near + toRight + toTop;
topRight.Normalize();
topRight *= scale;
Vector3 bottomLeft = camera.transform.forward * near - toTop - toRight;
bottomLeft.Normalize();
bottomLeft *= scale;
Vector3 bottomRight = camera.transform.forward * near + toRight - toTop;
bottomRight.Normalize();
bottomRight *= scale;
// 将近裁剪平面的四个角对应的向量存储到矩阵类型的变量中(存入顺序是非常重要的)
frustumCorners.SetRow(0, bottomLeft);
frustumCorners.SetRow(1, bottomRight);
frustumCorners.SetRow(2, topRight);
frustumCorners.SetRow(3, topLeft);
// 将参数写入材质
uberMaterial.SetMatrix("_FrustumCornersRay", frustumCorners);
// 写入参数
uberMaterial.SetFloat("_FogDensity", m_FogWithDepthTexture.fogDensity.value);
uberMaterial.SetColor("_FogColor", m_FogWithDepthTexture.fogColor.value);
uberMaterial.SetFloat("_FogStart", m_FogWithDepthTexture.fogStart.value);
uberMaterial.SetFloat("_FogEnd", m_FogWithDepthTexture.fogEnd.value);
int tw = m_Descriptor.width;
int th = m_Descriptor.height;
var desc = GetStereoCompatibleDescriptor(tw, th);
cmd.GetTemporaryRT(m_TemporaryColorTexture01.id, desc);
// 通过材质,将计算结果存入临时缓冲区
cmd.Blit(m_Source, m_TemporaryColorTexture01.Identifier(), uberMaterial);
// 再从临时缓冲区存入主纹理
cmd.Blit(m_TemporaryColorTexture01.Identifier(), m_Source);
// 释放临时RT
cmd.ReleaseTemporaryRT(m_TemporaryColorTexture01.id);
}
#endregion
}
}
描边
DepthNormalsFeature
using UnityEngine;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;
public class DepthNormalsFeature : ScriptableRendererFeature
{
class DepthNormalsPass : ScriptableRenderPass
{
// 深度缓冲块大小
int kDepthBufferBits = 32;
/// <summary>深度法线纹理</summary>
private RenderTargetHandle depthAttachmentHandle { get; set; }
/// <summary>目标相机渲染信息</summary>
internal RenderTextureDescriptor descriptor { get; private set; }
/// <summary>材质</summary>
private Material depthNormalsMaterial = null;
/// <summary>筛选设置</summary>
private FilteringSettings m_FilteringSettings;
// 该Pass在帧分析器显示的标签
string m_ProfilerTag = "Depth Normals Pre Pass";
// Pass绘制标签,在Shader中只有声明了相同绘制标签的Pass才能被调用绘制
ShaderTagId m_ShaderTagId = new ShaderTagId("DepthOnly");
/// <summary>
/// 构造函数
/// </summary>
/// <param name="renderQueueRange">渲染队列</param>
/// <param name="layerMask">渲染对象层级</param>
/// <param name="material">材质</param>
public DepthNormalsPass(RenderQueueRange renderQueueRange, LayerMask layerMask, Material material)
{
m_FilteringSettings = new FilteringSettings(renderQueueRange, layerMask);
depthNormalsMaterial = material;
}
/// <summary>
/// 参数设置
/// </summary>
/// <param name="baseDescriptor">目标相机渲染信息</param>
/// <param name="depthAttachmentHandle">深度法线纹理</param>
public void Setup(RenderTextureDescriptor baseDescriptor, RenderTargetHandle depthAttachmentHandle)
{
// 设置纹理
this.depthAttachmentHandle = depthAttachmentHandle;
// 设置渲染目标信息
baseDescriptor.colorFormat = RenderTextureFormat.ARGB32;
baseDescriptor.depthBufferBits = kDepthBufferBits;
descriptor = baseDescriptor;
}
// 该方法在执行渲染通道之前被调用。
// 它可以用来配置渲染目标和它们的清除状态。也创建临时渲染目标纹理。
// 当为空时,这个渲染通道将渲染到激活的摄像机渲染目标。
// 你不应该调用CommandBuffer.SetRenderTarget。调用<c>ConfigureTarget</c> and <c> configurecclear </c>。
// 渲染管道将确保目标设置和清除以性能方式进行。
public override void Configure(CommandBuffer cmd, RenderTextureDescriptor cameraTextureDescriptor)
{
// 获取一个临时RT(深度法线纹理、目标信息、滤波模式)
cmd.GetTemporaryRT(depthAttachmentHandle.id, descriptor, FilterMode.Point);
// 配置目标
ConfigureTarget(depthAttachmentHandle.Identifier());
// 清楚未渲染配置为黑色
ConfigureClear(ClearFlag.All, Color.black);
}
//这里你可以实现渲染逻辑。
//使用<c>ScriptableRenderContext</c>来发出绘图命令或执行命令缓冲区
// https://docs.unity3d.com/ScriptReference/Rendering.ScriptableRenderContext.html
//你不必调用ScriptableRenderContext。提交时,渲染管道会在管道中的特定点调用它。
public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
{
// 获取命令缓冲区
CommandBuffer cmd = CommandBufferPool.Get(m_ProfilerTag);
using (new ProfilingScope(cmd, new ProfilingSampler(m_ProfilerTag)))
{
// 执行命令缓存
context.ExecuteCommandBuffer(cmd);
// 清楚数据缓存
cmd.Clear();
// 相机的排序标志
var sortFlags = renderingData.cameraData.defaultOpaqueSortFlags;
// 创建绘制设置
var drawSettings = CreateDrawingSettings(m_ShaderTagId, ref renderingData, sortFlags);
// 设置对象数据
drawSettings.perObjectData = PerObjectData.None;
// 检测是否是VR设备
ref CameraData cameraData = ref renderingData.cameraData;
Camera camera = cameraData.camera;
if (cameraData.isStereoEnabled)
context.StartMultiEye(camera);
// 设置覆盖材质
drawSettings.overrideMaterial = depthNormalsMaterial;
// 绘制渲染器
context.DrawRenderers(renderingData.cullResults, ref drawSettings, ref m_FilteringSettings);
// 设置全局纹理
cmd.SetGlobalTexture("_CameraDepthNormalsTexture", depthAttachmentHandle.id);
}
// 执行命令缓冲区
context.ExecuteCommandBuffer(cmd);
// 释放命令缓冲区
CommandBufferPool.Release(cmd);
}
// 清除此呈现传递执行期间创建的任何已分配资源。
public override void FrameCleanup(CommandBuffer cmd)
{
if (depthAttachmentHandle != RenderTargetHandle.CameraTarget)
{
cmd.ReleaseTemporaryRT(depthAttachmentHandle.id);
depthAttachmentHandle = RenderTargetHandle.CameraTarget;
}
}
}
// 深度法线Pass
DepthNormalsPass depthNormalsPass;
// 深度法线纹理
RenderTargetHandle depthNormalsTexture;
// 处理材质
Material depthNormalsMaterial;
public override void Create()
{
// 通过Built-it管线中的Shader创建材质
depthNormalsMaterial = CoreUtils.CreateEngineMaterial("Hidden/Internal-DepthNormalsTexture");
// 获取Pass(渲染队列,渲染对象,材质)
depthNormalsPass = new DepthNormalsPass(RenderQueueRange.opaque, -1, depthNormalsMaterial);
// 设置渲染时机 = 预渲染通道后
depthNormalsPass.renderPassEvent = RenderPassEvent.AfterRenderingPrePasses;
// 设置纹理名
depthNormalsTexture.Init("_CameraDepthNormalsTexture");
}
//这里你可以在渲染器中注入一个或多个渲染通道。
//这个方法在设置渲染器时被调用。
public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
{
// 对Pass进行参数设置(当前渲染相机信息,深度法线纹理)
depthNormalsPass.Setup(renderingData.cameraData.cameraTargetDescriptor, depthNormalsTexture);
// 写入渲染管线队列
renderer.EnqueuePass(depthNormalsPass);
}
}
物体Shader_StandardLightDepthPass
Shader "URP/StandardLightDepthPass"
{
Properties
{
// 基础纹理
[MainColor] _BaseColor ("Color", Color) = (0.5, 0.5, 0.5, 1)
[MainTexture] _BaseMap ("Albedo", 2D) = "white" { }
// 透明度裁剪
[Toggle(_ALPHATEST_ON)] _EnableAlphaTest ("Alpha Cutoff", Float) = 0.0
_Cutoff ("Alpha Cutoff", Range(0.0, 1.0)) = 0.5
// 法线贴图
[Toggle(_NORMALMAP)] _EnableBumpMap ("Bump Map", Float) = 0.0
_BumpScale ("Normal Scale", Float) = 1.0
_BumpMap ("Normal Map", 2D) = "bump" { }
// 漫反射叠加颜色
_Diffuse ("Diffuse", Color) = (1, 1, 1, 1)
// 高光反射叠加颜色
_Specular ("Specular", Color) = (1, 1, 1, 1)
// 高光系数
_Gloss ("Gloss", Range(8.0, 256)) = 20
// 是否计算多光源
[Toggle(_AdditionalLights)] _AddLights ("AddLights", Float) = 1
}
SubShader
{
Tags { "RenderType" = "Opaque" "RenderPipeline" = "UniversalPipeline" }
LOD 300
HLSLINCLUDE
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
CBUFFER_START(UnityPerMaterial)
float4 _BaseColor;
float4 _BaseMap_ST;
half _Cutoff;
float4 _BumpMap_ST;
float _BumpScale;
float4 _Diffuse;
float4 _Specular;
float _Gloss;
CBUFFER_END
ENDHLSL
Pass
{
Tags { "LightMode" = "UniversalForward" }
Cull Off
HLSLPROGRAM
// 设置关键字
#pragma shader_feature _NORMALMAP
#pragma shader_feature _ALPHATEST_ON
#pragma shader_feature _AdditionalLights
// 接收阴影所需关键字
#pragma multi_compile _ _MAIN_LIGHT_SHADOWS
#pragma multi_compile _ _MAIN_LIGHT_SHADOWS_CASCADE
#pragma multi_compile _ _ADDITIONAL_LIGHTS_VERTEX _ADDITIONAL_LIGHTS
#pragma multi_compile _ _SHADOWS_SOFT
#pragma vertex vert
#pragma fragment frag
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
#include "Packages/com.unity.render-pipelines.core/ShaderLibrary/CommonMaterial.hlsl"
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Lighting.hlsl"
TEXTURE2D(_BaseMap);
SAMPLER(sampler_BaseMap);
TEXTURE2D(_BumpMap);
SAMPLER(sampler_BumpMap);
struct Attributes
{
float4 positionOS: POSITION;//位置
float3 normalOS: NORMAL;//法线
float4 tangentOS: TANGENT;//切线
float2 texcoord: TEXCOORD0;//纹理坐标
};
struct Varyings
{
float4 positionCS: SV_POSITION;
float2 uv: TEXCOORD0;
float3 positionWS: TEXCOORD1;
#ifdef _NORMALMAP//如果使用了法线贴图
float4 normalWS: TEXCOORD3; // xyz: normal, w: viewDir.x
float4 tangentWS: TEXCOORD4; // xyz: tangent, w: viewDir.y
float4 bitangentWS: TEXCOORD5; // xyz: bitangent, w: viewDir.z
#else
float3 normalWS: TEXCOORD3;
float3 viewDirWS: TEXCOORD4;
#endif
};
Varyings vert(Attributes v)
{
Varyings o;
// 获取不同空间下坐标信息
VertexPositionInputs positionInputs = GetVertexPositionInputs(v.positionOS.xyz);
o.positionCS = positionInputs.positionCS;
o.positionWS = positionInputs.positionWS;
o.uv = TRANSFORM_TEX(v.texcoord, _BaseMap);
// 获取世界空间下法线相关向量
VertexNormalInputs normalInput = GetVertexNormalInputs(v.normalOS, v.tangentOS);
// 视角方向
half3 viewDirWS = GetCameraPositionWS() - positionInputs.positionWS;
#ifdef _NORMALMAP// 如果使用了法线贴图
o.normalWS = half4(normalInput.normalWS, viewDirWS.x);
o.tangentWS = half4(normalInput.tangentWS, viewDirWS.y);
o.bitangentWS = half4(normalInput.bitangentWS, viewDirWS.z);
#else
o.normalWS = NormalizeNormalPerVertex(normalInput.normalWS);
o.viewDirWS = viewDirWS;
#endif
return o;
}
/// albedo:反射系数
/// lightColor:光源颜色
/// lightDirectionWS:世界空间下光线方向
/// lightAttenuation:光照衰减
/// normalWS:世界空间下法线
/// viewDirectionWS:世界空间下视角方向
half3 LightingBased(half3 albedo, half3 lightColor, half3 lightDirectionWS, half lightAttenuation, half3 normalWS, half3 viewDirectionWS)
{
// 兰伯特漫反射计算
half NdotL = saturate(dot(normalWS, lightDirectionWS));
half3 radiance = lightColor * (lightAttenuation * NdotL) * _Diffuse.rgb;
// BlinnPhong高光反射
half3 halfDir = normalize(lightDirectionWS + viewDirectionWS);
half3 specular = lightColor * pow(saturate(dot(normalWS, halfDir)), _Gloss) * _Specular.rgb;
return(radiance + specular) * albedo;
}
// 计算漫反射与高光
half3 LightingBased(half3 albedo, Light light, half3 normalWS, half3 viewDirectionWS)
{
// 注意light.distanceAttenuation * light.shadowAttenuation,这里已经将距离衰减与阴影衰减进行了计算
return LightingBased(albedo, light.color, light.direction, light.distanceAttenuation * light.shadowAttenuation, normalWS, viewDirectionWS);
}
half4 frag(Varyings i): SV_Target
{
half3 viewDirWS ;
half3 normalWS;
#ifdef _NORMALMAP// 是否使用法线纹理
viewDirWS = half3(i.normalWS.w, i.tangentWS.w, i.bitangentWS.w);
//可以使用该方法替代下面的法线纹理采样,但是需要引用函数库ShaderLibrary/SurfaceInput.hlsl
//half3 normalTS = SampleNormal(i.uv, TEXTURE2D_ARGS(_BumpMap, sampler_BumpMap), _BumpScale);
//or
half3 normalTS = UnpackNormalScale(SAMPLE_TEXTURE2D(_BumpMap, sampler_BumpMap, i.uv), _BumpScale);
normalWS = TransformTangentToWorld(normalTS, half3x3(i.tangentWS.xyz, i.bitangentWS.xyz, i.normalWS.xyz));
#else
viewDirWS = i.viewDirWS;
normalWS = i.normalWS;
#endif
normalWS = NormalizeNormalPerPixel(normalWS);
viewDirWS = SafeNormalize(viewDirWS);
//纹理采样
//可以使用该方法替代下面的纹理采集操作,但是需要引用函数库ShaderLibrary/SurfaceInput.hlsl
//half4 albedoAlpha = SampleAlbedoAlpha(i.uv, TEXTURE2D_ARGS(_BaseMap, sampler_BaseMap));
// or
half4 albedoAlpha = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv);
// 透明度裁剪
//可以使用该方法替代下面的裁剪操作,但是需要引用函数库ShaderLibrary/SurfaceInput.hlsl
//half alpha = Alpha(albedoAlpha.a, _BaseColor, _Cutoff);
//or
#if defined(_ALPHATEST_ON)
clip(albedoAlpha.a - _Cutoff);
#endif
// 漫反射系数
//half3 albedo = albedoAlpha.rgb * _BaseColor.rgb;
// or
half3 albedo = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv).rgb * _BaseColor.rgb;
// 获取阴影坐标
float4 shadowCoord = TransformWorldToShadowCoord(i.positionWS.xyz);
// 计算主光源与阴影
Light mainLight = GetMainLight(shadowCoord);
half3 diffuse = LightingBased(albedo, mainLight, normalWS, viewDirWS);
// 计算其他光源与阴影
// 需要将光源组件的ShadowType参数打开,并且将管线中的CastShadows勾选,副光源才会产生阴影
#ifdef _AdditionalLights
uint pixelLightCount = GetAdditionalLightsCount();
for (uint lightIndex = 0u; lightIndex < pixelLightCount; ++ lightIndex)
{
Light light = GetAdditionalLight(lightIndex, i.positionWS);
diffuse += LightingBased(albedo, light, normalWS, viewDirWS);
}
#endif
// 计算环境光
half3 ambient = SampleSH(normalWS) * albedo;
return half4(ambient + diffuse, 1.0);
}
ENDHLSL
}
// 计算阴影的Pass
Pass
{
Name "ShadowCaster"
Tags { "LightMode" = "ShadowCaster" }
Cull Off
ZWrite On
ZTest LEqual
HLSLPROGRAM
// 设置关键字
#pragma shader_feature _ALPHATEST_ON
#pragma vertex vert
#pragma fragment frag
#include "Packages/com.unity.render-pipelines.core/ShaderLibrary/CommonMaterial.hlsl"
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Shadows.hlsl"
float3 _LightDirection;
struct Attributes
{
float4 positionOS: POSITION;
float3 normalOS: NORMAL;
float2 texcoord: TEXCOORD0;
};
struct Varyings
{
float2 uv: TEXCOORD0;
float4 positionCS: SV_POSITION;
};
TEXTURE2D(_BaseMap);
SAMPLER(sampler_BaseMap);
// 获取裁剪空间下的阴影坐标
float4 GetShadowPositionHClips(Attributes input)
{
float3 positionWS = TransformObjectToWorld(input.positionOS.xyz);
float3 normalWS = TransformObjectToWorldNormal(input.normalOS);
// 获取阴影专用裁剪空间下的坐标
float4 positionCS = TransformWorldToHClip(ApplyShadowBias(positionWS, normalWS, _LightDirection));
// 判断是否是在DirectX平台翻转过坐标
#if UNITY_REVERSED_Z
positionCS.z = min(positionCS.z, positionCS.w * UNITY_NEAR_CLIP_VALUE);
#else
positionCS.z = max(positionCS.z, positionCS.w * UNITY_NEAR_CLIP_VALUE);
#endif
return positionCS;
}
Varyings vert(Attributes input)
{
Varyings output;
output.uv = TRANSFORM_TEX(input.texcoord, _BaseMap);
output.positionCS = GetShadowPositionHClips(input);
return output;
}
half4 frag(Varyings input): SV_TARGET
{
//可以使用该方法替代下面的裁剪操作,但是需要引用函数库ShaderLibrary/SurfaceInput.hlsl
//Alpha(SampleAlbedoAlpha(input.uv, TEXTURE2D_ARGS(_BaseMap, sampler_BaseMap)).a, _BaseColor, _Cutoff);
//or
half4 albedoAlpha = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, input.uv);
#if defined(_ALPHATEST_ON)
clip(albedoAlpha.a - _Cutoff);
#endif
return 0;
}
ENDHLSL
}
// 用于渲染深度纹理
Pass
{
Name "DepthOnly"
Tags { "LightMode" = "DepthOnly" }
// 开启深度写入
ZWrite On
// 用于控制Pass不写入任何颜色通道
ColorMask 0
Cull Off
HLSLPROGRAM
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
#pragma vertex DepthOnlyVertex
#pragma fragment DepthOnlyFragment
struct Attributes
{
float4 position: POSITION;
float2 texcoord: TEXCOORD0;
};
struct Varyings
{
float2 uv: TEXCOORD0;
float4 positionCS: SV_POSITION;
};
TEXTURE2D(_BaseMap);
SAMPLER(sampler_BaseMap);
Varyings DepthOnlyVertex(Attributes input)
{
Varyings output = (Varyings)0;
output.uv = TRANSFORM_TEX(input.texcoord, _BaseMap);
output.positionCS = TransformObjectToHClip(input.position.xyz);
return output;
}
half4 DepthOnlyFragment(Varyings input): SV_TARGET
{
//可以使用该方法替代下面的裁剪操作,但是需要引用函数库ShaderLibrary/SurfaceInput.hlsl
//Alpha(SampleAlbedoAlpha(input.uv, TEXTURE2D_ARGS(_BaseMap, sampler_BaseMap)).a, _BaseColor, _Cutoff);
//or
half4 albedoAlpha = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, input.uv);
#if defined(_ALPHATEST_ON)
clip(albedoAlpha.a - _Cutoff);
#endif
return 0;
}
ENDHLSL
}
}
FallBack "Packages/com.unity.render-pipelines.universal/FallbackError"
}
后处理Shader_EdgeDetectionNormalsAndDepth
Shader "URP/Edge Detection Normals And Depth"
{
Properties
{
_MainTex ("Base (RGB)", 2D) = "white" { }
// 描边强度
_EdgeOnly ("Edge Only", Float) = 1.0
// 描边颜色
_EdgeColor ("Edge Color", Color) = (0, 0, 0, 1)
// 背景色
_BackgroundColor ("Background Color", Color) = (1, 1, 1, 1)
// 用于控制深度+发现纹理采样时,使用的采样距离。
_SampleDistance ("Sample Distance", Float) = 1.0
// xy对应了法线和深度检测的灵敏度
_Sensitivity ("Sensitivity", Vector) = (1, 1, 1, 1)
}
SubShader
{
Tags { "RenderPipeline" = "UniversalPipeline" }
HLSLINCLUDE
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
CBUFFER_START(UnityPerMaterial)
half4 _MainTex_ST;
half4 _MainTex_TexelSize;
half _EdgeOnly;
half4 _EdgeColor;
half4 _BackgroundColor;
float _SampleDistance;
half4 _Sensitivity;
CBUFFER_END
// 声明深度法线纹理,注意该名称是指定的
TEXTURE2D(_CameraDepthNormalsTexture);
SAMPLER(sampler_CameraDepthNormalsTexture);
// 声明纹理
TEXTURE2D(_MainTex);
SAMPLER(sampler_MainTex);
struct a2v
{
float4 vertex: POSITION;
float4 texcoord: TEXCOORD0;
};
struct v2f
{
float4 pos: SV_POSITION;
// 第一个坐标存储了屏幕颜色图像的采样纹理
// 后面四个坐标存储了使用Roberts算子时需要采样的纹理坐标。
half2 uv[5]: TEXCOORD0;
};
v2f vert(a2v v)
{
v2f o;
o.pos = TransformObjectToHClip(v.vertex.xyz);
// 对纹理进行采样,并放入第一个坐标中
half2 uv = TRANSFORM_TEX(v.texcoord, _MainTex);
o.uv[0] = uv;
// 检测DirectX平台
#if UNITY_UV_STARTS_AT_TOP
// 检测Unity是否已自动翻转了坐标
if (_MainTex_TexelSize.y < 0)
uv.y = 1 - uv.y;
#endif
// 对纹理领域进行采样,并用_SampleDistance控制采样距离
o.uv[1] = uv + _MainTex_TexelSize.xy * half2(1, 1) * _SampleDistance;
o.uv[2] = uv + _MainTex_TexelSize.xy * half2(-1, -1) * _SampleDistance;
o.uv[3] = uv + _MainTex_TexelSize.xy * half2(-1, 1) * _SampleDistance;
o.uv[4] = uv + _MainTex_TexelSize.xy * half2(1, -1) * _SampleDistance;
return o;
}
// 解码深度
inline float DecodeFloatRG(float2 enc)
{
float2 kDecodeDot = float2(1.0, 1 / 255.0);
return dot(enc, kDecodeDot);
}
// 解码法线
float3 DecodeNormal(float4 enc)
{
float kScale = 1.7777;
float3 nn = enc.xyz * float3(2 * kScale, 2 * kScale, 0) + float3(-kScale, -kScale, 1);
float g = 2.0 / dot(nn.xyz, nn.xyz);
float3 n;
n.xy = g * nn.xy;
n.z = g - 1;
return n;
}
inline void DecodeDepthNormal(float4 enc, out float depth, out float3 normal)
{
depth = DecodeFloatRG(enc.zw);
normal = DecodeNormal(enc);
}
half CheckSame(half4 center, half4 sample)
{
// 获取法线近似值
half2 centerNormal = center.xy;
// DecodeFloatRG:解码RG颜色到float
float centerDepth = DecodeFloatRG(center.zw);
half2 sampleNormal = sample.xy;
float sampleDepth = DecodeFloatRG(sample.zw);
// 法线差异 = 两个采样点对应值相减后乘以灵敏度
half2 diffNormal = abs(centerNormal - sampleNormal) * _Sensitivity.x;
// 将分量相加与阈值比较来判断是否存在边界
int isSameNormal = (diffNormal.x + diffNormal.y) < 0.1;
// 深度差异 = 两个采样点对应值相减后乘以灵敏度
float diffDepth = abs(centerDepth - sampleDepth) * _Sensitivity.y;
// 将深度差异与阈值比较得到是否存在边界结果
int isSameDepth = diffDepth < 0.1 * centerDepth;
// 将深度与法线结果相乘得到最终结果
// return:
// 1 - 如果法线和深度足够相似,即不存在边界
// 0 - 否则
return isSameNormal * isSameDepth ? 1.0: 0.0;
}
half4 fragRobertsCrossDepthAndNormal(v2f i): SV_Target
{
// 使用纹理坐标,对深度+法线纹理进行采样
half4 sample1 = SAMPLE_TEXTURE2D(_CameraDepthNormalsTexture, sampler_CameraDepthNormalsTexture, i.uv[1]);
half4 sample2 = SAMPLE_TEXTURE2D(_CameraDepthNormalsTexture, sampler_CameraDepthNormalsTexture, i.uv[2]);
half4 sample3 = SAMPLE_TEXTURE2D(_CameraDepthNormalsTexture, sampler_CameraDepthNormalsTexture, i.uv[3]);
half4 sample4 = SAMPLE_TEXTURE2D(_CameraDepthNormalsTexture, sampler_CameraDepthNormalsTexture, i.uv[4]);
half4 sample = SAMPLE_TEXTURE2D(_CameraDepthNormalsTexture, sampler_CameraDepthNormalsTexture, i.uv[0]);
/*
float depth;
float3 normal;
DecodeDepthNormal(sample, depth, normal);
//输出深度,从深度法线纹理中获取到的深度亮度值相比直接从深度纹理获取到的不够高,需要将获取到的深度值*1000以提升亮度
return depth*1000;
//输出法线
return float4(normal, 1);
*/
half edge = 1.0;
// 计算对角线两个纹理的插值 CheckSame:返回0或1
edge *= CheckSame(sample1, sample2);
edge *= CheckSame(sample3, sample4);
// 插值描边颜色
half4 withEdgeColor = lerp(_EdgeColor, SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, i.uv[0]), edge);
// 插值背景颜色
half4 onlyEdgeColor = lerp(_EdgeColor, _BackgroundColor, edge);
// 插值描边与背景颜色,由描边强度控制
return lerp(withEdgeColor, onlyEdgeColor, _EdgeOnly);
}
ENDHLSL
Pass
{
Tags { "RenderPipeline" = "UniversalPipeline" }
ZTest Always Cull Off ZWrite Off
HLSLPROGRAM
#pragma vertex vert
#pragma fragment fragRobertsCrossDepthAndNormal
ENDHLSL
}
}
FallBack Off
}
属性参数组件
using System;
// 通用渲染管线程序集
namespace UnityEngine.Rendering.Universal
{
// 实例化类 添加到Volume组件菜单中
[Serializable, VolumeComponentMenu("Addition-Post-processing/EdgeDetectionNormalsAndDepth")]
// 集成VolumeComponent组件和IPostProcessComponent接口,用以继承Volume框架
public class EdgeDetectionNormalsAndDepth : VolumeComponent, IPostProcessComponent
{
// 在框架下的属性与Unity常规属性不一样,例如 Int 由 ClampedIntParameter 取代。
[Tooltip("开关")]
public BoolParameter _Switch = new BoolParameter(false);
[Tooltip("描边强度")]
public ClampedFloatParameter edgesOnly = new ClampedFloatParameter(0f, 0, 1);
[Tooltip("描边颜色")]
public ColorParameter edgeColor = new ColorParameter(Color.black);
[Tooltip("背景颜色")]
public ColorParameter backgroundColor = new ColorParameter(Color.white);
[Tooltip("采样距离")]
public ClampedFloatParameter sampleDistance = new ClampedFloatParameter(1f, -10f, 10);
[Tooltip("深度差")]
public ClampedFloatParameter sensitivityDepth = new ClampedFloatParameter(1f, -10f, 10);
[Tooltip("法线差")]
public ClampedFloatParameter sensitivityNormals = new ClampedFloatParameter(1f, -10f, 10);
// 实现接口
public bool IsActive() => _Switch.value;
public bool IsTileCompatible()
{
return false;
}
}
}
修改AdditionalPostProcessData与AdditionalMaterialLibrary
using System;
namespace UnityEngine.Rendering.Universal
{
/// <summary>
/// 附加后处理数据
/// </summary>
[Serializable]
public class AdditionalPostProcessData : ScriptableObject
{
[Serializable]
public sealed class Shaders
{
public Shader brightnessSaturationContrast;
public Shader fogWithDepthTexture;
public Shader edgeDetectionNormalsAndDepth;
}
public Shaders shaders;
}
}
namespace UnityEngine.Rendering.Universal
{
/// <summary>
/// 材质列表
/// </summary>
public class AdditionalMaterialLibrary
{
public readonly Material brightnessSaturationContrast;
public readonly Material fogWithDepthTexture;
public readonly Material edgeDetectionNormalsAndDepth;
/// <summary>
/// 初始化时从配置文件中获取材质
/// </summary>
/// <param name="data"></param>
public AdditionalMaterialLibrary(AdditionalPostProcessData data)
{
brightnessSaturationContrast = Load(data.shaders.brightnessSaturationContrast);
fogWithDepthTexture = Load(data.shaders.fogWithDepthTexture);
edgeDetectionNormalsAndDepth = Load(data.shaders.edgeDetectionNormalsAndDepth);
}
Material Load(Shader shader)
{
if (shader == null)
{
Debug.LogErrorFormat($"丢失 shader. {GetType().DeclaringType.Name} 渲染通道将不会执行。检查渲染器资源中是否缺少引用。");
return null;
}
else if (!shader.isSupported)
{
return null;
}
return CoreUtils.CreateEngineMaterial(shader);
}
internal void Cleanup()
{
CoreUtils.Destroy(brightnessSaturationContrast);
}
}
}
修改AdditionPostProcessPass
using UnityEngine.Experimental.Rendering;
namespace UnityEngine.Rendering.Universal
{
/// <summary>
/// 附加的后处理Pass
/// </summary>
public class AdditionPostProcessPass : ScriptableRenderPass
{
//标签名,用于续帧调试器中显示缓冲区名称
const string CommandBufferTag = "AdditionalPostProcessing Pass";
// 用于后处理的材质
Material m_BlitMaterial;
AdditionalMaterialLibrary m_Materials;
AdditionalPostProcessData m_Data;
// 主纹理信息
RenderTargetIdentifier m_Source;
// 深度信息
RenderTargetIdentifier m_Depth;
// 当前帧的渲染纹理描述
RenderTextureDescriptor m_Descriptor;
// 目标相机信息
RenderTargetHandle m_Destination;
// 临时的渲染目标
RenderTargetHandle m_TemporaryColorTexture01;
// 属性参数组件
BrightnessSaturationContrast m_BrightnessSaturationContrast;
FogWithDepthTexture m_FogWithDepthTexture;
EdgeDetectionNormalsAndDepth m_EdgeDetectionNormalsAndDepth;
public AdditionPostProcessPass(RenderPassEvent evt, AdditionalPostProcessData data, Material blitMaterial = null)
{
renderPassEvent = evt;
m_Data = data;
m_Materials = new AdditionalMaterialLibrary(data);
m_BlitMaterial = blitMaterial;
}
public void Setup(in RenderTextureDescriptor baseDescriptor, in RenderTargetIdentifier source, in RenderTargetIdentifier depth, in RenderTargetHandle destination)
{
m_Descriptor = baseDescriptor;
m_Source = source;
m_Depth = depth;
m_Destination = destination;
}
/// <summary>
/// URP会自动调用该执行方法
/// </summary>
/// <param name="context"></param>
/// <param name="renderingData"></param>
public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
{
// 从Volume框架中获取所有堆栈
var stack = VolumeManager.instance.stack;
// 从堆栈中查找对应的属性参数组件
m_BrightnessSaturationContrast = stack.GetComponent<BrightnessSaturationContrast>();
m_FogWithDepthTexture = stack.GetComponent<FogWithDepthTexture>();
m_EdgeDetectionNormalsAndDepth = stack.GetComponent<EdgeDetectionNormalsAndDepth>();
// 从命令缓冲区池中获取一个带标签的渲染命令,该标签名可以在后续帧调试器中见到
var cmd = CommandBufferPool.Get(CommandBufferTag);
// 调用渲染函数
Render(cmd, ref renderingData);
// 执行命令缓冲区
context.ExecuteCommandBuffer(cmd);
// 释放命令缓存
CommandBufferPool.Release(cmd);
}
// 渲染
void Render(CommandBuffer cmd, ref RenderingData renderingData)
{
ref var cameraData = ref renderingData.cameraData;
bool m_IsStereo = renderingData.cameraData.isStereoEnabled;
bool isSceneViewCamera = cameraData.isSceneViewCamera;
// VolumeComponent是否开启,且非Scene视图摄像机
// 亮度、对比度、饱和度
if (m_BrightnessSaturationContrast.IsActive() && !isSceneViewCamera)
{
SetBrightnessSaturationContrast(cmd, m_Materials.brightnessSaturationContrast);
}
// 全局雾
if (m_FogWithDepthTexture.IsActive() && !isSceneViewCamera)
{
SetFogWithDepthTexture(cmd, m_Materials.fogWithDepthTexture, cameraData);
}
// 描边
if (m_EdgeDetectionNormalsAndDepth.IsActive() && !isSceneViewCamera)
{
SetEdgeDetectionNormalsAndDepth(cmd, m_Materials.edgeDetectionNormalsAndDepth);
}
}
RenderTextureDescriptor GetStereoCompatibleDescriptor(int width, int height, int depthBufferBits = 0)
{
var desc = m_Descriptor;
desc.depthBufferBits = depthBufferBits;
desc.msaaSamples = 1;
desc.width = width;
desc.height = height;
return desc;
}
#region 处理材质渲染
// 亮度、饱和度、对比度渲染
void SetBrightnessSaturationContrast(CommandBuffer cmd, Material uberMaterial)
{
// 写入参数
uberMaterial.SetFloat("_Brightness", m_BrightnessSaturationContrast.brightness.value);
uberMaterial.SetFloat("_Saturation", m_BrightnessSaturationContrast.saturation.value);
uberMaterial.SetFloat("_Contrast", m_BrightnessSaturationContrast.contrast.value);
// 通过目标相机的渲染信息创建临时缓冲区
//RenderTextureDescriptor opaqueDesc = m_Descriptor;
//opaqueDesc.depthBufferBits = 0;
//cmd.GetTemporaryRT(m_TemporaryColorTexture01.id, opaqueDesc);
//or
int tw = m_Descriptor.width;
int th = m_Descriptor.height;
var desc = GetStereoCompatibleDescriptor(tw, th);
cmd.GetTemporaryRT(m_TemporaryColorTexture01.id, desc, FilterMode.Bilinear);
// 通过材质,将计算结果存入临时缓冲区
cmd.Blit(m_Source, m_TemporaryColorTexture01.Identifier(), uberMaterial);
// 再从临时缓冲区存入主纹理
cmd.Blit(m_TemporaryColorTexture01.Identifier(), m_Source);
// 释放临时RT
cmd.ReleaseTemporaryRT(m_TemporaryColorTexture01.id);
}
// 全局高度雾
void SetFogWithDepthTexture(CommandBuffer cmd, Material uberMaterial, CameraData cameraData)
{
Matrix4x4 frustumCorners = Matrix4x4.identity;
Camera camera = cameraData.camera;
// FOV是相机的竖直方向的视角范围
float fov = camera.fieldOfView;
// Near是相机的近裁剪平面的距离
float near = camera.nearClipPlane;
// Aspect是相机的宽高比(宽度除以高度)
float aspect = camera.aspect;
// 近裁剪面中心
float halfHeight = near * Mathf.Tan(fov * 0.5f * Mathf.Deg2Rad);
// 相机正右方
Vector3 toRight = camera.transform.right * halfHeight * aspect;
// 相机正上方
Vector3 toTop = camera.transform.up * halfHeight;
// 左上角相对于相机的方向
Vector3 topLeft = camera.transform.forward * near + toTop - toRight;
float scale = topLeft.magnitude / near;
topLeft.Normalize();
// 左上角距离摄像机的欧式距离dist
topLeft *= scale;
Vector3 topRight = camera.transform.forward * near + toRight + toTop;
topRight.Normalize();
topRight *= scale;
Vector3 bottomLeft = camera.transform.forward * near - toTop - toRight;
bottomLeft.Normalize();
bottomLeft *= scale;
Vector3 bottomRight = camera.transform.forward * near + toRight - toTop;
bottomRight.Normalize();
bottomRight *= scale;
// 将近裁剪平面的四个角对应的向量存储到矩阵类型的变量中(存入顺序是非常重要的)
frustumCorners.SetRow(0, bottomLeft);
frustumCorners.SetRow(1, bottomRight);
frustumCorners.SetRow(2, topRight);
frustumCorners.SetRow(3, topLeft);
// 将参数写入材质
uberMaterial.SetMatrix("_FrustumCornersRay", frustumCorners);
// 写入参数
uberMaterial.SetFloat("_FogDensity", m_FogWithDepthTexture.fogDensity.value);
uberMaterial.SetColor("_FogColor", m_FogWithDepthTexture.fogColor.value);
uberMaterial.SetFloat("_FogStart", m_FogWithDepthTexture.fogStart.value);
uberMaterial.SetFloat("_FogEnd", m_FogWithDepthTexture.fogEnd.value);
int tw = m_Descriptor.width;
int th = m_Descriptor.height;
var desc = GetStereoCompatibleDescriptor(tw, th);
cmd.GetTemporaryRT(m_TemporaryColorTexture01.id, desc);
// 通过材质,将计算结果存入临时缓冲区
cmd.Blit(m_Source, m_TemporaryColorTexture01.Identifier(), uberMaterial);
// 再从临时缓冲区存入主纹理
cmd.Blit(m_TemporaryColorTexture01.Identifier(), m_Source);
// 释放临时RT
cmd.ReleaseTemporaryRT(m_TemporaryColorTexture01.id);
}
// 描边
void SetEdgeDetectionNormalsAndDepth(CommandBuffer cmd, Material uberMaterial)
{
// 写入参数
uberMaterial.SetFloat("_EdgeOnly", m_EdgeDetectionNormalsAndDepth.edgesOnly.value);
uberMaterial.SetColor("_EdgeColor", m_EdgeDetectionNormalsAndDepth.edgeColor.value);
uberMaterial.SetColor("_BackgroundColor", m_EdgeDetectionNormalsAndDepth.backgroundColor.value);
uberMaterial.SetFloat("_SampleDistance", m_EdgeDetectionNormalsAndDepth.sampleDistance.value);
uberMaterial.SetVector("_Sensitivity", new Vector4(m_EdgeDetectionNormalsAndDepth.sensitivityNormals.value, m_EdgeDetectionNormalsAndDepth.sensitivityDepth.value, 0.0f, 0.0f));
int tw = m_Descriptor.width;
int th = m_Descriptor.height;
var desc = GetStereoCompatibleDescriptor(tw, th);
cmd.GetTemporaryRT(m_TemporaryColorTexture01.id, desc, FilterMode.Bilinear);
// 通过材质,将计算结果存入临时缓冲区
cmd.Blit(m_Source, m_TemporaryColorTexture01.Identifier(), uberMaterial);
// 再从临时缓冲区存入主纹理
cmd.Blit(m_TemporaryColorTexture01.Identifier(), m_Source);
// 释放临时RT
cmd.ReleaseTemporaryRT(m_TemporaryColorTexture01.id);
}
#endregion
}
}