原来在BuildIn中修改RenderTarget很容易,它可以让我们用额外的相机照画面进行后处理再贴到指定的对象上,比如镜子,湖面,但是最近项目用了URP后渲染管线变了,问了同事后才知道需要单独接管额外相机的渲染,整理后代码如下:
using System.Collections; using System.Collections.Generic; using UnityEngine; using UnityEngine.Rendering; using UnityEngine.Rendering.Universal; public class JCTest : MonoBehaviour { public Camera mCamera; public RenderTexture mRT; void OnEnable() { RenderPipelineManager.beginCameraRendering += OnBeginCameraRendering; mRT = RenderTexture.GetTemporary(512, 512, 16, RenderTextureFormat.ARGB32); mCamera.targetTexture = mRT; } void OnDisable() { RenderPipelineManager.beginCameraRendering -= OnBeginCameraRendering; RenderTexture.ReleaseTemporary(mRT); } private void OnBeginCameraRendering(ScriptableRenderContext context, Camera camera) { if (camera.cameraType == CameraType.SceneView || camera.cameraType == CameraType.Preview) return; UniversalRenderPipeline.RenderSingleCamera(context, mCamera); } }
注意事项:
1.按管的相机不需要添加到base相机的stack中
2,接管的相机需要在RenderPipelineManger.beginCameraRendering时调用,RenderSingleCamera方法,且必须为base相机,因为UniversalRenderPipeline.RenderSingleCamera方法中过滤了overlay相机,源码如下:
public static void RenderSingleCamera(ScriptableRenderContext context, Camera camera) { UniversalAdditionalCameraData additionalCameraData = null; if (IsGameCamera(camera)) camera.gameObject.TryGetComponent(out additionalCameraData); if (additionalCameraData != null && additionalCameraData.renderType != CameraRenderType.Base) { Debug.LogWarning( "Only Base cameras can be rendered with standalone RenderSingleCamera. Camera will be skipped."); return; } InitializeCameraData(camera, additionalCameraData, out var cameraData); RenderSingleCamera(context, cameraData, true, cameraData.postProcessEnabled); }
运行结果如下:
1.Camera为base相机,预设UI_Lgoin_Select设置一个layer为Camera不能看到,而SCamera可以看到(上面代码的mCamera为SCamera)
2.将JCTest.cs挂在SCamera上
3.Camera的北京色设置为红色
则Game窗口不可见UI_Login_Select,运行游戏,打开FrameDebugger查看如下:
发现SCamera看到的内容已经成功被写入临时RT中,结束
文章评论
HI, 我也是做unity的,看了你几篇技术文章觉得不错,可以交个朋友么, 留个邮箱?
@清风 287912542@qq.com