UE4.27: A surfing rendering attempt

Came up with an idea in my head recently. I wanted to try surfing wave in volumetric.
My concept here is simple:

An Irregular shape.

An irregular shape is possible to be made with several erosion volumes. As shown in above simple diagram, one rectangle for filling and four circle for erosion. I extend this concept to surfing waves:

Wave filling & erosion.

Yay! Designers only need to transform the erosion volume for making variable surfing waves.

Detail

It’s all done on compute shaders. I added three passes for it.

Culling Pass

Similar to tiled-based culling technique in forward+ lighting. A voxel-based culling is implemented. I simply used box-intersection test for the voxel and volume. Then stored the volume count and index in a list. There are already lots of samples on the internet, I don’t need to mention a piece of code here. My calculation is done in world space.

Tile-based culling references:
https://wickedengine.net/2018/01/10/optimizing-tile-based-light-culling/
http://brabl.com/tile-frustum-calculation-for-the-point-light-culling/
(Just convert the concept to 3D grids instead of 2D tile)

Filling Pass

The most important pass here. This pass uses culling result and only processes voxel inside volume. Erosion volume attenuation is then computed with an oval formula:

(x – h)^2 / rx^2 + (z – k)^2 / rz^2 <= 1.0f

Which can be simplified to x^2 + z^2 if XZ is in local space. I use XZ axis here, so Y axis will be width scale for a surfing wave. This only calculates per-voxel, without putting it in sample step loop. It won’t be better to do so.

Next, step in each voxel with a sample count (1,2,4,8 or 16 in my system). Here it computes height fade, cell fade, foam fade. The output color will blend between water and foam texture based on foam fade. When sampling texture, I’ve also considered camera distance for choosing mipmap. So the far voxels won’t look noisy.

		// sample volume texture for once only, choose lod based camera distance
		float DistToCam = length(CellWorldStart - View.WorldCameraOrigin);
		DistToCam *= 0.0005;
		DistToCam = Pow2(DistToCam);
		
		float WaterMip = lerp(0, WaterVolumeMipCount - 1, DistToCam);
		float FoamMip = lerp(0, FoamVolumeMipCount - 1, DistToCam);
		
		float3 WaterColor = WaterVolumeTexture.SampleLevel(Bilinear3DWrappedSamplerState, TotalVolumeUV, WaterMip).rgb
			* WaveVolumeData.TintColor.rgb;
		float3 FoamColor = FoamVolumeTexture.SampleLevel(Bilinear3DWrappedSamplerState, TotalFoamUV * Noise, FoamMip).rgb;
		
		float3 Color = lerp(WaterColor * WaveVolumeData.Intensity, FoamColor * WaveVolumeData.FoamIntensity, TotalFoamFade).rgb;
		OutputColor.rgb += Color;
		OutputColor.a += Atten * TotalCellFade * TotalHeightFade * WaveVolumeData.TintColor.a;

Atten: Erosion volume result.
Height Fade: Computed with cos() function with local x coordinate as input.
Cell Fade: saturate((1.0f – abs(CellLocal.x)) / FadeThick); YZ axis uses the same formula. And I multiply all 3 axis attenuation. Output alpha will be used in final pass.
Foam Fade: This is computed based on “Foam Point" set by user. It’s a 2D vector in local XZ.

The trick here is, I didn’t sample water & foam texture every step. I average the attenuation parameters and UV coordinate in voxel steps only. If volume texture resolution is very high, it could have performance hit with sampling it per step. Also, multi-sampling on the texture can make result more blurry.

Last but not least, I sampled from result of previous frame as well. It’s like TAA, can reduce aliasing on the edge. (But high history weight value can blur the result also.)

Final Pass

Simply loop through volume slices in pixels and blend the result with scene color.

	for (int idx = Slice.y; idx >= Slice.x; idx--)
	{
		VolumeUV.z = float(idx * WaveVolume.InvVolumeGridSize.z);
		float4 Color = InputWaveVolume.SampleLevel(Bilinear3DClampedSamplerState, VolumeUV, 0);
		
		// blending
		OutputColor = lerp(OutputColor, Color, Color.a);
	}

I compute this like alpha blending, and must be back-to-front order.
There is actually a minor pass before this pass: To find the min/max slices at a pixel.
So that I don’t have to step every slices here. That pass simply checked if a slice has alpha > 0 at a lower resolution. Minor but worthy. I’ve also converted scene depth to z slice and compare it with min slice. So the occluded voxel won’t show up.

Performance & Async Compute

Apparently, this won’t be performant as rasterization method. It cost 3.65ms for just one volume with the following settings @RTX3060 laptop 1080p:
PixelSize=2
Downsize=4
SampleCount=8
HistorySampleCount=4
WaveFarPlane=6000.0f
GridSizeZ=128

Set per voxel pixel size to 4 can lower the cost to 1.21ms. Adjust other parameters can also earn a few milliseconds.
But also make the result blurry….water rendering should be clear instead of blurry!
Blurry result is more adequate for fogs, clouds..etc.

Lower the far plane distance could also hit performance. It went up to 5.29ms after I set far plane to 300.0f. This is because, the lower far distance introduces higher number of slices we’re writing to. It’s usual to use far plane to decide the depth value of a slices.
In my case, if my volume is located in a depth range [0, 1500], the output slices will be [2,32] with linear formula. However, restricts far plane to 300, all slices are written. This indeed brings extra performance cost. So you’d like to use a high far plane value for volumetric rendering.

Last thing is Async computing. The idea is to separate compute and graphic tasks to different command queue, and execute them asynchronizedly if possible. We’re able to use it with supported hardware. UE4 provided this feature.
Use -ForceAsyncCompute in UE4 editor according to UE 4.26 release notes. For now it doesn’t have much differences to me, since my scene doesn’t have that much tasks. But it could be useful if there are more GPU tasks.

Future Work

I might try this idea on rasterization method. It just can’t reach the quality of rasterization water rendering without messing up performance. To make it work in rasterization, I must have a mesh like this:

Yes, it must be solid instead surface only. Otherwise, I would only see back side of a mesh after being faded out by erosion volume. Another headache is lighting, the vertex normal must be recalculated. Or consider other ways to receive lighting.

Anyway, this concludes my another volumetric rendering practice!

發表迴響

在下方填入你的資料或按右方圖示以社群網站登入:

WordPress.com 標誌

您的留言將使用 WordPress.com 帳號。 登出 /  變更 )

Twitter picture

您的留言將使用 Twitter 帳號。 登出 /  變更 )

Facebook照片

您的留言將使用 Facebook 帳號。 登出 /  變更 )

連結到 %s

用 WordPress.com 建立自己的網站
立即開始使用
%d 位部落客按了讚: