In this article, I will show how to break up a 3D model into polygon pieces, which is triggered by the distance between the camera and the model.

## Change the object according to the distance to the camera

You can add motion to the drawn object by changing the parameters of the shader.

This time I use **the distance between the camera and the object** as a variable.

Let’s create an effect that the object changes its colour and etc. according to the camera movement.

In the Project window, select ‘Create > Shader > Unlit Shader’ to create a new Shader file.

Modify the created Shader file as follows.

Shader "Custom/Color Gradient" { Properties { _Dsitance ("Distance", float) = 3.0 _FarColor ("Far Color", Color) = (0, 0, 0, 1) _NearColor ("Near Color", Color) = (1, 1, 1, 1) } SubShader { Tags { "RenderType"="Opaque" } Pass { CGPROGRAM #pragma vertex vert #pragma fragment frag #include "UnityCG.cginc" float _Dsitance; fixed4 _FarColor; fixed4 _NearColor; struct appdata { float4 vertex : POSITION; }; struct v2f { float4 vertex : SV_POSITION; float3 worldPos : TEXCOORD0; }; v2f vert (appdata v) { v2f o; o.vertex = UnityObjectToClipPos(v.vertex); o.worldPos = mul(unity_ObjectToWorld, v.vertex); // Convert Local coordinates into World coordinates return o; } fixed4 frag (v2f i) : SV_Target { // Get the distance between the camera and the object float dist = length(_WorldSpaceCameraPos - i.worldPos); // Change the colour by Lerp fixed4 col = fixed4(lerp(_NearColor.rgb, _FarColor.rgb, dist/_Dsitance), 1); return col; } ENDCG } } }

This Shader converts the local coordinate system into the world coordinate system in the Vertex Shader and sends them to the Fragment Shader.

Then, it calculates the distance between the camera and the object and changes the colour of the object according to the calculated distance by using the Lerp function.

// Get the distance between the camera and the object float dist = length(_WorldSpaceCameraPos - i.worldPos); // Change the colour by Lerp fixed4 col = fixed4(lerp(_NearColor.rgb, _FarColor.rgb, dist/_Dsitance), 1);

Now, the Shader changes the object according to the distance to the camera.

## Add motion to individual polygons

By using the Geometry Shader, you can add motion to individual polygons.

In the Shader, it’s processed in the following order.

- Vertex Shader
- Geometry Shader
- Fragment Shader

Note that the Geometry Shader doesn’t work on Web Player due to the limitation of WebGL.

When using it in STYLY, please be aware that** it doesn’t work on ‘STYLY Studio’ but it works in VR**.

Create a new Shader file as before, modify the created Shader file as follows.

Shader "Custom/Simple Geometry" { Properties { _Color ("Color", Color) = (1, 1, 1, 1) } SubShader { Tags { "RenderType"="Opaque" } Pass { CGPROGRAM #pragma vertex vert #pragma geometry geom #pragma fragment frag #include "UnityCG.cginc" fixed4 _Color; struct appdata { float4 vertex : POSITION; float2 uv : TEXCOORD0; }; struct g2f { float4 vertex : SV_POSITION; float2 uv : TEXCOORD0; }; appdata vert (appdata v) { return v; } // Geometry Shader [maxvertexcount(3)] void geom (triangle appdata input[3], inout TriangleStream<g2f> stream) { [unroll] for(int i = 0; i < 3; i++) { appdata v = input[i]; g2f o; o.vertex = UnityObjectToClipPos(v.vertex); o.uv = v.uv; stream.Append(o); } stream.RestartStrip(); } fixed4 frag (g2f i) : SV_Target { fixed4 col = _Color; return col; } ENDCG } } FallBack "Unlit/Color" }

This Shader itself doesn’t add any motions. It’s just a sample to show how to write a code of the Geometry Shader.

Let’s add motion by using the ‘_SinTime’ Shader and the Geometry Shader.

Shader "Custom/Polygon Moving" { Properties { _Color ("Color", Color) = (1, 1, 1, 1) _ScaleFactor ("Scale Factor", float) = 0.5 } SubShader { Tags { "RenderType"="Opaque" } Pass { CGPROGRAM #pragma vertex vert #pragma geometry geom #pragma fragment frag #include "UnityCG.cginc" fixed4 _Color; float _ScaleFactor; struct appdata { float4 vertex : POSITION; float2 uv : TEXCOORD0; }; struct g2f { float4 vertex : SV_POSITION; float2 uv : TEXCOORD0; }; appdata vert (appdata v) { return v; } // Geometry Shader [maxvertexcount(3)] void geom (triangle appdata input[3], inout TriangleStream<g2f> stream) { // Calculate Normal Vector float3 vec1 = input[1].vertex - input[0].vertex; float3 vec2 = input[2].vertex - input[0].vertex; float3 normal = normalize(cross(vec1, vec2)); [unroll] for(int i = 0; i < 3; i++) { appdata v = input[i]; g2f o; // Move vertex along normal vector v.vertex.xyz += normal * (_SinTime.w * 0.5 + 0.5) * _ScaleFactor; o.vertex = UnityObjectToClipPos(v.vertex); o.uv = v.uv; stream.Append(o); } stream.RestartStrip(); } fixed4 frag (g2f i) : SV_Target { fixed4 col = _Color; return col; } ENDCG } } FallBack "Unlit/Color" }

It added **‘the calculation of the normal vector’** and **‘the process to move vertices by using _SinTime’** to the simple Geometry Shader created earlier.

Now, I explain it step by step.

### Calculate the normal vector

We calculate the outer product to get the normal vector. The outer product gives a vector perpendicular to the given two vectors that are not parallel.

In other words, for a triangular polygon, it gives **the normal vector of the polygon, that is, a vector perpendicular to the two vectors that connects a vertex to other vertices.**

// Calculate the normal vector float3 vec1 = input[1].vertex - input[0].vertex; float3 vec2 = input[2].vertex - input[0].vertex; float3 normal = normalize(cross(vec1, vec2));

### Animation by using ‘_SinTime’

Next, we set up the animation. We use ‘_SineTime’, which is the sine of time. It smoothly oscillates between -1.0 and 1.0 as time goes by.

This kind of preset variables in Unity (‘Built-in variables’) allow you to save calculation cost.

Unity – Manual: ShaderLab Built-in shader variables

This time, in order to keep the value positive, we normalize it so that it smoothly oscillates between 0.0 and 1.0.

The minimum value -1.0 multiplied by 0.5 gives -0.5, and add 0.5, you have 0.0. Apply the same calculation to the maximum value 1.0, you have 1.0 (i.e. it doesn’t change after all).

_SinTime.w * 0.5 + 0.5

Now, you made the variable, which was oscillating between -1.0 and 1.0, oscillate between 0.0 and 1.0.

Finally, in order to add a motion that breaks up the object into individual polygons along the normal vectors, multiply the normal vector by the ‘_SinTime-controlled’ variable and then add it to the coordinates of each vertex.

// Move vertices along the normal vector v.vertex.xyz += normal * (_SinTime.w * 0.5 + 0.5) * _ScaleFactor;

## The program code for the ‘Polygon Destruction’ Shader

By combining **‘the distance between the camera and the object’** and **‘The geometry Shader’**, create the ‘Polygon Destruction’ Shader that explodes the 3D object when the camera approaches.

Create a Shader file and rewrite it as follows.

Shader "Custom/Polygon Destruction" { Properties { _FarColor ("Far Color", Color) = (1, 1, 1, 1) _NearColor ("Near Color", Color) = (0, 0, 0, 1) _ScaleFactor ("Scale Factor", float) = 0.5 _StartDistance ("Start Distance", float) = 3.0 } SubShader { Tags { "RenderType"="Opaque" } Pass { CGPROGRAM #pragma vertex vert #pragma geometry geom #pragma fragment frag #include "UnityCG.cginc" fixed4 _FarColor; fixed4 _NearColor; fixed _ScaleFactor; fixed _StartDistance; struct appdata { float4 vertex : POSITION; float2 uv : TEXCOORD0; }; struct g2f { float4 vertex : SV_POSITION; float2 uv : TEXCOORD0; fixed4 color : COLOR; }; // Return random value float rand(float2 seed) { return frac(sin(dot(seed.xy, float2(12.9898, 78.233))) * 43758.5453); } appdata vert (appdata v) { return v; } // Geometry Shader [maxvertexcount(3)] void geom (triangle appdata input[3], inout TriangleStream<g2f> stream) { // The distance between the camera and the CoG of the polygon float3 center = (input[0].vertex + input[1].vertex + input[2].vertex) / 3; float4 worldPos = mul(unity_ObjectToWorld, float4(center, 1.0)); float3 dist = length(_WorldSpaceCameraPos - worldPos); // Calculate the normal vector float3 vec1 = input[1].vertex - input[0].vertex; float3 vec2 = input[2].vertex - input[0].vertex; float3 normal = normalize(cross(vec1, vec2)); // Change how the polygons explode according to the distance to the camera fixed destruction = clamp(_StartDistance - dist, 0.0, 1.0); // Change the colour according to the distance to the camera fixed gradient = clamp(dist - _StartDistance, 0.0, 1.0); fixed random = rand(center.xy); fixed3 random3 = random.xxx; [unroll] for(int i = 0; i < 3; i++) { appdata v = input[i]; g2f o; // Move the vertex along the normal vector v.vertex.xyz += normal * destruction * _ScaleFactor * random3; o.vertex = UnityObjectToClipPos(v.vertex); o.uv = v.uv; // Change the colour by Lerp o.color = fixed4(lerp(_NearColor.rgb, _FarColor.rgb, gradient), 1); stream.Append(o); } stream.RestartStrip(); } fixed4 frag (g2f i) : SV_Target { fixed4 col = i.color; return col; } ENDCG } } FallBack "Unlit/Color" }

The processes in this Shader code are the combination of **‘The distance to the camera’** and **‘Processing individual polygons’** explained earlier.

## The effect by using the ‘Polygon Destruction’ Shader

This time I added a motion that breaks up the object into individual polygons by the shader code referring to the distance to the camera. However, you can also create a motion by changing the parameters with ‘_SinTime’ or ‘Animator’.

When controlling by Animator, you can animate the object with the motions varied.

## Upload to STYLY

Attach the Material created by your custom Shader to an object and upload it to use in a STYLY Scene.

The following article explains more about how to upload an asset to STYLY:

Even if you found a program code complicated at first, it could be just a combination of several basic elements in many cases. So, once you come to identify what is the ‘key’ element, you will make good use of shaders as you like.