Unity Official Introduction Tutorial Watch Notes

Today, I took a look at the latest introductory tutorial on the Unity official website. In the process of watching, I simply summarized the concepts involved and the content that needs to be paid attention to.

GameObject: Game Object

Game objects are the most important concept in the Unity editor. Every object in a game is a game object, from characters and collectible items to lights, cameras, and special effects. However, a game object cannot do anything by itself; you need to attribute it before it becomes a character, environment, or special effect.

Game objects are the basic objects that represent characters, props, and scenes in Unity. They don’t do much work by themselves, but they act as containers for components that implement functionality.

So GameObject is actually nothing to say, what it can do basically depends on what Components it has.

3D

Many basic types can be created in the game, such as cubu, sphere, etc

New objects can be created in various ways. After creation, double-click or press the F key to focus on the object

After selecting the object, there are some basic properties on the right side, these are called Components.

UI

TextMesh

The object of this UI is in effect in 2D latitude, so it is necessary to switch to 2D perspective to observe.

Its Transform is quite special, called Rect Transform

Anchor refers to where the axis is, and position refers to the offset position relative to the axis

This is that the axis is in the upper left corner, and the offset is 0

The way to quickly switch is to click on the Anchor Preset in the upper left corner. Some pre-set Anchor can be selected directly. Press and hold the option key to set the offset at the same time

Component

Transformer

It is divided into Position, Rotate and Scale, that is, Position, Rotate and Scale

Mesh

The Mesh Filter takes a mesh from your assets and passes it to the Mesh Renderer for rendering on the screen.

When importing mesh assets, Unity automatically creates a Skinned Mesh Renderer if the mesh is skinned, or a Mesh Filter along with a Mesh Renderer, if it is not.

In order for non-skinned Meshes to be rendered, both the Mesh Filter and Mesh Renderer components must be present for legacy reasons.

Preliminary understanding, Mesh Filter is a pure mesh for rendering GameObjects, which will be automatically passed to the Mesh Render of the same object for rendering the entire object

Mesh

Use Mesh Filter, Material, etc. to render GameObjects into the scene.

  • Material: the material used for rendering

  • Lighting

  • Cast Shadows: Whether to turn on shadows, if turned off, no shadows will be generated under the light source

  • Probe: Translation is called probe, specific role to be added

  • Light Probes:

  • Additional Settings:

Mesh

The Mesh Collider takes a Mesh Asset and builds its Collider based on that Mesh. It is more accurate for collision detection than using primitives for complicated Meshes. Mesh Colliders that are marked as Convex can collide with other Mesh Colliders.

Mesh colliders, based on Mesh Filter, can calculate collisions more accurately. Mesh colliders marked Convex can collide with other mesh colliders (that is to say, only those that are open and those that are open will collide, and those that are not open will collide.)

** MeshCollider.convex convex mesh **

This means that if you set this to true, your mesh collider will not have holes or entrances. Convex meshes can collide with other convex colliders and non-convex meshes. So convex mesh colliders are good for rigid bodies, if you do need a more detailed collider than the original collider provides.

Note: The physics engine requires a convex mesh to have non-zero volume. A flat mesh (such as a quadrilateral or plane labeled convex) will be modified by the physics engine to have a thickness (and thus volume) that meets this requirement. The resulting mesh has a thickness proportional to its dimensions, up to 0.05 of the longest dimension in the mesh plane.

Collider.isTrigger

The trigger does not register a collision with the incoming rigid body. Instead, when the rigid body enters or exits the trigger volume, it sends OnTriggerEnter, OnTriggerExit, and OnTriggerStay messages.

When a Collider is a Trigger, the Collider is simply collision detection, it will not prevent other colliders from overlapping with itself.

Material

In Unity, you use materials and shaders together to define the appearance of your scene

To draw something in Unity, you must provide information that describes its shape, and information that describes the appearance of its surface. You use meshes to describe shapes, and materials to describe the appearance of surfaces.

Materials and shaders are closely linked; you always use materials with shaders.

A material contains a reference to a Shader object . If that Shader object defines material properties, then the material can also contain data such as colors or references to textures.

Add a component that inherits from Renderer. MeshRenderer is the most common and is suitable for most use cases, but SkinnedMeshRenderer, LineRenderer, or TrailRenderer might be more suitable if your GameObject has special requirements.

Shader

A shader asset is an asset in your Unity project that defines a Shader object . It is a text file with a .shader extension. It contains shader code.

Texture: Texture

Normally, the mesh geometry of an object only gives a rough approximation of the shape while most of the fine detail is supplied by Textures . A texture is just a standard bitmap image that is applied over the mesh surface.

Textures are applied to objects using Materials . Materials use specialised graphics programs called Shaders to render a texture on the mesh surface.

Here, we can draw a general conclusion, Mesh Filter defines the mesh information of GameObject, Mesh Render uses Mesh Filter and Material to render, and Material is closely related to Shader, Material can use Shader to render Texture to the surface of Mesh.

Texture also contains particle systems, look more closely later

Rigidbody: rigid body

Adding a rigid body will only be affected by the physics engine

Rigidbodies enable your GameObjects to act under the control of physics. The Rigidbody can receive forces and torque to make your objects move in a realistic way. Any GameObject must contain a Rigidbody to be influenced by gravity, act under added forces via scripting, or interact with other objects through the NVIDIA PhysX physics engine

What is the relationship between rigid bodies and colliders? Will two rigid bodies generate force when they collide? If one Collider.isTrigger is true and the other is not, will it?

GameObjects with rigid bodies and colliders are considered dynamic, and the physics engine takes effect during the collision.

However, if the kinematic of the rigid body is set to true, it will not respond to the force and can only be controlled by scripting through the transform attribute.

Animator

\ - Animation Clips: A list of actions for the current model.

\ - Animator Controller: Finite-State Machine, used to control the conversion of the current model between different Clips under different conditions.

\ - Avatar: Attributes specific to the character model that can provide richer operations for the model.

Script

\ - In the script, you can use GetComponent < ComponentType > () to get all the Components added in advance, such as GetComponent < RigidBody > () to get the rigid body components we added.

\ - Variables defined as private will not be displayed in the inspector, but public can, then we can assign values to public variables in the inspector or other scripts

LifeCycle

Update

Execute once per frame

FixedUpdate

what is the difference between Update & FixedUpdate in Unity?

The number of executions per frame depends on the number of physics engine actions

LateUpdate

Execute once per frame, but perform LateUpdate for all GameObjects after Update for all GameObjects.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEngine.InputSystem;

public class PlayerController : MonoBehaviour
{
public float speed = 0f;
private Rigidbody rb;
private float movementX;
private float movementY;

// Start is called before the first frame update
void Start()
{
rb = GetComponent<Rigidbody>();
}

The information sent by the InputSystem will be received by this function
void OnMove(InputValue movementValue)
{
Vector2 movementVector = movementValue.Get<Vector2>();

movementX = movementVector.x;
movementY = movementVector.y;
}


void FixedUpdate()
{
Vector3 movemnt = new Vector3(movementX, 0f, movementY);

rb.AddForce(movemnt * speed);
}
}

Event

Collider.OnTriggerEnter

When a GameObject collides with another GameObject, Unity calls OnTriggerEnter.

OnTriggerEnter happens on the FixedUpdate function when two GameObjects collide. The Colliders involved are not always at the point of initial contact.

Both GameObjects must contain a Collider component. One must have Collider.isTrigger enabled (because neither Collider is a Trigger will prevent physical objects from overlapping), and contain a Rigidbody.

If both GameObjects have Collider.isTrigger enabled, no collision happens. The same applies when both GameObjects do not have a Rigidbody component.

Collider.OnCollisionEnter

https://docs.unity3d.com/ScriptReference/Collider.OnCollisionEnter.html

This will trigger as long as the two come into contact

Function

GameObject.SetActive(Boolean)

A GameObject may be inactive because a parent is not active. In that case, calling SetActive will not activate it, but only set the local state of the GameObject, which you can check using GameObject.activeSelf. Unity can then use this state when all parents become active.

The state of GameObject, if it is deactive, then all components will not work, it will not render, and life cycle functions such as Update will not be executed

Prefab

Prefabs are a special type of Asset that represent a GameObject or collection of GameObjects with components that are already set up. They’re like a blueprint which you can use to easily make instances of the same thing. Each instance of a Prefab is linked to the Prefab Asset, so changing the Asset will change all versions of the Prefab in all Scenes.

The first use for this system in your Project will be to make the character a Prefab. This means that if you go on to make multiple levels for the game, you won’t need to remake JohnLemon for every level — you can just instantiate a new Prefab.

It’s something close: in Unity, models work like read-only Prefabs. They’re blueprints for creating instances of that model, but the blueprint itself cannot be changed.

That is to say, the model we import from the outside is generally a read-only Prefab. If we want to make it editable, we can re-drag and drop the GameObject back to the folder. At this time, you will be asked to Create Original Prefab or Variant of Prefab, look at the meaning, one is a new Prefab, and the other is a reference to the old Prefab. We need to create a new one so that we can edit it, so choose Original Prefab

Tag

You can add tags to all objects.

Packages

Some packages provided by Unity provide some customization capabilities

InputSystem

For example, InputSytem, after adding a GameObject as a component, it can receive input events and send messages to the function

Animation

Unity - Manual: Animation System Overview

Unity has a rich and sophisticated animation system (sometimes referred to as ‘Mecanim’). It provides:

  • Easy workflow and setup of animations for all elements of Unity including objects, characters, and properties.

  • Support for imported animation clips and animation created within Unity

  • Humanoid animation retargeting the ability to apply animations from one character model onto another.

  • Simplified workflow for aligning animation clips.

  • Convenient preview of animation clips, transitions and interactions between them. This allows animators to work more independently of programmers, prototype and preview their animations before gameplay code is hooked in.

  • Management of complex interactions between animations with a visual programming tool.

  • Animating different body parts with different logic.

  • Layering and masking features

Unity’s animation system is based on the concept of animation clips, which contain information about how specific objects change their position, rotation, or other properties over time. Each clip can be seen as a single linear recording. Animation clips from external sources are produced by artists or animators using third-party tools such as Autodesk 3ds Max or Autodesk Maya, or from motion capture studios or other sources. The animation clips are then organized into a structured flowchart-style system called the animation controller. The animation controller acts as a Finite-State Machine

A very simple animation controller may contain just one or two clips, such as controlling prop rotation and bouncing, or animating the opening and closing of a door at the right time. More advanced animation controllers may contain dozens of humanoid animations for all major character movements, and may mix multiple clips simultaneously to provide fluid motion as the player moves through the scene. Unity’s animation system also has many special features for handling humanoid characters, which gives you the ability to reposition humanoid animations from any source. ( For example: motion capture; asset storage; or other third-party animation libraries) to your own character model, as well as adjusting muscle definitions. These features are implemented in a unified “Avatar” system.