Origin of Ray

Lift the fog of the Internet together

The original purpose of texturing is to use an image to control the appearance of the model. Using texture mapping technology, we can “glue” an image to the surface of the model and control the color of the model by texel (to distinguish from pixels).

When modeling, artists usually use texture expansion techniques in modeling software to store texture map coordinates on each vertex. Texture map coordinates define the 2D coordinates corresponding to the vertex in the texture. Usually, these coordinates are represented by a 2D variable (u, v), where u is the abscissa and v is the ordinate, so texture coordinates are also called uv coordinates.

Read more »

A previous blog has talked about the meaning of the existence of a process and why we need to come up with the concept of a process.

But then there are several consecutive problems, which are also several problems encountered by multichannel concurrency:

  • When multiple processes are concurrent, how to schedule and what are the principles of scheduling?
  • What if there are interdependencies between multiple processes, such as synchronization and mutual exclusion?
  • What if there is a circular dependency due to a dependency relationship, which may lead to a deadlock?

In this blog we will explain the first problem, which is how processors are scheduled.

Read more »

Recently, I have been developing an SDK for Unity character images. The goal is that the SDK can dynamically download the latest models of characters and clothing without having to be built into the SDK in advance.

At the beginning, the solution I plan to use is the traditional and mature Addressables way to package bundles, but the division of bundles is a problem, and this method needs to import the model into unity every time, then release the package, and then update the catalog. The whole process is relatively tedious.

So I was wondering if I could load FBX and textures directly from remote, and then load them directly using FBX, so that the whole release process would be much simpler.

After some searching, he found the TriLib repository, which was very convenient for him to load models remotely. However, there were also some problems. In order to locate these problems, I specially read the source code.

Read more »

We are officially starting to learn some Shaders that can be applied, this time we will start with the lighting model. The code and concepts in this article are interpreted from “Getting Started with Unity Shader Essentials”.

This article mainly explains the principles and code analysis of several lighting models in the book.

Read more »

The previous blogs talked about the basic principles of game rendering and how to write a Shader in Unity. This blog first inserts two concepts:

  • What is the rendering order and overlay relationship between different objects in Unity, that is, the rendering order.
  • What is Unity’s render path and what does it have to do with the render order?
Read more »

In the built-in rendering pipeline, surface shaders are a simplified way to write shaders that interact with lighting.

Writing shaders that interact with lighting is very complex. There are different light source types, different shadow options, different rendering paths (forward and deferred rendering); shaders should somehow cope with all these complexities.

Surface shaders are a code generation method that makes it easier to write light shaders than using low-level vertex/pixel shader programs.

Read more »

In Unity, we use HLSL syntax to write Shader Programs, but initially Unity used CG syntax, so some Unity keyword names (CGPROGRAM) and Enterprise Archive File (.cginc) were used. Although Unity no longer uses Cg, these names are still used.

Read more »
0%