So, at the moment, the mesh generation and the texture generation are all being handled on the main thread. This isn’t acceptable, as these can be fairly heavy operations. However, Multi-threading has a slightly issue.
The Unity API is not multi-threaded, and there are built in checks in Unity to make sure that you only access the unity API from the main thread. This causes issues as you cannot create Unity objects such as Texture2Ds or Meshes on other threads.
This doesn’t however, mean that you can’t offload most of your computation onto separate threads. You can, as long as it doesn’t rely on the Unity API.
For the mesh generation, I calculate the Vertex positions, UV co-ords and Triangles. These are just an array of Vector3s, and array of Vector2s, and an array of ints, all of which I can calculate and generate on a separate thread, then retrieve from the main thread once it’s done, and wrap them into a Mesh object there.
For the texture generation, I generate and a fill 2D array of Color32s. Once that’s done, I create a new Texture2D on the main thread, and then use SetPixels32 to fill it (Then call apply). Using SetPixels32 and Color32 instead of SetPixels and Color is faster according to the documentation, although I haven’t done any performance comparisons yet.
After these changes, the generation locks up the main thread a lot less, although there is still some slowdown I’m trying to identify!
Having to change my code to allow for multithreading also gave me the much needed excuse to spend some time refactoring my code, and now it’s split over several different classes, and I’m further refactoring it to have some generic interfaces so the MeshGenerator and TextureGenerator could be swapped out with other implementations.