Can I display vertex directions in the Spine editor? I need to add vertex direction displacement in my shader, but I'm getting incorrect results. I want to confirm the vertex directions in the Spine mesh, but I don't know how to view them in Spine, similar to how you can in 3ds Max
About Vertex Normal
In the Spine editor, there is no direct feature to display vertex normals or directions like you might find in 3D modeling software such as 3ds Max. Spine is primarily a 2D animation tool, and it doesn't handle vertex normals in the same way 3D software does. The focus in Spine is on vertex positions and how they are influenced by bones and weights for deformation. If you are working with shaders and need to manipulate vertex directions, you might need to calculate or visualize these directions externally or within your shader code, as Spine does not provide this visualization natively.
Spinebot Does the Unity-runtime provide this feature ?
suanLoBeach Since the skeleton's albedo texture does not contain normal information, you can specify the vertex normals by creating a separate normal map texture and assigning it to the material. This allows you to get results like the picture in the Shaders section of the spine-unity runtime documentation:
https://esotericsoftware.com/spine-unity-rendering#Shaders
Misaki Can the spine editor create normal maps ?
suanLoBeach No, it is not possible to create normal maps directly in the Spine Editor. A user has explained the workflow for creating normal maps in the past, so you may find the following thread helpful:
https://esotericsoftware.com/forum/d/14916-normal-maps-creation/3
Misaki Actually, I just need to move the vertices in the shader in an outward direction. Is there any way to quickly set the vertex normals of the spine character?
I am using the shader's vertex displacement and found that the vertex direction is not what I expected. Now I know that the vertex direction and normal cannot be edited in the spine editor. I use unity-runtime. Is there any way to let the spine character determine the vertex direction in unity? Can it be done by exporting json? But I also have to consider the development cost. If the development cost is too high, I might as well create a new mesh in spine to meet my needs.
Displaying vertex normals in Spine would be pointless, as all vertex normals would be pointing straight out of the monitor in Z direction (0, 0, 1) given a left-handed coord-system. This is because all triangles in the skeleton mesh are flat 2D triangles, each being flat on the XY plane.
The skeleton components in spine-unity provide an Advanced - Add Normals
option which adds (0, 0, 1) as (object-space) normals for each vertex, if you can't modify the shader to use a fixed normal as the spine-unity shaders do. In general it's best just to use a fixed object space normal (0, 0, 1) as this avoids passing unnecessary vertex attributes around.
suanLoBeach Actually, I just need to move the vertices in the shader in an outward direction.
What do you mean by "outward direction"? Away from the center of an attachment? From the monitor's surface outward in Z direction?
In general as Misaki mentioned, the standard way to add normals to 2D skeletons for lighting is authoring normal maps. If you tell us what you have in mind, we can help.
suanLoBeach I am using the shader's vertex displacement and found that the vertex direction is not what I expected.
Please tell us what you expected via a screenshot or at least a more detailed explanation.
Is there any way to let the spine character determine the vertex direction in unity?
A vertex is a dimension-less point, there is no "vertex direction". If you mean normal direction, see above.
suanLoBeach If the development cost is too high, I might as well create a new mesh in spine to meet my needs.
Unless you tell us what you intend to do, what problem you are trying to solve, we can't answer your questions. Also see the XY problem, don't ask us how to fix your attempted solution but tell us what you are trying to achieve in general.
Harald Harald I need to make the graphic expand (the white part) in a new pass, so initially I thought of the vertex displacement method, but the vertex normals of the spine mesh are by default (0,0,1), so I failed.
Than I created a script to calculate the vertex normals of the spine mesh, making them point in the correct direction in the xy plane. I succeeded and successfully wrote the data to the mesh filter.
However, I don't know how to write the new vertex normals to the skeletonData or .skel file, so when I enable the skeleton mecanim, it refreshes the vertex normals of the spine mesh.
SkeletonMecanim will update the original normal and constantly overwrite the normal data, so the shader will also be affected
So how should I write the mesh vertex normal data calculated in the scene to the spine file instead of the mesh filter?Does spine runtime provide API to rewrite .skel files? Or do I have to export json files? Can't modify in binary? But game products must use binary to save costs
suanLoBeach Than I created a script to calculate the vertex normals of the spine mesh, making them point in the correct direction in the xy plane. I succeeded and successfully wrote the data to the mesh filter.
Ok, thanks for sharing your screenshots. While you don't describe where your "normals" are pointing from, I assume it's either the MeshAttachment's center or the parent bone's location where the MeshAttachment is attached to (the slot's bone).
Just be sure to never use these "normals" as a base for normal maps in shaders, as they are not surface normals any more but custom displacement vectors. Mentioning mainly because your called them "correct normals", while the originally provided generated normals are the correct standard normals.
That's of course fine if you know what you're doing. From your postings I just have the feeling that you're ambitiously trying to merge together different solutions you find on various forums (or get the suggestion from LLMs like chatGPT,) sometimes trying to transplant the heart of a cow to a human as you don't see a difference.
suanLoBeach However, I don't know how to write the new vertex normals to the skeletonData or .skel file
SkeletonData
contains no normal information, as they all are (0, 0, 1). You would need to change larger parts of the spine-csharp and spine-unity code so that SkeletonData
hold normal information. Also during mesh deformation when multiple bones affect a vertex, normals would need to change accordingly. If you always have just a single bone affecting vertices, that's easier.
suanLoBeach so when I enable the skeleton mecanim, it refreshes the vertex normals of the spine mesh.
SkeletonMecanim will update the original normal and constantly overwrite the normal data, so the shader will also be affected
Mesh data is of course updated by the SkeletonRenderer
, normals are not excluded here. If you change visible attachments, mesh vertices will be quite different.
You can either use the MeshRenderer.OnPostProcessVertices
callback as shown in the Spine Examples/Other Examples/Vertex Effect
example scene.
Or you can use the MeshRenderer.OnMeshAndMaterialsUpdated
callback.
You can check out the documentation:
https://esotericsoftware.com/spine-unity-main-components#Life-cycle
https://esotericsoftware.com/spine-unity-rendering#Rendering
Harald In that case, I shouldn't modify the vertex normals of the Spine skeleton mesh to avoid potential future errors.
Actually, there's another approach to implement my idea - using a shader to make each mesh scale from its own pivot (slot position), but I currently can't get each point's corresponding pivot in the shader. Is it possible to get each mesh's corresponding slot position in the built-in RP without using C# scripts? I can't use meshRenderer.onPostProcessVertices because I only need to expand the mesh in one specific pass of the shader - I need one pass with original-sized mesh showing the texture, and another pass with expanded mesh showing the larger base color. My current workflow is limited to shader-based solutions, otherwise it would add a tremendous amount of work, like manually creating normal maps or creating separate meshes for the base color, which would consume too much of my time. So I'm exploring ways to achieve this visual effect with just one shader and the existing texture display meshes.
Among all methods I've experimented with, only one doesn't require changing vertex normals - in one pass, manually setting a "center point" and then having all the vertices I want to expand move away from that point. This is the best solution I've found so far, but the drawback is that I need to manually set a center point for each different skeleton. However, compared to modifying the runtime source code as you mentioned, I might prefer this approach.
@suanLoBeach If you're using Spine for displaying and animating your trees, why don't you just use bone scale to scale each mesh attachment (or region attachmt as well)? This would be the straightforward way, and something you should be capable of.
suanLoBeach My current workflow is limited to shader-based solutions, otherwise it would add a tremendous amount of work, like manually creating normal maps or creating separate meshes for the base color, which would consume too much of my time.
Generating a normal map is not "a tremendous amount of work", it's mostly the click of a button and perhaps tweaking a few slider settings, in e.g. former Allegorithmic's and now Adobe's Bitmap2Material
or similar tools.
suanLoBeach Among all methods I've experimented with, only one doesn't require changing vertex normals - in one pass, manually setting a "center point" and then having all the vertices I want to expand move away from that point. This is the best solution I've found so far, but the drawback is that I need to manually set a center point for each different skeleton. However, compared to modifying the runtime source code as you mentioned, I might prefer this approach.
If scaling from attachment center or object (skeleton) center or custom position is a valid approach for you, you could pass that along with the mesh vertex data in some existing channel (via the previously mentioned callback methods or modifying the runtime code), or have only a single object scale-origin to scale from, making it much easier. The easiest solution would be bone scale animation as mentioned on top.
Harald thanks for your help, I have solved this problem by shader.