I made a boo boo in calculating object normals. Funnily enough, up until now, it hadn't revealed itself because of where I was placing objects and the camera in the scene. But a recent attempt at rendering a longer dragon scene highlighted some shadow issues. Here's a render with the issue:
Now, the image at first glance seems OK. But, if you look at the blue surface and the shadows on the three dragons, they're different. The dragon on the right is a darker blue, than say, the one in the middle. I thought it was an issue with the lighting normals (to do with how we light the model), but upon inspecting the normals rendered image, there was clearly another problem. Here's the issue visualised:
See how the dragon's colours are not 'kind of the same'? The dragon on the right, mostly blue and green, should be coloured like the others, because the surfaces we see are in similar directions towards the camera. What's happening here, is when I pull the normals from the object, I was setting the normals in object space. I then went and globalised the data, before rendering with it. But I was resetting the normals type flag I have, that tells the engine how to handle that normal. So, despite being globalised, the normal was treated as an object normal, and was then multiplied by the object's matrix to 'globalise it' again... Which means the normal was rotated and then rotated again.
It's funny how you don't see these problems in testing, because you somehow manage to test on a scene where the problem is hidden - in this case by the object and camera positions. In other scenes I test with, I use a normals map (a texture map) which works without issue.
After applying the fixes, here's how it should look:
See the difference? Here's a comparison (drag the slider around to see):
And here's an updated render with the normals fixed:
Now that normals are fixed, I can go to bed!