Sunday, April 7, 2013

GR Graphics GL - Getting closer to the present with Normal maps

Features added, features enhanced, interfaces polished and others, it was a time to leave 1995 and simple texturing, and start with a few interesting features had to be implemented.

I downloaded a couple of free normalmaps, and implemented the NormalMap plugins and shaders half expecting it to be as dissapointingly easy as the texture ones.

I was in for a surprise.



I was a civil engineer before I became a Software engineer. Math, especially 3D geometry, I consider bread and butter. Still, wrapping my head around the fact that my normal map was the wrong handedness, and that was giving me the physically - impossible lighting took some time...

Wrong: the brick being shadowed TOWARDS the light
Eventually though, after quite some time of frustration, checking and rechecking matrices, going back to paper and checking my logic, it dawned on me, what should be apparent from the start. The map was wrong, not the code. Or rather, the two could not be rotated to fall into one another. Handedness. I was using right-handed coordinate spaces (here, the U-V-N is of interest), while the map was using a left hand system.

Load into Paint.Net, get channels plugin, invert Green. After checking that inverting the y-channel in the Pixel Shader did the trick.

I toyed around the idea of putting an invert y-channel option load-time, and in fact I did implement it, but honestly, have it if you must, but don't use it. It's just wrong. Normal map winding, texture color space, triangle winding direction, these are all design-time parameters, and your pipeline should accept their correct versions, and only them. If your map is incompatible, reverse y. If your texture is in linear colorspace, and you expect SRGB, save it as the format you need. You really should not put creation-time decisions in load time, and ESPECIALLY not runtime, and specially-especially, not in the pixel shader. It has better things to do than reversing y channels.


Simple texturing
Correct normal mapping


The particular shaders I wrote actually worked in World space instead of Tangent space, just because I could prototype them faster. That put a completely unnecessary matrix calculation in the Fragment, but the correct implementation of classic, tangent-space normal mapping, or object-space normal mapping would have to wait, as I now wanted to implementing the next infinitely interesting part...







Normal mapping has a lot of caveats. The biggest one that I personally found was actually creating the tangent space. Many good algorithms can be found on the web so I will not copy any here, but the one that gave me the best results is the one by Eric Lengyel at
http://www.terathon.com/
code/tangent.html.

It had to be modified because my vertices were indexed, but after the modification it worked perfectly without other caveats. Thanks!




For the customary code fragment, (added considerably later than the actual post) the pixel shader of the HLSL version of this code. Yes, I am determined to mix and match GLSL and HLSL, as I think it is both easy and useful to be equally proficient in both.

//All light calculations in this shader are done in the primitive's tangent space
//This means that we need to precalculate the vector from the vertex to the
//to light (better than just transforming the position vector and the light
//vector and subtracting here!) and pass it ready in tangent space.
//Of course, coordinate system does not matter, except if it is a non-linear non-
//orthogonal one, but our tangent space is orthonormal so no problem there
struct PSin {
float3 view_direction: VSOUT_VIEW_DIR;
float3 position_to_light: VSOUT_LIGHT_VECTOR;
float3 light_direction: VSOUT_LIGHT_DIR;
float2 texUv: VSOUT_TEXUV;
};

#include "CommonCbuffers.hlsl"
#include "func_calc_spotlight.hlsl"
//This include is interface-interchangeable with the rest of the specular models
//such as blinn-phong
#include "func_calc_specular_gaussian.hlsl"

Texture2D diffuseTexture: register(t0);
SamplerState textureSampler: register(s0);

Texture2D normalTexture: register(t1);
SamplerState normalSampler: register(s1);

float4 main(PSin inp): SV_Target {
//All calculations in this shader are in tangent space
    float2 sampling_coords = inp.texUv;

    float3 viewDirection = normalize(inp.view_direction);

    float3 sample = normalTexture.Sample(normalSampler, sampling_coords).xyz;
    float3 tan_normal = normalize(sample * 2 - float3(1, 1, 1));

//A small optimization I picked up while following the original tutorials that
//led me to this code: We will need both the reverse square root and the actual
//length of the distance to the light, so it's PROBABLY better to do the
//calculations by hand and use rsqrt and reciprocal instead of length.

    float3 tmp_eyetolightdiff =  inp.position_to_light;
    float lightDistSqr = dot(tmp_eyetolightdiff tmp_eyetolightdiff );
    float rLightDist = rsqrt(lightDistSqr);
    float3 vertexToLightDir = tmp_eyetolightdiff * rLightDist;

    float attenuation = 1 / (1.0 + light.attenuation_linear * rcp(rLightDist) + lights[0].attenuation_quadratic * lightDistSqr);

    float cos_incidence = clamp(dot(tan_normal, vertexToLightDir), 0, 1);

    attenuation *= calc_spotlight(inp.light_direction, vertexToLightDir, light.spotangle_inner_cos, light.spotangle_outer_cos);

    float specularTerm = calc_specular(material, cos_incidence  vertexToLightDir, viewDirection, tan_normal);

    //TA - DAH!!!
    float4 ambient = material.ambient_diffuse * lights[0].ambient;
    float4 direct = attenuation * light.direct * (material.ambient_diffuse * cos_incidence + material.specular * specularTerm);

    return diffuseTexture.Sample(textureSampler, sampling_coords) * (ambient + direct);
}

No comments:

Post a Comment