Quantcast
Channel: GameDev.net
Viewing all articles
Browse latest Browse all 17825

HLSL unexpected dot product results

$
0
0
Hi, I have written a deferred renderer a few years ago and now I picked up the project again to fix some outstanding bugs and extend some features. The project is written in C++, DirectX 11 and HLSL. While fixing the bugs I stombled across a strange behavior in one of my shader files which took me some time to track down. First I thought it had to do with my depth reconstruction algorithm in the point light shader, but after implementing alternate algorithms based on MJPs code snippets I ruled this out. It appears as if the dot function inside the shader sometimes (but reproducable) yields wrong results. Also, when switching from D3D_DRIVER_TYPE_HARDWARE to D3D_DRIVER_TYPE_WARP the problem completely disappeared, so to me it seems like this is either some kind of HLSL/DX11 or driver issue. I am using a GTX 980 for rendering and have the latest NVIDIA driver installed, also tried on an older laptop with NVIDIA card (which gave the same strange results). Here are some images that show the problem: So when debugging the wrong pixels with the Visual Studio Graphics Analyzer I found out that the hlsl dot function during my point light computations return unexpected and wrong values: And the dot product of (-0.51, 0.78, 0.36) and (0, 1, 0) obviously should not be 0... I am no expert in asm hlsl output, but the compiled shader code looks like this (last line is the dot product of lightVec and normal): Does anyone have an idea on how to fix this issue or on how to avoid the strange dot product behavior?

Viewing all articles
Browse latest Browse all 17825

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>