Hi,
Im having alot of trouble with determining the ray normal that goes into the scene through the pointer.
Heres an example:
I have.
A projection matrix that i generate with an implementation of gluPerspective and glFrustum
0.001953000 0 0 0
0 0.002604000 0 0
0 0 1 0
-1 -1, 0 1
A dead simple view matrix obtained from a camera that sits at [0,0,5] and doesn't have any orientation.
1 0 0 0
0 1 0 0
0 0 1 0
0 0 -5 1
mouse pointer dead center in the screen (so [0,0] in clip space)
I then do the following calculation. (that i found in a tutorial here: http://antongerdelan.net/opengl/raycasting.html)
float x = (2.0f * mouse_x) / width - 1.0f; //0 in this case because the mouse is in the center of the screen
float y = 1.0f - (2.0f * mouse_y) / height; //0 in this case because the mouse is in the center of the screen
float z = 1.0f;
vec3 ray_nds = vec3(x, y, z);
vec4 ray_clip = vec4(ray_nds.xy, -1.0, 1.0);
vec4 ray_eye = inverse(projection_matrix) * ray_clip;
ray_eye = vec4(ray_eye.xy, -1.0, 0.0);
vec3 ray_wor = (inverse(view_matrix) * ray_eye).xyz;
ray_wor = normalise(ray_wor);
I then expect a result of [0,0,-1] because the camera is looking straight down the z-axis and the mouse is in the center of the screen.
instead the result i get is [0.799999,0.59999925,-0.001562398]
Please take a look and point out what I'm missing here, Maybe the calculation is wrong, maybe I'm missing a step, maybe the projection or view matrixes are wrong or maybe my expectation is wrong and I'm just misunderstanding something.
Thnx in Advance!
↧