gl_NormalMatrix, gl_Normal are deprecated?

I’m trying to run a program using the #330 core profile of GLSL and these variables appear to be deprecated and thus will not compile; how would I go about implementing these?

The original code provided was: vec3 normal = gl_NormalMatrix * gl_Normal;

How would I go about doing this for #330 core profile?

I think the solution was to do: vec3 normal = (VMvec4(vertexNormal_modelspace,0)).xyz;

Passing my view matrix and modelmatrix and the normals to the shader and computing it as so, but the diffuse looks the opposite of how I want it though.

I would verify that your light source position/direction is also in eye-space before you put it the lighting equation with “normal”.

If still problems, post some glsl source code, in a [noparse]

...

or

...

block[/noparse].

[QUOTE=Dark Photon;1258824]I would verify that your light source position/direction is also in eye-space before you put it the lighting equation with “normal”.

If still problems, post some glsl source code, in a [noparse]

...

or

...

block[/noparse].[/QUOTE]


#version 330 core

// Input vertex data, different for all executions of this shader.
layout(location = 0) in vec3 vertexPosition_modelspace;
layout(location = 1) in vec3 vertexNormal_modelspace;

// Values that stay constant for the whole mesh.
uniform mat4 MVP;
uniform mat4 V;
uniform mat4 M;
uniform vec3 lightPos;
out vec4 forFragColor;
const vec3 diffuseColor = vec3(0.55, 0.09, 0.09);

void main(){

	// Output position of the vertex, in clip space : MVP * position
	gl_Position =  MVP * vec4(vertexPosition_modelspace,1);
	
  // all following gemetric computations are performed in the
  // camera coordinate system (aka eye coordinates)
  vec3 normal = (V*M*vec4(vertexNormal_modelspace,0)).xyz;
  vec4 vertPos4 = MVP * vec4(vertexPosition_modelspace,1);
  vec3 vertPos = vec3(vertPos4.x, vertPos4.y, vertPos4.z);
  vec3 lightDir = normalize(lightPos - vertPos);

  float lambertian = max(dot(lightDir,normal), 0.0);
  forFragColor = vec4(lambertian*diffuseColor , 1.0);
  //forFragColor = vec4(vec3(1,0,0),0);
}

Not sure how to check if its in eye space, but further testing shows the problem is not quite that its reversed, but instead its offset by -1,-1 in the x,y directions. So if its at the origin 0,0, its actually at -1,-1 while 1,1 puts it at the origin.

e: Another problem is that instead of the non diffuse parts simply being a darker shade of red, its just black, like its in complete shadow. Hrm.

The best I can figure out is that for some reason my light source is acting like a flashlight and not a lamp.

In your shader, the comment says that you’re doing the lighting calculations in eye coordinates, which makes sense. But then a couple of lines later, you calculate vertPos4 by multiplying the input vertex position with MVP, which would include the projection.

Your transformation of the normal looks correct as long as your MV matrix is orthonormal. But if you for example have scaling in the MV matrix, your transformed normal would have to be renormalized. If the MV is not orthogonal (containing for example non-uniform scaling, or something like a shear transform), you can’t just use the same matrix you use for vertices. In the general case, the matrix used for transforming normals is the inverse of the transposed matrix.

Would computing the normals be done with:

//mat3 normalMatrix = transpose(inverse(mat3(M)));
//vec3 normal = normalize(normalMatrix * vertexNormal_modelspace);

(ignore comments)?

Doing it this way oblitates the surface detail the normals I think are supposed to provide, so somethings not right for sure there.

I also fixed the issue with VertPos4 but it didn’t really help matters.

e: For some weird reason I’m getting gimbal lock with my rotation code, this didn’t happen in my older opengl code, I’m baffled.


	float angleRadians;
	//handles mouse motion events
	if (arcball) {
		// if left button is pressed
		cur_mx = mx;
		cur_my = my;
		if (cur_mx != old_mx || cur_my != old_my)
		{
			



			LastRot = ThisRot;
			lArcAngle = arcAngle;
			last_axis_in_world_coord = axis_in_world_coord;

			glm::vec3 va = get_arcball_vector(old_mx, old_my);
			glm::vec3 vb = get_arcball_vector(cur_mx, cur_my);
			angleRadians = acos(glm::min(1.0f, glm::dot(va, vb)));

			arcAngle = ((angleRadians * 180) / 3.14159265);

			axis_in_world_coord = glm::cross(va, vb);

			old_mx = cur_mx;
			old_my = cur_my;
			ThisRot = glm::rotate(glm::mat4(1.0), arcAngle, -axis_in_world_coord);
			ThisRot = ThisRot * LastRot;
			//cout << "Angle: " << arcAngle << endl;
		}

Which I then apply to:


	// Rotation
	glm::vec4 newCamLoc = ThisRot* cameraLoc;
	glm::vec4 newCamDir = ThisRot* cameraDir;

	ViewMatrix = glm::lookAt(
		glm::vec3(newCamLoc.x, newCamLoc.y, newCamLoc.z), // Camera is at (0,0,3), in World Space
		glm::vec3(newCamDir.x, newCamDir.y, newCamDir.z), // and looks at the origin
		glm::vec3(0, 1, 0)  // Head is up (set to 0,-1,0 to look upside-down)
		);

Is the “up” portion of lookat screwing me here?

This topic was automatically closed 183 days after the last reply. New replies are no longer allowed.