In my 3D program, I compute, model, view & projection matrices that I give to OpenGL in the Vertex shader:
// Position of the vertex as seen from the current camera
gl_Position = projection * modelview * vec4(VertexPosition, 1.0);
Considering that I have a list of points, I'd like to be able to write these point names aside their geometry. Therefore, I've got this code snippet:
for(auto const& _3Dpoint : model_->getPoints()) {
Vector3D projected = (projection_* cameraview_.inversedMultiplication(_3Dpoint.second->getPosition()));
projected.normalize();
renderText( projected[0] / projected[3],
projected[1] / projected[3],
projected[2] / projected[3],
_3Dpoint.second->getName());
}
I expect my projected coordinate to be expressed in window coordinates. But it is actually approximately in [-7; 7], moreover positive. This explains my normalization. Unexpectedly, this works well until I apply a rotation or translation to my view matrix. Then my point's name does not really stick my point's geometries. I do not really understand this behavior because when I apply a transformation to the camera, the transformation is applied both on the final coordinates ...
Any ideas?