Advertisement

Using screen (2D) coordinates instead of 3D space?

Started by August 13, 2005 03:36 PM
1 comment, last by Luctus 19 years, 3 months ago
Hey there. I am trying to figure out how I would use screen coords. (ex. 0, 0, 100, 100) when creating a quad instead of (0.0f, 0.0f, 1.0f, 1.0f). I hope you get what I'm trying to say. The first one uses pixels, and the second, it uses units in OpenGL. I don't need to do it through my entire app. Just when I draw a certain quad. How do I do this? Thanks in advance, Matt U.
:==-_ Why don't the voices just leave me alone?! _-==:
I don't know OGL, but if there's (int width) pixels = (1 unit) across and (int height) pixels = (1 unit) down...

To go from pixels to units just do x/width and y/height.
Advertisement
You want to set an orthographic projection matrix that corresponds to your screen size. You can do this by using the function gluOrtho2D function like gluOrtho2D( 0, 0, screen_width, screen_height ), then use glVertex2i for specifying coordinates.
-LuctusIn the beginning the Universe was created. This has made a lot of people very angry and been widely regarded as a bad move - Douglas Adams

This topic is closed to new replies.

Advertisement