fisheye view with OpenGL?
Hello members,
Is there a possibility to create a REAL fisheye view with OpenGL? I mean the effect that objects that are very near to the camera are distorted spheric. An example:
Imagine a quadratic plane with lets say 20 x 20 faces. Now move the camera towards the plane, and increase the field of view. If you continue that, with a real camera the edges of the plane start to move into the center, and you quadratical plane starts to become a spheric. Is this possible to realize with OpenGL or do I have to make my own transformation for that?
OpenGL uses only the z coord as distance to the camera, is there a way to tell openGL to use the distance of the whole vertex to the view point?
Lyve
_____________________________________
Visit http://www.nilsschneider.de for finest trance music, studio, bio, guestbook and more!
_____________________________________http://www.winmaze.de, a 3D shoot em up in OpenGL, nice graphics, multiplayer, chat rooms, a nice community, worth visiting! ;)http://www.spheretris.tk, an upcoming Tetrisphere clone for windows, a 3D tetris game on a sphere with powerful graphics for Geforce FX and similar graphics cards.
Why not scale your w from 1 to 0, 1 being at the camera/player center, and zero being far far away from the center.
I''m not sure if this will work, but it seems right...
I''m not sure if this will work, but it seems right...
~~~~~Screaming Statue Software. | OpenGL FontLibWhy does Data talk to the computer? Surely he's Wi-Fi enabled... - phaseburn
I did it like this (not real fisheye, though, but close):
1) calculate static texture coordinates for an NxM grid (64x64 seems suitable) into which you divide your screen, adding a sin/cos component to the calculation - with proper parameter you can create all kinds of distortions. precalulating these values will save you a lot of a computational power for later use.
2) capture the frame
3) draw the texture as an NxM grid with the abovementioned texture coordinates
if you want I can post some more specific stuff later on.
1) calculate static texture coordinates for an NxM grid (64x64 seems suitable) into which you divide your screen, adding a sin/cos component to the calculation - with proper parameter you can create all kinds of distortions. precalulating these values will save you a lot of a computational power for later use.
2) capture the frame
3) draw the texture as an NxM grid with the abovementioned texture coordinates
if you want I can post some more specific stuff later on.
Never do anything that is a waste of time and be prepared to wage long tedious wars over this principle - Michael O''Connor
"Literally, it means that Bob is everything you can think of, but not dead; i.e., Bob is a purple-spotted, yellow-striped bumblebee/dragon/pterodactyl hybrid with a voracious addiction to Twix candy bars, but not dead."- kSquared
quote:
Original post by cippyboy
Increase the FOV to over 100 degrees ?
Doesn''t work because openGL only uses the Z coord as distance to the view, not the distance to the vertex
_____________________________________http://www.winmaze.de, a 3D shoot em up in OpenGL, nice graphics, multiplayer, chat rooms, a nice community, worth visiting! ;)http://www.spheretris.tk, an upcoming Tetrisphere clone for windows, a 3D tetris game on a sphere with powerful graphics for Geforce FX and similar graphics cards.
You can simply use a vertex shader that will transform vertices according to their distance and angle with the camera direction.
That's the easiest solution, and probably the cleanest.
[edited by - SKSlayer on June 7, 2003 5:24:39 PM]
That's the easiest solution, and probably the cleanest.
[edited by - SKSlayer on June 7, 2003 5:24:39 PM]
(you can find me on IRC : #opengl on undernet)
An ugly solution would be to use OpenGL in glortho projection mode... and write your own world to screen coordinate converter.. you can then use opengl to depth test and apply textures and everything.. might not look too good though.. *shrug*
(Back in my freshman year, i wrote a qbasic program that took the relationship between the screen width (in pixels) and the view angle, and applied that to the angle of the point in 3d space (obtained with a arctan call)... worked quite well)
(Back in my freshman year, i wrote a qbasic program that took the relationship between the screen width (in pixels) and the view angle, and applied that to the angle of the point in 3d space (obtained with a arctan call)... worked quite well)
Disclaimer: "I am in no way qualified to present advice on any topic concerning anything and can not be held responsible for any damages that my advice may incurr (due to neither my negligence nor yours)"
Wouldnt using like 180 FOV be the same result as a fisheye? That is really simple and easy to do, Is there something else going on with a fisheye?
quote:
Original post by skow
Wouldnt using like 180 FOV be the same result as a fisheye? That is really simple and easy to do, Is there something else going on with a fisheye?
Fisheye projects onto a curve. The difference is apparent when there are straight lines in the scene. In a wide FOV projectd onto a plane the lines are still straight, but in a fisheye they are curved.
-solo
www.tommyraz.com
info@tommyraz.com
-solo (my site)
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement