Advertisement

Passing Window Handles

Started by December 01, 2000 05:30 PM
2 comments, last by GreatAjax 23 years, 11 months ago
I am a beginning OpenGl programmer working on my first 3D engine. I want to separate my game logic from the 3Dengine as much as possible. Is it ok to have the WinMain and WndProc functions in my game code then pass the handles for the windows to my graphics engine or will I run into problems? Any suggestions on how to keep my 3DEngine separated from my game logic, so that I can easilly move the engine from one project to another with little difficulty? ---------------------------- -=GreatAjax=- ----------------------------
-----------------------------=GreatAjax=-----------------------------
As long as whatever you pass the handle to is in the same address space you should have no problems. i.e. it''s ok to pass them back and forth between functions in the same executable or between an executable and a dll, but if you pass them between two executables you might have problems.
Advertisement
Rule of thumb: in the thread where you create the window, also get the DC and create the Rendercontext. If you want to render/do something with the DC or rendercontext (f.e. render a polygon or change a windowsmousepointer) in _ANOTHER THREAD_ in your application you have to retrieve a new DC and create a new Rendercontext.

the windowhandle is systemwide the handle for your window. That won't change. so you can pass that to even other processes now running. The device context that you retrieve from that window is _NOT_ systemwide the same, but differs PER THREAD. (as stated in the MSDN docs/win32 platform SDK)



--

Get productive, Get DemoGL: http://www.demogl.com

Edited by - Otis on December 2, 2000 4:40:29 AM
--
Get productive, Get DemoGL: http://www.demogl.com
Also, if you want to keep your 3D render code separated from the gamelogic, make the 3D rendercode an API that is called from the gamecode. Thus, write the 3D rendercode as a library, export a couple of functions and datastructures/classes. Your gamecode can then load / link to that library, use the datastructures to store data for the 3D rendercode and call the functions you exported to get stuff drawn.

So, you should have some kind of ''kernel'' that controls the several pieces of code in your game: the AI, the gamelogic itself, graphicscontrol, audio control, audio producer and graphics producer (3D renderer).

You can then let the graphics control part call directy functions of graphics producer, or you could use some kind of ''state'' machine system that will let the kernel decide what to do: all subsystems, except audio and graphics producer, produce data (one produces data for the other) and when they''re finished the kernel calls f.e. 1 function in the graphics producer: ''RenderFrame'' and one function in the audioproducer ''PlaySoundFrame''.

so (from my bare head, mumbling ideas ) you have roughly 2 kinds of systems: one that has every part calling functions directly into other parts, parts export a kind of API, and one that has separate blocks that consume and produce data, the consumer for the producer. A kind of ''statemachine'' kernel controls when which subsystem is running. The latter is good for multithreading and parallel execution of stuff.

--

Get productive, Get DemoGL: http://www.demogl.com
--
Get productive, Get DemoGL: http://www.demogl.com

This topic is closed to new replies.

Advertisement