(SDL) Crash during quit from fullscreen
This is a new issue I'm having. Basically, if I gave SDL the SDL_FULLSCREEN flag while setting the video mode, it crashes on SDL_Quit. The only information I've gleaned from debugging is that the actual segfault/access violation happens in the OGL video driver. I'm using OGL, obviously. What I don't understand is why I'm getting an access violation. This hasn't happened before, and I can't figure out what exactly I changed to cause this.
[EDIT] It appears to be somehow related to VBO creation. Not sure what's going on yet.
[EDIT #2] This is weird as hell. If I force it to use a software buffer instead of VBO, it works fine. If not, it causes a driver error during SDL_Quit. This only happens when using fullscreen. Furthermore, this exact code worked fine in the demo I released and never caused a problem. What. The. Hell.
[Edited by - Promit on July 14, 2004 6:46:49 PM]
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
it runs fine through out the program though?
HxRender | Cornerstone SDL TutorialsCurrently picking on: Hedos, Programmer One
Yep. All the data in the VBO is good, rendering goes fine. And no errors in windowed mode. Just fullscreen.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
I had something like this happen when I had a buffer overflow in a VBO index buffer (uploaded more indices than I'd allocated space for). It didn't crash until at the end of the program, when destroying the OpenGL context.
is SDL_Quit called atexit, or are you calling it explicitly?
lonesockPiranha are people too.www.lonesock.netSOIL: Simple OpenGL Image LibraryMovies I've mocked: "The Core", "The Mummy", "Tale of Despereaux"
I'm calling it explicitly, but I was calling it implicitly (I added the explicit call while tracking down the problem).
In any case, I'll look at the indices...but I don't remember changing anything.
In any case, I'll look at the indices...but I don't remember changing anything.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
Well that's quite odd.
I'm taking this to the OGL forum.
[EDIT] Oh, I suppose you're wondering what the problem was.
This line:
glBufferDataARB( GL_ARRAY_BUFFER, m_SizeBytes, NULL, TranslateUsage( m_Usage ) );
Use to discard the buffer, allocate space, etc. was breaking all. I originally saw it on OGRE. I remembered some mention of it behaving improperly but didn't pay much attention.
I'm taking this to the OGL forum.
[EDIT] Oh, I suppose you're wondering what the problem was.
This line:
glBufferDataARB( GL_ARRAY_BUFFER, m_SizeBytes, NULL, TranslateUsage( m_Usage ) );
Use to discard the buffer, allocate space, etc. was breaking all. I originally saw it on OGRE. I remembered some mention of it behaving improperly but didn't pay much attention.
SlimDX | Ventspace Blog | Twitter | Diverse teams make better games. I am currently hiring capable C++ engine developers in Baltimore, MD.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement