Advertisement

glBufferData not updating the GPU

Started by April 17, 2020 08:12 PM
6 comments, last by taby 4 years, 9 months ago

In my program, I generate a mesh. Then I generate another mesh, and I try to update the GPU, and it works fine if the second mesh is bigger than the first. If the second mesh is smaller than the first, then it doesn't seem to work. It's like it glBufferData doesn't quite work. Any ideas?

My GPU upload function can be found at: https://github.com/sjhalayka/julia4d3/blob/672d1b6042082a25cdb94477198e1cc3b84106e6/ssao.h#L843

Included are some screenshots. Thanks for any help that you can provide. I'm even willing to pay a small fee for the help. The project is open source, and in the public domain.

Well, if this is truly dynamic data then glBufferData is not the correct api (it's for one time writing). One of many simple possibilities for writing to a whole buffer every frame. Buffer flushes automatically because GLMAPWRITE_BIT:

// setup
glCreateBuffers( 1, &(i->dynamic_vertex_buffer) );
glNamedBufferStorage( i->dynamic_vertex_buffer, sufficient_size, NULL, GL_MAP_WRITE_BIT );

// every frame, cast to my own datatype
vec4f* buf = glMapNamedBuffer( i->dynamic_vertex_buffer, GL_WRITE_ONLY );

// use the pointer to write to the buffer

glUnmapNamedBuffer( i->dynamic_vertex_buffer ) // Don't forget error checking !

You can also map ranges, flush manually, snchronise, ... i think you know all that :-)

Edit: oh, i see you create a new buffer every time. I can only assume that your indices get confused somewhere and that's why only a part is drawn. Can you use the debug output to see if opengl has any complaints ?

Advertisement

It works now.

I'm not sure what was wrong, but I snagged some old OpenGL ES code from a card game that I was working on. Instead of feeding vertices and indices, i just use vertices. And instead of glDrawElements, I'm now using glDrawArrays. I understand that the indices version of the code would be faster to render, but I dunno.

Thanks again for the ideas. ?

Working good now!!

 

I understand now that you have Julia routine, probably in some opengl 3ish style, an opengl 1 or 2 style text overlay and a buffer handling of which you do not know why it suddenly works.

Don't get me wrong, i am not scoffing at you, i just fear that you will always run into those problems if you just copy code together. I would really recommend you grab the red and blue book, maybe you don't need the introductory chapters (I certainly need them) but they'll help you understand and get a better picture of how opengl stuff works.

Thanks for the insight.

Yeah, I think my code is what they call a dog's breakfast. It is using OpenGL 4.x for SSAO, and OpenGL 1.x for the GUI. It also uses OpenGL 2.x for the glDrawPixels.

So far it works just fine on AMD and Intel. Have not heard if it runs on nVidia.

I also implemented rainbow colouring:

Advertisement

I find that both methods work if the data are sent to the GPU each frame with GL_DYNAMIC_DRAW. If there's a framerate hit I'm taking, it's not very obvious. So why is it like this? I have the red, blue and orange books on iBooks, but I no longer have a Mac computer. :(

This topic is closed to new replies.

Advertisement