Advertisement

Questions about OpenGL shaders

Started by January 02, 2019 01:17 AM
1 comment, last by Goraxium 6 years, 1 month ago

Hey guys,

 

these are probably silly questions but I'm starting to poke my head into OpenGL and have a few questions.

 

glShaderSource documentation

it says

Quote

For implementations that support a shader compiler

 

and there's a reference to a glShaderBinary.

 

From what I gather shaders can be compiled externally into a binary format and then loaded directly vs doing the compilation at load/runtime.  Is this a valid concern nowadays?  Meaning, is it something I need to be aware of in code or do all reasonably recent GPU's have the ability to compile opengl shaders?

 

Also I'm guessing the opengl shader dialect is probably tied to the opengl version? 

 

GetShaderiv documentation

 

This references errors, but it's a void return.  Is it expected that you call glGetError to determine if this error'd?  It seems to be the case, but I thought I'd ask just to be explicit.

 

Also, does anyone have a link to code that has a robust solution for compiling/preparing OpenGL shaders?  It would really help me to see something that wasn't a toy implementation that makes a lot of assumptions to simplify code.

 

Thanks a bunch guys!

GPUs have been able to compile shaders for a very long time using GLSL (OpenGL Shading Language). That documentation may be referencing older GPUs that required their own special treatment. The function you referenced (glShaderBinary) definitely sounds like it does what you expect it to do. You can definitely load mipmaps that way (including compressed versions).

 

To check for errors while compiling shaders, you want the function glGetShader().

 

Here's some relatively simple code that will load a shader of a given type. Returns true on success and if successful, sets the given pointer to the OpenGL shader. Sorry about the lack of comments in it. I haven't played with this stuff since I originally got it working years ago and never saw the need (since it's relatively straight forward).


	bool loadShader(std::string filename,GLenum shaderType,GLuint* shaderPtr,bool printDebugMessages = false){
  std::ifstream file;
  file.open(filename.c_str(),std::ios::in | std::ios::binary);
  if (file.is_open()){
    std::string shaderSource;
    while (!file.eof() && !file.bad()){
      std::string line;
      std::getline(file,line);
      shaderSource += line + '\n';
    }
    if (!file.bad()){
      GLuint shader = glCreateShader(shaderType);
      if (shader){
        const GLchar* source = shaderSource.c_str();
        GLint length = shaderSource.length();
        glShaderSource(shader,1,&source,&length);
        glCompileShader(shader);
        GLint status;
        glGetShaderiv(shader,GL_COMPILE_STATUS,&status);
        if (status){
          *shaderPtr = shader;
          file.close();
          return true;
        }
        ///DUMP ERROR MESSAGES TO THE CONSOLE IF NECESSARY
        if (printDebugMessages){
          GLcharARB buffer[1024] = {0};
          GLsizei outSize = 0;
          glGetShaderInfoLog(shader,1024,&outSize,buffer);
          std::cout << "GLSL SHADER FAILED TO LOAD! DUMPING LOG...\n" << buffer << "\n";
        }
        glDeleteShader(shader);
      }
    }
    file.close();
  }
  return false;
}
	

 

This topic is closed to new replies.

Advertisement