Advertisement

sigbus in SDL_FreeSurface (when debugging)

Started by April 26, 2014 10:21 PM
0 comments, last by yogo1212 10 years, 7 months ago

Hi,

i'm currently experiencing a really weird problem with SDL2 and maybe one of you is brighter than i am...

Basically, I have this function:




function GetString(fontid: byte; thing: string): TEngineString;
var surf: PSDL_Surface;
begin
  Result := EngineString(thing, dispStringXSalt, dispStringDSalt);

  if not GameResourceHas(Result, grtTexture) then
  begin
 	  // So, why is this method still a catastrophy?

    surf := TTF_RenderUTF8_Blended(fonts[fontid], PChar(thing), foregroundColour);

{$IFDEF ENGINEDEBUG}
        if surf^.format^.BitsPerPixel = 32 then
{$ENDIF}
            GameResourceAdd(GameTextureLoader(Result, surf^.w, surf^.h,
                GL_RGBA, GL_UNSIGNED_BYTE, GL_RGBA, surf^.pixels), Result)
{$IFDEF ENGINEDEBUG}
        else
            raise Exception.Create('invalid pixelformat for font-rendering. ouch.');
{$ENDIF}


    SDL_FreeSurface(@surf);
  end;
end;  

And it raises a sigbus at runtime (inside SDL_SetPixelFormatPalette called from SDL_FreeSurface) ?!

This happens only when debugging. blink.png

Doesn't matter if through IDE or with gdb.

If I don't debug, no error occurs and my little simulation works as normal..........

I recently changed the whole method and this is the original:


function GetString(fontid: byte; thing: string): TEngineString;
var width, height: Longint; surf: PSDL_Surface;
begin
  Result := EngineString(thing, dispStringXSalt, dispStringDSalt);

  if not GameResourceHas(Result, grtTexture) then
  begin
 	  // So this method is a catastrophy

    TTF_SizeUTF8(fonts[fontid], PChar(thing), @width, @height);

    surf := TTF_RenderUTF8_Blended(fonts[fontid], PChar(thing), foregroundColour);

    if surf^.format^.BitsPerPixel = 32 then
      GameResourceAdd(GameTextureLoader(Result, width, height, GL_RGBA,
        GL_UNSIGNED_BYTE, GL_RGBA, surf^.pixels), Result)
    else
      raise Exception.Create('invalid pixelformat for font-rendering. ouch.');

    SDL_FreeSurface(@surf);
  end;
end;  

Suprisingly, this method does work without raising an error when debugging :-/

But GetString will be called a lot and I want to optimise it.

I am totally puzzled, but I found this: "EDIT: Strange, I recompiled and it worked! EDIT: Recompiled again; it failed. EDIT: I just didn't check the debugger the first time :P"

(here http://www.cplusplus.com/forum/general/59965/)

and this is just how I feel biggrin.png

kind regards, yogo1212

Update:

This version works - even with debugging on (notice the extra declared variables):


function GetString(fontid: byte; thing: string): TEngineString;
var
	width, height: longint; surf: PSDL_Surface;
begin
	Result := EngineString(thing, dispStringXSalt, dispStringDSalt);

	if not GameResourceHas(Result, grtTexture) then
	begin
		// So, is this method still a catastrophy?

		//TTF_SizeUTF8(fonts[fontid], PChar(thing), @surf, @surf);

		surf := TTF_RenderUTF8_Blended(fonts[fontid], PChar(thing), foregroundColour);

{$IFDEF ENGINEDEBUG}
		if surf^.format^.BitsPerPixel = 32 then
{$ENDIF}
			GameResourceAdd(GameTextureLoader(Result, surf^.w, surf^.h,
				GL_RGBA, GL_UNSIGNED_BYTE, GL_RGBA, surf^.pixels), Result)
{$IFDEF ENGINEDEBUG}
		else
			raise Exception.Create('invalid pixelformat for font-rendering. ouch.');
{$ENDIF}

		SDL_FreeSurface(@surf);
	end; 

This makes things really interesting...

Because they are not used, they dont end up in the symbol-table. But they are allocated on the stack!

So, could the sigbus be a sigsegv in disguise?

Addresses on x86_64 are byte-aligned and a memory mapping from one of the virtual addresses contained in PixelFormat to a physical address of the stack is veeeeeeeery unlikely....

I'm installing the sources of SDL and i'll try to find the line responsible.

Nobody of you ever heard of something like this?

This topic is closed to new replies.

Advertisement