Advertisement

B&W bitmaps, instead of color ones.

Started by August 24, 2000 04:20 AM
1 comment, last by Aqutiv 24 years, 4 months ago
hey, I''m using GDI to display images from a binary files that contains a collection of 256x256 bitmaps, (where each byte represents a pixel) I''m using CreateCompatibleBitmap(), and SetDIBits(), but it appears as if the image were B&W (ie. 2bits instead of 8bits) Here''s the pallete I created (assume LPBITMAPINFO ImageInf):
    
//Create a gray scale palette

for (i = 0; i < 256; i++) {
		ImageInf->bmiColors<i>.rgbBlue  = i;
                ImageInf->bmiColors[i].rgbGreen  = i;
                ImageInf->bmiColors[i].rgbRed  = i;
                ImageInf->bmiColors[i].rgbReserved = 0;
			}
    
But it doesn''t display it in all 256 shades of gray. I even tried more colorful palletes, and nope.. The results were sometimes diffrent (had less black, had more black, etc...), but still b&w. Any idea? thanks in advance, -Aqutiv.
AquDev - It is what it is.
Your display needs to be in 32/24-bit true color mode to see all shades of gray/color. 16-bit mode loses the low order three bits of each r,g,b effectively rounding them up to a 8x value. 256 color mode uses a palette of which the first/last 10 or so colors are reserved by windows, and may conflict with your palette.
Advertisement
hmm... maby not all shades of gray, but still more then just 2 constant boring colors.
and in either case, I am running in 32bit, beside, like I said,
I tried more "coloful" palletes, and it is still b&w, and look even worse.

and even if it was a problem, How do you explain the fact that when I open the file in raw format in psp6, it displays it in gray scale? (as opposed to just b&w, which is how my app displays it)

I might have to take a screenshot, I guess?












Edited by - Aqutiv on August 24, 2000 4:31:14 PM
AquDev - It is what it is.

This topic is closed to new replies.

Advertisement