a light breeze said:
This looks like a gamma-correction issue. Perform blending in linear color space, then convert to sRGB on display, and the difference between the two should go away.
The glyph cache texture has a `R8G8B8A8 sRGB` format. In this example, the texels have either { 0,0,0 } or { 1,1,1 } RGB components. The A component is the only one that really matters as that contains the coverage. After sampling from the glyph texture, the RGB components are pre-multiplied by the A. (Note that the sRGB format is not really necessary here, as the hardware filtering will still result in the same linear color without, since the A component is linear.)
The pre-multiplied color will be combined with a vertex color (in this example, either { 0,0,0,1 } or { 1,1,1,1 }).
The final color is written to a render target with a `R8G8B8A8 sRGB` format. So all color arithmetic (including the blending) happens in linear space, while all colors are stored as sRGB.
http://rastertragedy.com/RTRCh5.htm#Sec3 mentions that the gamma exponent can be tweaked to adjust the perceived boldness for the black and white case. I currently, just rely on the implicit hardware conversions from and to sRGB.