frob said:
Two things, or maybe just one and a consequence.
OpenGL follows a “diamond exit” rule for line rasterizing. The pixel area is considered with an internal diamond and the line must leave it to be drawn. This helps with smooth transitions between line segments.
The second or follow up is that the last pixel is not drawn. It never leaves the diamond.
Drawing lots of short lines can all stay inside the pixel area and never be drawn. Drawing line strips are treated as a longer, continuous line and the pieces do exit the diamonds.
Another “read the specifications carefully” note is that line width other than one is not guaranteed to be implemented, and the implementation details have similar subtle details like handling of line end caps and miters at joints. For thick lines your own quads may be better, as well as shapes for end caps and miters.
Enjoy the learning.
I did read the specs and effectively noticed this, but that left me with more questions than answers…
Also found out this: https://github.com/badlogic/line-rasterization Nice but it couldn't help figure it out.
JoeJ said:
Aybe One said:
But then GL_LINESTRIP thickness isn't even, unless one goes 4K it's noticeable:You could render at higher resolution and sample it down for higher quality.
You could keep resolution but use TAA.
Double resolution + MSAA + downsampling should give high quality.Or you could use a compute shader to do whatever you want, including analytical anti aliasing.
Or you could try a library like OpenVG which would care about it all, but idk about widespread GPU support.
Aybe One said:
BresenhamAfaik Bresenham does not support subpixel accuracy. Use DDA instead.
Yes, #1 would likely tackle the problem, #2/#3 are out of reach for me.
I too tried DDA… until I realized I needed neither of these, just vertical lines.
JoeJ said:
Assuming you don't need a depth test, you could do the AA cheaply with alpha blending. But then you need to generate a ‘thick’ mesh, turning each line segment into a quad (which is tricky on high curvature due to self intersections):
I've drawn the alpha texture in one quad. Grey means opaque and white means transparent. I guess a width of 2 or 3 texels would give a good compromise between smoothness and sharpness.
Forgot to mention, basically I want aliased lines 😁.
dpadam450 said:
I ran a small test on my PC. So understand that if glLineWidth = 1.0, then you have a line that could be intersecting up to two pixels to the left and right. Imagine a vertical line between pixel 0 and pixel 1, and the lines x value is 0.5, that means the width of rasterization is 2, while the line is 1 pixel thick.
Anyway I looked into this a bit and determined that glLineWidth = .75 or so may help get closer to what you want. I made a program where you press a key (pause for 1 second so you can see exactly one iteration). Each iteration subtract 0.05 value from line width and see how it looks. Not certain you will still be happy. Anything under .7 for me starts to drop pixels because the line isn't thick enough at certain locations to be encroaching close enough to a pixel center.
If I was using vanilla OpenGL, I would have tried this, but Unity's is the most basic GL you can think of.
dpadam450 said:
Also, this is how Audacity looks. This is the most zoomed in you can get. This looks pretty weird with many dimples.
Looked at it before, yes it's bad, Audacity is quite inaccurate/simplistic, e.g. not even a zoom factor label…