|Introduction||Hardware AA||FSAA||Texture AA|
This is a brief investigation into the rendering quality of antialiased primitives on several Macintosh OpenGL renderers. The OpenGL specification allows for some implementation-dependent variance, such as the maximum width of an antialiased line. It also does not guarantee any exact rendering of a given primitive, as long as the results are repeatable. That is, OpenGL is "not pixel exact," the results can and do vary depending which graphics card you use.
In the middle of developing some 2D applications for Mac OS X in the Fall of 2003, I upgraded from an Apple TiBook (Radeon 7500) to an AlBook (Radeon 9600) and suddenly all my pretty antialiased artwork was jaggy and terrible. An inquiry to the mac-opengl list quickly showed that this was a hardware issue, and I was not alone in my frustration. After a bit of griping, and submitting a bunch of bug reports to Apple, I realized the first thing to do is to see how different the available implementations really are, to determine what the base level of reliable functionality is. My test app shows the results on the next two pages. I also collected a lot of renderer info into a GLInfo table.
For my applications, I am interested in antialiased 2D primitives (points, lines, triangles, etc.) It turns out that the OpenGL points and lines are not reliable for anything other than aliased objects 1 pixel big, which are not very useful. Of course, there is already a nice 2D rendering API on Mac OS X: Quartz. But Quartz is not (currently) hardware accelerated, so while it is high quality and reliable, it is somewhat lacking in the speed department. However, OpenGL texture mapped triangles are pretty reliable, so as suggested by previous research I decided to implement my own antialiased primitives which would work reliably across all renderers, using texture mapping. The discussion and some results are on the fourth page.