I'm trying to figure out a good way to draw a realistic night sky consisting of stars with wildy varying magnitudes. There is the classic and easy solution of baking everything into a skybox, but not only does this require an exorbitantly high resolution to look nice and sharp, it also limits the potential dynamicity of the night sky. Using points to do the job then seems desirable, but how the points should be rendered exactly still leaves me uncertain. Taking inspiration from existing star-rendering implementations, there are a few different techniques with their own ups and downs. Crucially, aliasing shouldn't plague the image quality, as naïve single-pixel points will look strange indeed when the camera shifts its orientation ever so slightly. Another aspect to consider is the implementation of glares from bright stars. Ideally, this should be done in a post-process bloom pass, but this may not be sufficient for tiny points in the sky that may take up just one pixel of the screen. Glares could instead be drawn as a part of the star-rendering process, but this requires some extra consideration if it should work in tandem with tone mapping and exposure control. Finally, there's the question of how large the stars can appear to be. In the real world, stars are far too tiny for the eye to perceive the actual shape of, let alone a pixel at conventional resolutions, so a realistic star should never occupy more than a quad on the screen, excluding the glare. However, if the desire for a fictional night sky arises, there may a need to include an extra parameter of apparent size into the equation. Some examples of implementations inflate the apparent sizes of all stars, even if they are very faint, which I think looks unappealing and unrealistic.