So there's separate (but related) topics here: HDR rendering, and HDR output for displays. Depending on your exact Google queries you might find information about one of these or both of these topics.
HDR rendering has been popular in games ever since the last generation of consoles (PS3/XB360) came out.The basic idea there is to perform lighting and shading calculations internally using values that can be outside the [0, 1] range, which is most easily done using floating-point values. Performing lighting without floats seems silly now, but historically GPU's did a lot of lighting calculations with limited-precision fixed-point numbers. Support for storing floating point values (including writing, reading, filtering, and blending) was also very patchy 10-12 years ago, but is now ubiquitous. Storing floating-point values isn't strictly necessary for HDR rendering (Valve famously used a setup that didn't require it), but it certainly makes things much simpler (particularly performing post-processing like bloom and depth of field in HDR). You can find a lot of information about this out there now that it's very common.
HDR output for displays is a relatively new topic. This is all about how the application sends its data to be displayed, and format of that data. With older displays you would typically have a game render with a rather wide HDR range (potentially going from the dark of night to full daytime brightness if using a physical intensity scale) and then using a set of special mapping functions (usually consisting of exposure + tone mapping) to squish that down into the limited range of a display. The basic idea of HDR displays is that you remove the need for "squishing things down", and have the display take a wide range of intensity values in a specially-coded format (like HDR10). In practice that's not really the case, since these displays have a wider intensity range than previous displays, but still nowhere wide enough to represent the full range of possible intensity values (imagine watching a TV as bright as the sun!). So that means either the application or the display itself still needs to compress the dynamic range somehow, with each approach having various trade-offs. I would recommend reading or watching this presentation by Paul Malin for a good overview of how all of this works. As for actually sending HDR data to a display on a PC, it depends on whether the OS and display driver support it. I know that Nvidia and Windows definitely support it, with DirectX having native API support. For OpenGL I believe that you have to use Nvidia's extension API (NVAPI). Nvidia has some information here and here.
Be aware that using HDR output isn't necessarily going to fix your banding issues. If fixing banding is your main priority, I would suggest making sure that your entire rendering pipeline is setup in a way to avoid common sources of banding. The most common source is usually storing color-data without the sRGB transfer curve applied to it, which acts like a sort of compression function that ensures darker color values have sufficient precision in an 8-bit encoding. It's also possible to mask banding through careful use of dithering.