HDR is by definition increased dynamic range, an 8bit panel by definition is not increased dynamic range. To be reductive, SDR utilizes 256 steps/gradation to represent the darkest black to the brightest white, there can be no more than 256. It also has 256 steps of Red, Green, and Blue to represent every color, shade, and hue within that capability, there can be no more than 256x3. HDR increases all of that to 1024 steps, so there is much more granularity between the darkest black, the whitest white, and all the colors, shades, and hues there in. An 8 bit panel physically can not display more than 256, it is merely "processing" at 1024 then converting/clipping/truncating it down to 256 for display. 10bit panels can display the full 1024. This is why people call 8bit "HDR" TV's Faux HDR. How much difference there is in practice and actual use, I can't say. My guess is 8bit HDR looks a bit like the oft maligned "Dynamic Contrast" settings or "Vivid Modes" that tend to crush blacks/shadow detail and blow out whites.