This could be on both Xbox or the monitor itself.
I myself running a Series X on a Gigabyte G32QC which has a 1440p 165 hz panel. It's also Freesync Premium Pro therefore I theoretically should not have any problem running 120 hz and HDR simultaneously. The monitor also has a console mode which allows it to accept 4K signal then downscales it.
Here is what I'm experiencing. If I set my console 1440p@120hz I get 120hz on supported games and VRR works, but HDR does not kick in no matter what (allow 4K unchecked).
If I set my console 4K@60 hz, the monitor downscales 4K signal to 1440p and I get HDR. VRR also works (allow 4K checked).
And lastly, the funny part, if I set my console to 1440p@120 hz and check Allow 4K option, I get HDR, but I lose 120 hz. When I launch a game screen goes black for a second then displays 4K@60 with HDR. This is probably due to my monitor's EDID values, as it is a HDMI 2.0 monitor Xbox thinks it can't handle 4K 120 Hz with HDR, which is true. Gigabyte could potentially change the monitor's EDID to say it can accept 4K 120hz with HDR, the it accepts the signal and downscales it to 2K, HDR and 120 hz untouched. But that is probably impossible due to HDMI limitations.
So only hope to get 1440p@120hz with HDR is Microsoft's implementation. I read somewhere on Reddit that they are working on it but nothing official so far. My monitor is HDR400 capable and has around 430 nits of max luminance, therefore I do not care about HDR as much as I do about 120 hz and VRR, but I also don't agree with those saying HDR400 won't make a difference anyway. It makes a difference, and would be a nice to have.
So far, I'm playing single player games on 4K@60 with HDR. Most of them don't support 120 hz anyways so it's fine. However, I feel additional lag in this mode probably due to the downscaler, so it is not ideal. For multiplayer games, if they support 120 hz, I switch back to 1440p@120 with Allow 4K unchecked mode, it feels much better.