The paradigm is the same as with Camera Sensors.
Some old camera sensors with, say, 10 megapixels are really good in hindsight because they have really big pixels, which makes them better at catching light. This is also why a lot of good phone camera sensors are stuck with 12 Megapixels for years now, against sensors that easily reach 50 megapixels. They reach that, but each individual pixel is crap, hence a lot of manufacturers downsample the images instead, with the consumer ending up with a fourth of the resolution, which is 12,5 megapixels. IMO pixel quality ------> everything else, but this can certainly be disguised if you have a lot of pixels and good processing.
With 1080p you're basically stuck with FRC to coach your monitor into making more colors than it does. FRC is flashing two colors, so you get the color in the middle. This way you can coax a screen to display more colors than it would otherwise. So you have 6 bit IPS panels now (functionally worse than the 8 bit IPS panels we had 10 years ago) doing 8 bits artificially, and you have 8 bit panels being coaxed into simulating they're doing 10 bit (this is really common, or at least aknowledged on PC monitors, but also quite obviously used on TV's - this is easy to spot on sub 120 Hz monitors if you give it a midtone color and see "grain" on it) - suffices to be said, 4K panels are in on the fun as well.
With 4K, because you have so many pixels (more than you need for most TV uses, tbh) at the distance you're using you have some other options, namely look up tables. Think about it as an
pointilism painting, you do a table in advance for the colors outside of the scope of your panel, and then basically reach that color using 4 pixels instead of one. "but I have a 4K screen using 4K content, how does that work?" well, since at the distance you're seeing it chances are you're rebating at least part of the detail anyway... Consider this, most content you play on your TV is not 4:4:4 at all, it's 4:2:0, this means it's color resolution is 1/4 of that of the panel (luma channel is unchanged, so texture is there, it's the colors that shift), meaning it's easy to just throw this visual dithering tables onto the final color channel.
Anyway, it's a better result to do this with 4K than 1080p for obvious reasons, so you can coax a 4K monitor with shit pixels to be visually "better" than a version with the same shit pixels who happen to be bigger and 1080p.
Then you have the fact that technology looks forward never backwards, so there's no 1080p sets being manufactured these days that are really good (case in point, there's no 240Hz panels being used for that on TV sets), if you go back to 2013 models you might have a really good set that can rival a lot of TV's in actual perceivable quality. Good panels simply moved on, the same way you can't even find a single 32" FHD TV these days (nor 4K) instead you'll be stuck with shit 2006 panels with 1366x768 (I don't know how they still get those panels tbh), there's simply no interest in doing that anymore despite some consumer demand. 1080p TV's these days are just usually cheap sets for people that value SmartTV features more than image quality, so you get that instead.
Most panels in any era of LCD televisions were shit though, that is true today and it was true 10 years ago, we can even argue they were worse due to having less pixels and less processing going on. Take the top range though, and at some points you'll be impressed. A Sony 55W905A is still really good (last Sony 1080p flagship) but it's inferior to some 4K sets they did afterwards simply because they came afterwards and perfected the tech, there's a lot of inferior 4K sets that I wouldn't trade one for though.