In every single performance video I watch, I never see a steady frame time, no matter how overpowered the hardware is. See this three examples:
Meanwhile on console:
You can see that frame time is never locked or consistent on PC like consoles. There are micro-stutters all over the place all the time. I'm not saying consoles are absolutely perfect, there are some hiccups here and there, but the graph is always an absolute Parkinson-level mess on PC.
I always see that in every single PC benchmark I watch. Is there something I'm missing here?
I don't know why people are dogging you just for not knowing something.
Most people who play on PC play at uncapped framerates. This does, indeed, mean that their framerates are not "stable" per se, but it's widely agreed that variable framerates above 90fps (and for some people even 60fps) look great and smooth in motion, so long as VRR is enabled.
VRR is when your framerate output directly syncs with your monitor's refresh rate. So if you weren't using VRR on a 120hz monitor, and your game was running at 90fps, then every other frame would be duplicated on your display, resulting in choppy motion. But with VRR, your monitor would drop to 90hz, delivering a smooth video. And it adjusts in realtime, so anything really above 90fps will always look buttery-smooth.
However, the other big issue with modern PC frame times are framerate spikes, aka "stuttering". This is usually caused by shaders compiling in realtime, increasing a single frame time delivery from something like 8ms all the way up to crazy low speeds like 100-200+ms, but usually just for one or two frames.
Another common cause of this is traversal in large, usually open world games.
This problem primarily plagues PCs because every single PC setup is a little bit different. So even in games where you precompile your shaders before you start, it's pretty common for missed one-off shaders to be compiled in realtime as you play the game. Sometimes this will get better the more you play a game, and sometimes a game is just poorly coded and continues to stutter perpetually.
A lot of modern engines, especially Unreal, have had a particularly difficult time getting shader compiliation and traversal stutter under control. Usually, in the games where it's really bad, there will be patches and updates that help mitigate that. And, of course, the better your hardware is, the less bad the stuttering usually is. (However, even insane $5000+ rigs can still be susceptible to stuttering depending on the game.)
The reason why this doesn't usually happen on consoles is that every console uses the exact same hardware. So developers and ensure that each and every possible shader is compiled for, say, a PS5. And since every PS5 is identical, shader comp stutter is essentially entirely mitigated.
Traversal stutter on consoles is another story, though, and happens usually just as often as it does on PCs. But there are weird exceptions sometimes.
Additionally, because consoles offer a static experience, the developers are able to really min-max visual elements to give you the best possible consistent experience. For example, a 30fps game on a console might actually feel better than it performing at 30fps on a PC, because the developer can adjust things like motion blur and input tick rate to give you the best experience at that framerate. Adjusting stuff like that on PC is still possible, but it's generally up to the player to do so, which means you usually won't have the same polished experience.
That's one of the reasons why 30fps on console, depending on the game, doesn't always feel or look that bad. (Of course, it's still never more preferable than 40 or 60fps.)
Finally, consoles are at a significant disadvantage when it comes to unstable or uncapped framerates. For example, the PS5 only enables VRR at 48hz or higher, meaning an unstable framerate below that will look absolutely horrible. On most PC monitors, the VRR range is not only wider, but you can do things like put a lower framerate in a high framerate container (so your 40fps game is displaying on an 80hz monitor, and if it dips to 35fps, your monitor goes to 70hz, etc.)
Anyway, I hope this was helpful. Frame rates and frame times are super confusing and weird, and it gets even more complicated than what I went into.