Basically, after George W Bush won a second term, he proclaimed that his win gave him a "mandate" to push through whatever he wanted.
Unsurprisingly, Republicans will always deny Democratic POTUS's a mandate, even if they win by more points than Bush won by in 2004.
If somehow Democrats take back the House, expect Dems to loudly make clear that the Electorate gave them the right to push a "mandate".
So is it like a formal thing where it's officially implemented and agreed upon or something? Or was this just W saying it's his right to push through policy because he won? Just something politicians say?