This might just because I'm paying attention to it for the first period in my adult life, but it seems to me like WWE is becoming destigmatised to an extent in the mainstream. I think it's got something to do with the PG era meaning there is less overtly heinous bullshit being said and done, and the slow takeover by younger people who either don't have negative associations with them from the past or seem fresh. It's probably not that pronounced, like I said, I'm exaggerating due to my perspective, but it might be a thing. It's also probably some of the effect of parents watching with their kids who may not have when you had 'Puppies' to contend with every PPV, and those parents thinking wrestling is actually more wholesome than it used to be. Which is true, it is, despite a closer look revealing a lot of the dark and ignorant bullshit that always existed. We see that, but others don't with the exceptions of the treatment of the Women's division, and the occasional homophobic or sexist promo. A lot of people probably don't even pick up on or care about those two problems, either.