Im beginning to truly despise America.
The thought of living in a place where police can nursery you "just incase", and the state backs them.every step of the way.
It's literally sickening. It's disgusting.
The general lack of care towards human life is just nasty.
Guns tear people to shreds daily, because fucks you, I want guns though.
Police murder more innocent people in a month than the total number of bullets fired by police per month in most first world countries.
The INSANE amount of times the GOP continue to try to repeal Obama care like their life depends on it.
Even the damn food portions are made to fucking kill you.
There is an entire political party DEDICATED to bigotry, homophobia, sexism, racism, whatever ism you want to think of, and are supported by almost 50% of the entire fucking nation.
There is an entire news network dedicated to pushing all those isms, and it's the most popular news network in the country...
A country where a man can murder a kid, get a2ay with it, then go do a fucking photo shoot at the manufacturing plant of the brand of gun you used to murder a kid, smiling and posing with the fucking murder weapon, then auction it later as some trophy for some other sick fucking to win?
Wow.
And then there's Trump, doing pretty good work to make sure the rest of the world stop being disillusioned about what America really is and what huge amounts, almost half of its people truly stand for.
"Greatest nation in the world?" Okay.
If that's greatness, I don't want to be anywhere near it. Don't want anything to do with it, let alone have my kids grow up surrounded in it.