Ok. I'm gonna do a little soul-baring here on GAF, though its not about women or anything like that, so rest assured i'm still CONFIDENT AND FUNNY.
What's everyones take on this pop psychology stuff going around, where everyone thinks they've got a brain chemical imbalance and that some drug will help them out? My mom is currently on my ass about trying this shit, and everytime I even try to think about doing it I just feel so dirty, like I'd be betraying myself with this shit.
See, I think there's a difference between unhappy and depressed. Most of America is unhappy... it's our namesake, our heritage, our je nai se quois that makes us who we are. You can't be money grubbing, spoiled little bitches if you're ever content with anything, in my humble opinion. But that is irrelevant, to an extent. I think a LOT of people are confusing their sadness with depression, and in doing so they take these psychotropic drugs that are everywhere these days, trying to make themselves feel better. And I'm sure it does make a lot of people feel better.
But first off, I don't think there's anything wrong with me other than not being happy. I don't think this is a psychological defect and I think it will get better once I really get my shit together again. Second off, I don't really feel comfortable with the idea of taking these drugs anyways. Everyone taking pills to feel better seems too fake and too akin to Brave New World for my tastes. If you take pills that are ultimately unnecessary, just to feel better about yourself, then you're not really feeling better about yourself. You're just being given drugs that make you think you feel better about yourself, right?
I don't know. Maybe I'm wrong about all this, but I think this new dependency on psychology (which more or less all conjecture anyways, even in the field of psychiatry to a great part until looking at mental disorders) is just absurd and it seems like an easy way for people to bow out of dealing with their own problems and fixing them on their own.
What's everyones opinion?
What's everyones take on this pop psychology stuff going around, where everyone thinks they've got a brain chemical imbalance and that some drug will help them out? My mom is currently on my ass about trying this shit, and everytime I even try to think about doing it I just feel so dirty, like I'd be betraying myself with this shit.
See, I think there's a difference between unhappy and depressed. Most of America is unhappy... it's our namesake, our heritage, our je nai se quois that makes us who we are. You can't be money grubbing, spoiled little bitches if you're ever content with anything, in my humble opinion. But that is irrelevant, to an extent. I think a LOT of people are confusing their sadness with depression, and in doing so they take these psychotropic drugs that are everywhere these days, trying to make themselves feel better. And I'm sure it does make a lot of people feel better.
But first off, I don't think there's anything wrong with me other than not being happy. I don't think this is a psychological defect and I think it will get better once I really get my shit together again. Second off, I don't really feel comfortable with the idea of taking these drugs anyways. Everyone taking pills to feel better seems too fake and too akin to Brave New World for my tastes. If you take pills that are ultimately unnecessary, just to feel better about yourself, then you're not really feeling better about yourself. You're just being given drugs that make you think you feel better about yourself, right?
I don't know. Maybe I'm wrong about all this, but I think this new dependency on psychology (which more or less all conjecture anyways, even in the field of psychiatry to a great part until looking at mental disorders) is just absurd and it seems like an easy way for people to bow out of dealing with their own problems and fixing them on their own.
What's everyones opinion?