Why is there such a negative stigma with Plastic Surgery? Like, I can't be the only one who doesn't think its a big deal. But when I hear of someone getting any work done, everyone seems to be up in arms about it. Like its their business! Maybe I missed the memo on that one. But I don't see it as such a big deal.
First off- It's. None. Of. Your. Business! Second- Who cares? Third- See the first point.
If someone has the extra money and that's what they want to do with it, who am I to judge? Who are you to judge? Hell, I get Botox every 3 months because I have wrinkles on my forehead that make me self-conscious. Would someone telling me their unsolicited opinion about how I should love myself the way God made me change my mind? No. I have friggen' speed bumps across my face... it looks like the damn Publix parking lot!!
If you want to change something about yourself, and have the money, & that's something you want to do- then by all means-- please do! It's your life to do with what you want.
It makes me laugh to see some of the people who criticize those who get work and they have tattoos! *I have tattoos- nothing against them* But it's a permanent change to your body, just like new tits would be! So whats the difference? Just mind your own biscuits and life will be gravy.
I'm bringing this up-one because its just what came to mind this morning but I was thinking back on an old friend (who I am not friends with anymore) telling me about a girl that we went to school with getting work done and she was so mean about it. I didn't get it. Who did it effect?! And weather it was true or not, the girl looks good! So whats the big, fat, hairy deal?! I just don't understand why people think they have a right to push their opinions on others. Maybe its my age, or something, but I don't get it.
My point is- if you want to do anything in life- no matter what- if it makes you happy & doesn't hurt others... do it. It's your life... no one elses.