Ah, the simple joy of the sun. The way it feels on your skin as you're basking in it on a hot day at the beach. Even better, the way you feel the next day when it has settled into your skin, giving you that sun-kissed glow and baby freckles we all long for. How can something that feels (and looks) so good be so bad for us? Deadly, even.
I remember being a little kid and getting such a bad sunburn on my back that I had to wear my dad's XXL t-shirts to bed -- my bright red tiny body covered in aloe head to toe. Back then it wasn't about getting a tan; it was about having so much fun at the pool that I didn't want to stop for a second to put on sunscreen, let alone go home before it was dark out.
So when did the sun become so terrible for us? I'll be the first to admit that I've visited the tanning booth one too many times. I haven't been in almost a year now, ever since the reports came out claiming that tanning beds were as deadly to us as arsenic. Arsenic?! That scared me away from 'Ultimate Sun' faster than you can imagine.
But if I've already been going tanning for years (indoors and out) is the damage already done? I have one friend who claims that if we've already been doing it all this time, "we're already screwed" so we should just keep on tanning. Part of me almost wants to believe her so that I can (ignorantly & blissfully) keep my summer glow all through those long winter months. But deep down, I know that it's completely ridiculous and untrue. It's that same philosophy of "I already had one bit of cake, I might as well eat the whole thing." And we all know that gets us nowhere, except for 20 pounds heavier and unable to fit into our jeans.
I know there are lotions and creams out there now -- spray tans, even -- that claim to give us that same look. But it's not the same! Don't worry, I'm not trying to justify my way back to the tanning bed. I just wish that the end result wasn't wrinkles and skin cancer (so not worth it).