Our body is the vessel with which we experience the physical world, and it houses our deepest instinctive parts. It's where we feel pleasure and love, judge and destroy, bleed and smile.
Since David and Venus, we've put the human body on the highest pedestal, shaping and perpetuating an unattainable beauty standard. Through this distorted lens of beauty, our natural form has been sexualized, demonized, and sometimes shamed into a taboo.
Why is the naked body offensive? Does it hold a political charge? Are we not more than meets the eye?