I've noticed something on a couple of popular shows recently that I hope to be the start of a trend. A few months ago on "CSI" they were examining a female body and briefly showed her bare breasts. On last night's "ER" the doctors were examining an elderly lady having some sort of trauma. In the process they very realistically removed her shirt and bra to attach leads. For a few seconds I had to wonder if what I was seeing was real but it was.
Everyone is familiar with the "National Geographic" specials or the Discover Channel programs about tribes which show totally nude women and men. But these were mainstream shows employing nudity, albeit just female breasts, but I find it encouraging. Does anyone have other examples of this? Care to share your opinion of these advances?
Everyone is familiar with the "National Geographic" specials or the Discover Channel programs about tribes which show totally nude women and men. But these were mainstream shows employing nudity, albeit just female breasts, but I find it encouraging. Does anyone have other examples of this? Care to share your opinion of these advances?
Comment