I’m disturbed by what I see more and more on the news. The fights among young people and acceptance of violence is scary. I want to float an idea here.
My observations of late are that the only emotions that are acceptable are happiness, greed and spitefulness. I never really considered the last two emotions until recently and then only because of what I see around me. Our society is so influenced by digital media. It has been this way since the radio was invented, probably, but I think the television made it even more so. Right now the talking box and the folks in it are gods!
When you look through the TV guide you see so many ‘reality’ shows. The issue is that they are not based on any reality that I know. The other options are shows like Desperate Housewives. From what I have seen of the advertisements this show would remind you of Dallas. These shows seem to glorify competition, deceit, and outer beauty. There are shows that show decency but they are few. Even fewer are shows that show life filled with emotion and real ways of dealing with them. So my question is this: are the societal norms that are shown on TV denying us of our natural emotions? Are we accepting that certain emotions represent weakness and therefore should be denied? That the only way to get what we want or to settle a dispute is to beat the cr*p out of someone?
At which point do we acknowledge that emotions are part of life, are normal and forgo the mentality of “beating people in to submission”? Aren’t we supposed to be in the age of enlightenment?