Coming back from a rather tumultuous vacation of sorts (more on that later), I'm shocked to open up the news and find that, in Colorado at least, the Peace sign is considered the sign of Satan.
Whatever happened to "Peace on Earth"? Have we become so war-hungry that we cannot tolerate that? Has our sense of right and wrong become so perverted?
What the hell is happening, anyway?