Intelligence is the capacity to receive, decode and transmit information efficiently. Stupidity is blockage of this process at any point. Bigotry, ideologies etc. block the ability to receive; robotic reality-tunnels block the ability to decode or integrate new signals; censorship blocks transmission.
Also a great quote from Alan Watts:
Inability to accept the mystic experience is more than an intellectual handicap. Lack of awareness of the basic unity of organism and environment is a serious and dangerous hallucination. For in a civilization equipped with immense technological power, the sense of alienation between man and nature leads to the use of technology in a hostile spirit—to the “conquest” of nature instead of intelligent co-operation with nature.
Note that there’s also a Bill Hicks video page.
Peter Kim, who describes himself as a traditional marketing professional, gave an interesting talk at this morning’s Social Media Breakfast. He says at his site that he’s working on an enterprise social technology company, along with Kate Niederhoffer, who was also at the SMB, and my pal Doug Rushkoff, who’s “not from around here.” I’m mulling this over: he says he’s a traditional marketer but he’s helping build a social tech company, so there might be a contradiction here, especially given his talk, wherein he questioned whether social media really works for marketing. Actually, he led by questioning whether negative social media experiences (like fake blogs) had any impact on companies like Wal-Mart and Comcast… it’s not like their stock went south based on blogosphere or videosphere bad buzz. I pointed out, though, that the companies had done far worse without taking a huge hit. It’s a complicated world, and social media makes it even more so.
Another question Kim was asking was whether companies could scale their use of social media so that it could make a difference for them in a positive way, as part of their marketing efforts. Why are companies still spending three million on superbowl ads if social media can be effective? As always happens with new forms of media, at least early on the new doesn’t replace the old, it’s just another way of communicating. I think most of us who’ve been at this for quite a while suspect we’re seeing a revolution, the new converged media will be truly transformative, more and more so over time. I suspect Peter Kim sees that more clearly than he let on.
The talk got me thinking. Social media is complex, it’s niche, it’s political, it involves all sorts of personalities and personal quirks; user generated content requires monitoring or moderation or some kind of oversight, so there’s very real and possibly expensive social overhead. Some companies are jumping in and others are interested, but a social web strategy requires a lot of thought, and perception from new angles, flexing new brain muscles you didn’t know you had as you think your way into it. And you can’t own it in the same way you could own a top-down marketing campaign. In a sense, it owns you, and requires that you be authentic…
My friend Mike Chapman said at one point that “there are no rules. When you try to put rules around it, you break it.”
The Jobs incident was the second time in a week that mainstream media organizations have been embarrassed by their online citizen journalism arms – sparking debate about the accuracy of reports from these Web sites and showing how it takes only a few minutes for a scurrilous rumor, placed on a site without sufficient editorial checks, to inflict damage.
So what’s the cure? A dozen years ago Bob Anderson and I were talking about the emerging new media ecology and the question of information authority in that context. We figured media literacy should be taught alongside reading, writing, and ‘rithmetic. Support critical thinking, not censorship or authoritarian structures for distributing information.
Education isn’t always enough, sometimes you really do need moderators, hopefully with a light touch. The SFGate story linked above says how sexually explicit photos were posted at CBS’mobile phone application site, after which CBS promised “to redouble its efforts to police content.” A moderator had quickly removed the photos. Some might argue that photos should be screened before they’re posted, and some sites would do it that way, but that’s a daunting task, especially where you may have thousands of posts, and it’s not in the spirit of the many-to-many mediasphere. CNN does have moderators for iReport, but they’re not checking facts… “mostly, it is the job of iReport users themselves to weed out erroneous or inappropriate material.” That’s the social media way – the “vetting” is crowdsourced, and the reader must read critically, never assuming that the “news source” is correct. I would argue that’s always been the case, even with the best journalists. I’ve never been close to a news story that wasn’t wrong in some of the particulars, at least from my perspective. And that’s part of the problem – perspectives and interpretations differ. That’s why I left journalism behind – when I was in journalism school, it seemed pretty clear that it would be hard to tell the truth. Only a few gonzo journalists, a la Hunter Thompson, realized they, and their biases, had to be transparent within the reporting…