Censorship

| |
I spent some time composing an email to a friend about the proposed internet censorship, and thought it deserved a wider audience. Here it is (slightly edited for context):

-------------------------------------------------

I've long held the idea of the internet less as a communications tool and more of a commons. That is, it's not so much about the getting of information, as the sending, if that makes sense. Let me put it this way - radio, television and other broadcast media are about sending out information that people want to hear. If people hear enough stuff that they don't like, they'll switch off, complain or whatever, and the content will (reactively) change. The internet is almost a polar opposite (and is fast becoming more so with the web 2.0 phenomenon) - every day people put content out there and if people don't read it, it doesn't matter. The content on the internet changes according to the whim of the people creating that content, not according to the whim of the people reading it. Of course, some things become popular (think the cheezburger franchise for a good example), but for the most part the content available on the net will still be there, even if no one reads it. Does anyone really care what old mate's Facebook profile says? If I stopped blogging tomorrow would there be a public outcry? If all the lolcats disappeared would anyone protest? OK, they're all some pretty flippant examples, but hopefully you get the point I'm trying to make. To use a few buzzwords - it's content-driven, not consumer-driven. Most people (with a few notable exceptions) blog because they have something to say, not just because there's someone out there to hear it.

The point I'm trying to make is that the internet really cannot be classified as a "communications tool", a "broadcast medium", a "mass media outlet" or any one of a hundred other terms that can be applied to television or radio. Strange as it may seem (and stick with me here) I actually think it can be more readily likened to a town square. It's not about the content, but about how it is distributed. In the traditional town square, there would be a number of people moving in and out of it throughout the day - imagine the old guy sitting on the bench, he's there every day, and knows all of what's going on - he's like the news sites - you go straight to him if you want to know what crimes have been perpetrated, whether it's going to rain tomorrow or if you want to know when to put the bull in with the cows. Then there's the other locals, they stop when they see someone they know, and have a chat together - that's your Facebook, MySpace, personal blogs and the other 'social networking' stuff. You use these people to catch up on gossip - find out if Mary's pregnant, if Bob is having an affair, if their tulips have bloomed this year. Imagine you also have the hawkers from the shops on the Main Street - they're telling you what their specials are, trying to convince you buy from them, just like the commercial sites and other ads. It's a fairly rough analogy, but to me it gets to the heart of what the internet is all about, especially with the advent of the Web 2.0 stuff, it's a place for people to get together - to share knowledge, to share information, to share mindless gossip. What happens when a flasher makes it into the town square? Well, some might get a bit upset, but mostly, people would turn their backs or leave, to return later on when they felt safer, or not at all. Imagine a whole group of flashers turns up - outnumbering the benign passersby. In that case, the townsfolk would abandon the town square. The interesting part is that they would reconvene somewhere else (if not as a large group in another part of town, in knitting groups, coffee shops and council meetings) - there will always be a commons.

Now, in a roundabout way, to the point at hand - the idea of introducing any kind of limiting or measuring tool on the internet (including classification) means that you're thinking about the internet in the wrong way. You're thinking in technical terms - about what's possible and what isn't. You're thinking about how to keep the flashers out of the commons, so that it's safe for the children to ride their bikes there. What you're not thinking about is the social and human aspect of things. Using my town square example, imagine the local policeman. He doesn't know where the flasher lives, so he can't go after him that way. What he decides to do instead is stand in the middle of the square, and interview everyone who wants to come in to it. The old guy sitting on the park bench is kicked out, because he matches the description of the flasher. Mildred isn't let in, because all she ever wants to talk about is her gastro problems, and that's potentially offensive. Next, he decides he's not going to let the butcher in, because his apron is all covered in blood, and that could scare the kids. What happens? Will the people come in anyway, submit to the policeman's idea of what's 'offensive' and what's benign? Of course not - they'll go somewhere else. They might decide to meet at the park, or at the library. What happens then? The policeman is left upholding 'order' in a commons that no one uses.

When you impose censorship - in the form of blacklisting certain content, classifying content into degrees of 'offensive', or showing warnings before potentially offensive content - on direct mediums like radio and television, the result is fairly predictable. A certain, small number of viewers will turn off, never to return. Many more will accept the regulation and continue consuming. Another fairly predictable number will applaud the move as 'protecting the children'. When you impose censorship on a commons, the effect is a lot more unpredictable - to start with, many people will move around the blockages (access the information they want in another way. Whether that's through technical solutions like routing, or non-technical solutions like buying a newspaper and phoning their friends). Of course, many will just shrug and accept it, others will applaud the move (although the percentages may vary, these people will always exist, but they're not the interesting ones). What makes it interesting is what happens to the people who thwart the blockages - after a while, thwarting it won't be enough and the most likely result is they will find a new commons. The ARPANET was designed to withstand nuclear attack, and it does that. The new commons - whatever it ends up being - will be designed to withstand censorship.

To conclude my diatribe - a classification system or a blacklist or a kiddie-porn filter is not the answer. All it does is put a policeman in the town square. My blog can not in any way be considered offensive, but if every post I wrote was subject to classification by a government body, I would stop blogging immediately. I wouldn't be the only one. Pretty soon, the internet would be devoid of everything that makes it great.


Since the last several hundred thousand years, almost all the mutation going on around here, in this species, has been cultural. Every new invention is a cultural mutation, and so is every new idea ... The idea is, we're mutating already. Around here, we're mutating like crazy. Nice.


While I can fully understand the reasons behind a wish to classify the internet, the differences between 'classify' and 'censor' are really only semantic. Humans have always had commons. Usenet, message boards, Web 2.0 and 'social networking' have come about because television and radio isolated us as individuals, and so we found a new commons. If that new commons is violated, another one will be found.

0 comments:

Post a Comment