Technology Review - Taming the Web
Technology & Sociology
'Nonetheless, the claim that the Internet is ungovernable by its nature is more of a hope than a fact. It rests on three widely accepted beliefs, each of which has become dogma to webheads. First, the Net is said to be too international to oversee: there will always be some place where people can set up a server and distribute whatever they want. Second, the Net is too interconnected to fence in: if a single person has something, he or she can instantly make it available to millions of others. Third, the Net is too full of hackers: any effort at control will invariably be circumvented by the world's army of amateur tinkerers, who will then spread the workaround everywhere.'

This article is not as good as it could be. It handles each of these 'myths'. The first one (too international) is handled adequately, but the third one (too many hackers) was a bit lop-sided, and the second one (too decentralized) was handled downright myopically.... which is kind of sad. I more-or-less agree with the position of the article (though I have a somewhat more nuanced view), but I didn't find this one convincing, even with that bias in favor of it.

While the third part does pay lip service to the fact that consumers will not buy things that grotesquely curtail functionality for the purposes of protecting other people's interests, I think it underestimates the importence. Consumers have already demonstrated this, with the rejection of DiVX. The simple fact of the matter is that if this were an election to decide on the feasibility of strict hardware control, the vast majority of the population is undecided-through-ignorance. I believe that the more the consumer electronic buyers become aware of the limitations, the more they will try to reject them. This is a bit of a wild-card factor that is difficult to predict and probably will be the primary determining factor.

The second part of the article is downright crappy. It focuses solely on the problems Gnutella has experienced, but Gnutella is not the be-all-end-all of decentralization. In fact, Gnutella's a 1.0 design and it shows; later iterations can be expected to show some improvement. The article also goes too far in crowing about how the Gnutella network has been "re-centralized" with the addition of particular types of 'servers' attached to it, but I think it really over-estimates the importence of that.

First, the network does not depend on these servers to function, the centralization merely increases the efficiency. The network doesn't need any particular server, so even if that server is lost through litigation, the network moves on, with slightly less efficiency. And as soon as somebody sets up another server, that loss in efficiency is ameliorated.

Second, while I don't track the development of Gnutella as closely as I would like (can't have it all)... I'd bet the function of those servers could largely be de-centralized. There's still room for improvement in the self-organizational aspects of Gnutella; the improvements mentioned in the article that BearShare makes are only first-approximation improvements that weren't even particularly difficult to come up with; more improvements can be made, many of them only getting easier to implement as the most pathetic machine you can buy right now still has server-class power. (Even my 133MHz Pentium laptop has an awful lot of power still, with the right software!)

Does that disprove the hypothesis that it is a myth that the Internet is too interconnected to control? No, it's still going to be an interesting fight. And that's exactly why I thought the second part of the article is weak; whether or not the "evidence" is right or wrong doesn't affect the hypothesis (strongly) either way.

(Funny, I always thought my teachers & professors were overly harsh with me when I wrote essays. They always had this exact complaint: "Theses insufficiently supported." Now I'm harder on essays then the whole lot of them combined. Guess I learned my lessons. ;-) )