A New Web, But What?
The father of the Internet, Sir Tim Berners-Lee, has not done with us all yet. Last Sunday (September 14, 2008) he and Steve Bratt (CEO of the World Wide Web Consortium – W3C) welcomed a multitude to the Newseum in Washington, DC, to announce a new initiative for the web.
It was, maybe, a little ironic that Berners-Lee should make the announcement at the end of the week when the Large Hadron Collider (LHC) was fully powered up for the first time, for it was while he was at CERN that he started the work that led to the current Internet, HTML, and the rest. We might all grumble about aspects of the resulting technology (mine is why on earth did he disallow the use of the ampersand in domain names?) but we all benefit from the work that was done by him – and many others – day after every day.
The reasons he believes his work is not yet done are multitudinous, but one of the most important to him is that he believes there is no standard of believability in web sites around the world. One of the stories he pointed out was the continuing terrors that were spread about the LHC creating huge black holes that would consume our world. He does not know the answers yet to believability, but his World Wide Web Foundation will explore for them.
He would somehow like to able to label web sites with an accuracy label of some kind, to separate those that give facts as opposed to those that rumor-monger; to separate those that give real scientific information as opposed to those that spout bad science; to separate those that talk truth as opposed to those who lie. He would also like to open the web – really open the web – to those who will increasingly rely on portable devices to keep them in contact with the outside world. This would empower almost complete continents, like Africa, where computers are hardly de rigueur outside of the major cities, even in South Africa. Of the world’s current population only 20% have access to the web.
When we hear the constant advertising of “3G speeds” in association with wireless networks and particular phones we have to realize that there is no real definition of what that means, be it 114 kbit/s or 384 kbit/s, with no comparison with the 54 Mbit/s that 802.11 Wi-Fi systems are throwing around. It may well be that Berners-Lee new web might be based on a protocol other than existing commercial voice/data standards, although he says that he hopes that is not the case.
The web we have today was designed for academics and it now serves the general population who have western values. A teenager with a prepaid phone card is well served by the existing technology for the thousands of minutes that he/she is texting monthly; the business user is well served by the e-mail capabilities and the phone capabilities of, say, a Blackberry. The rural user in Kenya, looking for a way to treat a sick child, learn English, find what disease is affecting his goats, could hardly use what we have available now. There are plenty of handsets in rural areas, but it may be that they are not the right devices for the future use of the universal web in allowing those who are uneducated, illiterate, or poorly educated to obtain reliable information in a format they understand.
Berners-Lee is also concerned that the level playing field of the web – something that has allowed EN-Genius to breathe, survive, and grow – has become a tool for big business and vested interests who are not completely interested in truthful information. It has also become a political tool and a weapons system with hackers being trained and deployed as part of many countries’ defense/offense.
The new Foundation will act as a facilitator, bringing together anybody and everybody who has an interest in the next stages of the web and its use for mankind. Seed financing for the Foundation is being provided by the John S & James L Knight Foundation in the form of $1 M a year for five years.
The universal spread of available information, through the web, is altruistic and honorable. Sir Tim has admitted in interviews that he had no idea where the original web was going to go, nor how pervasive it would be in so many peoples’ lives. Getting web technology worldwide is obviously very possible. There are permutations, certainly, but the basic structures are available and cost effective. Whether there is a viable commercial model is a completely different matter and that might well be a situation where infrastructure donations are sought at an equipment level (certainly rather than at a cash level).
What the Foundation is going to have most trouble with is how to label web sites with an honesty or usefulness label. This is not a matter of machine testing; it will require human intervention of some sort, a massive task that I cannot imagine how might be undertaken.
Wikipedia, for example, is a wonderful notion, but EN-Genius never links to it because of its unreliability. When Stephen Colbert coined the term "wikiality" in his segment of The Word (on The Colbert Report) one night in 2006, telling viewers – not directly asking them to modify anything, but asking them to find an entry on elephants – that the number of elephants in Africa had increased threefold in the last six months, about twenty files were vandalized before they could be locked up.
That’s the web we have come to know, love and, also, mistrust.