What has a “really stupid” name and gets a B+ or A- Grade?

by shiva

On February 7, 1958, in response to the launch of Sputnik, the US Department of Defense established the Advanced Research Projects Agency (ARPA). Several years later ARPA began to focus on computer networking and communications technology. UCLA proposed to DARPA to organize and run a Network Measurement Center for the ARPANET project.

Around Labor Day in 1969, BBN delivered an Interface Message Processor (IMP) to UCLA that was based on a Honeywell DDP 516, and when they turned it on, it just started running. It was hooked by 50 Kbps circuits to two other sites (SRI and UCSB) in the four-node network: UCLA, Stanford Research Institute (SRI), UC Santa Barbara (UCSB), and the University of Utah in Salt Lake City.

That was the birth of the Internet. This initiated work on Internet Protocols and brought the Internet into being.

The history of every great invention is based on a lot of pre-history. In the case of the World-Wide Web, there are two lines to be traced: the development of hypertext, or the computer-aided reading of electronic documents, and the development of the Internet protocols which made the global network possible.

Web was a side effect of the CERN’s scientific agenda. After the World War 2 the nuclear centers of almost all developed countries became the places with the highest concentration of talented scientists. For about four decades many of them were invited to the international CERN’s Laboratories. So specific kind of the CERN’s intellectual “entire culture” (as you called it) was constantly growing from one generation of the scientists and engineers to another. When the concentration of the human talents per square foot of the CERN’s Labs reached the critical mass, it caused an intellectual explosion

Until the early 1970’s, the so-called area of “Data Communications” at CERN was in a state of chaos. The variety of different techniques, media and protocols used was staggering; open warfare existed between many manufacturers’ proprietary systems, various home-made systems (including CERN’s own “FOCUS” and “CERNET”), and the then rudimentary efforts at defining open or international standards.

By 1989 CERN’s Internet facility was ready to become the medium within which Tim Berners-Lee would create the World Wide Web with a truly visionary idea. In fact an entire culture had developed at CERN around “distributed computing”, and Tim had himself contributed in the area of Remote Procedure Call (RPC), thereby mastering several of the tools that he needed to synthesize the Web such as software portability techniques and network and socket programming.

The Web, — crucial point of human’s history, was born…

Tim Berners-Lee graduated from the Queen’s College at Oxford University, England, 1976. Whilst there he built his first computer with a soldering iron, TTL gates, an M6800 processor and an old television. Berners-Lee comes by his vocation naturally. His parents helped design the world’s first commercially available computer, the Ferranti Mark I. “The family level of excitement about mathematics was high,” he says, recalling the breakfast-table teasing of his younger brother, then in primary school, who was having trouble fathoming the square root of negative four.

In adolescence Berners-Lee read science fiction, including Arthur C. Clarke’s short story Dial F for Frankenstein. It is, he recalls, about “crossing the critical threshold of number of neurons,” about “the point where enough computers get connected together” that the whole system “started to breathe, think, react autonomously.” Could the World Wide Web actually realize Clarke’s prophecy? No– and yes. Berners-Lee warns against thinking of the Web as truly alive, as a literal global brain, but he does expect it to evince “emergent properties” that will transform society.

He spent two years with Plessey Telecommunications Ltd (Poole, Dorset, UK) a major UK Telecom equipment manufacturer, working on distributed transaction systems, message relays, and bar code technology. In 1978 Tim left Plessey to join D.G Nash Ltd (Ferndown, Dorset, UK), where he wrote among other things typesetting software for intelligent printers, and a multitasking operating system.

A year and a half spent as an independent consultant included a six month stint (Jun-Dec 1980)as consultant software engineer at CERN, the European Particle Physics Laboratory in Geneva, Switzerland. Whilst there, he wrote for his own private use his first program for storing information including using random associations. Named “Enquire”, and never published, this program formed the conceptual basis for the future development of the World Wide Web.

From 1981 until 1984, Tim worked at John Poole’s Image Computer Systems Ltd, with technical design responsibility. Work here included real time control firmware, graphics and communications software, and a generic macro language. In 1984, he took up a fellowship at CERN, to work on distributed real-time systems for scientific data acquisition and system control. Among other things, he worked on FASTBUS system software and designed a heterogeneous remote procedure call system.

It is for “random reasons” that Berners-Lee is known as the inventor of the World Wide Web, he says. “I happened to be in the right place at the right time, and I happened to have the right combination of background.” The place was CERN, the European physics laboratory that straddles the Swiss-French border, and he was there twice. The first time, in 1980, he had to master its labyrinthine information system in the course of a six-month consultancy. That was when he created his personal memory substitute, a program called Enquire. It allowed him to fill a document with words that, when clicked, would lead to other documents for elaboration.

In between the birth of Enquire and the birth of the Web a decade later, the world had changed. The Internet, though still unknown to the public, was now firmly rooted. It was essentially a bare-bones infrastructure, a trellis of empty pipes. There were ways to retrieve data, but no really easy ways, and certainly nothing with the intuitive, neural structure of hypertext.

To Berners-Lee, one attraction of the Internet was that it encompassed not just CERN but CERN’s far-flung collaborators at labs around the world. “In 1989, I thought, look, it would be so much easier if everybody asking me questions all the time could just read my database, and it would be so much nicer if I could find out what these guys are doing by just jumping into a similar database of information for them.” In other words: give everyone the power to Enquire.

Berners-Lee wrote a proposal to link CERN’s resources by hypertext. He noted that in principle, these resources could be text, graphics, video, anything–a “hypermedia” system–and that eventually the system could go global. “This initial document didn’t go down well,” says Berners-Lee. But he persisted and won the indulgence of his boss, who okayed the purchase of a NeXT computer. Sitting on Berners-Lee’s desk, it would become the first Web content “server,” the first node in this global brain. In collaboration with colleagues, Berners-Lee developed the three technical keystones of the Web: the language for encoding documents (HTML, hypertext markup language); the system for linking documents (HTTP, hypertext transfer protocol); and the www.whatever system for addressing documents (URL, universal resource locator).

The idea of a global hypertext system had been championed since the 1960s by a visionary named Ted Nelson, who had pursued it as the “Xanadu” project. But Nelson wanted Xanadu to make a profit, and this vastly complicated the system, which never got off the ground. Berners-Lee, in contrast, persuaded CERN to let go of intellectual property to get the Web airborne. A no-frills browser was put in the public domain–downloadable to all comers, who could use it, love it, send it to friends and even improve on it.

But what should he name his creation? Infomesh? No, that sounded like Infomess. The Information Mine? No, the acronym–TIM–would seem “egocentric.” How about World Wide Web, or “www” for short? Hmm. He discussed it with his wife and colleagues and was informed that it was “really stupid,” since “www” takes longer to say than “the World Wide Web.”

The breathtaking growth of the Web has been “an incredibly good feeling,” he says, and is “a lesson for all dreamers … that you can have a dream and it can come true.” But Berners-Lee’s story has more facets than simple triumph. It is in part a story about the road not taken–in this case the road to riches, which in 1992 he pondered taking, and which he still speaks of with seemingly mixed emotions. His is also a story about the difficulty of controlling our progeny, about the risky business of creating momentous things, unleashing epic social forces. For Berners-Lee isn’t altogether happy with how the World Wide Web has turned out.

He says he’d give the Web a B-plus, even an A-minus, that on balance it is a force for good. Yet an “accident of fate” has compromised its goodness. And that accident is intertwined with–perhaps, perversely, even caused by–his decision back in 1992 to take the road less traveled. The question that fascinates people who have heard of Berners-Lee–Why isn’t he rich?–may turn out to have the same answer as the question that fascinates him: Why isn’t the World Wide Web better than it is?


Berners-Lee is the unsung–or at least undersung–hero of the information age. Even by some of the less breathless accounts, the World Wide Web could prove as important as the printing press. That would make Berners-Lee comparable to, well, Gutenberg, more or less.

Berners Lee spends time at M.I.T., where his nonprofit group, the World Wide Web Consortium, helps set technical standards for the Web, guarding its coherence against the potentially deranging forces of the market.

As director of the Web consortium, he brought together its members–Microsoft, Netscape, Sun, Apple, IBM and 155 others–and tries to broker agreement on technical standards even as the software underlying the Web rapidly evolves. His nightmare is a Web that “becomes more than one Web, so that you need 16 different browsers, depending on what you’re looking at.” He especially loathes those BEST VIEWED WITH ACME BROWSER signs on Websites. Apart from being the director of the World Wide Web Consortium, senior research scientist at MIT and professor at the University of Southampton. On 14th September 2008 he announced the creation of the World Wide Web Foundation

Story compiled from

History of Internet and WWW by Gregory R. Gromov THE MAN WHO INVENTED THE WEB, in Time, MAY 19, 1997 VOL. 149 NO. 20 by ROBERT WRIGHT and FASCINATING FACTS ABOUT TIM BERNERS-LEE and from World Wide Web Foundation

Leave a Reply