Sir Tim Berners-Lee — the inventor of the World Wide Web — has won this year's A.M. Turing Award, which is frequently described as the "Nobel Prize of Computing," by the Association for Computing Machinery (ACM).
Turing Award is named after Alan Mathison Turing, the British mathematician and computer scientist who was a key contributor to the Allied cryptanalysis of German Enigma cipher and German "Tunny" encoding machine in World War II.
The ACM announced the 2016 Turing Award on Tuesday, which also includes the top prize of $1 Million that has been awarded to Sir Berners-Lee, who is long known for inventing World Wide Web, which becomes a way for scientists to share information on the Internet.
Sir Berners-Lee vision was to create a place where individuals could share information across the world through a "universal linked information system," in which a network of documents (web pages) linked to one another could help people navigate to find what exactly they need.
And so is the concept of the World Wide Web.
Sir Berners-Lee initially proposed the idea for a worldwide network of computers sharing information in the year 1989, while he was working at the European Organization for Nuclear Research (CERN) in Geneva, Switzerland.
The World Wide Web (W3) was written on a NeXT computer, made by the company Steve Jobs founded after he was kicked out of Apple back in 1985.
The 61-year-old professor has not stopped working. Sir Berners-Lee is now a professor at MIT and Oxford and also remains the director of the World Wide Web Consortium (W3C) – an organization that sets technical standards for the development of Web.
The Web has since become the world's most powerful medium for communications, knowledge, and trade — but that does not mean its creator is happy with all of the consequences.
Sir Berners-Lee regretted a lot of things about his invention with primarily concern that the Internet has now transformed into the "world's largest surveillance network."
Turing Award is named after Alan Mathison Turing, the British mathematician and computer scientist who was a key contributor to the Allied cryptanalysis of German Enigma cipher and German "Tunny" encoding machine in World War II.
The ACM announced the 2016 Turing Award on Tuesday, which also includes the top prize of $1 Million that has been awarded to Sir Berners-Lee, who is long known for inventing World Wide Web, which becomes a way for scientists to share information on the Internet.
"I'm humbled to receive the namesake award of a computing pioneer who showed that what a programmer could do with a computer is limited only by the programmer themselves," Sir Berners-Lee said on receiving the award.
Sir Berners-Lee wrote about the HyperText Transfer Protocol (HTTP) that outlined how data would travel between computers, as well as, HyperText Markup Language (HTML) that was used to create the world's first website – which can still be visited today even after more than two decades of its creation.
"It's an honor to receive an award like the Turing that has been bestowed to some of the most brilliant minds in the world."
Sir Berners-Lee vision was to create a place where individuals could share information across the world through a "universal linked information system," in which a network of documents (web pages) linked to one another could help people navigate to find what exactly they need.
And so is the concept of the World Wide Web.
Sir Berners-Lee initially proposed the idea for a worldwide network of computers sharing information in the year 1989, while he was working at the European Organization for Nuclear Research (CERN) in Geneva, Switzerland.
The World Wide Web (W3) was written on a NeXT computer, made by the company Steve Jobs founded after he was kicked out of Apple back in 1985.
The 61-year-old professor has not stopped working. Sir Berners-Lee is now a professor at MIT and Oxford and also remains the director of the World Wide Web Consortium (W3C) – an organization that sets technical standards for the development of Web.
The Web has since become the world's most powerful medium for communications, knowledge, and trade — but that does not mean its creator is happy with all of the consequences.
Sir Berners-Lee regretted a lot of things about his invention with primarily concern that the Internet has now transformed into the "world's largest surveillance network."
Today, the Web "controls what people see, creates mechanisms for how people interact," Sir Berners-Lee said in a statement quoted in New York Times. "It is been great, but spying, blocking sites, repurposing people's content, taking you to the wrong websites that completely undermines the spirit of helping people create."The Web model relies on central servers and IP addresses, which can be easily tracked or blocked. Therefore, Sir Berners-Lee is looking to decentralize the whole Web.
"The web is already decentralized," he said. "The problem is the dominance of one search engine, one big social network, one Twitter for microblogging. We do not have a technology problem; we have a social problem."The idea is to eliminate middleman completely from all aspects of the Web. Still, all the major players do not agree to this decentralize approach. It's still a question that whether the Internet needs decentralizing.