Three Pioneers of Artificial Intelligence win the Turing Prize

SAN FRANCISCO – In 2004, Geoffrey Hinton redoubled a technological idea called a neural network. It was a way for machines to see the world around them, to recognize sounds and even to understand the natural language. But scientists had spent more than 50 years working on the concept of neural networks, and machines could not really do that. Against the Canadian government organized Dr. Hinton, a professor of computer science at the University of Toronto, a new research community of several scientists who also worked on the concept. These included Yann LeCun, a professor at New York University, and Yoshua Bengio at the University of Montreal. On Wednesday, the Association for Computing Machinery, the world’s largest association of computer professionals, announced that Drs. Hinton, LeCun and Bengio had won this year’s Turing Award for their work on neural networks. The Turing Award, introduced in 1966, is often referred to as the “Nobel Prize for Computers” and includes a $ 1 million prize that will be shared by the three scientists.Over the past decade, the great idea borne by these researchers has reinvented the development of technology and accelerated the development of facial recognition services, digital assistants, storage robots, and self-driving cars. Dr. Hinton is now at Google and Dr. LeCun works for Facebook. Dr. Bengio has done business with IBM and Microsoft.

 

“What we’ve seen is nothing short of a paradigm shift in science,” said Oren Etzioni, chief executive officer of the Allen Institute for Artificial Intelligence in Seattle and a prominent voice in A.I. Community. “The story has turned and I’m impressed.”

 

A neural network based on the network of neurons in the human brain is a complex mathematical system that can learn discrete tasks by analyzing huge amounts of data. For example, by analyzing thousands of old phone calls, it can learn to recognize spoken words. As a result, many artificial intelligence technologies can evolve at a speed that was previously unachievable. Instead of coding the behavior by hand into systems – a logical rule all at once – computer scientists can develop a technology that learns behavior largely independently. The London-born Dr. Hinton, 71, had accepted this idea early as a doctoral candidate. In the 1970s, most of the researchers of the artificial intelligence turned against it. Even his own doctor advisor questioned the choice.

Leave a Reply

Your email address will not be published. Required fields are marked *

RELATED POST

Apple TV’s biggest news is not about content, it’s about screens. You can look everywhere

Apple rolled out its highly anticipated video streaming service Apple TV + yesterday. A large selection of celebrities from the…

Kaspersky Lab Files Antitrust Complaint Against Apple App Store Policies

Cybersecurity company Kaspersky Lab has filed a cartel application against Apple with the Russian antimonopoly authority Antimon, which relates to…

WANT THE SECURITY BENEFITS OF APPLE CARD? ONLY USE APPLE PAY

At a typically glamorous launch event in Cupertino on Monday, Apple debuted the Apple Card, a new credit card offered…

The Google Home Hub offers over 50% off, the Nest Smart Thermostat is available for $ 170, and Philips Hue offers the best of today

Google Home Hub is back on the market Google Home Hub offerings have recently been up, with the latest offer…