Self-learning AIs is a step beyond limit
During the World Economic Forum, which is better known as the Davos Forum, in January, AI was cited as the biggest technological innovation since the internet.
While many ordinary people have thought of AI as a concept from science fiction like humanoid robots, the historic match between Google’s AI system AlphaGo and world go champion Lee Se-dol in March, which ended in AI’s landside victory, got people’s attention and realize what AI is capable of.
AI and machine learning
Since AI was first suggested by the late mathematician John McCarthy at the Dartmouth Conference in 1956, the concept has continued to crystallize. On the back of the introduction of powerful processors and new technologies such as system virtualization, faster networks and big data analysis, the advancement of AI technology has accelerated since 2015.
One of the biggest differences of AI compared to conventionally programmed systems is the adoption of machine learning technology, which empowers computer systems with an active self-learning capability through algorithms that analyze data.
According to the definition by U.S. computer scientist Arthur Samuel in 1959, machine learning is a technology that “gives computers the ability to learn without being explicitly programmed.”
In its early days, machine learning researchers focused on supervised learning algorithms such as decision tree learning, inductive logic programming, Bayesian networks, clustering and reinforced learning. But none of them could realize a truly self-learning AI system because that required coding processes before the system starts learning something.
But one of the early algorithms called the “artificial neural network,” which was inspired by the biological nature of the human brain, has evolved into “deep learning.”
The deep learning algorithm, however, also faced obstacles because it is designed to mimic information input and output processes of the human brain and thus required massive processing power. And more recently, this algorithm has been realized thanks to the development of stronger graphic processors.
In 2012, Google and Stanford University professor Andrew Ng succeeded in realizing the deep neural network that connects more than 1 billion system nerves to the network using 16,000 computers. The researchers tested the deep neural network by picking 10 million images from YouTube and have the computer distinguish pictures of humans from those of cats. In the process, the computer successfully self-learned the characteristics of cat features and used the information to come to the judgments.
“Machine learning has become the news just now because learning by machines was made at a slow pace,” Google’s senior AI researcher Gregg Corrado, who participated in the development of the company’s deep neural network, said during a technology forum in Seoul on Oct. 10. “But now the pace has become faster and the technology is being applied to diverse products.”
AI systems with the deep learning algorithms are already showing image recognition capabilities that exceed those of humans. Such capabilities of the deep learning algorithm can be used in the medical sector to determine cancer cells and tumors in the magnetic resonance imaging scans. The same mechanism was used for AlphaGo’s ability to learn from go game records.
DeepMind, a Google subsidiary that has made the AlphaGo system, is continuing to pioneer AI and machine learning technology development.
On Oct. 12, researchers led by DeepMind CEO Demis Hassabis announced a new machine learning neural network which boosts an AI system’s inductive logic capability by adding a storing feature.
The new neural network technology, called “differentiable neural computer” (DNC), allows an AI system to store the data it calculated from the input. In a test by the researchers, the DNC-based AI computer was programmed with the map of the London Underground, which has 11 lines and 270 stations, and calculated the fastest route to get to a certain station. Whereas other neural network systems have shown about 37 percent accuracy in the same test, the new system recorded 98.8 percent accuracy.
“This is like humans taking notes and reading it afterwards when they need it,” said Lee Chang-ki, an associate professor of computer science at Kangwon National University.
Nvidia, the world’s top graphic processor maker, is pushing to combine its advanced processors and the deep learning algorithm for self-driving automobiles.
In August, the company unveiled a new mobile processor named “Parker” for next-generation self-driving cars.
Based on the Tegra mobile processor and the Pascal graphic unit, Parker is used for Nvidia’s self-driving car module Drive PX 2 to operate deep learning applications. The company said the module can run deep learning algorithms by 24 trillion times per second.
The Drive PX 2 system is currently used by more than 80 global carmakers including Volvo, automobile part developers and research institutions to develop self-driving car technology.
In Korea, Naver is working to connect AI technology to the deep learning algorithm, natural language recognition and voice synthesis for self-driving cars, robots and translation machines.
During the company’s developer conference DEVIEW 2016 on Oct. 24, the nation’s top portal service provider introduced its AI-based conversation system AMICA, which is like Google Assistant and Amazon Alexa, as well as its translation application Papago and its new web browser Whale. It also pledged to continue researching self-driving and robotics technologies in long-term perspective.
“We cannot avoid competition with global enterprises such as Google and Facebook because the internet does not have boundaries,” Naver’s chief Lee Hae-jin said during the conference, adding that the company plans to push for more aggressive investments in promising startups.
Naver CTO Song Chang-hyun said, “We will concentrate on AI-based technology development to compete with global enterprises. To this end, we will strengthen collaboration with partners and employ talented human resources both at home and abroad.”