[2009.02185] Naive Artificial Intelligenceopen searchopen navigation menucontact arXivsubscribe to arXiv mailings

In the cognitive sciences, it is common to distinguish between crystal intelligence, the ability to utilize knowledge acquired through past learning or experience and fluid intelligence, the ability to solve novel problems without relying on prior knowledge. Using this cognitive distinction between the two types of intelligence, extensively-trained deep networks that can play chess or Go exhibit crystal but not fluid intelligence. In humans, fluid intelligence is typically studied and quantified using intelligence tests. Previous studies have shown that deep networks can solve some forms of intelligence tests, but only after extensive training. Here we present a computational model that solves intelligence tests without any prior training. This ability is based on continual inductive reasoning, and is implemented by deep unsupervised latent-prediction networks. Our work demonstrates the potential fluid intelligence of deep networks. Finally, we propose that the computational principles

3 mentions: @maguroIsland
Date: 2020/09/12 14:21

Related Entries

Read more [2004.01981] Open Domain Dialogue Generation with Latent Imagescontact arXivarXiv Twitter
0 users, 3 mentions 2020/04/07 21:51
Read more [2004.00603] No-regret learning dynamics for extensive-form correlated and coarse correlated equilib...
0 users, 5 mentions 2020/04/07 23:21
Read more [2004.03238] Variational Question-Answer Pair Generation for Machine Reading Comprehensioncontact ar...
0 users, 3 mentions 2020/04/08 23:21
Read more [2008.05925] Commonsense Knowledge Graph Reasoning by Selection or Generation? Why?open searchopen n...
0 users, 2 mentions 2020/08/16 17:21
Read more [2009.08123] DLBCL-Morph: Morphological features computed using deep learning for an annotated digit...
0 users, 4 mentions 2020/09/18 02:21