Artificial intelligence may seem man-made, but there are actually a few examples of natural artificial intelligence in the wild, like a bat using echolocation to sense its environment. In the tech world, artificial intelligence is popping up everywhere. Cognitive neuroscientists are using artificial intelligence to further understand the human brain.
"The fundamental questions cognitive neuroscientists and computer scientists seek to answer are similar," says Aude Oliva of MIT. "They have a complex system made of components — for one, it's called neurons and for the other, it's called units — and we are doing experiments to try to determine what those components calculate."
In Oliva’s work, the researchers and neuroscientists are using artificial intelligence to learn all about contextual clues in human image recognition. The team used “artificial neurons” along with neural network models so they can figure out exactly what a human brain does in order to recognize a place or object.
"The brain is a deep and complex neural network," says Nikolaus Kriegeskorte of Columbia University, who is chairing the symposium. "Neural network models are brain-inspired models that are now state-of-the-art in many artificial intelligence applications, such as computer vision."
One of the studies looked at over 10 million images. Oliva’s team taught an artificial network to recognize 350 places, like a kitchen, bedroom, park and more. Before the study started, they thought that the network would learn objects that are within the places. But what they found was that the artificial network learned to recognize people and animals that are in the places it is trying to recognize.
Machine learning programs like this one are quick learners. When provided with copious amounts of data, these programs can learn even better than humans. Lots of data allow the programs to analyze contextual learning at a fine level. Studying these networks allows researchers to study human neurons that would be impossible to study at this small level. Artificial neural networks are “mini-brains that can be studied, changed, evaluated, compared against responses given by human neural networks, so the cognitive neuroscientists have some sort of sketch of how a real brain may function” according to the researchers.
"This involves millions of signals emanating from the retina, that sweep through a sequence of layers of neurons, extracting semantic information, for example, that we're looking at a street scene with several people and a dog," said Kriegeskorte. "Current neural network models can perform this kind of task using only computations that biological neurons can perform. Moreover, these neural network models can predict to some extent how a neuron deep in the brain will respond to any image."
Scientists and researchers have only recently started using computer science to further research the human brain, but it is quickly becoming one of the up-and-coming research methods.
"Human cognitive and computational neuroscience is a fast-growing area of research, and knowledge about how the human brain is able to see, hear, feel, think, remember, and predict is mandatory to develop better diagnostic tools, to repair the brain, and to make sure it develops well,” said Oliva.
Oliva and Kriegeskorte are presenting their research at the CNS meeting in Boston from March 24th-27th, 2018.