From Apple's Siri to Honda's robot Asimo, the communication
between machines and humans seems to be getting more and more familiar. However,
some neuroscientists have warned that the current computer does not really
understand what we mean because they do not consider the context of conversation
in human conversation.
Arjen-Stolk, a postdoctoral researcher at the University of
California, Berkeley, says machines can't learn to understand people, places,
and situations—which often involve long social history—and this is human
communication. The essential. Without such a common foundation, computers
will inevitably be confused.
“People
always think that communication is a language symbol or a gesture interchange,
and forget that a lot of communication is related to the social background and
to the people you communicate with,” Stolk said.
For example, the word Bank, but when you hold a credit card it
can be interpreted as a meaning (bank), but if you hold a fishing rod, then it
has another meaning (river bank). Another example: in the absence of any
background environment, making a V with two fingers may mean victory, number
two or "this is the two fingers I hurt".
“All these
subtle differences are critical to mutual understanding,” Stolk said. In
the process of communication, they are even more critical than words and
symbols that computers and many neuroscientists pay attention to. “In
fact, we can still understand each other without words, words or symbols.”
Effective communication between newborns and parents is based
primarily on gestures and the background of consensus they have built in a
short period of time. Stolk believes that engineers and scientists should
pay attention to the background of common understanding. His claim stems from
the experimental evidence of brain scans when humans reach non-verbal
consensus.
Some studies conducted by Stolk indicate that disruption of
consensus is a cause of certain social illnesses, such as autism. Stolke
and colleagues discussed the importance of conceptual alignment in mutual
understanding in an article published in the January 11 issue of Cognitive
Science Trends.
“Without
language, you can understand how people communicate and provide a new
theoretical and empirical basis for understanding normal social communication,
and open up a new window for understanding and treating neurological social
disorders and neurodevelopmental disorders.” University of California,
Berkeley, Helen Dr. Robert-Knight, a professor of psychology at the Institute
of Neuroscience at the University of California, and a professor of neurology
and neurosurgery at the University of California, San Francisco, said.
In order to
explore how the brain achieved consensus, Stolk created a game that required
two players to communicate rules with each other only through game movements
without seeing each other, which eliminated the influence of language or
gestures.
When two players communicated non-verbally through a computer,
Stolk used functional magnetic resonance imaging (fMRI) to scan their brains. It
turns out that when two players try to communicate the rules of the game, the
same brain area, the right-handed lobe that is currently poorly known, becomes
active. More importantly, the right temporal lobe maintains a stable
baseline activity level throughout the game, but when a player suddenly
understands what another player is trying to communicate, it will suddenly
become active. The right hemisphere of the brain is more involved in
abstract thinking and social interaction than the left hemisphere.
"When
you establish a common understanding of something with someone, the activity of
these areas in the right temporal lobe will increase, but it will not happen
when you send a signal," Stolke said. “The deeper the players
understand each other, the more active this area becomes.”
This means that two players have established a similar
conceptual framework in the same brain area, continuously testing each other to
ensure that their concepts are consistent, and Updates will only be made when
new information changes this consensus. The findings were published in the
2014 issue of the Proceedings of the National Academy of Sciences.
On the
other hand, robots and computers conduct conversations based on statistical
analysis of the meaning of a word. If you often use the word Bank to
represent where you are withdrawing money, then it will be the meaning set in
the conversation, even if the conversation at the time was about fishing.
"Apple's Siri pays attention to statistical regularity, but
communication is not only about statistical laws. Although it has some truth,
it is not the way the brain works. Computers must have a cognitive construction
that can communicate effectively with humans. Consciously capture and update
the conceptual space shared with communication objects during
conversations." Such a dynamic conceptual framework can help computers
deal with the inherently ambiguous communication signals generated by humans.
Stolk's
research identifies other areas of the brain that are critical to achieving
consensus. In a 2014 study, he used brain simulations to disturb the
second half of the temporal lobe and found that it played an important role in
integrating the input signal with previously interacting knowledge. A
subsequent study found that patients with impaired frontal lobe (the middle
frontal cortex) were no longer matched to stored knowledge, both of which
explained why such patients showed social interaction in social interactions.
awkward.
"Most
cognitive neuroscientists focus on the signal itself, focusing on words,
gestures, and their statistical relationships, while ignoring the potential conceptual
abilities we use in communication, as well as the flexibility and variability
of everyday life," Stoke Said. "Language is useful, but it's
just a tool for communication. If you only focus on the language, then you may
ignore the underlying mechanisms that our brains help us communicate."
0 Comments