Effect of Music on the Brain
Psychologist Lauren Stewart on the musical perception, the emotional effects of listening to music and how can...
Science has been focused on the objective recording of thought processes since the early 20th century when Austrian psychiatrist Hans Berger invented recording brain electrical activity (EEG) directly from the scalp. Initially, however, the EEG method was used for diagnosing brain diseases. In the 1990s, scientists began to decode human movement intentions, and attention focused on objects within the visual field using EEG. More recently, researchers have learned to reconstruct the film frames shown to a subject on a screen by analyzing the blood flow distribution in the brain. These methods do not allow us to read a person’s thoughts, but the results obtained can be considered the first encouraging steps toward the instrumental reading of at least hints of specific thoughts.
The question “How to read people’s thoughts?” may seem perplexingly simple to a rational person. There’s nothing easier if one knows how to read. Thoughts are the messages of a person, formulated in a particular language and expressed through various means. For example, the captivating love story of Prince Siegfried and Odette is told through music and dance. “I think, therefore I am” is a thought by René Descartes. This thought can be interpreted differently: some argue that Descartes proposed it in search of a primary truth that cannot be doubted; others believe it indicates that our subjective world exists as long as it is expressed in thoughts.
Reading a thought is not as simple as it seems because thought is twofold in its form of representation: on the one hand, there is the thought in its state when it is still conceived in the author’s mind; on the other hand, there is the thought that is already spoken, written, conveyed through gestures, drawings, or dance—that is, transformed into something perceivable by another person’s senses and interpreted by them to the best of their understanding.
A thought, born somewhere deep in the brain at the intersection of memory traces and creative associations, undergoes at least two transformations as it is translated into the communicative form. First, the author selects means of expressing thoughts, such as words. The listener filters these words through their understanding. Choosing the right words to convey a thought accurately is a challenging task, as is understanding the meaning of those words. This differs from the coding process, where one sign system is unambiguously transformed into another. Thus, Descartes’ famous phrase only roughly conveys his thought. How, then, can one truly understand the original idea of the interlocutor? Any clarification is just a new retelling of it in different words.
The question arises of how to read a person’s thoughts directly from the brain, from those folds where the thought originates. In the 1920s, methods for recording the brain’s electrical activity (EEG) directly from the scalp were invented. For a long time, these methods were used to diagnose brain diseases. Still, in recent years, with the advent of powerful computing technology and new mathematical algorithms, it has become possible to address diagnostic issues and attempt to decode movement intentions from EEG, such as moving the right hand or leg. However, movement intentions can also be considered the inception of thought.
In 2012, American researchers Andrew Schwartz and John Donoghue, and in 2015, their colleague Richard Andersen, in various laboratories, managed to gain access to a broader set of movement intentions in paralyzed patients using 100 and 200 electrodes implanted directly into the brain cortex. This neurointerface transmits the person’s intentions directly to the motors of a robotic arm, which can be controlled skillfully enough to pick up a drink or a chocolate bar from a table and bring it to the mouth.
In essence, this is direct access to the formation of movement-related thoughts. Neurocomputer interfaces, though with great difficulty and after months of training, can adapt to capture movement intentions directly from the brain. However, movement intentions are not the thoughts we typically refer to; they are commands to perform motor actions, where all brain signals are directed not to an interlocutor but to muscle groups. These commands do not require intellectual understanding; muscles respond by simply contracting. Moreover, only the most straightforward movement intentions can be decoded: “I want to move my left hand, right hand, or legs.” The right and left do not even distinguish in this context. It is even more challenging to decode intentions to move individual fingers.
Of course, dozens of psychophysiological laboratories have attempted to decode thoughts about hands and feet, for example, when a person mentally imagines events or objects in the external world, such as night, street, lamp, or pharmacy. So far, these attempts have been unsuccessful.
It is impossible to decipher thoughts by recording electrical and metabolic processes in the brain. In 2011, neuroscientists Gallant and Nishimoto, along with colleagues from the University of Berkeley (Jack L. Gallant, Shinji Nishimoto et al.), managed to reconstruct the approximate contours of images from a film shown to a subject by analyzing blood flow distribution in the brain. However, the synthesized visual pictures in the authors’ technology are generated by averaging many frames from previously viewed films selected based on the similarity of blood flow distribution maps. If, for example, a new movie contains a close-up of a person’s face, then the synthesized image from the existing catalogue is chosen by matching the blood flow distribution maps with the current frame. The authors themselves state that this is not mind-reading but a technical procedure for approximately predicting how the illumination of the subject’s visual field is distributed at that moment.
Instrumental mind-reading by direct connection to the brain faces at least two significant obstacles. First, there is the extreme poverty of the connection channel itself: no matter how many electrodes are present—100, 200, or 2,000—it will always be unimaginably few compared to the astronomical complexity of the human brain’s neural network, with a million billion connections, which is the substrate of the neurodynamic codes of mental acts. Second, the mental representations themselves are not as simple as they may appear at first glance. A simple word like “Moscow” is the name of a city. But let us respond with the poet’s words: “…how much in this sound / Merged for the Russian heart! / How much has responded to it?”
How can this “much,” uniquely expressed by each person, be read in the fluctuations of brain electrical activity or blood flow distribution maps? This task may be difficult today, but it will be routine in 10-20 years. After all, 40 years ago, no one knew about laptops, and 20 years ago, a mobile phone “for everyone” was unimaginable. However, these are far from comparable analogies. In the case of the human brain, we are dealing with a natural object whose complexity is comparable only to the complexity of the universe. Reading thoughts instrumentally may be possible, but it is a task that requires a complete understanding of the brain. We know little about how the human brain creates psychic reality from myriad synthetic images, emotional experiences, aspirations, and desires intricately woven into each thought.
Psychologist Lauren Stewart on the musical perception, the emotional effects of listening to music and how can...
Developmental psychologist Uta Frith on psychopathy, the biomarkers of autism, and the autism spectrum
Scientists have found structures in parrot brain responsible for sound imitation