David, a researcher at the University of California, and Lihe Liu, from the University of Florida, have made a new breakthrough in the field of cognitive neuroscience research, focusing on anticipatory attention. Following the publication of their research in the Journal of Neuroscience, they have been able to study how the visual cortex and parietal system work through the use of EEG technology and artificial intelligence. New avenues have been opened up in the study of visual perception, which will be of great help to people with ADHD or autism. Read on to find out more.
The mind’s anticipatory capacity
Imagine you are looking at the sky searching for a drone or a bird. You don’t see anything yet, but your mind is already ready. This ability to prepare ourselves before the visual stimulus appears is called anticipatory attention. It is the way the brain gets ahead of itself to decide what information is worth looking at and what should be ignored. Although it seems automatic, behind this process is a complex system that coordinates different areas of the brain as if they were instruments in an orchestra.
For years, scientists have tried to understand whether the brain first adjusts its attention to a general type of feature—for example, “movement” or “color”—or whether it does so directly to a more specific detail, such as ‘blue’ or “upward.” Knowing this is not an academic whim: it involves understanding how the brain filters the chaos of the visual world to give us an orderly and coherent perception. In fractions of a second, our visual system decides what deserves to pass through the filter of consciousness.




