In the brain, visual information about an ever-changing outside world is parcelled off to dedicated mechanisms in specialized regions, each responsible for processing specific visual features, such as colour, form, motion, etc. To further complicate matters, different visual features are processed at different speeds. The brain therefore not only has to figure out what belongs to what, it also has to keep track of what belongs when. How does the brain prevent visual information from different time-points from blurring together?
My research focuses on these temporal (time-related) aspects of vision. This encompasses both the timing of the perceptual processes (time in the brain: e.g. how quickly can the brain process certain features?) and the perceived timing of visual events (time in the mind: e.g. did the blue dot seem to appear before or after the appearance of the red dot?). My earlier work showed that, counterintuitively, time in the brain is by no means directly coupled to time in the mind.
The visual brain chops up the continuous stream of visual input into discrete frames, just like the shutter mechanism of a traditional video camera.
What if the visual system were to process the continuous stream of incoming visual information in successive snapshots, like the sequential frames of a video camera? Such a processing architecture would facilitate the binding of visual features belonging to the same time-point, since they would be part of the same ‘frame’. Although this notion of discrete perceptual episodes seems at odds with our day-to-day experience of continuous visual awareness, recent psychophysical and electrophysiological evidence suggests that some basic visual processes are in fact periodic. In particular, it has been suggested that attention might be the mechanism that grabs single frames from the incoming stream of information for further processing, in a similar way to the shutter mechanism in a video camera. This attentional sampling takes place rhythmically, with periods of high attentional resources alternating with periods where attention is unavailable. In addition, oscillatory brain activity has been linked to periodic fluctuations of awareness, indicating that periodic neural mechanisms do indeed result in discrete episodes in perceptual processing.
Different ways of visualizating how neural representations evolve over time using EEG pattern classification.
I use psychophysical/behavioural approaches as well as EEG (electroencephalography) to investigate this counterintuitive but parsimonious possibility. Most recently, I have borrowed a powerful technique from fMRI research and applied it to neuroimaging techniques with better time resolution (MEG and EEG), and showed that it gives unique insight into the evolution of neural representations over time (see these three papers for examples: Carlson et al 2011 – Journal of Vision, Hogendoorn 2015 – Perception, and Hogendoorn et al 2015 – Vision Research). In my current research, I am applying this technique to other aspects of how the brain processes visual information over time.