Skip to main content Skip to local navigation

Objects and faces fast-tracked in brain to pick out familiar or threat

How does the brain convert sights and sounds into information that makes sense to its neurons? A new study led by York psychology Professor Kari Hoffman found that objects and faces are fast-tracked in the brain, making it faster and easier to pick out a familiar face or a perceived threat.

The study, published in the November issue of the journal of Proceedings of the National Academy of Sciences of the United States of America, tested how the brain codes objects in the environment – one of the key jobs in visual recognition. The brain headshot of Hoffmantakes all the visual information, and codes and categorizes it using precisely synchronized electrical pulses. “Something in this electrical activity lets us know what we’re looking at,” says Hoffman of York’s Faculty of Health. “But we don’t know how this coding is done.”

Insects and other mammals use phase coding, which is where neural pulses align to a rhythm inside of their brains, and the way the pulses align depends on the object being viewed. It’s a bit like the way telecommunication signals are multiplexed, the brain uses precisely timed pulses, with the objects of interest coded through differences in the timing of pulses and not just the mere presence or amount of pulses. The precise alignment of pulses to different parts of the rhythm allows the brain to make sense of the stimuli.

What Hoffman and her team – Hjalmar K. Turesson (Hoffman’s student) from the Center for Molecular and Behavioral Neurosciene at Rutgers University and Nikos K.Logothetis from the Max Planck Institute for Biological Cybernetics in Tuebingen, Germany – found was that this same phase coding was also used in recognizing faces and objects.

“This is the first study to show that faces/objects are also coded in this way, using phase coding,” says Hoffman.

Think of it like a symphony, she says. “Cymbals are heard on the upbeat, whereas drums are heard on the down beat in a rhythm, and these two instruments can be separated out; we hear them as separate sounds.” When faces and objects are seen, they create pulses in distinct phases of the brain rhythms, and everything works in synchronicity with that rhythm, like a symphony.

However, “if the cymbal doesn’t go off (fire) at the right time, the instruments (signals) get mixed up and instead of a symphony, you get cacophony,” says Hoffman. “Some of the implications can be profound when the brain doesn’t do what it’s supposed to.”

If the timing is off for these signals, then it throws the ability to code them off. This object perception and categorization helps people respond appropriately to faces, something some people with Autism Spectrum Disorder (ASD) have difficulty doing. Something may be going awry with the super-fast, long-range connections in the brain, disrupting the timing of the ‘symphony’ and consequently, interfering with coding.

Hoffman points to the Internet as a loose analogy. When it’s having problems or there is a bottleneck of information, there are often delays in processing. This understanding could also be beneficial for people with epilepsy, where the synchrony goes in the wrong direction.

Future research could look into whether the brains of people with ASD could compensate for having weak long-range connections in certain cell groups, which are responsible for decision-making, language and social interactions, by synchronizing them to a rhythm in another part of the brain.

The research was supported by the Max Planck Society, the National Sciences & Engineering Research Council of Canada and the Alfred P. Sloan Foundation.

Research & Innovation

Tags: