You know those people who always run into people they know? Like when they’re buying groceries, taking the kids to school, trying on jackets? They must be really popular, right?
Well maybe not.
Maybe they just have a talent for spotting and recognizing faces.
Because there are also those who couldn’t recognize Angelina Jolie if she got a perm.
Well research published in this month’s Current Biology finds that the ability to recognize faces is most likely heritable, and the research also supports the so-called modular view of the brain.
This is the view that the brain acts more like a toolbox with different specialties, as opposed to a generalized piece of equipment. This generalist theory is what the idea of IQ is based on—meaning that an aptitude for math probably means an aptitude for reading, too.
The researchers had 102 pairs of identical twins and 71 pairs of fraternal twins look at 20 different images of faces for about a second each. Then 10 of the images were mixed with 20 new face images. Subjects had to say which ones they’d already seen.
And scores were significantly more matched between identical as opposed to fraternal twins. The researchers controlled for differences in vision, general object recognition and memory.
In a separate study with 320 participants they also found that facial recognition ability was independent of IQ. According to the researchers, these findings support the view that there are specialist genes that specifically affect a talent for face recognition.
They write that this heritability of specific cognitive talents might suggest that some genes have specific cognitive effects, and that this research might give insight into disorders like dyslexia or Williams syndrome.