Psychologist logo

From the Research Digest, September 2016

Including how the brain deals with blinks, schmoozing and more.

02 August 2016

A selection from our Research Digest blog.

Psychologists still don't know how the brain deals with blinks
In Journal of Experimental Psychology: Human Perception and Performance

If you were sitting in a dark room and the lights flickered off every few seconds, you'd definitely notice. Yet when your blinks make the world go momentarily dark – and bear in mind most of us perform around 12 to 15 of these every minute – you are mostly oblivious. It certainly doesn't feel like someone is flicking the lights on and off. How can this be?

A new study in Journal of Experimental Psychology: Human Perception and Performance has tested two possibilities – one is that after each blink your brain 'backdates' the visual world by the duration of the blink (just as it does for saccadic eye movements, giving rise to the stopped clock illusion); the other is that it 'fills in' the blanks created by blinks using a kind of perceptual memory of the visual scene. Neither explanation was supported by the findings, which means that the illusion of visual continuity that we experience through our blinks remains a mystery.

One experiment involved students making several judgements about how long a letter 'A' was presented on a computer screen (the actual durations were between 200ms and 1600ms; 1000ms equals 1 second). Sometimes the 'A' appeared at the beginning or end of a voluntary eye blink, other times it appeared during a period when the participant did not blink. If we backdate visual events that occur during blinks, then the 'A's that appeared at the beginning or end of a blink should have been backdated to the onset of the blink, giving the illusion that they'd been presented longer than they actually had, as compared with 'A's that appeared when there was no blink. In fact, the researchers found no evidence that the students overestimated the duration of 'A's that appeared during blinks.

Another experiment involved students making a voluntary blink while a letter 'A' was already onscreen and making a judgement of how long the 'A' was visible, and also making judgements about the duration of other 'A's that were onscreen during non-blink periods. If perceptual 'filling in' occurs during blinks, then the students should have judged the time onscreen of an 'A' of a given duration as the same whether they blinked during its appearance or not. But this isn't what the researchers found – rather, the students consistently underestimated the duration of 'A's if they blinked during their appearance.

We do know from past research that the brain to some extent shuts down visual processing during blinks – a study from the 1980s shone a light up through people's mouths and found their ability to detect changes in its brightness was reduced during blinks, even though the blinks obviously didn't impede the light source. But what the new research shows is that it is still unclear how the brain weaves the loss of visual input during blinks into a seamless perceptual experience.

Summing up, the University of Illinois researchers David Irwin and Maria Robinson said the brain seems to ignore the perceptual consequences of blinks, but they're not sure how this is done. 'Having ruled out the temporal antedating and perceptual maintenance hypotheses,' they said, 'the question still remains: Why does the visual world appear continuous across eye blinks?'

- Christian Jarrett

 Even a four-year-old can tell when you're contradicting yourself (and now they won't trust you)  
In Child Development

'Yes, Victoria, eating chocolate is unhealthy, but not when I eat it' – you might wonder just how long you can get away this kind of contradictory logic with your kids. If you'd asked Jean Piaget, one of the founding fathers of child psychology, he would probably have told you that you'll be fine until they're at least eight. After all, he had observed that children younger than this age often describe things in contradictory ways, such as saying that a candle sinks because it's round, but that a ball floats because it's round.

Recent research has largely backed up Piaget's view, but in a new study in Child Development, psychologists have shown that children's recognition of logical inconsistency starts much earlier – around four years of age – when they are exposed to it in a conversational context. This makes sense, say Sabine Doebel and her colleagues, because reasoning probably evolved as a way to evaluate what we're told by others – an especially important skill for children.

A first experiment with 74 children aged three to five involved them watching video clips of one woman asking two others a series of basic questions, like 'Can you tell me about the ball you saw today?'. One woman answered all the questions in a contradictory way ('Today I saw a ball that was the biggest ball ever and it was the smallest ball ever') whereas the other woman answered the questions in a logically consistent way ('Today I saw a ball that was the biggest ball ever and it was the softest ball ever'). After each clip the children were asked to say which woman did not make sense.

Four-year-olds and five-year-olds, but not three-year-olds, correctly identified the women who did not make sense because they were making contradictory statements. This also affected the way the five-year-old children perceived the trustworthiness of these women. For instance, in a later part of the experiment, these children said they'd rather ask the logically consistent woman about the meaning of a new word, rather than ask the woman who'd contradicted herself.

Another experiment with more four- and five-year-olds replicated these findings in the same conversational context, but found that only the five-year-olds were unable to detect logical inconsistencies when they were attributed to books, rather than to people in conversation (to do this, the researcher presented the children with two books and, to take one example, told them that one book said someone saw a ball that was the biggest and the smallest ever, whereas the other book described someone seeing a ball that was the biggest and the softest).

Because the four-year-olds could detect logically inconsistent utterances in a conversation, but not when attributed to a book, this suggests there's something more engaging or motivating about listening to an actual conversational exchange that improves their performance. 'Put another way,' the researchers said, 'the testimonial context may serve to prompt an epistemically vigilant stance, and as a result children may evaluate arguments and claims more carefully than they would otherwise.' Alternatively, perhaps they are just extra trusting of books – this would certainly chime with earlier research.

Another aspect to this second experiment was that the children also completed tests of their memory performance and executive control (they had to remember strings of numbers or recite them backwards), and those who scored higher on these tests tended to do better at detecting logical inconsistency.

A final note – although based on their average performance four-year-olds were able to identify the women who were being contradictory, not all the children at this age were able to do so, and even among five-year-olds there was plenty of room for improvement in their performance. So if you're lucky, you might just get away a little longer with convincing your five-year-old that chocolate is bad for them but good for you, especially if you tell them that's what a book says.

- Christian Jarrett

How expert schmoozers trick themselves into liking their target 
In Academy of Management Journal

Big-wigs have much to gain from ingratiating themselves with even bigger ones, because having an in with important people sways decisions made in the executive washroom, on the golf course, or over plates of wagyu carpaccio. But ingratiators face a problem: no-one likes a suck-up, and people at the top of the food chain have plenty of practice in detecting and dismissing them.

A new article in the Academy of Management Journal finds that company directors get around this problem by employing a clever psychological tactic – before meeting up with those they plan on winning over, they think about them in such a way that they come to like them more, making any flattery or ingratiation seem all the more convincing.

Participants in the study were directors at a range of large US companies, each of whom had at least one scheduled meeting with another director who had something they wanted: a say in the board membership at another company. The meetings occurred during the six months running up to the board nominations meeting, so if the participants played their cards right, maybe they would get appointed.

So what's the best way to play? Researchers James Westphal and Guy Shani suspected that the key to successful ingratiation is to believe it. Detecting unnatural behaviour comes fairly easily, especially if you know what to look for, meaning pretenders are one feigned smile or wavering compliment away from being dismissed as a brown-noser. Acting is hard! When we really like someone, on the other hand, we don't need to act, just let our feelings come through. Increasing one's authentic liking for a person would therefore be very helpful. Westphal and Shani predicted that one way to do this would be for the participants to mentally emphasise to themselves what they have in common with the director they wanted to influence. After all, there is copious evidence showing that we like more those who resemble us, and that we are more likely to credit the achievements of (and therefore respect) people like ourselves, rather than putting their success down to external factors.

In the study, the 278 participants were surveyed at multiple time points prior to their crucial meeting(s) with the other director, on how much they thought about their similarities, or about their differences. For example, a black woman prior to meeting a much older white male might choose to reflect on how they both spent some years in the same industry. The researchers also surveyed the ingratiation behaviours in the meeting itself: compliments and expressions of admiration, together with the amount of non-verbal affirmation like smiling or laughter.

The data showed that the more a participant had turned their thoughts towards what they had in common with the other director, the more their ingratiation behaviours paid off – they were more likely to get an invitation to join the board in the months that followed – presumably because their flattery was more convincing.

Furthermore, participants were more likely to adjust their thinking in this way when their counterpart was more dissimilar to them – where intentionally searching for common ground is going to be particularly important – and in these cases, use of the tactic was even more likely to be rewarded with a nomination. These effects were striking: those following this strategy to its fullest were nearly three times more likely to get a recommendation than those with an average amount of regulation of their thoughts around the meeting.

The psychological strategy uncovered in this research was certainly effective, but what we don't know is how aware the participants were of what they were doing. Did they deliberately trick themselves into liking the other director, or was it a more automatic and instinctive process?

Either way, these results aren't only relevant for top dogs trying to bound their way further up the hierarchy. The study provides another demonstration that changing how we think about other people has an important role in smoothing social interactions. Similar processes might help explain why social contact between outgroups is sometimes found to be helpful, and sometimes not: are the different factions looking for what they have in common, or what sets them apart? This approach is about more than a cushy seat in the board room; it's about how divided people can find a way to sit down together.

- Alex Fradera 

Is OCD fuelled by a fear of the self?
In Clinical Psychology and Psychotherapy

Most of us have unwanted thoughts and images that pop into our heads, and it's not a big deal. But for people with a diagnosis of obsessive-compulsive disorder (OCD) these mental  intrusions are frequently distressing and difficult to ignore. A new article in Clinical Psychology and Psychotherapy explores the possibility that the reason these thoughts become so troubling to some people is that they play on their fears about the kind of person they might be.

The reasoning goes something like this: If, for instance, you or I  had a sudden mental of image of stabbing someone, we might find it strange and unpleasant, but – assuming we are mentally well – the moment would quickly pass and be forgotten. In contrast, to someone with an ongoing, nagging fear that they are dangerous and that they might one day harm somebody, the unwanted image could fuel their anxieties and end up becoming part of long-running obsession, no matter that their fears have no basis in reality.

Gabriele Melli and his colleagues recruited 76 participants diagnosed with OCD who were about to embark on psychotherapy at a private clinic in Italy. The researchers interviewed the participants about their OCD-related symptoms, their anxiety and depression, and their self-related fears. This last measure featured items like 'I fear perhaps being a violent, crazy person'; 'I am afraid of the kind of person I could be'; and 'I often doubt that I am a good person' to which the participants rated their agreement.

Even after factoring out the part played by anxiety, depression and a general tendency for obsessive beliefs (e.g. thinking that having a bad urge is as bad as carrying out that urge), the researchers found that a greater fear of the self was independently associated with having more unacceptable and repugnant thoughts, and also to the importance the participants attributed to these thoughts and the need they had to control them.

While cautioning that their results are only preliminary – the sample is relatively small, the measures depended on self-report, were correlational, and there was no control group – Melli and his colleagues believe there could be important clinical insights here. For instance, some patients with OCD might benefit from help realising their obsessive thoughts have no basis in reality and are not a reflection of their 'true self'. The findings also build on past research that's shown, for example, that people with OCD find intrusive thoughts more troubling when they seem to contradict a valued aspect of their sense of self, and that people with OCD are more uncertain than healthy controls about their self-concept. cj

If you do everything you can to avoid plot spoilers, you're probably a thinker 
In Psychology of Popular Media Culture

It's a vexing First World Problem – how to avoid people giving away, on Twitter or at the water cooler, the events of the latest Game of Thrones episode before you've caught it. Psychologists are beginning to study this modern scourge, albeit in the context of written stories rather than TV shows, but so far their findings have been contradictory – one study suggested that spoiled stories were actually more enjoyable (possibly because they're easier to process), while a later investigation found the precise opposite. Now a research team led by Judith Rosenbaum has entered the fray with a study in Psychology of Popular Media Culture that suggests one reason for the contradictory results is that the effects of spoilers depend on how much a person likes to engage their brain, and how much they enjoy emotional stimulation.

In psychological jargon these traits are known as 'need for cognition' and 'need for affect', respectively. The former is measured through disagreement with statements like 'I only think as hard as I have to' and the latter via agreement with statements such as 'Emotions help people get along in life'.

The researchers first presented over 350 students, mostly African Americans at a university in Southeastern USA, with several previews of classic short stories, some of which contained plot spoilers and some that didn't, and then asked them to say which of the stories they'd like to read. The students also completed measures of their need for cognition and affect, and the critical finding was that those who scored low on 'need for cognition' tended to say they would prefer to read the full versions of stories that were previewed with plot spoilers. 'When choosing between stories, low need for cognition individuals appear to have found spoiled stories as potentially more comprehensible and more in keeping with their preferred level of cognitive processing', the researchers said.

Next, the students read some classic short stories (such as Two Were Left and Death of a Clerk) in full, some of which had been 'spoiled' by a preview, and some not, and then rated their enjoyment of the stories. This time, 'need for cognition' was unrelated to enjoyment, but 'need for affect' was, in that people with a greater desire for emotional stimulation got more pleasure from unspoiled stories, as did the students who read fiction more frequently.

One positive way to look at these findings is that encountering a spoiler may not ruin your enjoyment as much as you think it will (if you're a deep thinker), but probably will be a downer if you're the kind of person who likes emotional surprises. Alternatively, perhaps this study is just too far removed from reality to offer much insight – after all, as the researchers acknowledge, they didn't look at TV shows or movies (where plot spoilers are arguably more common), nor did they consider important variables such as genre (spoilers are presumably much more of an issue for horror and suspense) or story/show length. cj

The material in this section is taken from the Society's Research Digest blog at www.bps.org.uk/digest, and is written by its editor Dr Christian Jarrett and contributor Dr Alex Fradera.
Subscribe to the fortnightly e-mail, friend, follow and more via www.bps.org.uk/digest

New: download our free app via your iOS or Android store to keep up with the latest psychology research every day, on the go!