“I have to translate the colours.”
New case study describes the atypical ways of understanding speech in a confirmed synaesthete, illustrating the diversity of cognitive processes.
22 November 2023
By Emma Young
Imagine if every time you heard a hard 'a' sound, you saw a luminous cyan colour, or every time you heard 'k', you saw red. Now imagine translating meaning from those colours, rather than the sounds themselves. This is the everyday experience of a woman identified in the literature as 'VA'.
Lucie Bouvet at the University of Toulouse and colleagues first described VA's unusual sensory perceptions in 2017. Now, after further investigations and interviews with her, they have confirmed that hers is a genuine case of 'phoneme-colour synaesthesia'. This research, published in Consciousness and Cognition, has implications not just for understanding this form of synaesthesia, but also illustrates the diversity of ways in which humans can think.
Synaesthesia is when the perception of a particular stimulus — such as a sight, sound or smell — automatically triggers a perception in the same or another sense. For a sound-colour synaesthete, for example, listening to a piece of music might create visual perceptions of a 3D changing pattern of colours, whereas for a lexical-gustatory synaesthete, hearing or seeing words triggers perceptions of particular flavours. Around 60 different types of synaesthesia have been documented, with some types clustering together.
Phonemes are the smallest unit of sound and meaning in speech. The prevalence of phoneme-colour synaesthesia, in which a phoneme triggers a colour perception, is unknown, but it is certainly much rarer than some other types of synaesthesia (an estimated 1% of people have grapheme-colour synaestheia, the most common type). In fact, the team's main objective in this follow-up work with VA was to verify her synaesthesia, as no other 'pure' cases of it, without additional synaesthesias, have been described in academic literature to date.
VA's case history revealed some unusual features. She didn't speak a single word until she was 16 months old, at which point she started speaking in complete sentences. At the age of three, she taught herself to read. At school, though, she experienced difficulties. She was socially excluded and, the team reports, struggled to understand ambiguous instructions.
The 2017 neuropsychological evaluation found that VA had above average cognitive abilities and also that, for her, speech and non-verbal sounds triggered the perception of different colours. In this new work, the team probed this further, systematically exposing her to the sounds of 13 vowels and 16 consonants, but in different arrangements. VA listened to these letter sounds alone, as part of consonant-vowel syllables, and also in words. So, for example, she heard 'a' by itself, as part of 'at' and 'as', and in the French words 'sac', 'chat', and 'pas'. On each occasion, VA had to indicate which colour or colours she perceived.
The team found a striking consistency in her reports. For example, an 'a' reliably triggered cyan blue perceptions, an 'i' reliably triggered light yellow, and a 'v', light green. This consistency across multiple presentations is important for a diagnosis of synaesthesia.
The researchers' analysis also revealed that when VA heard full words, their meaning had no influence on the colours that she saw; those colours seemed to be determined purely by whatever distinct phonemes were present. She reported one colour for each phoneme that comprised the word, and the same for the syllables. These phoneme-colour associations were highly consistent with those that she'd reported during the initial evaluation, years earlier. "This demonstrates the robustness of her synaesthesia," the researchers write.
As part of this new study, VA was also asked to describe how her way of processing speech plays out day to day. She reported that to understand speech, she has to make an effort to translate the colours that she perceives into word meanings. This effort could be overwhelming, she said, causing her to lose track of conversations.
VA went on to explain that this translation process in fact results in a multi-sensory representation of the meaning. "If I think of a cake, I won't think of the word cake: I will think of all the sensations I have when I see, touch, and eat the cake," she explains. Then, when she wants to speak — to say 'cake', for example — she has to translate these sensory representations into words. This double translation effort makes it especially hard for her to hold a conversation, and could explain her previous difficulties at school, the researchers suggest.
It's generally thought that neurotypical people can use different-sized units of speech — phonemes, syllables, and words — to understand meaning and follow conversations. VA, on the other hand, illustrates that this process isn't universal. She processes speech segmentally, building meaning through translation of the individual phoneme-colours, and relying heavily on multi-sensory representations rather than words.
Though VA herself does not have an autism diagnosis, the team notes that autistic people are often described as sensory/visual thinkers. Perhaps, they suggest, the communication difficulties observed in some neurodiverse people may partly stem from a similar difficulty in translating between verbal and non-verbal representations.
Case studies are limited in what they can tell us about how human minds work beyond a single individual. However, they do add depth to our understanding and, at times, challenge the assumptions much of our research is built on. Acknowledging and investigating diverse experiences, such as VA's, can help us to appreciate the variety in cognitive processes present in the population, and prompt us to perhaps better represent that diversity in our research.
Read the paper in full: https://doi.org/10.1016/j.concog.2023.103509