We’ve discussed the role of vision in reading many times before, from “traditional” optometric angles such as the perspective of the AOA to the literacy point of view in academia. I came upon the role of vision in literacy from an unexpected source last week, though while reading in my usual place.
The source is Rob DeSalle’s new book, Our Senses: An Immersive Experience. A seemingly unexpected source because Rob is curator of the American Museum of Natural History in New York, but the connection isn’t as far-fetched as you might think. Rob’s mind is delightful, as you’ll see in this clip, and his sense of curiosity and down-to-earth approach to learning and teaching shines through.
From his exhibition designed for the public about the brain, Rob went on to curate the American Museum of Natural History’s latest exhibition, Our Senses: An Immersive Experience, in essence the live companion to this book.
The exhibition, which opened to the public in November, seems to be a delightful experience, and I’m planning a field trip there with a few of our grandkids soon. In the interim, the book serves as a teaser – and the chapter that really caught my eye (Chapter 18) is titled “Bob Dylan’s Nobel: Language, Literacy, and How the Senses Interact to Produce Literature”. Here are some key quotes:
“The portal for the neural information that is needed for literacy is usually through the retina and hence the eyes … As with all of the senses, when the initial information enters the brain from the sensory collection organ (in the case of literacy, the organ is the retina) there is an initial rapid processing of the information (fig. 18.3) …
… Remember that the information form the retina in early visual processing goes through several areas of the visual cortex specifically the pathways known as V1, V2, V3, and V4. Western writing uses the V1 and V2 pathways to sort out and recognize the characters used in literacy. By contrast, recognition of characters in Chinese writing uses the V3 and V4 pathways … One of he more interesting developments with the acquisition of literacy in the visual word form areas is that this pathway in the brain learns to suppress the tendency to lump mirror images of objects … Examples from the Western alphabet include b/d/p/q. And hence the adaptive reason for this so-called mirror invariance of our nonliterate ancestors needs to be overcome to acquire literacy.”
DeSalle concludes his book with a highly significant paragraph in Chapter 20 on the limits to what we can sense, and the future of our senses. It deserves a drum roll, please.
“The average adult in the Western world faces a computer screen or a smartphone screen for about ten hours a day, according to a 2016 Nielsen survey. Given that we sleep about seven to eight hours a day, this means that more than half the waking day in many cultures is spent staring at a computer or smartphone screen viewing virtual images the whole time. We are only beginning to understand the impact of this changed sensory realm on the human condition. In a direct comparison of reading comprehension among tenth graders, researchers in Norway assessed the difference between reading on a computer screen versus old-fashioned hard copy. The surprising result was that these students comprehended the written word on paper much better than on screen. Why this might be so isn’t well understood, but it does point to a possible dichotomy in the way we learn and comprehend reading as humans. Reading comprehension is a downstream effect of vision [my emphasis added – what a great line!!!], and some researchers are concerned about the long-term impact that computer and smartphone screens might have on the human visual system in a more upstream manner. Humans did not evolve to peer endlessly at a small, light-emitting rectangle … How this restriction in the visual field is affecting our eyes and their potential evolution is a subject that needs attention.”