Esref Armagan was born without sight to a Turkish family in 1953. Today, he is arguably the world’s most famous blind painter. His oil paintings of richly colored landscapes have been shown in galleries throughout the globe. Cognitive and neural scientists have studied Armagan’s visual cortex to try to understand his remarkable ability to visually depict the world around him, despite his lack of sight.

Jennifer Stevens, associate professor in psychological sciences at William & Mary, was especially fascinated by Armagan’s depictions of depth using relative size. For example, one of his paintings features three windmills on a hillside, each smaller, higher in field and less detailed than its neighbor.

“For those of us with vision, when we look out onto a landscape, objects that are furthest away appear smaller,” Stevens said. “But what has been perplexing about Esref is he depicts objects in depth in the same way. How would someone without sight have any understanding that a distant object should be depicted smaller if there is only haptic information available? After all, an object’s actual size does not shift down with distance the way its visual appearance does.”

This line of questioning led Stevens to author a study, recently published in the journal Perception, that provides the first evidence that haptic – also known as tactile – object interaction can, in fact, provide information about relative size, akin to that garnered visually.

“It turns out that even without sight, we have this sense of relative size as it relates to depth,” Stevens said. “This study is groundbreaking in the fact that it reveals a novel route to understanding the mind’s representation of three-dimensional space. We can comprehend relative size even in the absence of vision, which adds a significant new cog in the wheel of our understanding.”

Stevens conducted the first proof-of-concept study on herself. Ironically, she used her glasses as the object with which to test her own hypothesis about a blind painter’s ability to visually render depth with relative size.

“Here I was, sitting alone in my office, wondering to myself how Esref understands relative size,” said Stevens. “And I began to simulate blindness by sitting with my eyes closed for periods of time. I held my glasses at different distances, first near then further away, and it was compelling to realize that while I knew that their size hadn’t changed, the further away they were held, the smaller they felt.”

In the paper, Steven explains that the imagined size of her glasses changed from flooding her entire visual field to taking up a small percentage of it.

“It was illuminating when I focused on how the glasses felt,” she writes. “They felt larger when they were held in front of my closed eyes compared to when they were held at arm’s length. … The reader can recreate this experience now using any handheld object found nearby.”

After having the revelation in her office, Stevens decided to test her own mini-experiment at scale. She ran a study in her lab over the course of several months asking blindfolded, sighted participants to hold different objects either directly in front of their eyes or at arm’s length. Then they were asked to remove their blindfold and draw the object in the size they felt it to be on graph paper. Both familiar and unfamiliar objects revealed the same effect, the objects held farther away felt smaller to the participants.

“What’s especially thrilling to me about these results is that we are showing there is a largely untapped access point into understanding the way that we represent the world,” Stevens said. “It is likely that humans use haptic information to construct their representation of space all of the time. We just haven’t realized the extent of its contribution because our focus is typically on visual processing. Until now.”

, Assistant Director for Research, News & Analytics