‘Seeing’ bodies with sound (no sight required) — ScienceDaily

‘Seeing’ bodies with sound (no sight required) — ScienceDaily.

People born unable to see are readily capable of learning to perceive the shape of the human body through soundscapes that translate images into sound, according to researchers. With a little training, soundscapes representing the outlines and silhouettes of bodies cause the brain’s visual cortex — and specifically an area dedicated in normally sighted people to processing body shapes — to light up with activity.”

Emotions, facial expression, and sensory perception

Nature Neuroscience 11, 843 – 850 (2008)

Published online: 15 June 2008 | doi:10.1038/nn.2138

Expressing fear enhances sensory acquisition

Joshua M Susskind et el.

Abstract

It has been proposed that facial expression production originates in sensory regulation. Here we demonstrate that facial expressions of fear are configured to enhance sensory acquisition. A statistical model of expression appearance revealed that fear and disgust expressions have opposite shape and surface reflectance features. We hypothesized that this reflects a fundamental antagonism serving to augment versus diminish sensory exposure. In keeping with this hypothesis, when subjects posed expressions of fear, they had a subjectively larger visual field, faster eye movements during target localization and an increase in nasal volume and air velocity during inspiration. The opposite pattern was found for disgust. Fear may therefore work to enhance perception, whereas disgust dampens it. These convergent results provide support for the Darwinian hypothesis that facial expressions are not arbitrary configurations for social communication, but rather, expressions may have originated in altering the sensory interface with the physical world.

ciguatera (雪卡毒) and temperature reversal

Ciguatera often appears in Hong Kong news, with people getting poisoned by eating coral fish such as grouper. What is not reported is the surprising fact that in some cases this makes hot things feel cold and cold things feel hot. A nice example of qualia inversion!

doi:10.1136/jnnp.2007.129049

From WIRED http://blog.wired.com/wiredscience/2007/10/the-bizarre-eff.html

Fish Poison makes Hot Things Feel Icy and Cold Things Feel Burning Hot
By Aaron Rowe
October 11, 2007

Grouper Eating some bad fish might not seem like the most spectacular way to ruin a tropical vacation, but for a 45-year-old man from England, a bit of tainted seafood was the beginning of a wild ride.

Cold water felt burning hot. Hot things felt icy cold. His tongue felt strange. Drinking alcohol or coffee only increased his suffering.

The patient had ciguatera poisoning — an ailment caused by ciguatoxin, a neurotoxin that is produced by microorganisms and found in a wide variety of tropical fish.

Virtual robots duped by illusions help to explain human vision

From http://www.ucl.ac.uk/media/library/robotillusions

28 September 2007

A study by researchers at UCL (University College London) explains why humans see illusions by showing that virtual robots trained to ‘see’ correctly also – as a consequence – make the same visual mistakes that we do. The study, published in the latest edition of PLoS Computational Biology, shows that illusions are an inevitable consequence of evolving useful behaviour in a complex world.

Dr. Beau Lotto, UCL Institute of Ophthalmology, said: “Sometimes the best way to understand how the visual brain works is to understand why sometimes it does not. Thus lightness illusions have been the focus of scientists, philosophers and artists interested in how the mind works for centuries. And yet why we see them is still unclear.”

To address the question of why humans see illusions, researchers at the UCL Institute of Ophthalmology used artificial neural networks, effectively virtual toy robots with miniature virtual brains, to model, not human vision as such, but human visual ecology. Dr David Corney in Dr. Lotto’s lab trained the virtual robots to predict the reflectance (shades of grey) of surfaces in different 3D scenes not unlike those found in nature. Although the robots could interpret most of the scenes effectively, and differentiate between surfaces correctly, they also – as a consequence – exhibited the same lightness illusions that humans see.

Early visual deprivation impairs multisensory interactions in humans

Nature Neuroscience – 10, 1243 – 1245 (2007)
Published online: 16 September 2007; | doi:10.1038/nn1978

Lisa Putzar1, Ines Goerendt1, Kathrin Lange2, Frank Rösler3 & Brigitte Röder1

Animal studies have shown that visual deprivation during the first months of life permanently impairs the interactions between sensory systems. Here we report an analogous effect for humans who had been deprived of pattern vision for at least the first five months of their life as a result of congenital binocular cataracts. These patients showed reduced audio-visual interactions in later life, although their visual performance in control tasks was unimpaired. Thus, adequate (multisensory) input during the first months of life seems to be a prerequisite in humans, as well as in animals, for the full development of cross-modal interactions.

nonface clusters in the fusiform face area

Nature Neuroscience – 9, 1177 – 1185 (2006) Published online: 6 August 2006; | doi:10.1038/nn1745Kalanit Grill-Spector et. el.

Higher resolution fMRI shows that FFA has a heterogeneous structure: localized subregions within the FFA highly selective to faces are spatially interdigitated with localized subregions highly selective to different object categories. We found a preponderance of face-selective responses in the FFA, but no difference in selectivity to faces compared to nonfaces. Thus, standard fMRI of the FFA reflects averaging of heterogeneous highly selective neural populations of differing sizes, rather than higher selectivity to faces. These results suggest that visual processing in this region is not exclusive to faces.

Retinal laser projection system

“US firm Microvision has developed a system that projects lasers onto the retina, allowing users to view images on top of their normal field of vision.It could allow surgeons to get a bird’s eye view of the innards of a patient, offer military units in the field a view of the entire battlefield and provide mechanics with a simulation of the inside of a car’s engine.

The system uses tiny lasers, which scan their light onto the retina to produce the entire range of human vision, reported the journal of the Institute of the Electrical and Electronics Engineers, IEEE Spectrum.”