We use psychophysics, eye-tracking, brain stimulation, neuroimaging, and computational approach to investigate where and how affective visual cues are processed in human brain, and how these cues are used in our social cognition.
We use behavioural, physiological, and neuroscientific measurements to investigate emotional expression and perception in dogs, and how humans and dogs ‘understand’ each other’s emotion and social communication cues.
We are investigating different domains of how we process faces, which can be summarized as: (1) Face recognition and face learning (2) Social information conveyed in faces (3) The use of automatic face recognition technology
This project investigates how human visual motion perception is optimised for perceiving the actions of other people: How are social attributes derived from sensory data? Do social attributes actually influence sensory processing?
The current project proposes to investigate the workings of the visual system, and how this might contribute to the appreciation of artwork, using both subjective judgements and physiological measurements of observers viewing Op-art-based stimuli.