Directed by Prof. Dr. Nikolaus Troje, the lab is located at York University in Toronto, Ontario.
Our main research interest is focused on questions concerning the nature of perceptual representations. How can a stream of noisy nerve cell excitations possibly be turned into the coherent and predictable perception of a “reality”? We work on questions involving the processing of sensory information, perception, cognition and communication.
People perception: The biology and psychology of social recognition
- detection of animate agents
- conspecific recognition
- gender recognition
- individual recognition
- action recognition
- recognition of emotions, personality traits and intentionality
- recognition of bodies, faces, and biological motion
Vision in virtual reality
- pictorial vs physical spaces
- simulator sickness
- perception of self-motion (vection)
- visual-vestibular integration
Visual ambiguities and perceptual biases
- depth ambiguities
- the “facing-the-viewer” bias
We are equipped with several motion capture systems: A 21 camera Qualisys system, a camera Vicon system, and an ATR system with four cameras. Which we use in a number of different experimental setups. Motion capture is used to capture whole body kinematics of moving people, but also to track the head of the observer in projection-based virtual reality systems.
In addition to high-end computer displays, we use different forms of virtual realities in our experiments. We currently work with multiple sets of Oculus Rift head-mounted displays, but we also use projection-based system, one of them being a Christie Holostation with an integrated treadmill.
Several graphics workstations provide the hardware for the generation of sophisticated visual stimuli. We extensively use Matlab as a development tool and the Psychophysics Toolbox extension for most of our experiments. For online demonstrations and formal experiments, we used Flash for a long time, and we are still searching for an adequate replacement. Maybe it will be Unity3D, a versatile game engine that we recently started to work with.
We program graphics under OpenGL and we use professional animation packages (Autodesk’s Maya and 3DMax, WorldViz’s Vizard) for computer animation and virtual reality applications.
As a member of Queen’s Biological Communication Centre, we have access to eye tracking technology, EEG recording systems, and transcranial magnetic stimuluation (TMS). Through our Centre for Neuroscience Studies, we also have access to the centre’s research dedicated fMRI scanner.
The laboratory is integrated into the Queen’s Biological Communications Centre and the Centre for Neuroscience Studies at Queen’s University. Hosted by the Department of Psychology, the lab has an intensive exchange with the School of Computing at Queens University and the Department of Biology. We are also a member of the Canadian Action and Perception Network CAPnet and we are involved with the International Graduate School The Brain in Action.