BioMotionLab

BioMotionLab

Menu
  • Home
  • People
    • Alumni
  • Projects
  • Publications
  • Demos
  • JOIN US
  • Contact us

Biomotion Lab


Directed by Prof. Dr. Nikolaus Troje, the lab is located at York University in Toronto, Ontario.

News Feed

BioMotionLabFollow93361

The BioMotionLab, Centre for Vision Research at York University

BioMotionLab
BioMotionLab avatarBioMotionLab@BioMotionLab·
24 Feb 1364653465113346049

Sia Eftekharifar just defended his PhD thesis on the role of motion parallax and binocular disparity on
presence, cybersickness, and restoration in VR. New papers coming soon. Congratulations @SiaEftekharifar.

Image for the Tweet beginning: Sia Eftekharifar just defended his
Reply on Twitter 1364653465113346049Retweet on Twitter 13646534651133460491Like on Twitter 136465346511334604915Twitter 1364653465113346049
BioMotionLab avatarBioMotionLab@BioMotionLab·
25 Oct 1320179724610621440

Biological motion processing is impressively resilient to aberrant early visual experience, according to a very successful collaboration between the BioMotionLab and Brigitte Röder's research group at University of Hamburg. @vistayorku @YorkUScience @unihh
https://www.eneuro.org/content/eneuro/early/2020/10/15/ENEURO.0534-19.2020.full.pdf

Reply on Twitter 1320179724610621440Retweet on Twitter 1320179724610621440Like on Twitter 13201797246106214403Twitter 1320179724610621440
BioMotionLab avatarBioMotionLab@BioMotionLab·
2 Oct 1311843672481292289

As the new term has started there are new students, too: Andres and Romina are new 4th year honours thesis students, Ashley joins as MSc candidate, Viswajit who's been with us as an RA is now also an MSc student. Welcome all! @vistayorku @YorkUScience

Image for the Tweet beginning: As the new term has
Reply on Twitter 1311843672481292289Retweet on Twitter 13118436724812922891Like on Twitter 13118436724812922897Twitter 1311843672481292289
BioMotionLab avatarBioMotionLab@BioMotionLab·
1 Oct 1311689782234406913

BML graduate student Saeed Ghorbani leads publication of new method for realistic human motion modelling with natural variation to be used for crowd animation and other applications. Paper will be presented at ACM SIGGRAPH/EG SCA @vistayorku @YorkUScience http://diglib.eg.org/handle/10.1111/cgf14116

Image for the Tweet beginning: BML graduate student Saeed Ghorbani
Reply on Twitter 1311689782234406913Retweet on Twitter 13116897822344069131Like on Twitter 13116897822344069139Twitter 1311689782234406913
BioMotionLab avatarBioMotionLab@BioMotionLab·
4 Aug 1290678874670608385

In a very successful collaboration with G. Ross and R. Graham's group @uOttawa we are using linear acceleration and angular velocity of simple movements to discriminate between expert and novice athletes at accuracies >80%. @vistayorku @YorkUScience

Classifying Elite From Novice Athletes Using Simulated Wearable Sensor Data

Movement screens are frequently used to identify differences in movement patterns such as pathological abnormalities or skill related differences in s...

doi.org

Reply on Twitter 1290678874670608385Retweet on Twitter 12906788746706083852Like on Twitter 12906788746706083855Twitter 1290678874670608385
BioMotionLab avatarBioMotionLab@BioMotionLab·
21 Jul 1285651527957389318

Another new paper from BML: Those who look down a virtual cliff in #VR know that they are perfectly safe. Yet, their body responds with real fear. Sia Eftekharifar investigates what drives that response. @vistayorku, @YorkUScience @CentreforVisio1 @Sia_Eft
https://www.ingentaconnect.com/content/ist/jpi/pre-prints/content-jpi_0129

Image for the Tweet beginning: Another new paper from BML:
Reply on Twitter 1285651527957389318Retweet on Twitter 128565152795738931814Like on Twitter 128565152795738931846Twitter 1285651527957389318
BioMotionLab avatarBioMotionLab@BioMotionLab·
17 Jul 1284142210871681024

New paper! Our toolbox that turns Unity into a tool to design and execute factorial, trial-based experiments in Unity3D just appeared as a journal article in i-Perception. Congratulations to Adam Bebko!
@vistayorku, @YorkUScience, @PerceptionSAGE, @unity3d https://www.biomotionlab.ca/tux/

Image for the Tweet beginning: New paper! Our toolbox that
Reply on Twitter 1284142210871681024Retweet on Twitter 128414221087168102415Like on Twitter 128414221087168102434Twitter 1284142210871681024
BioMotionLab avatarBioMotionLab@BioMotionLab·
24 Jun 1275850125261320196

Kudos to Adam Bebko. His workshop on bmlTUX, the BML Toolkit for Unity ExperimentsUnity, given at @VSSMtg drew an impressive crowd. For those who missed it, see the tutorials at https://biomotionlab.github.io/TUX/docs/firstoverview. @vistayorku, @YorkUScience, @InnovationYork

Image for the Tweet beginning: Kudos to Adam Bebko. His
Reply on Twitter 1275850125261320196Retweet on Twitter 12758501252613201962Like on Twitter 127585012526132019610Twitter 1275850125261320196
BioMotionLab avatarBioMotionLab@BioMotionLab·
19 Jun 1274075535967166476

We'll keep it up!

We'll keep it up!
VISTA @ YorkU@vistayorku

Are you attending this year's Virtual VSS 2020? Mark your calendars to join our Canadian Vision Social event open to any VSS member who is, knows, or would like to meet a Canadian Vision Scientist! Tune in this June 23 @ 8:00 pm (EDT)https://bit.ly/2AqbPvw @VSSMtg #visionscience

Reply on Twitter 1274075535967166476Retweet on Twitter 1274075535967166476Like on Twitter 12740755359671664762Twitter 1274075535967166476
BioMotionLab avatarBioMotionLab@BioMotionLab·
17 Jun 1273329572256374785

Congratulations, Michael et al. I remember this paper very well and often give it to students. Now I will also recommend MJB's notes (below) about its history. The most important message: "Do the work that you think is important and hopefully the community will agree in time".

Michael Black@Michael_J_Black

I’m honored to share the 2020 Longuet-Higgins prize with @DeqingSun and Stefan Roth. It is given at #CVPR2020 for work from #CVPR 2010 that has withstood the test of time. I’ve written a blog post about the secrets behind “The Secrets of Optical Flow”: https://tinyurl.com/y89odrwf

Reply on Twitter 1273329572256374785Retweet on Twitter 1273329572256374785Like on Twitter 1273329572256374785Twitter 1273329572256374785

Goals

Our main research interest is focused on questions concerning the nature of perceptual representations. How can a stream of noisy nerve cell excitations possibly be turned into the coherent and predictable perception of a “reality”? We work on questions involving the processing of sensory information, perception, cognition and communication.

Topics

People perception: The biology and psychology of social recognition

  • detection of animate agents
  • conspecific recognition
  • gender recognition
  • individual recognition
  • action recognition
  • recognition of emotions, personality traits and intentionality
  • recognition of bodies, faces, and biological motion

Vision in virtual reality

  • pictorial vs physical spaces
  • space perception
  • simulator sickness
  • perception of self-motion (vection)
  • multisensory integration
  • perception of the own body
  • the nature of presence

Visual ambiguities and perceptual biases

  • depth ambiguities
  • the “facing-the-viewer” bias

Network

Since moving to York University in 2018, the Biomotion Lab has become an integral part of the multi-departmental Centre for Vision Research. Its main affiliation is with the Department of Biology in the Faculty of Science.

Students in the Biomotion Lab come from different graduate programs:

  • Biology @ YorkU
  • Psychology @ YorkU
  • Electrical Engineering and Computer Science @ YorkU
  • International Graduate School “The Brain in Action”
  • Centre for Neuroscience Studies @ Queen’s University

Dr. Troje is a core member of the CFREF funded program “Vision: Science to Application” (VISTA). Other funding comes from

  • NSERC
  • CFI
  • CIFAR

Other important affiliations include:

  • Canadian Action and Perception Network (CAPnet)
  • Vision Science Society