BioMotionLab

BioMotionLab

Menu
  • Home
  • People
    • Alumni
  • Projects
  • Publications
  • Demos
  • JOIN US
  • Contact us

Biomotion Lab


Directed by Prof. Dr. Nikolaus Troje, the lab is located at York University in Toronto, Ontario.

News Feed

BioMotionLabFollow96372

The BioMotionLab, Centre for Vision Research at York University

BioMotionLab
BioMotionLab avatarBioMotionLab@BioMotionLab·
2 Apr 1378060456116563971

Adam Bebko and Anne Thaler gave a very well crafted tutorial on bmlTUX today @IEEEVR conference. If you missed it, check out the recording on YouTube: https://youtu.be/-1dh_U5RcKw. Starts at -4:56. @vistayorku @YorkUScience

Reply on Twitter 1378060456116563971Retweet on Twitter 13780604561165639712Like on Twitter 13780604561165639716Twitter 1378060456116563971
BioMotionLab avatarBioMotionLab@BioMotionLab·
29 Mar 1376528233936609280

BioMotionLab is employing. We are looking for a full time developer with experience in both Unity and iOS development, particularly ARKit. If you are @IEEEVR this week, feel free to talk to Niko Troje, Adam Bebko or Anne Thaler. Or just check out https://www.biomotionlab.ca/join/

Image for the Tweet beginning: BioMotionLab is employing. We are
Reply on Twitter 1376528233936609280Retweet on Twitter 13765282339366092804Like on Twitter 137652823393660928012Twitter 1376528233936609280
BioMotionLab avatarBioMotionLab@BioMotionLab·
28 Mar 1376202311173804035

BML members Adam Bebko and Anne Thaler are presenting new tools @IEEEVR. Here is the teaser for tomorrow's presentation on a tool that animates AMASS data in Unity. Poster session starts at 10am EST. https://www.youtube.com/watch?v=5uyAHu2ow08
@vistayorku, @YorkUScience

Reply on Twitter 1376202311173804035Retweet on Twitter 13762023111738040354Like on Twitter 13762023111738040357Twitter 1376202311173804035
BioMotionLab avatarBioMotionLab@BioMotionLab·
24 Feb 1364653465113346049

Sia Eftekharifar just defended his PhD thesis on the role of motion parallax and binocular disparity on
presence, cybersickness, and restoration in VR. New papers coming soon. Congratulations @SiaEftekharifar.

Image for the Tweet beginning: Sia Eftekharifar just defended his
Reply on Twitter 1364653465113346049Retweet on Twitter 13646534651133460491Like on Twitter 136465346511334604915Twitter 1364653465113346049
BioMotionLab avatarBioMotionLab@BioMotionLab·
25 Oct 1320179724610621440

Biological motion processing is impressively resilient to aberrant early visual experience, according to a very successful collaboration between the BioMotionLab and Brigitte Röder's research group at University of Hamburg. @vistayorku @YorkUScience @unihh
https://www.eneuro.org/content/eneuro/early/2020/10/15/ENEURO.0534-19.2020.full.pdf

Reply on Twitter 1320179724610621440Retweet on Twitter 1320179724610621440Like on Twitter 13201797246106214403Twitter 1320179724610621440
BioMotionLab avatarBioMotionLab@BioMotionLab·
2 Oct 1311843672481292289

As the new term has started there are new students, too: Andres and Romina are new 4th year honours thesis students, Ashley joins as MSc candidate, Viswajit who's been with us as an RA is now also an MSc student. Welcome all! @vistayorku @YorkUScience

Image for the Tweet beginning: As the new term has
Reply on Twitter 1311843672481292289Retweet on Twitter 13118436724812922891Like on Twitter 13118436724812922897Twitter 1311843672481292289
BioMotionLab avatarBioMotionLab@BioMotionLab·
1 Oct 1311689782234406913

BML graduate student Saeed Ghorbani leads publication of new method for realistic human motion modelling with natural variation to be used for crowd animation and other applications. Paper will be presented at ACM SIGGRAPH/EG SCA @vistayorku @YorkUScience http://diglib.eg.org/handle/10.1111/cgf14116

Image for the Tweet beginning: BML graduate student Saeed Ghorbani
Reply on Twitter 1311689782234406913Retweet on Twitter 13116897822344069131Like on Twitter 13116897822344069139Twitter 1311689782234406913
BioMotionLab avatarBioMotionLab@BioMotionLab·
4 Aug 1290678874670608385

In a very successful collaboration with G. Ross and R. Graham's group @uOttawa we are using linear acceleration and angular velocity of simple movements to discriminate between expert and novice athletes at accuracies >80%. @vistayorku @YorkUScience

Classifying Elite From Novice Athletes Using Simulated Wearable Sensor Data

Movement screens are frequently used to identify differences in movement patterns such as pathological abnormalities or skill related differences in s...

doi.org

Reply on Twitter 1290678874670608385Retweet on Twitter 12906788746706083852Like on Twitter 12906788746706083855Twitter 1290678874670608385
BioMotionLab avatarBioMotionLab@BioMotionLab·
21 Jul 1285651527957389318

Another new paper from BML: Those who look down a virtual cliff in #VR know that they are perfectly safe. Yet, their body responds with real fear. Sia Eftekharifar investigates what drives that response. @vistayorku, @YorkUScience @CentreforVisio1 @Sia_Eft
https://www.ingentaconnect.com/content/ist/jpi/pre-prints/content-jpi_0129

Image for the Tweet beginning: Another new paper from BML:
Reply on Twitter 1285651527957389318Retweet on Twitter 128565152795738931814Like on Twitter 128565152795738931846Twitter 1285651527957389318
BioMotionLab avatarBioMotionLab@BioMotionLab·
17 Jul 1284142210871681024

New paper! Our toolbox that turns Unity into a tool to design and execute factorial, trial-based experiments in Unity3D just appeared as a journal article in i-Perception. Congratulations to Adam Bebko!
@vistayorku, @YorkUScience, @PerceptionSAGE, @unity3d https://www.biomotionlab.ca/tux/

Image for the Tweet beginning: New paper! Our toolbox that
Reply on Twitter 1284142210871681024Retweet on Twitter 128414221087168102415Like on Twitter 128414221087168102436Twitter 1284142210871681024

Goals

Our main research interest is focused on questions concerning the nature of perceptual representations. How can a stream of noisy nerve cell excitations possibly be turned into the coherent and predictable perception of a “reality”? We work on questions involving the processing of sensory information, perception, cognition and communication.

Topics

People perception: The biology and psychology of social recognition

  • detection of animate agents
  • conspecific recognition
  • gender recognition
  • individual recognition
  • action recognition
  • recognition of emotions, personality traits and intentionality
  • recognition of bodies, faces, and biological motion

Vision in virtual reality

  • pictorial vs physical spaces
  • space perception
  • simulator sickness
  • perception of self-motion (vection)
  • multisensory integration
  • perception of the own body
  • the nature of presence

Visual ambiguities and perceptual biases

  • depth ambiguities
  • the “facing-the-viewer” bias

Network

Since moving to York University in 2018, the Biomotion Lab has become an integral part of the multi-departmental Centre for Vision Research. Its main affiliation is with the Department of Biology in the Faculty of Science.

Students in the Biomotion Lab come from different graduate programs:

  • Biology @ YorkU
  • Psychology @ YorkU
  • Electrical Engineering and Computer Science @ YorkU
  • International Graduate School “The Brain in Action”
  • Centre for Neuroscience Studies @ Queen’s University

Dr. Troje is a core member of the CFREF funded program “Vision: Science to Application” (VISTA). Other funding comes from

  • NSERC
  • CFI
  • CIFAR

Other important affiliations include:

  • Canadian Action and Perception Network (CAPnet)
  • Vision Science Society