Directed by Prof. Dr. Nikolaus Troje, the lab is located at York University in Toronto, Ontario.
News Feed
How to turn speech into full body animation: Saeed’s paper now appeared in Computer Graphics Forum. @SaGhorbani
@UbisoftLaForge
@YUResearch
@vistayorku
https://doi.org/10.1111/cgf.14734
New review on an ongoing topic: Life Detection from Biological Motion. We haven’t contributed a lot lately, but many other labs have. Here, @NTroje and Dorita Chang provide an update on what's new and exciting. @vistayork @YUResearch @CentreforVisio1
https://doi.org/10.1177/09637214221128252
This is what @biomotionlab grad student @SaGhorbani worked on during his internship with @Ubisoft. Nice feature @twominutepapers @YorkUScience @vistayorku
New Video - Ubisoft’s New AI: Breathing Life Into Games!
https://www.youtube.com/watch?v=Dt0cA2phKfU
#ai #gaming
MIT News features a study that resulted from collaboration between @BioMotionLab and @projectprakash

After a lifetime of blindness, newly sighted can immediately identify human locomotion
Researchers find blind patients who had very limited visual exposure to human bodily movement could immediately recognize human locomotion after the r...
news.mit.edu
Congratulations, Saeed. I am glad your work gains so much visibility. @vistayorku

Check out 'ZeroEGGS' which takes recorded speech and creates gestural movement to match, powered by the brains and research of our teams at @UbisoftLaForge!
Want to play around? It's open source and we'd love to see your results!
📺 https://www.youtube.com/watch?v=EJPdTtVrxHo
A number of international collaborations have recently resulted in new publications @biomotionlab, @vistayork, @YUResearch:
This post features the work of @BioMotionLab graduate student Saeed Ghorbani @SaGhorbani who did an internship with @Ubisoft as a grad student and was then promoted to become an R&D Scientist with @UbisoftToronto. Congratulations, Saeed!
How to translate Speech to Gesture... with style ?
Check out our open source model & data, our demo and paper about Zero-shot Example-based Gesture Generation from Speech. 👇
Repo: https://github.com/ubisoft/ubisoft-laforge-ZeroEGGS
Article: https://arxiv.org/abs/2209.07556
Video: https://www.youtube.com/watch?v=EJPdTtVrxHo&ab_channel=UbisoftLaForge
Ashley was also a graduate students in the @BioMotionLab. It is sad, that she leaves physically, but she will remain a much appreciated member of the lab until she defends her thesis later this year. We'll miss you, Ashley.


Saying goodbye to our outgoing CVR trainee rep Ashley Funkhouser! Good luck back in Mississippi! @CentreforVisio1
VISTA core member @NTroje presenting his innovative work on the “Alberti Frame” in his talk, Mug shots: Systematic biases in the perception of facial orientation. Come check out his demo on @VSSMtg Demo Night and say hi to me on the other end of the frame! 👋 🖼 @BioMotionLab
Thanks to @r2rnow for giving me the opportunity to think about my motivations and inspirations.

ICYMI "My education will never be complete. Most professions come with some aspect of lifelong learning, but being a researcher means learning itself is the goal and not just a vehicle to reach some other goal." https://research2reality.com/york-university/nikolaus-troje-meet-researcher-biology/ #meettheresearcher @YUResearch @vistayorku
ICYMI Renaissance architect and mathematician Leon Battista Alberti believed paintings should recreate exactly the view seen through an empty frame. What does this concept have to do with virtual reality? https://research2reality.com/health-medicine/alberti-frame-video-virtual-reality-vision-research/ @YorkUnews @BioMotionLab @YorkUScience @vistayorku
Is cybersickness caused by sensory mismatch? In a new BioMotionLab paper, @SiaEftekharifar shows that motion parallax and stereopsis are both contributing, and that mismatches between the two depth cues do not. @vistayorku @YUResearch @YorkUScience

The role of binocular disparity and active motion parallax in cybersickness
Cybersickness is an enduring problem for users of virtual environments. While it is generally assumed that cybersickness is caused by discrepancies in...
link.springer.com
R2R produced a well crafted feature on one of our current lines of research -- a new visual telecommunication system that we developed to study the dynamics of eye and head gaze during video communication. @vistayorku @YUResearch @YorkUScience

{New video} For all the good that tools like Zoom and Skype have done, we miss something fundamental when using them. Can it be replicated? https://research2reality.com/health-medicine/video-conferencing-zoom-skype-facial-recognition-pandemic-communication/ @York University @YorkUnews @BioMotionLab @YUResearch @jeff_schall @NTroje @J_D_Crawford @yorkuniversity @vistayorku
The Virtual Vision Futures conference kicked off this morning with open remarks from @Henrique5Denise and the first presentation from VISTA trainee @rqgastrock on 'Using a mirror reversal task to investigate de novo learning and distinguish it from motor adaptation'!
#YorkU
Join us for the upcoming Virtual Vision Futures Conference (June 14 - 17, 2021) for three days of great talks and workshops. More details: https://www.yorku.ca/cvr/virtual-vision-futures-conference/
{New video} What would happen if you could actually upload expertise from a person directly into a machine? https://research2reality.com/health-medicine/vision-science-neuroscience-vista-york-university-research/ @YorkUnews @NTroje @YorkMRIfacility @yorkuniversity @vistayorku @Sergio_lab_York @BioMotionLab @YUResearch
Compelling research by VISTA trainee @AnneThaler_ and Core member @NTroje looked at how sex-specific differences in walking style relate to the perceived attractiveness and confidence of male and female virtual characters. http://bit.ly/2Q0cD2c https://www.youtube.com/watch?v=rns_E8WVuIU
Adam Bebko and Anne Thaler gave a very well crafted tutorial on bmlTUX today @IEEEVR conference. If you missed it, check out the recording on YouTube: https://youtu.be/-1dh_U5RcKw. Starts at -4:56. @vistayorku @YorkUScience
BioMotionLab is employing. We are looking for a full time developer with experience in both Unity and iOS development, particularly ARKit. If you are @IEEEVR this week, feel free to talk to Niko Troje, Adam Bebko or Anne Thaler. Or just check out https://www.biomotionlab.ca/join/
BML members Adam Bebko and Anne Thaler are presenting new tools @IEEEVR. Here is the teaser for tomorrow's presentation on a tool that animates AMASS data in Unity. Poster session starts at 10am EST. https://www.youtube.com/watch?v=5uyAHu2ow08
@vistayorku, @YorkUScience
Sia Eftekharifar just defended his PhD thesis on the role of motion parallax and binocular disparity on
presence, cybersickness, and restoration in VR. New papers coming soon. Congratulations @SiaEftekharifar.
Biological motion processing is impressively resilient to aberrant early visual experience, according to a very successful collaboration between the BioMotionLab and Brigitte Röder's research group at University of Hamburg. @vistayorku @YorkUScience @unihh
https://www.eneuro.org/content/eneuro/early/2020/10/15/ENEURO.0534-19.2020.full.pdf
As the new term has started there are new students, too: Andres and Romina are new 4th year honours thesis students, Ashley joins as MSc candidate, Viswajit who's been with us as an RA is now also an MSc student. Welcome all! @vistayorku @YorkUScience
BML graduate student Saeed Ghorbani leads publication of new method for realistic human motion modelling with natural variation to be used for crowd animation and other applications. Paper will be presented at ACM SIGGRAPH/EG SCA @vistayorku @YorkUScience http://diglib.eg.org/handle/10.1111/cgf14116
In a very successful collaboration with G. Ross and R. Graham's group @uOttawa we are using linear acceleration and angular velocity of simple movements to discriminate between expert and novice athletes at accuracies >80%. @vistayorku @YorkUScience

Classifying Elite From Novice Athletes Using Simulated Wearable Sensor Data
Movement screens are frequently used to identify differences in movement patterns such as pathological abnormalities or skill related differences in s...
doi.org
Another new paper from BML: Those who look down a virtual cliff in #VR know that they are perfectly safe. Yet, their body responds with real fear. Sia Eftekharifar investigates what drives that response. @vistayorku, @YorkUScience @CentreforVisio1 @Sia_Eft
https://www.ingentaconnect.com/content/ist/jpi/pre-prints/content-jpi_0129
New paper! Our toolbox that turns Unity into a tool to design and execute factorial, trial-based experiments in Unity3D just appeared as a journal article in i-Perception. Congratulations to Adam Bebko!
@vistayorku, @YorkUScience, @PerceptionSAGE, @unity3d https://www.biomotionlab.ca/tux/
Kudos to Adam Bebko. His workshop on bmlTUX, the BML Toolkit for Unity ExperimentsUnity, given at @VSSMtg drew an impressive crowd. For those who missed it, see the tutorials at . @vistayorku, @YorkUScience, @InnovationYork
Congratulations, Michael et al. I remember this paper very well and often give it to students. Now I will also recommend MJB's notes (below) about its history. The most important message: "Do the work that you think is important and hopefully the community will agree in time".
I’m honored to share the 2020 Longuet-Higgins prize with @DeqingSun and Stefan Roth. It is given at #CVPR2020 for work from #CVPR 2010 that has withstood the test of time. I’ve written a blog post about the secrets behind “The Secrets of Optical Flow”: https://tinyurl.com/y89odrwf
. @Sia_Eft, @AnneThaler_, @SaGhorbani, @yorkuniversity, @CentreforVisio1, @vistayorku
Today's lab meeting at @BioMotionLab
with Xiaoye Michael Wang, Niko Troje, Ashley Funkhouser, Viswajet Kumar, Max Esser, Adam Bebko, Sia Eftekharifar, Anne Thaler, Saeed Ghorbani.
Using #VirtualReality in trial-based, factorial experiments in #VisionResearch can be cumbersome. In the BioMotionLab @yorkuniversity, we developed a toolbox for @unity3d that makes experimental design and execution easy and convenient. For details visit https://biomotionlab.ca/tux/
https://jov.arvojournals.org/article.aspx?articleid=2764419
Cover story of the current @ARVOJOV issue: Using a new tool developed @BioMotionLab, we work with a magic frame that behaves like a picture or a window to study perception of pictures vs perception in the world. @jlugiessen, @YUResearch, @vistayorku
Here's my column on parks as the fulcrum points in the "constant, experimental push-pull between social norms and legal restrictions, trust and compulsion, freedom and constraint" -- and the pandemic-fighting importance of keeping them open.
BMLmovi, a new database from @BioMotionLab with 9h mocap + 17h synchronized and calibrated video + 7h of IMU + MoSh reconstructed body shape is now up on Dataverse. Help yourself and go nuts. @dataverseorg @YorkUScience @yorkuniversity @vistayorku
https://dataverse.scholarsportal.info/dataverse/MoVi
And here Michael looking at the big screen in the main conference hall of @IEEEVR https://twitter.com/AnneThaler_/status/1243275642906574848
Last day of the virtual @IEEEVR meeting. Michael Xiaoye Wang's avatar is looking at Anne Thaler's poster. Amazing how the conference organizers switched from real to virtual with just two week's notice. Anne's and Michael's papers are here: https://www.biomotionlab.ca/publications/
The hand tracking is cool. We are still working on it here in the @BioMotionLab @vistayorku @YUResearch. But we are getting close.
Here is a wonderful visual illusion that achieves what visual illusions are supposed to achieve: It challenges the trust into your very own senses.
ついに立体的に動いて見える錯視が完成しました。
キューブが回転して見えますね?
止まっています
Happy Women and Girls in Science Day #WomenInScience
https://www.un.org/en/observances/women-and-girls-in-science-day/
Perception goes Twitter! I still consider it one of the finest journals in our field as it stays faithful to its almost 50-year old tradition to publish unconventional, creative, adventurous research.
Hello world!
And hello 2020. This year Perception is 48 years old, and its open access sister journal i-Perception turns 10. For the new decade we have a shiny new Twitter account for papers, discussion & news from both journals on Special Issues, prizes, new formats, and more.
This is a gem! Not only will it replace how we teach high school students to solve quadratic equations, but we should also point out to them, that new, exciting discoveries may not require deep digging, but rather open eyes and a curious mind to spot them.
A new way to make quadratic equations easy - Cool trick. https://www.technologyreview.com/s/614775/a-new-way-to-make-quadratic-equations-easy/
Congratulations to Johannes Kurz who collaborates with @BioMotionLab from @jlugiessen. His new paper shows how body structure helps to interpret body motion in the context of soccer penalty shooting. Ask us for a reprint, if you don't have access.
http://link.springer.com/article/10.3758/s13414-019-01883-5
Happy to have Larry Maloney here who arrived for the first of a series of visits support by a @vistayorku travel grant. He excited the students in the @BioMotionLab with his crystal clear talks and then gave another seminar to the larger CVR community.
Our sense of touch is at the center of nearly everything we do. Katherine J. Kuchenbecker, Director of the #HapticIntelligence Department, aims to sharpen our understanding of haptic human-machine interaction. She was interviewed @IROS2019MACAU #IROS2019:
#iccv19 reached a productive and happy ending. many thanks to amazing colleagues @PerceivingSys, and our wonderfull coauthers for #amass.
Naureen and Nima working hard to serve the crowds at @ICCV2019 today 10:30-13:00, Hall B, Poster 102. #AMASS: Archive of Motion Capture As Surface Shapes. More here:
It was very nice to reconnect with the @PerceivingSys group at @ICCV19 in #Seoul.

Our first ever alumni event in #Seoul, #Korea @ICCV19! Thank you @Michael_J_Black and all the great helpers and people who make @PerceivingSys what it is - a great place with great people
Congratulations to Anja Cui who successfully defended her PhD thesis today. Anja, I am very happy you contributed so many years to life in the BioMotion Lab. Congrats to your new postdoc position in Janet Werker's lab @UBC.
https://www.biomotionlab.ca/anja-xiaoxing-cui/
Goals
Our main research interest is focused on questions concerning the nature of perceptual representations. How can a stream of noisy nerve cell excitations possibly be turned into the coherent and predictable perception of a “reality”? We work on questions involving the processing of sensory information, perception, cognition and communication.
Topics
People perception: The biology and psychology of social recognition
- detection of animate agents
- conspecific recognition
- gender recognition
- individual recognition
- action recognition
- recognition of emotions, personality traits and intentionality
- recognition of bodies, faces, and biological motion
Vision in virtual reality
- pictorial vs physical spaces
- space perception
- simulator sickness
- perception of self-motion (vection)
- multisensory integration
- perception of the own body
- the nature of presence
Visual ambiguities and perceptual biases
- depth ambiguities
- the “facing-the-viewer” bias
Network
Since moving to York University in 2018, the Biomotion Lab has become an integral part of the multi-departmental Centre for Vision Research. Its main affiliation is with the Department of Biology in the Faculty of Science.
Students in the Biomotion Lab come from different graduate programs:
- Biology @ YorkU
- Psychology @ YorkU
- Electrical Engineering and Computer Science @ YorkU
- International Graduate School “The Brain in Action”
- Centre for Neuroscience Studies @ Queen’s University
Dr. Troje is a core member of the CFREF funded program “Vision: Science to Application” (VISTA). Other funding comes from
Other important affiliations include: