Fiona Ryan

Girl sitting on steps

Hi, I'm Fiona.

I am a second year Computer Science PhD student at the Georgia Institute of Technology, where I am advised by Professor James Rehg within the School of Interactive Computing. My research interests are in computer vision & perception, with applications to understanding social behavior and behavioral health indicators. I am excited about developing video recognition models that understand people and the way they interact with each other and the 3D world around them.

I graduated from Indiana University in 2020 with a B.S. in Computer Science and a B.M. in Clarinet Performance. At IU, I did research in the Music Informatics Lab and Computer Vision Lab. I also spent 2 summers interning at Google, where I worked on deep learning for diagnosing skin conditions and scene understanding for Nest Cams.

News

November 2021 - I accepted an internship at Facebook Reality Labs Research for Summer 2022!

October 2021 Introducing Ego4D! Glad to have been a part of the social benchmark team and to have led the data collection effort at Georgia Tech.

July 2021 Our paper Big Self-Supervised Models Advance Medical Image Classifications was accepted to ICCV 2021!

August 2020 - I started my PhD in Computer Science at Georgia Tech

May 2020 - I am interning at Google this summer on the Dermatology team

May 2020 - I graduated from Indiana University with my B.S. in Computer Science and B.M. in Clarinet Performance

Research

Modeling Social Behavior

How can we make AI socially intelligent? I am investigating deep learning methods for modeling aspects of social behavior including gaze, attention, and gesture. I'm especially excited about applications of this work to behavioral health and understanding autism.

Musical Score Following

Score alignment algorithms work well for a straightforward performance of a piece, but realistic music practice involves repetition, mistakes, and skips! Our team created an efficient offline score following algorithm to model realistic, nonconsecutive musical practice. See our paper from Sound & Music Computing Conference 2019.