Danilo Gasques, M.S.

Hello! I am a Computer Science Ph.D. Candidate at UC San Diego (Weibel Lab / Design Lab).

My research focuses on improving remote and co-located surgical guidance through eXtended Reality technology.

In my research, I use mixed methods to better understand user goals, co-design with stakeholders, and evaluate and explain interventions. I also develop prototypes, tools, and systems. For example, tools to facilitate the rapid-prototyping of Augmented Reality applications.

As of 2020, I am excited to share that my research is supported by an Intuitive Surgical Ph.D. Research Fellowship.

Please, reach out to me at: gasques@ucsd.edu

(LinkedIn / Google Scholar / Ello / Resume )

academic PORTFOLIO

Ultrasound-guided procedure training in Augmented Reality

(more info to come)

Wound Care Training (2018 - 2019)

Misc Collaborations

Collaborations are one of the things I enjoy the most. They are an exciting way of learning from peers while positively impacting society with multidisciplinary work.




CSE 198 - Mixed Reality Dojo (SPRING 2018)

VIS 147A - Computer Technologies for Art (FALL 2017)


Intuitive Surgical - Applied Research Intern (Summer 2019)

Intuitive Surgical - Applied Research Intern (Summer 2018)

Simulation Training Center - Research Fellow (Summer 2017)

Enhanced ultrasound systems and methods (2018)

Preetham Suresh, Danilo Gasques Rodrigues, Nadir Weibel, Elizabeth A. Anderson

Systems, devices, and methods are disclosed for an enhanced ultrasound system. A system may include an ultrasound probe. The system may include processing circuitry communicatively coupled to the ultrasound probe. The system may also include an AR device receiving image information from the processing circuitry and displaying one or more ultrasound images from the ultrasound probe in the field of view of an operator.

Method and system for facilitating remote presentation or interaction (2020)

Sundar Murugappan, Danilo Gasques Rodrigues, Govind Payyuvula, Simon DiMaio

A facilitation system for facilitating remote presentation of a physical world includes a first object and an operating environment of the first object. The facilitation system includes a processing system configured to obtain an image frame depicting the physical world, identify a depiction of the first object in the image frame, and obtain a first spatial registration registering an object model with the first object in the physical world. The object model is of the first object. The processing system is further configured to obtain an updated object model corresponding to the object model updated with a current state of the first object, and generate a hybrid frame using the image frame, the first spatial registration, and the updated object model. The hybrid frame includes the image frame with the depiction of the first object replaced by a depiction of the updated object model.


Program Committee

Paper reviews (50+ works reviewed):

2022: CHI (LBW and Papers), IEEE VR 2022 (Conference and Journal Papers), TEI 2022 (Papers)

2021: UIST (papers), CSCW (papers), CHI (LBW), IUI (LBW), ISMAR (papers and posters)

2020: CHI (LBW and Papers), CSCW, TEI (Committee member / WIP), UIST (Papers)

2019: CHI (LBW), Creativity & Cognition, IMWUT, ISMAR (Posters), UIST (Papers)

2018: CHI (LBW)

2017: CHI (LBW)