Hello! I'm Danilo, Founder Fellow at South Park Commons and previously a Research Engineer at Medivis.
I’m interested in AI that see what people see, reason about real-world context, and guide action in real time. I believe AI's superpower is accelerating how quickly people become capable, confident, and effective at a skill: compressing years of training into on-the-job, just-in-time guidance.
My background combines healthcare, augmented reality, and human-centered design. I received my Ph.D. in Computer Science from the University of California San Diego, where I studied at the Design Lab and HXI Lab.
My Ph.D. dissertation focused on Augmented Reality guidance in the presence of misalignments through the use of contextual aids
In my research, I use mixed methods to better understand user goals, co-design with stakeholders, and evaluate and explain interventions. I also develop prototypes, tools, and systems. For example, tools to facilitate the rapid prototyping of Augmented Reality applications.
From 2021-2022, I was honored to receive a Ph.D. fellowship from Intuitive Surgical.
Please, reach out to me at: d@appliedmind.ai
[Website to be updated in the near future]
(more info to come)
Rapid Prototyping in Augmented Reality (2017-2019)
Wound Care Training (2018 - 2019)
Collaborations are one of the things I enjoy the most. They are an exciting way of learning from peers while positively impacting society with multidisciplinary work.
2019
CrowdPortraits / VideoMob @ Port of San Diego - Emily Grenader, Danilo Gasques (Creative Coding / Kinect&Project Installation).
2018
CSE 118/218 - Ubiquitous Computing / Advanced Software Engineering (FALL 2018/FALL 2019)
CSE 198 - Mixed Reality Dojo (SPRING 2018)
VIS 147A - Computer Technologies for Art (FALL 2017)
Intuitive Surgical - Applied Research Intern (Summer 2019)
Intuitive Surgical - Applied Research Intern (Summer 2018)
Simulation Training Center - Research Fellow (Summer 2017)
Ultrasound landmark registration in an augmented reality environment (2024) (US 20250391125A1)
Danilo Gasques Rodrigues, Osamah Choudhry, Christopher Morley, Long Qian, Joseph Benjamin Horowitz
Various embodiments are directed to generating a contour object for a region of interest represented by medical data corresponding to internal anatomy of a patient. Respective display coordinates are determined by performing a transformation on the medical data of the contour object. The display coordinates correspond to a unified three-dimensional (3D) space of an Augmented Reality (AR) environment. Respective edges of the contour object are displayed according to trace points. The trace points are portrayed in the AR environment as a visual outline in alignment with an ultrasound imagery visualization of the region of interest in the patient's physical internal anatomy. The trace points are registered as internal landmarks for the region of interest in the patient's physical internal anatomy.
Calibrationless reference marker system (2024) (US12396703B1)
Christopher Morley, Danilo Gasques Rodrigues, Long Qian
A reference marker system configured for use with a separate medical device can include a main body and a plurality of reference marker sites. The main body can have outer and inner surfaces and an overall geometry that interacts with a distinctive feature on the separate medical device. The main body can removably couple to the separate medical device along its inner surface at a specific orientation based on the distinctive feature. The reference marker sites can be coupled to the outer surface and configured to host reference markers suitable for use with a separate augmented reality system. The reference marker sites can be distributed across the outer surface at fixed positions relative to each other to form an asymmetrical fixed positional arrangement, which can be known for a medical procedure using the separate medical device without requiring any reference marker calibration based on the specific orientation.
Collaborative mixed-reality system for immersive surgical telementoring (2023) (US 12482192B2)
Nadir Weibel, Michael Yip, Danilo Gasques Rodrigues, Thomas Sharkey, Janet Johnson, Konrad Davis
Systems and methods utilize mixed reality technologies to enable remotely located expert surgeons to instruct novice surgeons as if they were together in the same operating room. An example method may involve: (1) displaying, on a virtual reality (VR) display worn by a first person located in a second geographic location, a 3D virtual reconstruction of a patient located in a first geographic location; (2) determining a spatial relationship between a hand gesture performed by the first person and the 3D virtual reconstruction of the patient; and (3) displaying, on an augmented reality (AR) display worn by a second person located in the first geographic location, a 3D avatar of the hand gesture made to appear in a spatial relationship to the surgical subject that mirrors the determined spatial relationship between the hand gesture and the 3D virtual reconstruction of the patient.
Method and system for facilitating remote presentation or interaction (2020)
Sundar Murugappan, Danilo Gasques Rodrigues, Govind Payyuvula, Simon DiMaio
A facilitation system for facilitating remote presentation of a physical world includes a first object and an operating environment of the first object. The facilitation system includes a processing system configured to obtain an image frame depicting the physical world, identify a depiction of the first object in the image frame, and obtain a first spatial registration registering an object model with the first object in the physical world. The object model is of the first object. The processing system is further configured to obtain an updated object model corresponding to the object model updated with a current state of the first object, and generate a hybrid frame using the image frame, the first spatial registration, and the updated object model. The hybrid frame includes the image frame with the depiction of the first object replaced by a depiction of the updated object model.
Program Committee
Paper reviews (50+ works reviewed):
2026: CHI2026 (Papers)
2025: ISMAR 2025 (Papers), CHI 2025* (Papers)
2024: VRST** (Papers), IEEE VR 2024 (Papers)
2023: DIS (Papers), CHI 2023* (Papers )
2022: CHI (LBW and Papers), IEEE VR 2022 (Conference and Journal Papers), TEI 2022 (Papers)
2021: UIST (papers), CSCW (papers), CHI (LBW), IUI (LBW), ISMAR (papers and posters)
2020: CHI (LBW and Papers), CSCW*, TEI (Committee member / WIP), UIST (Papers)
2019: CHI (LBW), Creativity & Cognition, IMWUT, ISMAR (Posters), UIST (Papers)
*special recognitions for outstanding reviews
** highly useful reviews