#SurgicalAI

DEEP STITCH

In this project, we focus on automating the assessment of suturing skills during the robot-assisted radical prostatectomy (RARP) and virtual reality (VR) simulation through machine learning. Our goal is to determine how suturing skills relate to patient outcomes and whether ideal VR skills can link to optimal RARP outcomes.

Funded by the National Cancer Institute under Award Number R01 CA241758

 

Andrew J. Hung Robotics Lab- Deep Stitch Logo
Andrew J. Hung Robotics Lab- MD Anderson Logo
Andrew J. Hung Robotics Lab- Houston Methodist Logo
Andrew J. Hung Robotics Lab- Rochester Logo
Andrew J. Hung Robotics Lab- City of Hope Logo

DEEP DISSECT

The over-arching goal of our proposal is to assess nerve-sparing (NS) surgical performance, the principal modifiable determinant of erectile function (EF) outcome after robot-assisted radical prostatectomy (RARP). The correlation of NS skills and EF outcome serve as a test case; that we can quantify NS surgical performance to predict a surgical outcome. We will combine expert consensus-driven assessment (Aim 1) and automated assessment (Aim 2) to observe and define the optimal surgical practice for the preservation of EF. Finally, we will design Virtual Reality exercises for NS skill with tailored formative feedback validated on real RARP EF outcomes (Aim 3).

 Co-PI: Jim Hu, MD :: Funded by the National Cancer Institute under Award Number R01 CA259173

Andrew J. Hung Robotics Lab- Deep Dissect Logo
Andrew J. Hung Robotics Lab- City of Hope Logo
Andrew J. Hung Robotics Lab- Cornell Logo
Andrew J. Hung Robotics Lab- Stanford Logo
Andrew J. Hung Robotics Lab- Memorial Sloan Logo

DEEP FEEDBACK

The interaction between a surgical trainer and a trainee in the operating room (live surgical feedback), is critical for acquiring technical skills and producing skillful surgeons. However, such feedback is inconsistent during formal training and essentially non-existent beyond training when surgeons have yet to become fully proficient. To address this need, we aim to observe and assess the process of live surgical feedback, conduct randomized controlled trials to optimize feedback for specifically improving surgeon skills, and develop an AI system to deliver automated feedback when experienced human trainers are not available.

Funded by R01 CA298988

Andrew J. Hung Robotics Lab- Stanford Logo

SURGICAL GESTURES

Surgical gestures, defined as the smallest meaningful interaction of a surgical instrument with human tissue, are a novel approach to deconstruct surgery. After the development and validation of a classification system for dissection gestures, we move on to link gestures to surgeon technical skills and patient outcomes. We believe surgical gestures can benefit surgical education in the long run by showing that combinations of gestures may ideally accomplish specific surgical tasks.

 

Andrew J. Hung Robotics Lab- Gestures Logo
Andrew J. Hung Robotics Lab
Andrew J. Hung Robotics Lab
Andrew J. Hung Robotics Lab