Amplitude-Aware Deep Learning-Based Tool Tip Localization in Raw Photoacoustic Channel Data
Affiliation Type:
Academia
Keywords:
Deep learning, Photoacoustic imaging, Surgical tool tracking system, Surgical guidance
Abstract:
Photoacoustic imaging is a promising modality for real-time surgical guidance to visualize critical structures during minimally invasive procedures. Deep learning-based visual servoing systems utilizing raw photoacoustic channel data have successfully tracked catheter tips in cardiac interventions. However, existing deep learning methods prioritize the proximal waveform and risk decreased generalizability due to low-amplitude artifacts. Therefore, we present an approach that leverages amplitude-aware training to improve surgical tool tip localization using raw photoacoustic channel data. A Faster R-CNN network was trained on 20,000 k-Wave-simulated frames and tested with experimental data. The system achieved a mean absolute tracking error of 0.42 mm and 0.91 mm in the axial and lateral image dimensions, respectively, with a frame rate of 10.9 Hz, which is compatible with our 10 Hz laser pulse repetition frequency. The proposed system promises to provide real-time guidance to track surgical tool tips while minimizing erroneous detections from low-amplitude signals and artifacts in multiple locations.