The Cognitive Vision in Robotic Surgery Lab is developing computer vision and AI techniques for intraoperative navigation and real-time tissue characterisation.

Head of Group

Dr Stamatia (Matina) Giannarou

411 Bessemer Building
South Kensington Campus

+44 (0) 20 7594 8904

What we do

Surgery is undergoing rapid changes driven by recent technological advances and our on-going pursuit towards early intervention and personalised treatment. We are developing computer vision and Artificial Intelligence techniques for intraoperative navigation and real-time tissue characterisation during minimally invasive and robot-assisted operations to improve both the efficacy and safety of surgical procedures. Our work will revolutionize the treatment of cancers and pave the way for autonomous robot-assisted interventions.

Why it is important?

With recent advances in medical imaging, sensing, and robotics, surgical oncology is entering a new era of early intervention, personalised treatment, and faster patient recovery. The main goal is to completely remove cancerous tissue while minimising damage to surrounding areas. However, achieving this can be challenging, often leading to imprecise surgeries, high re-excision rates, and reduced quality of life due to unintended injuries. Therefore, technologies that enhance cancer detection and enable more precise surgeries may improve patient outcomes.

How can it benefit patients?

Our methods aim to ensure patients receive accurate and timely surgical treatment while reducing surgeons' mental workload, overcoming limitations, and minimizing errors. By improving tumor excision, our hybrid diagnostic and therapeutic tools will lower recurrence rates and enhance survival outcomes. More complete tumor removal will also reduce the need for repeat procedures, improving patient quality of life, life expectancy, and benefiting society and the economy.

Meet the team

Citation

BibTex format

@inproceedings{Ye:2016:10.1007/978-3-319-46720-7_45,
author = {Ye, M and Zhang, L and Giannarou, S and Yang, G-Z},
doi = {10.1007/978-3-319-46720-7_45},
pages = {386--394},
publisher = {Springer},
title = {Real-Time 3D Tracking of Articulated Tools for Robotic Surgery},
url = {http://dx.doi.org/10.1007/978-3-319-46720-7_45},
year = {2016}
}

RIS format (EndNote, RefMan)

TY  - CPAPER
AB - In robotic surgery, tool tracking is important for providingsafe tool-tissue interaction and facilitating surgical skills assessment. De-spite recent advances in tool tracking, existing approaches are faced withmajor difficulties in real-time tracking of articulated tools. Most algo-rithms are tailored for offline processing with pre-recordedvideos. In thispaper, we propose a real-time 3D tracking method for articulated toolsin robotic surgery. The proposed method is based on the CAD modelof the tools as well as robot kinematics to generate online part-basedtemplates for efficient 2D matching and 3D pose estimation. A robustverification approach is incorporated to reject outliers in2D detections,which is then followed by fusing inliers with robot kinematic readingsfor 3D pose estimation of the tool. The proposed method has been val-idated with phantom data, as well asex vivoandin vivoexperiments.The results derived clearly demonstrate the performance advantage ofthe proposed method when compared to the state-of-the-art.
AU - Ye,M
AU - Zhang,L
AU - Giannarou,S
AU - Yang,G-Z
DO - 10.1007/978-3-319-46720-7_45
EP - 394
PB - Springer
PY - 2016///
SN - 0302-9743
SP - 386
TI - Real-Time 3D Tracking of Articulated Tools for Robotic Surgery
UR - http://dx.doi.org/10.1007/978-3-319-46720-7_45
UR - http://hdl.handle.net/10044/1/42513
ER -

Contact Us

General enquiries

Facility enquiries


The Hamlyn Centre
Bessemer Building
South Kensington Campus
Imperial College
London, SW7 2AZ
Map location