We use perceptual methods, AI, and frugal robotics innovation to deliver transformative diagnostic and treatment solutions.
Head of Group
B415B Bessemer Building
South Kensington Campus
+44 (0)20 3312 5145
What we do
The HARMS lab leverages perceptually enabled methodologies, artificial intelligence, and frugal innovation in robotics (such as soft surgical robots) to deliver transformative solutions for diagnosis and treatment. Our research is driven by both problem-solving and curiosity, aiming to build a comprehensive understanding of the actions, interactions, and reactions occurring in the operating room. We focus on using robotic technologies to facilitate procedures that are not yet widely adopted, particularly in endoluminal surgery, such as advanced treatments for gastrointestinal cancer.
Meet the team
Results
- Showing results for:
- Reset all filters
Search results
-
PatentItkowitz B, Mylonas GP, Zhao W, et al., 2012,
Method and system for stereo gaze tracking
, EP 2774380 A1, (Sep 16, 2015-code: RIN1-Event: Inventor (correction)) -
Conference paperMylonas GP, Totz J, Vitiello V, et al., 2012,
A Novel Low-Friction Manipulator for Bimanual Joint-Level Robot Control and Active Constraints
, IROS 2012 -
Conference paperClancy NT, Mylonas GP, Yang GZ, et al., 2011,
Gaze-contingent autofocus system for robotic-assisted minimally invasive surgery
, EMBC 2011A gaze-contingent autofocus system using an eye-tracker and liquid lens has been constructed for use with a surgical robot, making it possible to rapidly (within tens of milliseconds) change focus using only eye-control. This paper reports the results of a user test comparing the eye-tracker to a surgical robot’s in-built mechanical focusing system. In the clinical environment, this intuitive interface removes the need for an external mechanical control and improves the speed at which surgeons can make decisions, based on the visible features. Possible applications include microsurgery and gastrointestinal procedures where the object distance changes due to breathing and/or peristalsis.
-
Conference paperJames DRC, Leff D, Orihuela-Espina F, et al., 2011,
Influence of heart rate and stress on cortical haemodynamics associated with learning: a longitudinal functional Near Infrared Spectroscopy (fNIRS) study
, Organization for Human Brain Mapping 2011 -
Conference paperJames DRC, Mylonas GP, et al, 2011,
The role of the prefrontal cortex (PFC) in naïve complex motor skills learning: an fNIRS study
, Organization for Human Brain Mapping 2011 -
Conference paperJames DRC, Mylonas GP, et al, 2011,
Neuroergonomic assessment of collaborative gaze control for robotic surgery: an fNIRS study
, Organization for Human Brain Mapping 2011 -
Conference paperMylonas G, Sun LW, Kwok K-W, et al., 2011,
Collaborative Gaze Channelling for Cooperation within a Shared Tele-Surgery Environment
, Hamlyn Symposium -
Conference paperFujii K, Mylonas GP, Yang G-Z, 2011,
Stealth Calibration Eye Tracking Algorithm for Minimally Invasive Surgery
, Hamlyn Symposium 2011 -
Conference paperNoonan DP, Mylonas GP, Shang J, et al., 2010,
Gaze contingent control for an articulated mechatronic laparoscope
, Pages: 759-764 -
Conference paperPaggetti G, Menegaz G, Leff D, et al., 2010,
An Assessment of Parietal Function during Depth Perception and Coordination in Surgical Robotics
, 16th Annual Meeting of the Organization for Human Brain Mapping (HBM)
This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.
Contact Us
The Hamlyn Centre
Bessemer Building
South Kensington Campus
Imperial College
London, SW7 2AZ
Map location