Industrial - Honours
The Overwatch Framework proposes an alternative future to surgery by studying the trends in emerging technologies and extrapolating their effects within the medical domain in the near future. While several barriers limit the integration of Ai and autonomous robotics within surgical procedures, this project proposes a safe framework to circumvent these barriers with the hope of saving lives during strenuous medical conditions.
2020’s COVID-19 outbreak challenged the worlds established social, economic and medical infrastructures. During this time, NVIDIA showcased the CLARA GUARDIAN initiative which promised AI integration for hospital patient monitoring applications. The unification of thermal sensing, image recognition and smart sensors were handled through a central processing system and proposed further integration in surgical workflows. As these technologies slowly assimilate into Operation Rooms (OR) and combine with existing interfaces, they present the potential to cause cognitive overload. This project forms a building block of a larger body of work intending to make orthopaedic workflows smoother, lower medical error rates, and enable better patient outcomes. The project aimed to outline how emerging technologies can be used to optimise human-robot interactions within a surgical theatre context, thereby addressing issues of cognitive overload and fatigue during a Total Knee Arthroplasty (TKA) procedure.
Each year, as many as 18,000 unnecessary deaths occur due to medical errors, a further 50,000 are faced with the premise of living with a surgery-induced disability. While autonomous robotics have begun to be implemented in studios, collaborative robotics and higher levels of autonomy are avoided as different robots cannot communicate with each other. Medical staff circumvent this issue by performing manual calibration during the pre-procedure and monitoring the functions of robot usage in addition to standard medical tasks. This adds additional cognitive load which may result in manual errors and prevents the adoption of additional robots within a theatre. To understand the workflows associated with TKA, a case study was performed – a timeline is presented below.
Surgery 4.0 – the seamless integration of Automated tasks, medical imaging and assistive functions
HAIDEGGER, 2019
To accurately design for a future autonomous surgical context, existing autonomy within different industries needed to be researched. Examples of a logistics approach to autonomy, metaverse collaboration initiatives, applied Ai, and the requirements for machine learning were reviewed and coded. An absence of an industry standard was noted early on, and thus the Overwatch framework was proposed as a supporting standard to Dr Jaiprakash’s Surgical Automation and Communications System proposal – A roadmap to surgery 4.0.
Before approaching the design phase, field research was conducted to collect missing data on TKA workflows and the pain points of medical staff. Due to the large team, a large criterion set was derived to address the data collection requirements of today’s medical practices, while respecting intangible aspects of Autonomous Surgery.
To showcase the proposal, 3 products have been developed to function within the 10-year development plan. The Conquest Tag, Conquest Dock and Overwatch Chronos management tool are novel products designed to collect data from practising surgeons while training the next generation of robotic manipulators through digital mirroring.
From a physical standpoint, the conquest tag and dock are defined primarily as utility devices – they allow surgical technologists to easily access endoscopy tools. The dock needed to achieve this function while achieving its passive requirements such as sterilisation and data tracking without hindering the medical procedure. The conquest endoscopy tools were used to demonstrate the SACS proposal and how the Overwatch Industry standard could be created.
The conquest dock sterilises endoscopy instruments from Stryker’s Conquest tool line, while simultaneously collecting contextual and chronological usage data patterns for synthetic training. The device is designed to function within the surgical studios of today and adapt to future autonomous environments as well. This data, much like supporting sensor information collected throughout the data collection phase is vital to establishing a clear industry standard. While products such as cameras capture RGB information, the Conquest Dock will monitor how individual handheld tools are used throughout the course of several thousand surgeries.
The conquest tag forms the underlying basis for tool tracking within the Overwatch system. The concept demonstrates how an RFID tag can be attached to a set of manual tools and used to track the usage of the instrument throughout a surgery. This data can be coded to graph the importance of each tool during specific phases and contexts while providing overhead sensors with vital information on each tool for digital twin replication. Designed specifically for Stryker’s Conquest tool line, the Tag clips over the stainless-steel tube and remains in place through friction.
The conquest dock is designed for minimal portability yet includes sufficient ergonomic cues for easy transportation between short distances. The product is mainly situated within a surgical theatre, under the context of an active surgery, or within a medical bay in between procedures. Built from a unibody of diecast aluminium and glass, the product seamlessly blends into a medical context and is designed to function under constant sterilization procedures. The product uses two displays; one to communicate tool details, and another to indicate active statuses – those of which can be toggled through the Overwatch Chronos Application.
Chronos is a GUI created for nurses and doctors to communicate with a larger family of autonomous devices within phase 3 and onwards. Built upon the overwatch standard, the context assumes that most surgical robots and sensors will feature software connectors to the Chronos UI. The application allows nurses to manage the surgeries progress and view the object detection and decision process in real-time. The software also manages several streams of data such as LiDAR, RGB, Tool Usage and Motion tracking while comparing this real-time information to previously collected synthetic data such as CT scans, 3D models and patient history as seen in figure 40. The implications of the tool are further extended with cloud computing and would allow anaesthetists to manage patient vitals through the same software – providing yet another form of data for synthetic context training.
Both a vision for the future of autonomous surgery as well as a representation of the fluidity of digital twin applications – the synthetic training environment is the first step to establishing a training ground for the standardisation of data between different surgical appliances. Data collected from real surgeries will be coded in a similar environment to run case scenarios and train different robots to work together under a variety of procedures, contexts and surgery specific workflows. In addition, the space can also serve as a novel VR or AR training tool for surgeons to review before performing a complex procedure. The digital twin represented in figure 41, showcases a range of novel robotic manipulators as well as a rotary platform and redesigned circular layout for improved tool delivery in a Level of Autonomy 4+ environment.
The Overwatch Framework proposes an alternative future to surgery by studying the trends in emerging technologies and extrapolating their effects within the medical domain in the near future. While several barriers limit the integration of Ai and autonomous robotics within surgical procedures, this project aimed to purpose a safe framework to circumvent these barriers with the hope of saving lives.
Project Overwatch establishes a potential road map to creating a system that truly watches over patients as they undergo life-changing surgery, a system that reliably supports medical staff on a digital and physical front when our infrastructure is pushed to its limits – as we have seen, and know all too well.
Epifanio is a visualisation focused designer interested in exploring digital innovation within areas of User experience, Product Design, and Applied Ai. He enjoys creating multimodal digital experiences and leverage 3D simulation, photorealism techniques, and motion graphics workflows in his practice.