Robotics For COVID-19
Update June 5, 2020
Call for Papers: We are organizing a special issue in the journal Frontiers in Robotics and AI on Robotics, Autonomous Systems and AI for Nonurgent/Nonemergent Healthcare Delivery During and After the COVID-19 Pandemic. Frontiers has waived article processing charges until 31st July 2020 for manuscripts submitted in response to the COVID-19 pandemic and established a priority peer-review process. (Frontiers in Robotics and AI)
Update May 6, 2020
Mahdi Tavakoli, Jay Carriere, Ali Torabi, Robotics, Smart Wearable Technologies, and Autonomous Intelligent Systems for Healthcare During the COVID‐19 Pandemic: An Analysis of the State of the Art and Future Vision, Advanced Intelligent Systems, 2020. (Wiley Online Library) (Postprint)
We are very keen to leverage our robotics research to help the global healthcare community in the fight against COVID-19. Please do contact us if we can be of any help.
How Can We Help Right Now?
Telehealth to assist frontline healthcare workers
In times of extreme strain on the healthcare system such as during the coronavirus pandemic, robotic systems can significantly reduce the risk of infectious disease transmission to frontline healthcare workers by making it possible to triage, evaluate, monitor, and treat patients from a safe distance. Our research lab specializes in digital health solutions including telehealth/telepresence technologies and wishes to apply it to the current health crisis to enable more effective and safer healthcare service delivery. Our lab also has expertise in wheeled telepresence robots that can include a manipulator arm to allow for virtual face-to-face patient assessment and enable healthcare staff to perform diagnostic testing remotely (e.g., remotely measuring a patient’s temperature). We are able to formulate telehealth technology solutions in the current COVID-19 pandemic to allow for curbside screening of patients while healthcare staff can remain at a safe distance. Enabling remote screening of patients will reduce the contact time between patients and frontline healthcare workers and, critically during the COVID-19 pandemic, can reduce the use of facemasks and other personal protective equipment during patient intake.
Automation of lab and testing services
Robots can also automate manual operations that are labour-intensive, time-consuming and repetitive in order to reduce the burden on frontline healthcare workers. Because robots can provide highly precise, reproducible, fast and controlled maneuvers, they can facilitate a much higher throughput in lab testing and sample analysis, hospital equipment and environment sterilization/sanitization, and pharmacy services. While task-specific programming of robots is challenging and requires extensive computer programming know-how, our lab specializes in rapid on-demand robotic automation based on Learning from Demonstration (LfD), where any healthcare worker (e.g., a nurse or lab worker) can “demonstrate” a given task to the robot by physically guiding it through the desired motions (without the need for any programming of the robot) and the robot “learns” the ability to reproduce (i.e., imitate) the task on its own. This is highly advantageous during the COVID-19 pandemic when robotic systems need to be highly flexible and repurposable to best meet the highly dynamic day-to-day challenges of the healthcare system.
COVID-19 Healthcare Applications Supported by Our Expertise
Telehealth activities involving evaluation, monitoring, or intervention
- Using a wheeled mobile manipulator (WMM) allowing front-line healthcare workers to evaluate and triage patients before they enter the hospital (e.g., allowing nursing staff to do a curbside interview with a patient and take their temperature while the patient remains in the car).
- Using a WMM supporting live audio-video interaction with hospitalized patients with the ability to remotely control a camera or measure patient vitals, especially for the ones that are being isolated.
- Using a WMM supporting the monitoring of a large number of patients in non-hospital emergency health environments such as gymnasiums, tents, etc.
- Using a WMM for taking swab samples and administering medications from a safe distance.
- Low-cost audio-video-enabled systems to interact with patients in residential settings.
Automation for healthcare
- Using a mobile robot to autonomously disinfect and clean healthcare facilities as needed.
- Using a mobile robot to autonomously and rapidly deliver healthcare materials such as lab supplies, test samples, or personal protective equipment when time is of the essence.
- Using stationary manipulators or WMMs to collaborate with healthcare workers to lift heavy objects or handle lab samples and biological/infectious materials.
- Rapidly deploying robots to automate any repetitive task in a healthcare environment through LfD-based hands-on kinesthetic teaching. This includes automation of repetitive lab-based analyses for higher throughput.
Key Areas of Our Research
Our lab is devoted to the design, control, and adoption of
- Vision-guided autonomous mobile and fixed-base robotic manipulators (stand-alone robots)
- Semi-autonomous assistive mobile and fixed-base robotic manipulators (collaborative robots)
- Remote-operated mobile and fixed-base robotic manipulators (telerobots)
- Kinesthetically programmed mobile and fixed-base robotic manipulators
- Simulators/trainers encompassing robots and virtual-reality and augmented-reality displays
Examples of Our Research
|A remotely operated (fixed-based) manipulator|
A pick and place task
A pick-and-place task using a multiple-degrees-of-freedom manipulator controlled from an Xbox hand controller is shown. The remote control is intuitive and effective and preempts the need for the user’s presence in a hazardous environment, making it for COVID-19 related tasks.
|A remotely operated wheeled mobile manipulator (WMM)|
A pick, carry and place task
Much of the work in a hospital can be done remotely by a WMM we have developed that enables dexterous operations. Using visual feedback from a camera mounted on the manipulator, the operator is immersed in the environment.
| A collaborative manipulator|
An ultrasound scanning task
We have developed a robotic system that facilitates cooperative task performance with a clinician. Here, it is used for collaborative ultrasound scanning but the idea holds for any diagnostic procedure that can take advantage of the benefits offered by robots.
| An assistive manipulator|
An assisted object lifting task
We have developed a force-scaling robot allowing the user to lift heavier than normal objects while still retaining the user’s full range of upper-body motion and the sense of touch. Example applications include general healthcare materials or biological/infectious samples handling.
|A collaborative manipulator with augmented reality|
A bone cutting task
For orthopedic surgeries, we have developed a surgeon-assist system with 3D augmented reality (AR) visualization and force feedback. The robot guides the surgeon’s hand toward desired cutting/drilling locations. The AR display provides visual guidance via image overlay to the user.
| A teleoperated manipulator|
A remote ultrasound scanning task
We have developed a teleoperated system for remote ultrasound scanning of the body. Here, teleoperated ultrasound scanning is demonstrated but the idea can be extended to any procedure that can take advantage of the benefits offered by robots.
|A robot and a telerobot kinesthetically taught |
Learning from Demonstration (LfD) for robotic/telerobotic systems
Hands-on teaching of a robot (left video) or a telerobot (right video) via Learning from Demonstration (LfD) can help to automate repetitive manual tasks. LfD consists of two main phases: the demonstration phase where a doctor or a nurse interacts with the robot to teach it the task to be done, and a reproduction phase where the task is autonomously performed by the robot in future repetitions.
|A robot-integrated task simulator |
Robot-based simulation of manipulation tasks with augmented reality
Physical human-robot interaction can be combined with visual feedback for purposes such as simulation and training of complex tasks. In our task simulator, live images of the task environment are displayed to the user via 3D augmented reality while the user experiences the physical properties of the task environment via the robot. Applications include training medical professionals before putting them in hazardous situations or environments.