From precision to empathy, robots bring unique skills to medicine
Robots and artificial intelligence are becoming ubiquitous, taking over various industries through automation, and medicine is no different.
Professors Frauke Zeller (Professional Communication) and Farrokh Janabi-Sharifi (Mechanical and Industrial Engineering) are both integrating artificial intelligence into healthcare through different methods, but with the same end goal of improved and enhanced care.
Professor Janabi-Sharifi is using the steadiness of robots to increase the accuracy and precision of heart surgery. The robots he programs learn by "watching" surgeons perform the work.
A mechanical engineer with a passion for robots and a drive to save lives, professor Janabi-Sharifi programs robots to do the work of surgeons with the help of doctors from St. Michael's Hospital.
"We are working on cardiac interventions to fix blocked arteries and some other heart diseases such as arrhythmia," said professor Janabi-Sharifi. For example, treatment of arrhythmia or unusual heartbeats is usually achieved through ablation, which involves inserting a catheter into a patient's cardiac chambers to remove the problematic nerves that are causing the arrhythmia. When done by human hands, the process can be imprecise and increase the risk of damaging healthy tissue. Furthermore, when physicians use the catheterization process, they rely on radiographic imaging, such as X-rays. This potentially exposes patients and staff to high doses of radiation. By contrast, the robots can remove the need for staff in the room when radiation is employed. The robots can also minimize, if not exclude, the use of hazardous imaging techniques to determine the target location in a patient's heart and in guiding the catheter. Because the robots use precise microsensors and safer imaging methods and measurements to locate unhealthy tissue faster, the procedure increases in efficiency and safety, leading to patients' improved recovery times. The development of such dexterous robots by professor Janabi-Sharifi's team has opened the door to many new minimally invasive procedures such as robot-assisted, catheter-based, mitral valve treatments.
A world leader in opto-mechatronics, professor Janabi-Sharifi also programs the robots using vision-based control systems, allowing the robots to learn by seeing. "They are not just mimicking, but optimizing," he said, explaining that the robots will omit movements that are not conducive to the smooth achievement of the procedure, like the way human hands might inadvertently jerk or shake. "We do not necessarily do tasks optimally, but robots can."
However, robots have more to contribute to healthcare than just their precision in the operating room.
Professor Zeller, known for her research in human-robot interactions and for the highly publicized hitchBOT project, is working to create robots that can enhance communication in healthcare. By applying the lessons learned through hitchBOT, such as how robots can inspire trust and engage people in unique ways, professor Zeller is partnering with researchers at McMaster University − including professor David Harris Smith (Communication Studies and Multimedia), Dr. Hermênio Lima (Michael G. DeGroote School of Medicine) and professor Sue Becker (Psychology, Neuroscience and Behaviour). Together, the team aims to develop human-robot interaction that will both provide information to patients as well as encourage behavioural change, specifically around dermatology patients' use of sunscreen. With help from McMaster computer engineering postdoctoral fellow Naby Nikookaran and Lauren Dwyer of Ryerson and York University's joint doctoral program in communication and culture, they are programming Pepper, a socially assistive robot from SoftBank Robotics, to provide patients with an approachable information interface. Additionally, an app is being created so users can take Pepper home with them once they leave the doctor's office, to continue that dialogue and access to information, and to track and monitor a patient's skin and healthcare regimens. "We don't want to replicate the doctor-patient relationship," said professor Zeller. "We want the robot to be more of a trusted friend or an acquaintance."
In the first phase of their research, professor Zeller's team conducted user testing. A partnership with IBM has given the team access to Watson, the computing company's advanced artificial intelligence and natural language processing platform, to allow Pepper to interact with study participants. According to professor Zeller, Watson coding was chosen for its ability to enable Pepper to process a patient's emotion and express empathy in its dialogue with humans.
Preliminary responses from the study are encouraging; users enjoy engaging with Pepper and feel comfort communicating with it. Currently, there are two Peppers being trained to interact with patients in a dermatology clinic in an effort to enhance communication and reduce information overload experienced by patients. "The question now becomes: how can we leverage this really positive impression of Pepper toward changing patient behaviour sustainably," said professor Zeller.
While they continue to fine-tune Pepper's responses, the team is anticipating a clinical trial next year in a clinical setting with project partner McMaster Dermatology Centre. Professor Zeller anticipates Pepper will be in trial sometime next year.
Professor Farrokh Janabi-Sharifi's research has received support from the Natural Sciences and Engineering Research Council of Canada, Ryerson University, the Ontario Centres of Excellence and Mitacs.
Professor Frauke Zeller's research is funded by SOSCIP, Ryerson's Health Research Fund, and the Faculty of Communication and Design. The use of the Watson interface is provided by IBM.