Advances in artificial intelligence are bringing the possibility of autonomous robotic surgery closer every day, according to a panel of Johns Hopkins University robotics experts.

In a recent virtual briefing for journalists, “What Every Reporter Should Know About AI: The Latest in Robotic Surgery,” researchers from the Malone Center for Engineering in Healthcare explored the fast-changing field of robotic surgery, exploring in particular how AI is being used to enhance robotic capabilities. The session launched a new monthly series on AI organized by the university’s Office of Communications.

“When we’re dealing with surgical robots, we really are dealing with complementary capabilities,” said medical robotics pioneer Russell Taylor, a John C. Malone Professor of Computer Science. “We have some common capabilities, but machines are good at things that we are maybe less good at, and the other way around. What we want our partnership to achieve is the best of both worlds.”

Taylor said the current paradigm in research is for physicians to use their expertise to inform the robot about the surgical plan, and then have the robot perform it precisely while following safety protocols.

“It’s like power steering in a car,” Taylor said. “The [human] surgeon and the robot both have the tool, but the robot is following the hand of the human.”

Axel Krieger, a medical roboticist and associate professor of mechanical engineering, said autonomous robots could fill a critical role: a shortage of surgeons.

“We have fewer surgeons available,” he said. “We also have an aging, growing society, and the caseload is projected to rise more than twofold in the next 10 years. So we really need more assistance to keep up with that rising caseload.”

Krieger said robotic surgery is becoming more autonomous as the technology advances.

“Physicians would take a more advisory role [in the future] and could intervene or take control if needed,” he said. “The goal is to reduce complications, democratize access to everyone for expert surgery, and alleviate that shortage of trained surgeons.”

Mathias Unberath, a John C. Malone Associate Professor of Computer Science, said, “The introduction of autonomous technology will result in changes that will affect pretty much everybody in the health care spectrum. It’s not simply about how we can build and enable the technology that is autonomous and can achieve and perform at the level that we need in order to make patients healthier. We also need to think about how the introduction of this type of technology changes the overall ecosystem that is health care.”

Jeremy Brown, a John C. Malone Assistant Professor of Mechanical Engineering who focuses on improving surgical expertise through robotics, explained the critical role of haptic feedback—the sense of touch and force sensation provided to the users of robotic systems.

“Moving the surgeon away from the patient has actually decreased the level of touch that they feel,” he said. “There’s sort of been this unified argument that we need to start including haptic feedback in robotic surgery, and it is starting to be the case that the robotic manufacturers are listening.”

Brown referred to Taylor’s car analogy and made one of his own, saying, “We didn’t jump from manual transmissions in our cars to robotaxis. Just like a surgical resident learns to do surgery, AI does as well.”

This article originally appeared on the Hub >>