Of all the remarkable things engineers do for humanity, none may be more important than the ways in which they improve our resiliency, keeping us safe from the many potential harms the world has in store. Among the Whiting School faculty, there is no shortage of engineers striving to make the world a safer place. In fields ranging from surgery to structural engineering, their approaches are as creative as they are promising.
For the many thousands of people with epilepsy who do not respond to medication, life is a roller coaster in which any moment could bring another seizure. Recently, medical science has begun to look deeper into areas in the brain—known as epileptic foci—where seizures are believed to originate.
Sometimes, these foci are caused by structural malformations. Other times, a tumor produces the seizure. And yet, for other people, there is no known physical explanation, but seizures continue to emerge from these locations nonetheless.
“The foci are the epicenters of epilepsy—the heart of the earthquake—but they are not always easy to delineate,” says Archana Venkataraman, the John C. Malone Assistant Professor of Electrical and Computer Engineering and a member of the Malone Center for Engineering in Healthcare, who studies these elusive phenomena. “We think engineering can help in that definition.”
The good news is that epileptic foci can be surgically removed to reduce or eliminate seizures. But, like any brain surgery, accuracy is paramount. Removing too much tissue can cause severe consequences. Removing too little tissue subjects the patient to serious surgery with little or no benefit.
According to Venkataraman, current methods to identify epileptic foci are based on the eye and the instinct of neurologists and radiologists, a process that is time consuming, requires years of training, and is prone to human error. So she’s employing machine learning, a branch of artificial intelligence, to automatically pinpoint these epileptic foci by using brain imaging and electrical monitoring technology—magnetic resonance imaging and electroencephalography, in particular.
In essence, these techniques offer moment-by-moment snapshots of the brain’s physical and electrical activity, like a stop-motion film. Using the method Venkataraman developed, computers examine and compare those snapshots and train themselves to spot patterns that are not always apparent, even to experts. With this information, she triangulates the precise location and size of epileptic foci.
Best of all, her approach is noninvasive, reducing risk for the patient, while promising greatly improved surgical precision. Venkatraman and graduate student Jeff Craley have developed a prototype seizure localization algorithm for EEG data acquired in the clinic; they are now working to make it more reliable for all epilepsy patients. Venkatraman is also seeking additional data and funding to take her work to the next level, where it will help real patients in real need.
“The most rewarding part of this project is that I can develop cutting-edge engineering tools to directly impact people’s lives,” she says.
QUANTIFYING SURGICAL EXPERTISE
The last two decades have witnessed the ascent of medical procedures in which the surgeon’s hand is guided, at least in part, by robots and computer algorithms. Many times, the surgeon does not operate directly on the patient, but while sitting across the operating room peering into sophisticated video monitors and manipulating remote-controlled instruments that tell the robot what to do. Such advances have made certain surgeries far less invasive and have reduced the chance of human error.
While these benefits are noteworthy in their own right, this technical evolution has yielded an unexpected upside: a profusion of data about the surgeons themselves.
“Each incision, every suture is recorded to an incredible degree of accuracy,” says Greg Hager, the Mandell Bellmore Professor of Computer Science and director of the Malone Center for Engineering in Healthcare.
About 15 years ago, Hager decided to put that data to good use to quantify the techniques of master surgeons in order to help students gain expert skills. Hager counts among his collaborators many fellow engineers, as well as surgeons and biostatisticians.
Hager has discovered that surgery is not a collection of grand and complex motions, but rather a series of discrete, definable, and—most importantly—teachable smaller submovements. What’s more, he has now cataloged all these smaller movements into a sort of dictionary of surgery. He calls these submovements dexemes—a term related to the linguists’ word “phoneme” that describe the discrete sounds used in spoken language. Instead of a dictionary of sound, Hager created a dictionary of motion.
Those mathematical representations can be used in training devices for future surgeons to practice on without risk to patients: They can hone their skills before taking on the challenge of real surgery.
In recent years, Hager has moved beyond data-intensive robotic surgery into surgeries that are not so quantifiable, namely the scads of video images captured during endoscopic surgeries.
A decade and a half into his pursuit, Hager is eager for the challenges that lie ahead. He’s working several ancillary projects, including surgical simulators that use machine learning to teach basic skills.
“These computers are true mentors,” he says. “I’m excited to bootstrap this and move it into the real world.”
SOFTWARE FOR HARD HATS
A computer scientist and inveterate entrepreneur, Anton “Tony” Dahbura ’81, PhD ’84, took note one day when his son shared a growing concern about workplace safety.
“My son, who owns a demolition company, said, ‘Dad, sooner or later someone’s going to get smushed.’ And I got to thinking how we might solve the problem,’” recalls Dahbura, executive director of the Johns Hopkins Information Security Institute.
The elder Dahbura borrowed a few off-the-shelf Bluetooth signaling devices—known as iBeacons—and mounted them atop workers’ hard hats. The beacons “ping” periodically with a radio signal unique to each hard hat. Dahbura then placed a network of Bluetooth receivers around the exterior of an excavator.
Next, the computer scientist in him took over. Dahbura recruited students in the Department of Electrical and Computer Engineering and in the Center for Leadership Education to help him write software that triangulates the positions of each hard hat around heavy equipment—such as excavators, cranes, and trucks—and plots them on a monitoring screen mounted in the equipment’s cab. If a worker gets too close, an alert sounds, and visual cues flash on screen. The system, dubbed Blindside, is like air traffic control for the work site.
In July 2018, Dahbura and two colleagues were awarded U.S. Patent 10,026,290 for Blindside. He has since developed kits to help heavy-equipment manufacturers retrofit existing equipment and is working with them to integrate Blindside into new vehicles. Recently, government regulators and insurance companies have taken note. Dahbura’s hope is that Blindside will become mandatory at all job sites.
“The financial consequences of a workplace accident can put a small company out of business,” Dahbura says. “But that’s nothing compared to the consequences of losing a life.”