Robotics, Augmented Reality & Devices
Malone researchers develop and deploy technology and devices to provide the critical interface between the “virtual reality” of information about patients and the “physical reality” of patients and caregivers. One major area of interest is medical robotics, particularly surgical and rehabilitation robots.
PI: Jing Xu
Recent studies show that hand dexterity and strength recover mostly within the first three months after stroke, and that these two critical components of hand function are supported by separate biological systems. However, in most stroke patients, dexterous hand function does not fully recover with the standard rehabilitation therapy. These findings strongly suggest the need for intense rehabilitation targeting hand dexterity in the early post-stroke stages. Currently, the stroke rehabilitation field is missing effective tools to meet this need. Researchers from the Malone Center and the Brain, Learning, Animation, and Movement Laboratory (BLAM) are conducting pilot studies on the Hand Articulation Neurotraining Device (HAND), a portable rehabilitation device for the hand, which can be used in various clinical settings, starting within the hospital, immediately after brain injury, and after discharge, in the patient’s home.
PIs: Greg Hager, Anand Malpani
Surgery is a prototype for contexts in which human learning can be significantly transformed with technology. Besides transforming patient care, technological advances are now beginning to play an increasingly central role in graduate surgical education. Several factors are driving this transformation, including the large number of procedures in which trainees must gain competency, and learning constraints in the operating room (OR) due to patient safety concerns, variable teaching opportunities, and resource limitations. Virtual Reality (VR) simulation is an exemplar technology that enables skill acquisition in humans outside of real scenarios like the OR, but to a limited extent. We hypothesize that augmenting technology with human intelligence in a coaching paradigm will transform its role in how surgeons acquire technical skill.
Surgical coaching by an experienced surgeon is effective for imparting technical skill. Lack of resources required to replicate expert coaching at scale is a critical barrier to designing effective technical skill training curricula that minimize time to OR readiness.
Automating coaching in VR provides a scalable solution to replicate expert feedback in the training laboratory. We hypothesize that augmenting automated coaching with human insights can more effectively replicate expert coaching at scale. To this end, we will draw upon methodologies from peer learning in adults, machine learning, and data science, and ground them in the context of a basic surgical skills training curriculum for novices.
We will assemble a cohort of novices to serve as peers and capture a corpus of structured feedback from them on others’ performances of the study task. We will structure the feedback to enable subsequent mapping to task performances. We will use machine learning approaches to learn a mapping between data captured during a performance and feedback. We will develop a trainee-centered, clear, concise, efficient and effective score card to deliver the coaching interventions per principles of universal design for learning. Finally, we will conduct a randomized controlled trial to determine the effectiveness of the VC augmented with automated feedback.
Using surgical technical skills as the test-bed, our project’s main innovation will be to bring in machine intelligence to enable technology to play an “active” role in the human learning of complex skilled activities.
PI: Jeremy Brown
Substantial progress has been made in the control of myoelectrically driven upper-limb prosthetics, which has improved the quality of life for many amputees. However, one of the biggest complaints that amputees still have about these commercially available prosthetics is the lack of tactile feedback. Tactile feedback is crucial not only for highly dexterous tasks, but also for basic manipulation of objects in daily life.
In our current prosthetics study, we are investigating the effect of haptic feedback on the operation efficacy of an upper-limb myoelectric prosthesis in able-bodied subjects. We have developed a force feedback device that can relay the grip force or aperture position of the terminal device on the prosthetic arm in a similar manner to body-powered prosthetics. We will compare this type of feedback to several other conditions such as vibrotactile feedback, no feedback, and the natural hand in a variety of object manipulation and discrimination tasks. The results of this study will help elucidate better tactile feedback mechanisms for myoelectric prostheses and provide enhanced understanding of how users integrate haptic feedback to control such prostheses.
News | Robotics, Augmented reality, and Devices
Mechanical engineering PhD student Sergio Machaca is investigating how haptic feedback can improve robotic surgery training.
Ajaykumar, a PhD student in computer science, is investigating the potential of the first-person viewpoint for human-robot interaction.
Department of Computer Science
Chien-Ming Huang’ research focuses on building intuitive, interactive technologies to provide social, physical, and behavioral support for people. He is particularly passionate about using novel technologies to help special needs populations such as children with autism spectrum disorders. Dr. Huang completed his postdoctoral research at …