Getting a grip on model-based design
Few of us are aware of the complex interactions between neural, mechanical, and sensory systems required to do something as simple as picking up a ball. To create a prosthetic arm capable of natural movement, it is necessary to mimic these sophisticated systems — as well as the intricate interactions between them — by using cutting-edge actuators, sensors, microprocessors, and embedded control software. This was the challenge set forth by the Defense Advanced Research Projects Agency (DARPA) Revolutionizing Prosthetics program.
To answer the call, Johns Hopkins University Applied Physics Laboratory (APL) is leading an international team of government agencies, universities, and private companies working to develop a prosthetic arm that exceeds today's devices. Their arm's final version will have control algorithms driven by neural inputs that will enable the user to move with the speed, dexterity, and force of a real arm. Advanced sensory feedback technologies will allow perception of physical inputs, such as pressure, force, and temperature.
Central to the project is a Virtual Integration Environment (VIE), a complete limb simulation environment built using tools including model-based design from The MathWorks Inc., Natick, Mass. The environment's standardized architecture and interfaces are enabling collaboration among experts at more than two dozen partner organizations.
Virtual Integration Environment architecture
The VIE architecture consists of five main modules: Input, Signal Analysis, Controls, Plant, and Presentation. The Input module comprises all the input devices that patients can use to signal their intent, including surface electromyograms (EMGs), cortical and peripheral nerve implants, implantable myoelectric sensors (IMESs), and more conventional digital and analog inputs for switches, joysticks, and other controls used by clinicians. The Signal Analysis module performs signal processing and filtering. More important, this module applies pattern recognition algorithms that interpret raw input signals to extract the user's intent and communicate that intent to the Controls module.
In the Controls module, commands are mapped to motor signals, which control individual motors that actuate the limb, hand, and fingers. The Plant module consists of a physical model of the limb's mechanics and the Presentation module produces a three-dimensional rendering of the arm's movement.
Nervous system interface
The VIE-based prosthetic limb interfaces with the human nervous system to allow natural and intuitive control. In ongoing testing, researchers record data from neural device implants while the subjects perform tasks such as reaching for a ball in the virtual environment.
The VIE modular input systems receive this data, and MATLAB algorithms decode the subject's intent by using pattern recognition to correlate neural activity with the subject's movement. Results are integrated back into the VIE, where experiments can be run in realtime.
In fact, the same workflow has been used to develop input devices of all kinds, some of which are currently being tested by prosthetic limb users at the Rehabilitation Institute of Chicago.
Realtime controllers, clinical applications
The Signal Analysis and Controls modules of the VIE form the heart of the control system that will be deployed in the prosthetic arm's final iteration. Software for these modules was developed at APL, with individual algorithms built using MATLAB. Researchers built and tested a virtual prototype system before committing to a specific hardware platform. A bonus: Because code was generated from a system model that already had been safety-tested and verified through simulation, there was no need for hand coding, which could have introduced errors. The development team is now confident that the Modular Prosthetic Limb will perform as intended.
Due to the virtual system framework, researchers also developed a clinical environment for system configuration and training. Clinicians will be able to configure parameters in the VIE and manage test sessions with volunteer subjects using a GUI that was created in MATLAB. What's more, clinicians will be able to interact with the application in realtime, using a host PC that communicates with the system running the control software, while a third PC will be used for 3D rendering and display of the virtual limb. By using model-based design, the team was able to deliver Proto 1, Proto 2, and the first version of the VIE ahead of schedule. They are now developing a detailed design of the Modular Prosthetic Limb, the final version that will be presented to DARPA.
Approved for Public Release, Distribution Unlimited.
Mimicking nature on a deadline
Developing a mechatronic system that replicates natural motion and preparing it for clinical trials in four years (a DARPA mandate) requires breakthroughs in neural control, sensory input, advanced mechanics and actuators, and prosthesis design. That's because today's most advanced prosthetic arms typically have just three active degrees of freedom (DOF): elbow flex/extend, wrist rotate, and grip open/close. Proto 1, the development team's first prototype, added five more DOF, including two active DOF at the shoulder (flexion/extention and internal/external rotation), wrist flexion/extention, and additional hand grips. However, to emulate natural movement, further advances were necessary.
The second prototype, Proto 2, featured more than 22 DOF, including additional side-to-side movements at the shoulder (abduction/adduction), wrist (radial/unlar deviation), and independent articulation of the fingers. The hand can also be commanded into multiple, highly functional coordinated grasps. The Modular Prosthetic Limb — the final version in the works now — will include 27 DOF, as well as the ability to sense temperature, contact, pressure, and vibration.
Resources
Johns Hopkins University Applied Physics Laboratory www.jhuapl.edu
Model-based design www.mathworks.com/nn9/mbd