Veronica Santos has been working to improve medical mechanical devices since she injured her elbow during a high school basketball game. Santos said the accident helped her realize the importance of research in upper limb prosthetics.

Santos and her team of student researchers at the UCLA Biomechatronics Lab designed the BairClaw in 2014 to help people with upper limb disabilities experience touch. The team is currently working to collect more data about the human sense of touch.

The team uses a method called artificial haptic intelligence, which uses mechanical touch sensors, to record data that can be translated and understood by a computer, Santos said. She added she hopes this will allow patients to use prosthetic arms as real ones.

“Imagine a machine that’s slightly removed from the human body,” she said. “These things have the same fundamental challenge – we speak totally different languages.”

The lab uses a method called machine learning that allows Santos and her team to record the general areas of the finger making contact with an object. They can also record the amount of pressure exerted during contact with the object, she added. The team then inputs the data collected into a formula that allows the computer to distinguish between what it can and cannot feel by looking at patterns.

“The goal of the machine learning systems is to keep the cognitive burden low on the human, so they can keep their mind and creativity on very high-level things,” she said.

The BairClaw, one of the team’s first projects, is modeled after the muscles of a human hand, she said. The small intrinsic muscles of the human hand are suitable for precise movements, but the muscles of the forearms are more durable and useful for the the outer layer of the robot’s fingers.

“Just like the muscles in your forearms transmit force to your fingers through your tendons, our model uses a high-strength fishing line to mimic motor rotation and pull a tendon to extend a finger,” said Randy Hellman, a graduate student from Arizona State University who helped develop the first BairClaw prototype.

She said the team also uses 3-D printed fingers designed to accommodate the joint angle sensors used in the original BairClaw model.

Santos spends hours in the lab mentoring students and going over individual projects to ensure every detail is in place, said Jimmy Wu, a second-year mechanical engineering student.

“She speaks with us about our career goals and makes sure that we are doing well outside the lab,” Wu said.

Hellman said being in the lab with Santos allows him to further his experience in programming while contributing to an important cause.

The lab is also working on a project that improves hand-eye coordination for disabled individuals, said Alireza Fathaliyan, a graduate student in mechanical engineering.

The team uses a motion capture system called Vicon to track hand motions and translate it into data that the computer can interpret, Fathaliyan said. Six motion cameras attach markers to the hand to record simple hand motions, and the eye tracker system tracks the motion of one’s pupil and determines exactly where an individual is looking.

Both the activity gathered from the motion sensors and the eye tracker system work together to study hand-eye coordination of an arm amputee, said Xiaoyun Wang, a graduate student in mechanical engineering.

Santos said working in the lab reminds her research can change people’s lives.

“Just getting a drink of water is a challenge for an amputee, and it’s important to understand how fortunate we are.” Santos said.

Leave a comment

Your email address will not be published. Required fields are marked *