ML & FSM help amputees walk more naturally

Researchers from the University of Utah designed a robotic leg that learns from the user’s motion how to help amputees walk more naturally.  It uses machine learning to generate a human-like stride. It also helps wearers step over obstacles in a natural way. Link  Rather than trying to recognize obstacles in the user’s path, the prosthesis relies on cues from the user’s body to tell it when something is in the way. Sensors in the user’s hip feed data a thousand times per second into a processing unit located in the unit’s calf. For instance, the way a user rotates their hip might tell the leg to tuck its knee to avoid tripping over an obstacle.

  • A finite state machine (FSM) determines when and how to flex the knee based on angles of the ankle and thigh and the weight on the prosthetic foot.
  • A second model called the minimum-jerk planner kicks in when the angle and speed of the artificial limb reach a certain point. It works to minimize sharp, sudden actions.
  • The prosthesis applies reinforcement learning to adjust its motion as the user walks, using smoothness as the cost function.

(courtesy of Andrew Ng)

This entry was posted in Artificial Intelligence, Machine Learning, State Machines. Bookmark the permalink.

Leave a comment