Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the jonradio-multiple-themes domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /var/www/vhosts/epic2017.gatech.edu/httpdocs/wp-includes/functions.php on line 6114

Deprecated: Automatic conversion of false to array is deprecated in /var/www/vhosts/epic2017.gatech.edu/httpdocs/wp-content/plugins/widgets-on-pages/admin/class-widgets-on-pages-admin.php on line 455
Projects – EPIC Lab +

Swing Optimization

This is the first big test for optimization. Using a virtual model that represents the prosthesis attached to a frame, I am running a stochastic search for these impedance parameters:

– a1: Knee equilibrium angle during swing flexion (normalized)

– a2: Knee equilibrium angle during swing extension (normalized)

The goal is to replicate intact swinging knee motion so that the device makes a movement that is close to the one found in typical human gait.
Hence, I made a cost function that penalizes when the trajectory is not similar to a goal trajectory.

Simulations are running in Gazebo and there is an extra cool fact: the results are sent to this website, so basically you can observe in real time how the optimization is doing. Probably when you read this, the optimization is going to be finished already so you will only observe the final results.

 

These are the current optimization results:

These plot shows how the cost function relates to different parameter values:

We can see that the process found that the best set of values are: a1~0.85 a2~0.21, that is angles of 68.9° and 20.9° respectively.

Without any constraints, except for the possible range of motion of the device, the simulation learned how to do a human-like swing flexion and swing extension.

What’s next?

This was a test to verify that the framework works, particularly if I was able to launch multiple simulations with different parameters, retrieve the results and compare them against some trajectory. These are basic steps in the optimization process.

In this case, the flexion result was coherent with our current controllers in the real device (we are using a 65° for this parameter), for the case of swing extension the result was not so similar, in our controllers we are extending the knee to 0°. The difference can be because I am fixing the final time of the simulation instead of finding the final time of extension for each simulation. That affects the time normalization of the results, penalizing solutions with smaller knee extentension angle because those take less time to reach 0.

The data extraction for this simulation could be improved to account for this. We could find the time in extension when the angle does not change anymore and define that as the final time per each simulation.

The next step is to define a walking goal function and learn how to walk!


A robot assistant can help with your bad habits

Last semester I took Dr. Chernova‘s “Human-Robot Interaction”. And got the chance to beta test this cute little robot:

Cozmo has a very simple concept of an OLED display for his face that is able to express emotions from animation of his eyes. The guys at Anki gave us a basic API in python to be able to communicate with the robot and trigger some acctions, modify the behaviour and getting access to the camera and other sensors like accelerometer and gyro.

I wanted to explore an emotion based approach for driving the actions of the  robot and how a robotic assistant could help to treat body focus repetitive behaviors. I defined a two axes temperament scale that defines the emotion of the robot according to the robot state in this x-y coordinate system.

Figure 5

I trained the robot to detect when a person is biting her nails and use that signal  as the input to a system dynamics that drives the robot emotions. Then, using the emotions to trigger actions from the robot I studied how our cute friend Cozmo could help for treating the nail biting behaviour. You can check out how this set up work in this video:

You can find more about this in the paper:

“Combined Strategy of Machine Vision with a Robotic Assistant for Nail Biting Prevention”