Dolores

I developed the path planning algorithm at BU for a research project in the HyNeSs Robotics Lab. A technical writing example detailing the project is provided here, titled Visibility-based Pursuit-evasion Game with Sound Localization Heuristic. The planner was written in Matlab®, the firmware on the physical robot was coded using the online Mbed™ C++ compiler.

Here is a video of the robot finding a path.

The environment is projected onto the floor. The red line indicates a flattened path that was created based on the method presented in the paper, given only the outline of the free space in the environment. The green region represents the “visibility” from the mobile robot. The robot’s goal is to find a path that is guaranteed to locate an evader regardless of the evader’s path, in this case a static location in the “room” in the bottom right corner. Speakers on the right, represent sound from the evader. Of course there are no walls in a projected environment, so the speakers had to be placed where the loudest reverberations might come from. At forks in the path, the robot evaluates which branch to pursue based on where it determines the sound to be coming from. In the simple demonstration above, the path is updated twice.

Here is another video of the robot (with an earlier microphone array configuration) being controlled remotely based on the localization from the OptiTrack™ system in the lab. The projected black dot is all that’s seen on the controlling computer. This video was taken just to document the process by which the demonstration above was constructed and is not part of the path planning method. Here we can clearly see the robot’s visibility changing as it moves through the projected environment.

This is my co-worker, Baxter, and mission control.

 

Some students in the lab are teaching Baxter to use a toaster through machine learning techniques, without any predefined kinematics.