• Subscribe

Robot dog leads the pack

Speed read
  • Training a guide dog for the visually impaired is expensive and time-consuming
  • Robotic dogs may be faster, cheaper, and without physical limitations
  • Student team’s prototype won first place in Intel Cup electronic design contest

According to Guide Dogs of America, a 16- to 18-month-old puppy will go through four to six months of training before it can become a guide dog. And that doesn’t consider the financial costs of training. 

Robot Rover. Designed for wheelchair users, this robotic dog prototype is programmed to recognize dangers and guide the visually impaired. Courtesy 3TV/CBS5.

A team of students and faculty from the Ira A. Fulton Schools of Engineering at Arizona State University (ASU) have developed a high-tech alternative. Their technologically savvy guide dog recently won a first-prize award at the 2018 Intel Cup Undergraduate Electronic Design Contest. Two ASU students traveled to Shanghai, China, in July for the competition.

“The main motivations for this problem are how long it currently takes to train guide dogs for use and the cost of doing so,” computer science senior Stephen Lockhart said. “Along with this are the dog’s physical limitations such as color blindness.”

<strong>Out of a job?</strong> Living guide dogs like Eevi require 4-6 months of expensive training. Could a voice controlled-robot dog perform the same tasks? Courtesy smerikal. <a href='https://creativecommons.org/licenses/by-sa/2.0/'>(CC BY-SA 2.0)</a>The team has developed a robotic guide dog for a motorized wheelchair for use by individuals who are visually impaired. Both prototypes are currently too small in physical scale to be used by people, but the project allows this concept to be easily demonstrated and tested for the future.

Many different systems were integrated for the guide dog to function properly. The guide dog uses many of the same communication protocols found in a smartphone, like Wi-Fi and Bluetooth.  

“In addition to these protocols, we had to develop an algorithm in order to get the systems to behave in the way we wanted them to,” said Richard Simpson, a computer systems engineering and engineering (robotics) major. “For example, when the dog sees a cone we needed to make certain that it would decide to move around the cone rather than plowing into it.”

The guide dog also is equipped with Amazon’s Alexa technology to understand verbal commands.

<strong>Blue ribbon.</strong> ASU students Richard Simpson and Stephen Lockhart and mentor Yinong Chen took first place at the 2018 Intel Cup in Shanghai for their prototype of a robotic guide dog. Courtesy Yinong Chen.“I believe using Alexa mostly came out of the necessity of needing an easy method of control that would work for a person who is blind,” Lockhart said. “We also demonstrated without the voice control since purely relying on voice commands in a noisy urban environment might prove more difficult than a set of buttons to press for commands.”  

For the guide dog to “see” its surroundings, a GoPro camera was strapped to its head.

“The images it sees are then passed onto the wheelchair’s computer for a combination of AI visual recognition and specific filters for things like a cone or stoplight,” Lockhart said. “This information is then sent to a laptop running our logic system in a program called VIPLE.”

While that information is sent to the computer, a user can speak a command to Alexa, which is received by the laptop and put into VIPLE (visual internet of things/robotics programming language environment), along with the visual information from the GoPro. Depending on what is “seen” and the command given, instructions are sent back to the wheelchair and on to the guide dog to process what movement to make.

Despite ultimately winning first place, the project was not without its difficulties.

“We faced challenges in having the vision be consistent and not misidentifying a red T-shirt as a red stoplight,” Lockhart said.

“Another major challenge area was the motions of the dog. It had great difficulty turning left for a long time, and its movements could change drastically depending on the surface it was on.”

<strong>At the sound of my voice.</strong> The robotic guide dog and motorized wheelchair are based on the integration of VIPLE, Alexa-based voice control, and machine learning. Courtesy Yinong Chen.Leading up to the event, team mentors Yinong Chen, a computer science and engineering senior lecturer, and Jinhui Zhu, a visiting scholar at ASU’s School of Computing, Informatics, and Decision Systems Engineering, were very optimistic about the project’s performance in the competition.

“I honestly didn’t quite know what to expect for the competition and was mostly more concerned with getting things working properly before worrying about our place in the competition,” Lockhart said.

“I don’t think I really gave it much thought until Shanghai, but Professor Chen’s confidence rubbed off on most of us, so we were aiming to at the very least match previous ASU teams in terms of standing.”

Lockhart and Simpson may have been the only two team members to travel to China, but the team’s five other student members and multiple team mentors were rooting for them from home.

Computer systems engineering majors Matthew Koltes and Tyler Pavkov, computer science majors Aubree Dagilis and Yichenglong Zhong, and Denis Liu, a high school junior from Corona Del Sol High School in Tempe, Arizona, rounded out the team.

The guide dog project is a part of Chen’s ongoing and future-oriented computer science education projects.

“These projects are based on the integration of VIPLE, Alexa-based voice control, and machine learning,” Chen said. “The guide dog and wheelchair in this Intel Cup project extend the types of the devices that VIPLE can control, and thus offer more platforms for teaching and research.”

Read the original article on ASU's site here.

Join the conversation

Do you have story ideas or something to contribute? Let us know!

Copyright © 2021 Science Node ™  |  Privacy Notice  |  Sitemap

Disclaimer: While Science Node ™ does its best to provide complete and up-to-date information, it does not warrant that the information is error-free and disclaims all liability with respect to results from the use of the information.


We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:
  1. You have to credit our authors.
  2. You have to credit ScienceNode.org — where possible include our logo with a link back to the original article.
  3. You can simply run the first few lines of the article and then add: “Read the full article on ScienceNode.org” containing a link back to the original article.
  4. The easiest way to get the article on your site is to embed the code below.