Just when we think we have a handle on all the incredible ways that dogs enhance our lives and our understanding of the world, new work with dogs expands that sphere even further. Graduate student Kiana Ehsani at the University of Washington has a great collaborator named Kelp, an Alaskan Malamute, who is a key partner in her quest to create an artificial intelligence system that thinks like a dog.

The long-term goal is to produce a robot that is enough like a dog to perform many of the task that dogs are trained to do for humans. Though that may seem like a faraway dream, Ehsani’s research project is edging ever closer to that possibility.

Generally, the goal of the current research was to study and emulate the dog’s response to visual information. Specifically, the scientists wanted to be able to teach a machine to learn to act and plan like a dog based on visual information, which required modeling the future actions of the dog based on the images she has seen previously.

Dogs make decisions all the time based on what they see, whether it is a ball that is being tossed their way or a tree in the path that they must walk around. Vision is used for many tasks that are of interest in artificial intelligence such as facial recognition, object detection, object tracking, determining what objects be walked on, and route planning.

To develop the foundation database for the models, Ehsani and the team of scientists she leads attached a number of sensors to Kelp—on the head, torso and tail—a few hours a day to capture her movements as she went about her daily activities. A camera attached to her head recorded what was in her view during this time, part of which was spent indoors and part outside. Read more from thebark.com…

thumbnail courtesy of thebark.com