Machine learning gives us the big picture

Machine learning gives us the big picture

The thoughts of dogs are read! Kind of.

The researchers used fMRI (functional magnetic resonance imaging) scans of the dogs’ brains and a machine learning tool to reconstruct what the dog sees. The results suggest that dogs are more interested in what is going on than who or what is involved.

The results of the experiment conducted at Emory University in Georgia in the United States are published in the Viewed Experiments Log.

Two dogs not retained watched three 30-minute videos. Neural fMRI data was recorded and a machine learning algorithm used to analyze patterns in the scans.

“We’ve shown that we can monitor activity in a dog’s brain while it’s watching video and, at least to some extent, reconstruct what it’s watching,” says Gregory Berns, professor of psychology at Emory. “The fact that we are able to do this is remarkable.”

Read more: Cockatoos teach each other how to open trash cans and humans share strategies to stop them

The use of fMRI to study perception has recently been developed in humans and only a few other species, including some primates.

“Although our work is based on just two dogs, it offers proof of concept that these methods work on dogs,” says lead author Erin Phillips, from Scotland’s University of St. Andrews, who conducted the research. as a specialist in the Berns Canine Cognitive Neuroscience Laboratory. “I hope this article will help pave the way for other researchers to apply these methods to dogs, as well as other species, so that we can get more data and a better understanding of how the spirit of different animals.”

Interestingly, machine learning is a technology that aims to mimic the neural networks of our own brains by recognizing patterns and analyzing huge amounts of data.

The technology “reads minds” by detecting patterns in brain data that can be associated with what’s playing in the video.

By attaching a video recorder selfie stick placed at the dog’s eye level, researchers filmed relatable scenes for canine audiences.

Read more: The way we talk matters to animals

Activities recorded included dogs being petted and receiving treats from people.

Scenes with dogs showed them sniffing, playing, eating, or walking. Other objects and animals included in the scenes included cars, bicycles, scooters, cats and deer, as well as people sitting, hugging, kissing, offering a toy to the camera and eating .

Time stamps on the videos helped classify them into objects (like dog, car, human, cat) and actions (like sniffing, eating, walking).

Only two dogs showed the patience to sit through the feature. For comparison, two humans also underwent the same experience. Both species, presumably, were coaxed with treats and belly pats.

The Ivis machine learning algorithm was applied to the data. Ivis was first trained on human subjects and the model was 99% accurate in mapping brain data to object and action classifiers.

In the case of dogs, however, the model did not work for object-based classifiers. However, it was between 75 and 88% accurate in decoding action classifiers in fMRI scans of the dog.

Bhubo, shown with his owner Ashwin, getting ready for his video viewing session in an fMRI scanner. The dog’s ears are taped to hold earplugs that muffle the sound of the fMRI scanner. Credit: Emory Canine Cognitive Neuroscience Lab.

“We humans are very object-oriented,” Berns says. “There are 10 times more nouns than verbs in English because we have a particular obsession with naming objects. Dogs seem less concerned with who or what they see and more concerned with the action itself.

Dogs only see in shades of blue and yellow, but have a slightly higher density of visual receptors designed to detect movement.

“It makes perfect sense that dogs’ brains are first and foremost highly sensitive to actions,” Berns adds. “Animals need to be very concerned about what is going on in their environment to avoid being eaten or to watch out for animals they might want to hunt. Action and movement are essential.

Philips thinks understanding how animals perceive the world is important in his own research into how the reintroduction of predators to Mozambique may impact ecosystems.

“Historically, there hasn’t been much overlap between computing and ecology,” she says. “But machine learning is a growing field that is beginning to find broader applications, including in ecology.”

#Machine #learning #big #picture

Leave a Comment

Your email address will not be published.