Latest Breaking News & Top Headlines

Scientists read birds’ brain signals to predict what they’ll sing next

0

Signals in the brains of birds have been read by scientists, in a breakthrough that could help develop prosthetics for people who can no longer speak.

In the study, silicone implants recorded the firing of brain cells as male adult zebra finches went through their full repertoire of songs.

By feeding the brain signals through artificial intelligence, the team at the University of California San Diego was able to predict what the birds would sing next.

The breakthrough opens the door to new devices that can be used to convert the thoughts of people who cannot speak into real spoken words for the first time.

Current state-of-the-art implants allow the user to generate text at a rate of about 20 words per minute, but this technique can enable a completely natural ‘new voice’.

Co-author of the study, Timothy Gentner, said he envisioned a voice prosthesis for people without a voice that would allow them to communicate naturally with speech.

Illustration of the experimental workflow. As a male zebra finch sings his song – which consists of the sequence “1, 2, 3,” – he thinks of the next syllable he will sing (“4”)

In the study, silicone implants recorded the firing of brain cells as male adult zebra finches went through their full repertoire of songs.  stock image

In the study, silicone implants recorded the firing of brain cells as male adult zebra finches went through their full repertoire of songs. stock image

HOW THEY PREDICT BIRDS

Researchers implanted silicon electrodes into the brains of adult male zebra finches and recorded the birds’ neural activity as they sang.

They studied a specific set of electrical signals called local field potentials.

These signals were registered in the part of the brain necessary for learning and producing song.

They were known as “local field potentials” and found that they translate into specific syllables of the birdsong.

And predict when the syllables will occur while singing.

First author Daril Brown, a PhD student in computer engineering, said the bird brain work “paves the way for the greater cause” of giving the voiceless a voice.

‘We are studying birdsong in a way that helps us get one step closer to designing a brain machine interface for vocalization and communication.’

Birdsong and human speech have many characteristics in common, including the fact that both are learned behaviors and are more complex than other animal sounds.

With the signals coming from bird brains, the team focused on a series of electrical signals called “local field potentials.”

These are necessary for learning and producing songs.

They have already been heavily studied in humans and were used to predict the vocal behavior of zebra finches.

Co-project leader Professor Vikash Gilja said: ‘Our motivation for exploring local field potential has been the majority of the additional human work for the development of speech prostheses, targeting these types of signals.

‘In this paper we show that there are many similarities in this type of signaling between the zebra finch and humans, as well as other primates.

“With these signals, we can begin to decipher the brain’s intent to generate speech.”

Different features translated into specific ‘syllables’ of the birdsong – showing when they will occur – and enabling predictive algorithms.

“Using this system, we can predict the onset of a songbird’s vocal behavior — in what order the bird will sing and when it will sing,” explains Brown.

They even anticipated variations in the order of the numbers – down to the syllable.

Project co-leader Prof Timothy Gentner said: 'In the longer term, we want to use the detailed knowledge we gain from the songbird brain to develop a communication prosthesis that can improve the quality of life of people suffering from a variety of diseases and conditions.  '

Project co-leader Prof Timothy Gentner said: ‘In the longer term, we want to use the detailed knowledge we gain from the songbird brain to develop a communication prosthesis that can improve the quality of life of people suffering from a variety of diseases and conditions. ‘

Elon Musk Neuralink Lets A Monkey Play Pong With His MIND

Elon Musk’s Neuralink has shown off his latest brain implant by having a monkey play Pong with his mind.

The brain computer interface was implanted in a nine-year-old macaque named Pager.

The device in his brain recorded information about the neurons firing as he played the game.

Musk said on Twitter: “Soon our monkey will be on twitch & discord.”

Last month, the tech tycoon told a Twitter user that he was working with the US Food and Drug Administration to get approval to start human trials.

For example, it can be built on a repeating set of four – and turn into five or three every now and then. Changes in the signals brought them to light.

Suppose the bird’s song is built on a repeating set of syllables, “1, 2, 3, 4”, and every now and then the order can change to something like “1, 2, 3, 4, 5″ or ” 1, 2, 3.”

Features in the local field potentials reveal these changes, the researchers found.

‘These forms of variation are important for us to test hypothetical speech prostheses, because a person does not repeat just one sentence at a time’, says Prof. Gilja.

“It’s exciting that we have found parallels in the brain signals recorded and documented in human physiological studies with our study in songbirds.”

Conditions associated with loss of speech or language functions range from head injuries to dementia and brain tumors.

Project co-leader Prof Timothy Gentner said: ‘In the longer term, we want to use the detailed knowledge we gain from the songbird brain to develop a communication prosthesis that can improve the quality of life of people suffering from a variety of diseases and conditions. .’

SpaceX founder Elon Musk and Facebook CEO Mark Zuckerberg are currently working on brain reading devices that can send texts through thoughts.

The study is in PLoS computational biology.

HOW TO LEARN ARTIFICIAL INTELLIGENCE NEURAL NETWORKS?

AI systems rely on artificial neural networks (ANNs), which attempt to simulate the way the brain works for learning.

ANNs can be trained to recognize patterns in information — including speech, text data, or visual images — and are the foundation for many advances in AI over the years.

Conventional AI uses input to “learn” an algorithm about a particular topic by giving it massive amounts of information.

AI systems rely on artificial neural networks (ANNs), which attempt to simulate the way the brain works for learning.  ANNs can be trained to recognize patterns in information - including speech, text data or visual images

AI systems rely on artificial neural networks (ANNs), which attempt to simulate the way the brain works for learning. ANNs can be trained to recognize patterns in information – including speech, text data or visual images

Practical applications include Google’s translation services, Facebook’s facial recognition software, and Snapchat’s image-changing live filters.

The process of entering this data can be extremely time consuming and limited to one type of knowledge.

A new breed of ANNs called Adversarial Neural Networks pit the minds of two AI bots against each other, allowing them to learn from each other.

This approach is designed to accelerate the learning process and refine the output of AI systems.

.

Leave A Reply

Your email address will not be published.