By scanning a person's brainwaves, an AI developed by Facebook's owner Meta may "hear" what another person is saying to them.
Here's what you need to know:
Meta Researchers Are Working on a New AI that Aims to Study Brainwaves
Meta announced on Aug. 31 that researchers working in its AI lab have developed artificial intelligence (AI) that, by analyzing brainwaves, can "hear" what people are hearing.
Researchers are developing a new look to comprehending what goes on in people's minds.
In the Meta AI study, according to a news story by Time.com, 169 healthy adults listened to stories and words read aloud while having various equipment (such electrodes stuck to their heads) monitor their brain activity.
In an effort to uncover patterns, researchers then input the data into an AI model. Based on the electrical and magnetic activity in participants' brains, they wanted the algorithm to "hear" or ascertain what the participants were listening to.
This AI software is currently only in its very early stage.
Read More: 'World's Biggest Offshore Wind Farm' Hornsea 2 Now Operating to Power Millions of UK Homes
Possible Uses of this AI Technology
Although Meta has no immediate intentions to turn this technology into a commercial product, it might be a first step toward reading a nonverbal person's thoughts to enable them to communicate more effectively.
According to Jean Remi King, a research scientist at the Facebook Artificial Intelligence Research (FAIR) Lab, this is particularly beneficial for those who suffer from a variety of conditions that essentially impair communication, such as traumatic brain injury and anoxia.
There is already existing technology with this application. While it is true that by placing an electrode on a patient's motor parts of the brain, researchers can decode activity and assist the patient in communicating with the outside world. However, inserting an electrode inside someone's brain is rather invasive.
Therefore, researchers at FAIR are trying to seek another method where scientists do not need to record this brain activity without probing the brain with electrodes, which requires surgery.
Challenges Associated With Meta's AI Research
Meta's AI research may sound complicated. Well, maybe because it really is (at least for now that the study is in its early stages).
Jean noted that one of the major challenges in this research is because they are coming from quite far from the brain, the signals that we detect as brain activity are very "noisy."
Unlike the current technology where an electrode is put on the brain to be able to study the brain waves, Meta's research does not require to actually open up someone's brain.
Because of this, the skull, and the skin might corrupt the signal that the researchers pick up from the brain. Jean noted that picking brainwaves with just a sensor requires "super advanced technology."
The second major issue is, well, pretty conceptual in nature, since it is very challenging to interpret brainwaves into an actual language.
In other words, even if the researchers at FAIR succeeded in eliminating the "noise" being captured by the sensor, the next major problem lies on how the scientists decode brain activities.