Ann Johnson, a teacher, volleyball coach, and mother from Regina, Saskatchewan, suffered a debilitating stroke in 2005 that left her unable to speak. At her wedding reception 20 years prior, Johnson's gift for speech was evident as she delivered a lively 15-minute toast. However, just two years later, she experienced a catastrophic stroke that paralyzed her and robbed her of her ability to communicate verbally. However, a recent groundbreaking study has shown remarkable progress in helping her, and others in similar situations, regain their ability to speak. In this milestone achievement combining neuroscience and artificial intelligence, researchers implanted electrodes into Johnson's brain to decode her neural signals as she silently attempted to speak sentences. These signals were then converted into written and vocalized language, allowing an avatar on a computer screen to articulate her words and display various facial expressions. The study, published in the journal Nature, represents the first instance of spoken words and facial expressions being synthesized directly from brain signals. Johnson personally selected the avatar, which bore a resemblance to her, and researchers used her wedding toast as the basis for developing the avatar's voice. The ultimate goal of this research is to help individuals who have lost their ability to speak due to conditions like strokes or cerebral palsy. Currently, Johnson's implant is connected to a computer via a cable, but researchers are actively working on developing wireless versions. They envision a future where individuals who have lost their ability to speak will be able to engage in real-time conversations using computerized representations of themselves that convey tone, inflection, and emotions. The field of research in this area is progressing rapidly, as evidenced by Johnson's story. Just two years ago, the same team published research showing how a paralyzed man named Pancho was able to produce basic words using a simpler implant and algorithm. By comparison, Johnson's implant, with nearly double the number of electrodes, enables a much broader range of speech-related neural signals to be detected. The sophisticated artificial intelligence used in the study was trained to recognize phonemes, which are sound units that can form any word, rather than individual words themselves. Johnson's implant allowed her to communicate at a rate of 78 words per minute, using a significantly larger vocabulary compared to previous studies. While this rate is still below the typical conversational speech rate of 160 words per minute, the progress is remarkable. Initially, the researchers did not anticipate using an avatar or audio in Johnson's case, but the promising results encouraged them to explore more challenging aspects of communication. They developed an algorithm to decode brain activity into audio waveforms, effectively producing vocalized speech. The research team also worked with a company specializing in facial animation to program the avatar with data on muscle movements. Johnson then practiced making various facial expressions, which were captured and conveyed through the avatar.
Through this groundbreaking technology, Johnson was able to communicate phrases like "I think you are wonderful" and "What do you think of my artificial voice?" with her husband, and engage in conversations about various topics. The rapid progress in this field has experts speculating that wireless versions of these communication systems could obtain federal approval within the next decade. Different approaches will likely be optimized for specific patients. For instance, another study published in Nature involved the use of electrodes implanted deeper in the brain to detect individual neuronal activity. This approach achieved a decoding rate of 62 words per minute for the participant, who had amyotrophic lateral sclerosis (ALS). While neither approach was completely accurate, with some decoding errors occurring in approximately 25% of the cases, both studies showed significant potential. Participants were generally able to interpret the facial expressions conveyed by the avatar, although interpreting the spoken words proved more challenging. Researchers are actively working on improving the accuracy of word recognition through the development of prediction algorithms. It's important to note that these systems are not reading people's minds or thoughts. They rely on interpreting neural signals to predict and generate speech, similar to a baseball batter interpreting a pitcher's movements to anticipate pitches. However, researchers acknowledge that the possibility of mind reading exists in the future, which would raise ethical and privacy concerns. Johnson's path to participating in this groundbreaking research began when she contacted Dr. Edward Chang's team after reading an article about the team's work with Pancho. Despite living far from the lab in San Francisco, Johnson's persistence paid off, and she was able to join the study. Her determination has always been a defining characteristic, as evident from her personal and professional life. Even after her stroke, she pursued opportunities like taking counseling courses online to help trauma survivors. Johnson's ability to communicate has been severely limited since her stroke, relying on assistive systems like reflective dot glasses to select letters and words on a computer screen. However, with the help of this technology, she has been able to regain some expressive capabilities and feels a renewed sense of purpose. The research continues to push boundaries, offering hope for individuals like Johnson to regain their ability to communicate fully. As advancements continue to be made in the field, experts are confident that wireless versions of these communication systems will become available within the next decade. However, each patient's needs must be carefully considered, and further research is necessary to optimize these approaches for long-term use. The ultimate goal is to restore a sense of identity and enable individuals to engage in meaningful conversations once again.
None
Google's DeepMind, a prominent artificial intelligence research lab, has introduced a groundbreaking AI system called AlphaCode that demonstrates the capability to write computer code at a level comparable to human programmers.
During a House Committee on Foreign Affairs hearing today, witnesses cautioned lawmakers that permitting China to buy advanced U.S. artificial intelligence (AI) chips would pose significant national security threats.
Hospital issues warning after fake videos claiming doctor endorsements 1 day ago Victoria Cook, London A hospital trust in south London has raised an alert following the circulation of fraudulent videos online falsely claiming that its staff endorse weight loss products
Vista Social, a leading social media management platform, has announced a significant advancement by integrating ChatGPT technology to greatly enhance content creation and user engagement capabilities.
Analyst’s Disclosure: I/we do not hold any stock, option, or similar derivative positions in the companies mentioned, nor do I/we intend to initiate such positions within the next 72 hours.
Incorporating artificial intelligence (AI) into your search engine optimization (SEO) strategy can greatly boost your website’s performance and improve its search engine rankings.
A 25-person marketing team rapidly expanded by adding over 100 AI teammates within six months, revolutionizing their operations.
Launch your AI-powered team to automate Marketing, Sales & Growth
and get clients on autopilot — from social media and search engines. No ads needed
Begin getting your first leads today