Ann Johnson, a teacher, volleyball coach, and mother from Regina, Saskatchewan, suffered a debilitating stroke in 2005 that left her unable to speak. At her wedding reception 20 years prior, Johnson's gift for speech was evident as she delivered a lively 15-minute toast. However, just two years later, she experienced a catastrophic stroke that paralyzed her and robbed her of her ability to communicate verbally. However, a recent groundbreaking study has shown remarkable progress in helping her, and others in similar situations, regain their ability to speak. In this milestone achievement combining neuroscience and artificial intelligence, researchers implanted electrodes into Johnson's brain to decode her neural signals as she silently attempted to speak sentences. These signals were then converted into written and vocalized language, allowing an avatar on a computer screen to articulate her words and display various facial expressions. The study, published in the journal Nature, represents the first instance of spoken words and facial expressions being synthesized directly from brain signals. Johnson personally selected the avatar, which bore a resemblance to her, and researchers used her wedding toast as the basis for developing the avatar's voice. The ultimate goal of this research is to help individuals who have lost their ability to speak due to conditions like strokes or cerebral palsy. Currently, Johnson's implant is connected to a computer via a cable, but researchers are actively working on developing wireless versions. They envision a future where individuals who have lost their ability to speak will be able to engage in real-time conversations using computerized representations of themselves that convey tone, inflection, and emotions. The field of research in this area is progressing rapidly, as evidenced by Johnson's story. Just two years ago, the same team published research showing how a paralyzed man named Pancho was able to produce basic words using a simpler implant and algorithm. By comparison, Johnson's implant, with nearly double the number of electrodes, enables a much broader range of speech-related neural signals to be detected. The sophisticated artificial intelligence used in the study was trained to recognize phonemes, which are sound units that can form any word, rather than individual words themselves. Johnson's implant allowed her to communicate at a rate of 78 words per minute, using a significantly larger vocabulary compared to previous studies. While this rate is still below the typical conversational speech rate of 160 words per minute, the progress is remarkable. Initially, the researchers did not anticipate using an avatar or audio in Johnson's case, but the promising results encouraged them to explore more challenging aspects of communication. They developed an algorithm to decode brain activity into audio waveforms, effectively producing vocalized speech. The research team also worked with a company specializing in facial animation to program the avatar with data on muscle movements. Johnson then practiced making various facial expressions, which were captured and conveyed through the avatar.
Through this groundbreaking technology, Johnson was able to communicate phrases like "I think you are wonderful" and "What do you think of my artificial voice?" with her husband, and engage in conversations about various topics. The rapid progress in this field has experts speculating that wireless versions of these communication systems could obtain federal approval within the next decade. Different approaches will likely be optimized for specific patients. For instance, another study published in Nature involved the use of electrodes implanted deeper in the brain to detect individual neuronal activity. This approach achieved a decoding rate of 62 words per minute for the participant, who had amyotrophic lateral sclerosis (ALS). While neither approach was completely accurate, with some decoding errors occurring in approximately 25% of the cases, both studies showed significant potential. Participants were generally able to interpret the facial expressions conveyed by the avatar, although interpreting the spoken words proved more challenging. Researchers are actively working on improving the accuracy of word recognition through the development of prediction algorithms. It's important to note that these systems are not reading people's minds or thoughts. They rely on interpreting neural signals to predict and generate speech, similar to a baseball batter interpreting a pitcher's movements to anticipate pitches. However, researchers acknowledge that the possibility of mind reading exists in the future, which would raise ethical and privacy concerns. Johnson's path to participating in this groundbreaking research began when she contacted Dr. Edward Chang's team after reading an article about the team's work with Pancho. Despite living far from the lab in San Francisco, Johnson's persistence paid off, and she was able to join the study. Her determination has always been a defining characteristic, as evident from her personal and professional life. Even after her stroke, she pursued opportunities like taking counseling courses online to help trauma survivors. Johnson's ability to communicate has been severely limited since her stroke, relying on assistive systems like reflective dot glasses to select letters and words on a computer screen. However, with the help of this technology, she has been able to regain some expressive capabilities and feels a renewed sense of purpose. The research continues to push boundaries, offering hope for individuals like Johnson to regain their ability to communicate fully. As advancements continue to be made in the field, experts are confident that wireless versions of these communication systems will become available within the next decade. However, each patient's needs must be carefully considered, and further research is necessary to optimize these approaches for long-term use. The ultimate goal is to restore a sense of identity and enable individuals to engage in meaningful conversations once again.
None
Congressional Democrats are expressing serious concern over the possibility that the U.S. may soon begin selling advanced chips to one of its foremost geopolitical rivals.
Tod Palmer, a KSHB 41 reporter covering sports business and eastern Jackson County, learned about this significant project through his beat covering the Independence City Council.
The deployment of artificial intelligence (AI) in video surveillance has become a critical topic among policymakers, technology experts, civil rights advocates, and the public.
You probably won’t need to remember the name Incention for long, as it’s unlikely to come to mind again after this.
The year 2025 proved to be turbulent for marketers, as macroeconomic shifts, technological advancements, and cultural influences dramatically transformed the industry.
AI-powered SEO companies are expected to become increasingly important in 2026, driving higher engagement rates and improved conversions.
Advancements in artificial intelligence are transforming how video content is compressed and streamed, delivering substantial improvements in video quality and enhancing the viewer experience.
Launch your AI-powered team to automate Marketing, Sales & Growth
and get clients on autopilot — from social media and search engines. No ads needed
Begin getting your first leads today