Speech-to-text capability is now baked into all modern computers. But what if you didn’t have to dictate to your computer? What if you could type just by thinking?
Silicon Valley startup Sabi is emerging from stealth with that goal. The company is developing a brain wearable that decodes a person’s internal speech into words on a computer screen. CEO Rahul Chhabra says its first product, a brain-reading beanie, will be available by the end of the year. The company is also designing a baseball cap version.
The technology is known as a brain-computer interface, or BCI, a device that provides a direct communication pathway between the brain and an external device. While many companies such as Elon Musk’s Neuralink are developing surgically implanted BCIs for people with severe motor disabilities, Sabi’s device could allow anyone to become a cyborg.
It’s not exactly Musk’s vision of the future, which involves implanted brain chips to allow humans to merge with AI. But venture capitalist Vinod Khosla, who was an early investor in OpenAI, says a noninvasive, wearable device is the only path to getting lots of people to use BCI technology.
“The biggest and baddest application of BCI is if you can talk to your computer by thinking about it,” says Khosla, founder of Khosla Ventures, one of Sabi’s investors. “If you’re going to have a billion people use BCI for access to their computers every day, it can’t be invasive.”
Sabi’s brain-reading hat relies on EEG, or electroencephalography, which uses metal disks placed on the scalp to record the brain’s electrical activity. Decoding imagined speech from EEG is already possible, but it’s currently limited to small sets of words or commands rather than continuous, natural speech.
Photograph: Courtesy of Sabi
The drawback of a wearable system is that the sensors have to listen to the brain through a layer of skin and bone, which dampens neural signals. Surgically implanted devices pick up much stronger signals because they sit so close to neurons. Sabi thinks the way to boost accuracy with a wearable is by massively scaling up the number of sensors in its device. Most EEG devices have a dozen to a few hundred sensors. Sabi’s cap will have anywhere from 70,000 to 100,000 miniature sensors.
“Given that high-density sensing, it pinpoints exactly what and where neural activity is happening. We use that information to get much more reliable data to decode what a person is thinking,” Chhabra says.
The company is aiming for an initial typing speed of 30 or so words per minute. That’s slower than most people type, but he says the speed will improve as users spend more time with the cap.
Speech-to-text capability is now baked into all modern computers. But what if you didn’t have to dictate to your computer? What if you could type just by thinking?
Silicon Valley startup Sabi is emerging from stealth with that goal. The company is developing a brain wearable that decodes a person’s internal speech into words on a computer screen. CEO Rahul Chhabra says its first product, a brain-reading beanie, will be available by the end of the year. The company is also designing a baseball cap version.
The technology is known as a brain-computer interface, or BCI, a device that provides a direct communication pathway between the brain and an external device. While many companies such as Elon Musk’s Neuralink are developing surgically implanted BCIs for people with severe motor disabilities, Sabi’s device could allow anyone to become a cyborg.
It’s not exactly Musk’s vision of the future, which involves implanted brain chips to allow humans to merge with AI. But venture capitalist Vinod Khosla, who was an early investor in OpenAI, says a noninvasive, wearable device is the only path to getting lots of people to use BCI technology.
“The biggest and baddest application of BCI is if you can talk to your computer by thinking about it,” says Khosla, founder of Khosla Ventures, one of Sabi’s investors. “If you’re going to have a billion people use BCI for access to their computers every day, it can’t be invasive.”
Sabi’s brain-reading hat relies on EEG, or electroencephalography, which uses metal disks placed on the scalp to record the brain’s electrical activity. Decoding imagined speech from EEG is already possible, but it’s currently limited to small sets of words or commands rather than continuous, natural speech.
Photograph: Courtesy of Sabi
The drawback of a wearable system is that the sensors have to listen to the brain through a layer of skin and bone, which dampens neural signals. Surgically implanted devices pick up much stronger signals because they sit so close to neurons. Sabi thinks the way to boost accuracy with a wearable is by massively scaling up the number of sensors in its device. Most EEG devices have a dozen to a few hundred sensors. Sabi’s cap will have anywhere from 70,000 to 100,000 miniature sensors.
“Given that high-density sensing, it pinpoints exactly what and where neural activity is happening. We use that information to get much more reliable data to decode what a person is thinking,” Chhabra says.
The company is aiming for an initial typing speed of 30 or so words per minute. That’s slower than most people type, but he says the speed will improve as users spend more time with the cap.
from the upcoming Masters of the Universe movie. Prince Adam (Nicholas Galitzine) has returned to Eternia, and he finds himself facing off with Trap-Jaw (Sam C. Wilson). Adam is trying to use some of the skills he learned back on Earth to mediate the tense situation, but the villain isn’t having any of it. He starts to beat the crap out of him when Teela (Camila Mendes) screams, “Use the sword!”
Yup. It was about to happen. The ultimate Masters of the Universe moment. A moment that, if handled incorrectly, could put a damper on everything around it. Adam touches the sword strapped to his back, and the second he touches it, he has a vision. It’s the Sorceress, played by Morena Baccarin. “Say the phrase,” the vision says to him, floating in the sky. Adam wasn’t expecting to see that and is a little shook by it. The vision returns. “By the power of Grayskull…” she says, trying to help.
And so Adam pulls out the sword and points it to the sky. “By the power of Grayskull,” he begins as the clouds above start to swirl. Lightning begins to crackle. It’s about to happen. “I have the power,” he then screams. Adam rises into the air. His clothes disappear, and his muscles start to build. Armor forms around him as the camera circles around in slow motion. Finally, he comes back to the ground, forever changed. He’s He-Man. And, right then, we get a POV shot of Adam looking at his abs. Yup. He’s changed, all right.
There’s more to the scene too, including a ton of action, but director Travis Knight handles the moment with absolute sincerity. He pushes it to the very edge of plausibility and fantasy and then acknowledges it for just a second. He knows this is wild. He knows it’s silly. But also, he doesn’t care. This is a freaking He-Man movie, and even though I’ve always been more of a tangential He-Man fan, watching this scene gave me chills. It’s that good.
That tone is apparent throughout the rest of the footage screened at the CinemaCon adjacent event, too. We saw Skeletor’s armies storming Eternos as King Randor, Queen Marlena, Prince Adam, and—yes—even Princess Adora try to escape. Ram-Man, Fisto, and Mekaneck are among the heroes fighting for them. Man at Arms (Idris Elba) promises to help the royals escape, kicking all sorts of ass on the way to the exit. That is, until he finds himself opposite Lock-Jaw and loses.
In this version of Masters of the Universe, the film begins with Skeletor (Jared Leto) actually winning. That’s why Prince Adam is sent away to Earth, to keep him safe. We even saw Skeletor’s speech to King Randor as the rest of the family escapes. He’s evil and weird, and it ends with a very awkward moment that feels perfectly Skeletor.
Basically, the footage was a very encouraging tease that Knight has found the balance to acknowledge the inherent weirdness of He-Man, with all the crazy creatures, rules, magic, muscles, and more, while also taking it seriously enough to make you feel for the characters. We can’t wait to see the rest.
from the upcoming Masters of the Universe movie. Prince Adam (Nicholas Galitzine) has returned to Eternia, and he finds himself facing off with Trap-Jaw (Sam C. Wilson). Adam is trying to use some of the skills he learned back on Earth to mediate the tense situation, but the villain isn’t having any of it. He starts to beat the crap out of him when Teela (Camila Mendes) screams, “Use the sword!”
Yup. It was about to happen. The ultimate Masters of the Universe moment. A moment that, if handled incorrectly, could put a damper on everything around it. Adam touches the sword strapped to his back, and the second he touches it, he has a vision. It’s the Sorceress, played by Morena Baccarin. “Say the phrase,” the vision says to him, floating in the sky. Adam wasn’t expecting to see that and is a little shook by it. The vision returns. “By the power of Grayskull…” she says, trying to help.
And so Adam pulls out the sword and points it to the sky. “By the power of Grayskull,” he begins as the clouds above start to swirl. Lightning begins to crackle. It’s about to happen. “I have the power,” he then screams. Adam rises into the air. His clothes disappear, and his muscles start to build. Armor forms around him as the camera circles around in slow motion. Finally, he comes back to the ground, forever changed. He’s He-Man. And, right then, we get a POV shot of Adam looking at his abs. Yup. He’s changed, all right.
There’s more to the scene too, including a ton of action, but director Travis Knight handles the moment with absolute sincerity. He pushes it to the very edge of plausibility and fantasy and then acknowledges it for just a second. He knows this is wild. He knows it’s silly. But also, he doesn’t care. This is a freaking He-Man movie, and even though I’ve always been more of a tangential He-Man fan, watching this scene gave me chills. It’s that good.
That tone is apparent throughout the rest of the footage screened at the CinemaCon adjacent event, too. We saw Skeletor’s armies storming Eternos as King Randor, Queen Marlena, Prince Adam, and—yes—even Princess Adora try to escape. Ram-Man, Fisto, and Mekaneck are among the heroes fighting for them. Man at Arms (Idris Elba) promises to help the royals escape, kicking all sorts of ass on the way to the exit. That is, until he finds himself opposite Lock-Jaw and loses.
In this version of Masters of the Universe, the film begins with Skeletor (Jared Leto) actually winning. That’s why Prince Adam is sent away to Earth, to keep him safe. We even saw Skeletor’s speech to King Randor as the rest of the family escapes. He’s evil and weird, and it ends with a very awkward moment that feels perfectly Skeletor.
Basically, the footage was a very encouraging tease that Knight has found the balance to acknowledge the inherent weirdness of He-Man, with all the crazy creatures, rules, magic, muscles, and more, while also taking it seriously enough to make you feel for the characters. We can’t wait to see the rest.
#HeMan #Transformation #Masters #Universe #Gave #ChillsCinemaCon,Masters of the Universe,Travis Knight">The He-Man Transformation in ‘Masters of the Universe’ Gave Us Chills
Now, I wasn’t expecting that. I was sitting in a theater watching a 20-minute series of clips from the upcoming Masters of the Universe movie. Prince Adam (Nicholas Galitzine) has returned to Eternia, and he finds himself facing off with Trap-Jaw (Sam C. Wilson). Adam is trying to use some of the skills he learned back on Earth to mediate the tense situation, but the villain isn’t having any of it. He starts to beat the crap out of him when Teela (Camila Mendes) screams, “Use the sword!”
Yup. It was about to happen. The ultimate Masters of the Universe moment. A moment that, if handled incorrectly, could put a damper on everything around it. Adam touches the sword strapped to his back, and the second he touches it, he has a vision. It’s the Sorceress, played by Morena Baccarin. “Say the phrase,” the vision says to him, floating in the sky. Adam wasn’t expecting to see that and is a little shook by it. The vision returns. “By the power of Grayskull…” she says, trying to help.
And so Adam pulls out the sword and points it to the sky. “By the power of Grayskull,” he begins as the clouds above start to swirl. Lightning begins to crackle. It’s about to happen. “I have the power,” he then screams. Adam rises into the air. His clothes disappear, and his muscles start to build. Armor forms around him as the camera circles around in slow motion. Finally, he comes back to the ground, forever changed. He’s He-Man. And, right then, we get a POV shot of Adam looking at his abs. Yup. He’s changed, all right.
There’s more to the scene too, including a ton of action, but director Travis Knight handles the moment with absolute sincerity. He pushes it to the very edge of plausibility and fantasy and then acknowledges it for just a second. He knows this is wild. He knows it’s silly. But also, he doesn’t care. This is a freaking He-Man movie, and even though I’ve always been more of a tangential He-Man fan, watching this scene gave me chills. It’s that good.
That tone is apparent throughout the rest of the footage screened at the CinemaCon adjacent event, too. We saw Skeletor’s armies storming Eternos as King Randor, Queen Marlena, Prince Adam, and—yes—even Princess Adora try to escape. Ram-Man, Fisto, and Mekaneck are among the heroes fighting for them. Man at Arms (Idris Elba) promises to help the royals escape, kicking all sorts of ass on the way to the exit. That is, until he finds himself opposite Lock-Jaw and loses.
In this version of Masters of the Universe, the film begins with Skeletor (Jared Leto) actually winning. That’s why Prince Adam is sent away to Earth, to keep him safe. We even saw Skeletor’s speech to King Randor as the rest of the family escapes. He’s evil and weird, and it ends with a very awkward moment that feels perfectly Skeletor.
Basically, the footage was a very encouraging tease that Knight has found the balance to acknowledge the inherent weirdness of He-Man, with all the crazy creatures, rules, magic, muscles, and more, while also taking it seriously enough to make you feel for the characters. We can’t wait to see the rest.
As Deep as the Grave follows married archaeologists Ann Axtell Morris (Abigail Lawrie) and Earl H. Morris (Tom Felton), who conducted fieldwork in the U.S. southwest during the 1920s. Kilmer’s AI-generated likeness will be used to depict Father Fintan, a Catholic priest who is also a Native American spiritualist. The film also features Abigail Breslin, Wes Studi, and Finn Jones.
Though Kilmer was cast in As Deep as the Grave prior to his death, delays in production and issues with his health meant he never shot any scenes. Kilmer had previously given a tech-assisted performance in Top Gun: Maverick, which digitally altered his real voice. He also worked with UK company Sonantic to create an AI speaking voice based on his old recordings. However, As Deep as the Grave will be the first time his likeness and voice will be completely AI-generated in a film.
“Very fitting that this trailer includes a scene where a corpse is unceremoniously yanked out of the ground,” read one of the top comments on As Deep as the Grave‘s trailer at time of writing.
CGI likenesses of deceased actors have been used in feature films before. In 2016, Rogue One: A Star Wars Storygained attention for using CGI and motion capture to resurrect Peter Cushing and portray a younger Carrie Fisher for a few minutes of the film. In 2015, Furious 7 used similar techniques to insert Paul Walker into the remainder of the film after he died mid-shoot. Though Furious 7 largely received a pass due to the circumstances, Rogue One received criticism regarding the ethicsof its CGI Cushing. Using generative AI to completely create a performance out of nothing appears to go a step even further, completely removing any actors from the process.
Writer and director Coerte Voorhees told Variety that he chose to use AI rather than recast the role due to budget constraints, and that Kilmer’s children gave the project their blessing. Even so, online commenters have labelled it disgusting and disrespectful, not only for digitally reanimating Kilmer but also for the damaging precedent As Deep as the Grave‘s use of AI could set for the film industry as a whole.
As Deep as the Grave follows married archaeologists Ann Axtell Morris (Abigail Lawrie) and Earl H. Morris (Tom Felton), who conducted fieldwork in the U.S. southwest during the 1920s. Kilmer’s AI-generated likeness will be used to depict Father Fintan, a Catholic priest who is also a Native American spiritualist. The film also features Abigail Breslin, Wes Studi, and Finn Jones.
Though Kilmer was cast in As Deep as the Grave prior to his death, delays in production and issues with his health meant he never shot any scenes. Kilmer had previously given a tech-assisted performance in Top Gun: Maverick, which digitally altered his real voice. He also worked with UK company Sonantic to create an AI speaking voice based on his old recordings. However, As Deep as the Grave will be the first time his likeness and voice will be completely AI-generated in a film.
“Very fitting that this trailer includes a scene where a corpse is unceremoniously yanked out of the ground,” read one of the top comments on As Deep as the Grave‘s trailer at time of writing.
CGI likenesses of deceased actors have been used in feature films before. In 2016, Rogue One: A Star Wars Storygained attention for using CGI and motion capture to resurrect Peter Cushing and portray a younger Carrie Fisher for a few minutes of the film. In 2015, Furious 7 used similar techniques to insert Paul Walker into the remainder of the film after he died mid-shoot. Though Furious 7 largely received a pass due to the circumstances, Rogue One received criticism regarding the ethicsof its CGI Cushing. Using generative AI to completely create a performance out of nothing appears to go a step even further, completely removing any actors from the process.
Writer and director Coerte Voorhees told Variety that he chose to use AI rather than recast the role due to budget constraints, and that Kilmer’s children gave the project their blessing. Even so, online commenters have labelled it disgusting and disrespectful, not only for digitally reanimating Kilmer but also for the damaging precedent As Deep as the Grave‘s use of AI could set for the film industry as a whole.
#Val #Kilmer #deepfake #Deep #Grave #trailer #sparks #outrage">Val Kilmer AI deepfake in ‘As Deep as the Grave’ trailer sparks outrage
As Deep as the Grave follows married archaeologists Ann Axtell Morris (Abigail Lawrie) and Earl H. Morris (Tom Felton), who conducted fieldwork in the U.S. southwest during the 1920s. Kilmer’s AI-generated likeness will be used to depict Father Fintan, a Catholic priest who is also a Native American spiritualist. The film also features Abigail Breslin, Wes Studi, and Finn Jones.
Though Kilmer was cast in As Deep as the Grave prior to his death, delays in production and issues with his health meant he never shot any scenes. Kilmer had previously given a tech-assisted performance in Top Gun: Maverick, which digitally altered his real voice. He also worked with UK company Sonantic to create an AI speaking voice based on his old recordings. However, As Deep as the Grave will be the first time his likeness and voice will be completely AI-generated in a film.
“Very fitting that this trailer includes a scene where a corpse is unceremoniously yanked out of the ground,” read one of the top comments on As Deep as the Grave‘s trailer at time of writing.
CGI likenesses of deceased actors have been used in feature films before. In 2016, Rogue One: A Star Wars Storygained attention for using CGI and motion capture to resurrect Peter Cushing and portray a younger Carrie Fisher for a few minutes of the film. In 2015, Furious 7 used similar techniques to insert Paul Walker into the remainder of the film after he died mid-shoot. Though Furious 7 largely received a pass due to the circumstances, Rogue One received criticism regarding the ethicsof its CGI Cushing. Using generative AI to completely create a performance out of nothing appears to go a step even further, completely removing any actors from the process.
Writer and director Coerte Voorhees told Variety that he chose to use AI rather than recast the role due to budget constraints, and that Kilmer’s children gave the project their blessing. Even so, online commenters have labelled it disgusting and disrespectful, not only for digitally reanimating Kilmer but also for the damaging precedent As Deep as the Grave‘s use of AI could set for the film industry as a whole.
The feature was expanded in January to give parents some control over how long their kids spend scrolling through Shorts, with an option for zero minutes “coming soon.” According to YouTube spokesperson Makenzie Spiller, the option to set the timer to zero is now “live for all parents, and is currently being rolled out to everyone,” including users with regular adult accounts.
Regardless of age, it can be a handy tool for anyone who wants to spend a little less time scrolling. The Shorts tab won’t show any videos once you hit your limit, just a notification that you’ve “reached your Shorts feed limit.” In our tests, hitting the time limit also removes Shorts from the Home screen, so by setting the timer to zero you can ignore Shorts entirely if you want. To turn on the timer, go to the settings in the YouTube app and select “time management” then toggle on the Shorts feed limit and select a time for it.
The feature was expanded in January to give parents some control over how long their kids spend scrolling through Shorts, with an option for zero minutes “coming soon.” According to YouTube spokesperson Makenzie Spiller, the option to set the timer to zero is now “live for all parents, and is currently being rolled out to everyone,” including users with regular adult accounts.
Regardless of age, it can be a handy tool for anyone who wants to spend a little less time scrolling. The Shorts tab won’t show any videos once you hit your limit, just a notification that you’ve “reached your Shorts feed limit.” In our tests, hitting the time limit also removes Shorts from the Home screen, so by setting the timer to zero you can ignore Shorts entirely if you want. To turn on the timer, go to the settings in the YouTube app and select “time management” then toggle on the Shorts feed limit and select a time for it.
#YouTube #lets #turn #ShortsNews,Social Media,Streaming,Tech,YouTube">YouTube now lets you turn off Shorts
YouTube’s time management settings now have an option to put a zero-minute time limit on Shorts, effectively removing them from your app in Android and iOS. The option is an update to the Shorts timer YouTube originally announced in October; the lowest previous option was 15 minutes.
The feature was expanded in January to give parents some control over how long their kids spend scrolling through Shorts, with an option for zero minutes “coming soon.” According to YouTube spokesperson Makenzie Spiller, the option to set the timer to zero is now “live for all parents, and is currently being rolled out to everyone,” including users with regular adult accounts.
Regardless of age, it can be a handy tool for anyone who wants to spend a little less time scrolling. The Shorts tab won’t show any videos once you hit your limit, just a notification that you’ve “reached your Shorts feed limit.” In our tests, hitting the time limit also removes Shorts from the Home screen, so by setting the timer to zero you can ignore Shorts entirely if you want. To turn on the timer, go to the settings in the YouTube app and select “time management” then toggle on the Shorts feed limit and select a time for it.
Post Comment