×
This Beanie Is Designed to Read Your ThoughtsSpeech-to-text capability is now baked into all modern computers. But what if you didn’t have to dictate to your computer? What if you could type just by thinking?Silicon Valley startup Sabi is emerging from stealth with that goal. The company is developing a brain wearable that decodes a person’s internal speech into words on a computer screen. CEO Rahul Chhabra says its first product, a brain-reading beanie, will be available by the end of the year. The company is also designing a baseball cap version.The technology is known as a brain-computer interface, or BCI, a device that provides a direct communication pathway between the brain and an external device. While many companies such as Elon Musk’s Neuralink are developing surgically implanted BCIs for people with severe motor disabilities, Sabi’s device could allow anyone to become a cyborg.It’s not exactly Musk’s vision of the future, which involves implanted brain chips to allow humans to merge with AI. But venture capitalist Vinod Khosla, who was an early investor in OpenAI, says a noninvasive, wearable device is the only path to getting lots of people to use BCI technology.“The biggest and baddest application of BCI is if you can talk to your computer by thinking about it,” says Khosla, founder of Khosla Ventures, one of Sabi’s investors. “If you’re going to have a billion people use BCI for access to their computers every day, it can’t be invasive.”Sabi’s brain-reading hat relies on EEG, or electroencephalography, which uses metal disks placed on the scalp to record the brain’s electrical activity. Decoding imagined speech from EEG is already possible, but it’s currently limited to small sets of words or commands rather than continuous, natural speech.Photograph: Courtesy of SabiThe drawback of a wearable system is that the sensors have to listen to the brain through a layer of skin and bone, which dampens neural signals. Surgically implanted devices pick up much stronger signals because they sit so close to neurons. Sabi thinks the way to boost accuracy with a wearable is by massively scaling up the number of sensors in its device. Most EEG devices have a dozen to a few hundred sensors. Sabi’s cap will have anywhere from 70,000 to 100,000 miniature sensors.“Given that high-density sensing, it pinpoints exactly what and where neural activity is happening. We use that information to get much more reliable data to decode what a person is thinking,” Chhabra says.The company is aiming for an initial typing speed of 30 or so words per minute. That’s slower than most people type, but he says the speed will improve as users spend more time with the cap.#Beanie #Designed #Read #Thoughtswearables,neuroscience,artificial intelligence,brain-computer interfaces

This Beanie Is Designed to Read Your Thoughts

Speech-to-text capability is now baked into all modern computers. But what if you didn’t have to dictate to your computer? What if you could type just by thinking?

Silicon Valley startup Sabi is emerging from stealth with that goal. The company is developing a brain wearable that decodes a person’s internal speech into words on a computer screen. CEO Rahul Chhabra says its first product, a brain-reading beanie, will be available by the end of the year. The company is also designing a baseball cap version.

The technology is known as a brain-computer interface, or BCI, a device that provides a direct communication pathway between the brain and an external device. While many companies such as Elon Musk’s Neuralink are developing surgically implanted BCIs for people with severe motor disabilities, Sabi’s device could allow anyone to become a cyborg.

It’s not exactly Musk’s vision of the future, which involves implanted brain chips to allow humans to merge with AI. But venture capitalist Vinod Khosla, who was an early investor in OpenAI, says a noninvasive, wearable device is the only path to getting lots of people to use BCI technology.

“The biggest and baddest application of BCI is if you can talk to your computer by thinking about it,” says Khosla, founder of Khosla Ventures, one of Sabi’s investors. “If you’re going to have a billion people use BCI for access to their computers every day, it can’t be invasive.”

Sabi’s brain-reading hat relies on EEG, or electroencephalography, which uses metal disks placed on the scalp to record the brain’s electrical activity. Decoding imagined speech from EEG is already possible, but it’s currently limited to small sets of words or commands rather than continuous, natural speech.

A very small chip shown on the pad of a finger to illustrate it's tiny scale

Photograph: Courtesy of Sabi

The drawback of a wearable system is that the sensors have to listen to the brain through a layer of skin and bone, which dampens neural signals. Surgically implanted devices pick up much stronger signals because they sit so close to neurons. Sabi thinks the way to boost accuracy with a wearable is by massively scaling up the number of sensors in its device. Most EEG devices have a dozen to a few hundred sensors. Sabi’s cap will have anywhere from 70,000 to 100,000 miniature sensors.

“Given that high-density sensing, it pinpoints exactly what and where neural activity is happening. We use that information to get much more reliable data to decode what a person is thinking,” Chhabra says.

The company is aiming for an initial typing speed of 30 or so words per minute. That’s slower than most people type, but he says the speed will improve as users spend more time with the cap.

#Beanie #Designed #Read #Thoughtswearables,neuroscience,artificial intelligence,brain-computer interfaces

Speech-to-text capability is now baked into all modern computers. But what if you didn’t have to dictate to your computer? What if you could type just by thinking?

Silicon Valley startup Sabi is emerging from stealth with that goal. The company is developing a brain wearable that decodes a person’s internal speech into words on a computer screen. CEO Rahul Chhabra says its first product, a brain-reading beanie, will be available by the end of the year. The company is also designing a baseball cap version.

The technology is known as a brain-computer interface, or BCI, a device that provides a direct communication pathway between the brain and an external device. While many companies such as Elon Musk’s Neuralink are developing surgically implanted BCIs for people with severe motor disabilities, Sabi’s device could allow anyone to become a cyborg.

It’s not exactly Musk’s vision of the future, which involves implanted brain chips to allow humans to merge with AI. But venture capitalist Vinod Khosla, who was an early investor in OpenAI, says a noninvasive, wearable device is the only path to getting lots of people to use BCI technology.

“The biggest and baddest application of BCI is if you can talk to your computer by thinking about it,” says Khosla, founder of Khosla Ventures, one of Sabi’s investors. “If you’re going to have a billion people use BCI for access to their computers every day, it can’t be invasive.”

Sabi’s brain-reading hat relies on EEG, or electroencephalography, which uses metal disks placed on the scalp to record the brain’s electrical activity. Decoding imagined speech from EEG is already possible, but it’s currently limited to small sets of words or commands rather than continuous, natural speech.

Photograph: Courtesy of Sabi

The drawback of a wearable system is that the sensors have to listen to the brain through a layer of skin and bone, which dampens neural signals. Surgically implanted devices pick up much stronger signals because they sit so close to neurons. Sabi thinks the way to boost accuracy with a wearable is by massively scaling up the number of sensors in its device. Most EEG devices have a dozen to a few hundred sensors. Sabi’s cap will have anywhere from 70,000 to 100,000 miniature sensors.

“Given that high-density sensing, it pinpoints exactly what and where neural activity is happening. We use that information to get much more reliable data to decode what a person is thinking,” Chhabra says.

The company is aiming for an initial typing speed of 30 or so words per minute. That’s slower than most people type, but he says the speed will improve as users spend more time with the cap.

Source link
#Beanie #Designed #Read #Thoughts

Previous post

French Open prize money increases by 9.5 per cent <div id="content-body-70868899" itemprop="articleBody"><p>Prize money at this year’s French ​Open will jump by 9.5 per cent, taking ‌the total purse to 61.7 million ​euros ($72.69 million), organisers said ⁠on Thursday.</p><p>The increase of 5.4 million euros compared to 2025 continues a steady ‌rise in player earnings at the claycourt Grand Slam.</p><p>The organisers ‌have in recent years focused ‌on ⁠boosting prize money across all ⁠rounds, not only for the champions but also for players eliminated in the early ​stages, amid growing ‌calls within the sport for a fairer distribution of revenues.</p><p>The Paris major, staged annually at Roland-Garros, has ‌maintained equal prize money for ​men and women.</p><p>The prize money increase comes as pressure mounts ⁠from players for a greater share of revenues, with discussions ongoing ‌across the sport involving governing bodies and tournament organisers.</p><p>Despite the latest rise, Roland Garros is expected to remain behind the other three Grand Slams in overall prize money.</p><p>The U.S. ‌Open offered the largest prize fund of ​the Grand Slams last year with $90 million, while Wimbledon paid ⁠out 53.5 million pounds ($72.40 million).</p><p>The Australian ⁠Open offered a record A$111.5 million ($79.92 million) in prize money ‌this year.</p><p class="publish-time" id="end-of-article">Published on Apr 16, 2026</p></div> #French #Open #prize #money #increases #cent

Next post

Deadspin | Ducks eye playoff positioning in regular-season finale vs. Predators <div id=""><section id="0" class=" w-full"><div class="xl:container mx-0 !px-4 py-0 pb-4 !mx-0 !px-0"><img src="https://images.deadspin.com/tr:w-900/28730680.jpg" srcset="https://images.deadspin.com/tr:w-900/28730680.jpg" alt="NHL: Anaheim Ducks at Minnesota Wild" class="w-full" fetchpriority="high" loading="eager"/><span class="text-0.8 leading-tight">Apr 14, 2026; Saint Paul, Minnesota, USA; Anaheim Ducks center Mikael Granlund (64) shoots against the Minnesota Wild in the third period at Grand Casino Arena. Mandatory Credit: Matt Blewett-Imagn Images<!-- --> <!-- --> </span></div></section><section id="section-1"> <p>Heading into the final day of the NHL regular season on Thursday, the Anaheim Ducks still can finish anywhere from second in the Pacific Division to the second Western Conference wild-card spot.</p> </section><section id="section-2"> <p>A lot of that depends on their result against the host Nashville Predators on Thursday evening, but also the final score in the Edmonton Oilers-Vancouver Canucks game and the Los Angeles Kings-Calgary Flames game.</p> </section><section id="section-3"> <p>The latter two contests are scheduled to start an hour after the Ducks-Predators matchup.</p> </section><section id="section-4"> <p>Edmonton (40-30-11, 91 points) sits second in the Pacific Division, one point ahead of the Ducks (42-33-6, 90 points) and Kings (35-26-20, 90 points), and the Oilers own the tiebreaker over both. All three teams have clinched playoff berths.</p> </section><section id="section-5"> <p>Edmonton had hoped to be playing for its first division title since 1986-87, but the Vegas Golden Knights clinched the Pacific with a 4-1 win against the visiting Seattle Kraken on Wednesday.</p> </section><section id="section-6"> <p>Anaheim supporters hope that spoils Edmonton’s motivation against Vancouver and opens the door for the Ducks to leapfrog the Oilers for second place in the Pacific and earn home-ice in their first-round playoff series with a win against Nashville.</p> </section><section id="section-7"> <p>Anaheim owns the tiebreaker over Los Angeles, leaving the Kings in the second wild-card spot entering Thursday.</p> </section><br/><section id="section-8"> <p>If the Oilers and Kings win on Thursday and the Ducks lose in any fashion to the Predators, Anaheim would drop into the second wild card, where it would face a first-round series against the Colorado Avalanche, owners of the best record in the NHL.</p> </section> <section id="section-9"> <p>The Ducks have done little to help their standing in the past nine games, going 1-6-2 to lose their grip on first place in the Pacific and potentially home-ice advantage in the first round.</p> </section><section id="section-10"> <p>“Regardless of where we end up, it’s playoff hockey,” Ducks forward Ryan Poehling said. “Anything can happen, and for us to just be on top of our game and just kind of take care of what we can I think is the biggest thing. So, we’ve got one more (game) to kind of go through a rehearsal and then real hockey starts.”</p> </section><section id="section-11"> <p>The Predators (38-33-10, 86 points) made a big splash in free agency two years ago when they signed forwards Steven Stamkos and Jonathan Marchessault and defenseman Brady Skjei to big contracts, but the trio has yet to lead them to the playoffs.</p> </section><section id="section-12"> <p>Nashville was eliminated from postseason contention on Monday when it lost 3-2 to the visiting San Jose Sharks.</p> </section><section id="section-13"> <p>“Came here with playoff aspirations, right?” Stamkos said. “So, we failed the last two years and that’s what’s so hard.”</p> </section><section id="section-14"> <p>The Predators need to hire a new general manager because Barry Trotz is retiring after the season. Stamkos has been around long enough to know that the new GM could take the team in a variety of directions.</p> </section><section id="section-15"> <p>“Someone could come here and say we want to complete tear down and rebuild and that’s the way it’s going to go,” Stamkos said. “We could have someone come in here and say, ‘Listen, you guys were close this year. We believe in the older guys and the core of this team and we want to help build that with bringing in even more younger guys and impact players.’ I think that’s the route we all hope for, but there are too many unknowns right now with regards to the future of the team.”</p> </section><br/><section id="section-16"> <p>–Field Level Media</p> </section> </div> #Deadspin #Ducks #eye #playoff #positioning #regularseason #finale #Predators

Val Kilmer AI deepfake film As Deep as the Grave has just released its first trailer. The internet has responded with overwhelming disgust.

A widely recognised actor known for his roles in films such as Top Gun, Batman Forever, and Kiss Kiss Bang Bang, Kilmer died from pneumonia last April at 65 years old. Upcoming film As Deep as the Grave has now used generative AI to create a digital puppet in Kilmer’s likeness, having it portray a character appearing in “a significant part” of the historical film.

As Deep as the Grave follows married archaeologists Ann Axtell Morris (Abigail Lawrie) and Earl H. Morris (Tom Felton), who conducted fieldwork in the U.S. southwest during the 1920s. Kilmer’s AI-generated likeness will be used to depict Father Fintan, a Catholic priest who is also a Native American spiritualist. The film also features Abigail Breslin, Wes Studi, and Finn Jones.

Though Kilmer was cast in As Deep as the Grave prior to his death, delays in production and issues with his health meant he never shot any scenes. Kilmer had previously given a tech-assisted performance in Top Gun: Maverick, which digitally altered his real voice. He also worked with UK company Sonantic to create an AI speaking voice based on his old recordings. However, As Deep as the Grave will be the first time his likeness and voice will be completely AI-generated in a film.

“Very fitting that this trailer includes a scene where a corpse is unceremoniously yanked out of the ground,” read one of the top comments on As Deep as the Grave‘s trailer at time of writing.

CGI likenesses of deceased actors have been used in feature films before. In 2016, Rogue One: A Star Wars Story gained attention for using CGI and motion capture to resurrect Peter Cushing and portray a younger Carrie Fisher for a few minutes of the film. In 2015, Furious 7 used similar techniques to insert Paul Walker into the remainder of the film after he died mid-shoot. Though Furious 7 largely received a pass due to the circumstances, Rogue One received criticism regarding the ethics of its CGI Cushing. Using generative AI to completely create a performance out of nothing appears to go a step even further, completely removing any actors from the process.

Writer and director Coerte Voorhees told Variety that he chose to use AI rather than recast the role due to budget constraints, and that Kilmer’s children gave the project their blessing. Even so, online commenters have labelled it disgusting and disrespectful, not only for digitally reanimating Kilmer but also for the damaging precedent As Deep as the Grave‘s use of AI could set for the film industry as a whole.

#Val #Kilmer #deepfake #Deep #Grave #trailer #sparks #outrage">Val Kilmer AI deepfake in ‘As Deep as the Grave’ trailer sparks outrage
                        Val Kilmer AI deepfake film As Deep as the Grave has just released its first trailer. The internet has responded with overwhelming disgust.A widely recognised actor known for his roles in films such as Top Gun, Batman Forever, and Kiss Kiss Bang Bang, Kilmer died from pneumonia last April at 65 years old. Upcoming film As Deep as the Grave has now used generative AI to create a digital puppet in Kilmer’s likeness, having it portray a character appearing in “a significant part” of the historical film.As Deep as the Grave follows married archaeologists Ann Axtell Morris (Abigail Lawrie) and Earl H. Morris (Tom Felton), who conducted fieldwork in the U.S. southwest during the 1920s. Kilmer’s AI-generated likeness will be used to depict Father Fintan, a Catholic priest who is also a Native American spiritualist. The film also features Abigail Breslin, Wes Studi, and Finn Jones.Though Kilmer was cast in As Deep as the Grave prior to his death, delays in production and issues with his health meant he never shot any scenes. Kilmer had previously given a tech-assisted performance in Top Gun: Maverick, which digitally altered his real voice. He also worked with UK company Sonantic to create an AI speaking voice based on his old recordings. However, As Deep as the Grave will be the first time his likeness and voice will be completely AI-generated in a film.“Very fitting that this trailer includes a scene where a corpse is unceremoniously yanked out of the ground,” read one of the top comments on As Deep as the Grave‘s trailer at time of writing.
CGI likenesses of deceased actors have been used in feature films before. In 2016, Rogue One: A Star Wars Story gained attention for using CGI and motion capture to resurrect Peter Cushing and portray a younger Carrie Fisher for a few minutes of the film. In 2015, Furious 7 used similar techniques to insert Paul Walker into the remainder of the film after he died mid-shoot. Though Furious 7 largely received a pass due to the circumstances, Rogue One received criticism regarding the ethics of its CGI Cushing. Using generative AI to completely create a performance out of nothing appears to go a step even further, completely removing any actors from the process.Writer and director Coerte Voorhees told Variety that he chose to use AI rather than recast the role due to budget constraints, and that Kilmer’s children gave the project their blessing. Even so, online commenters have labelled it disgusting and disrespectful, not only for digitally reanimating Kilmer but also for the damaging precedent As Deep as the Grave‘s use of AI could set for the film industry as a whole.



                            
                    
                
                    #Val #Kilmer #deepfake #Deep #Grave #trailer #sparks #outrage

Val Kilmer AI deepfake film As Deep as the Grave has just released its first trailer. The internet has responded with overwhelming disgust.

A widely recognised actor known for his roles in films such as Top Gun, Batman Forever, and Kiss Kiss Bang Bang, Kilmer died from pneumonia last April at 65 years old. Upcoming film As Deep as the Grave has now used generative AI to create a digital puppet in Kilmer’s likeness, having it portray a character appearing in “a significant part” of the historical film.

As Deep as the Grave follows married archaeologists Ann Axtell Morris (Abigail Lawrie) and Earl H. Morris (Tom Felton), who conducted fieldwork in the U.S. southwest during the 1920s. Kilmer’s AI-generated likeness will be used to depict Father Fintan, a Catholic priest who is also a Native American spiritualist. The film also features Abigail Breslin, Wes Studi, and Finn Jones.

Though Kilmer was cast in As Deep as the Grave prior to his death, delays in production and issues with his health meant he never shot any scenes. Kilmer had previously given a tech-assisted performance in Top Gun: Maverick, which digitally altered his real voice. He also worked with UK company Sonantic to create an AI speaking voice based on his old recordings. However, As Deep as the Grave will be the first time his likeness and voice will be completely AI-generated in a film.

“Very fitting that this trailer includes a scene where a corpse is unceremoniously yanked out of the ground,” read one of the top comments on As Deep as the Grave‘s trailer at time of writing.

CGI likenesses of deceased actors have been used in feature films before. In 2016, Rogue One: A Star Wars Story gained attention for using CGI and motion capture to resurrect Peter Cushing and portray a younger Carrie Fisher for a few minutes of the film. In 2015, Furious 7 used similar techniques to insert Paul Walker into the remainder of the film after he died mid-shoot. Though Furious 7 largely received a pass due to the circumstances, Rogue One received criticism regarding the ethics of its CGI Cushing. Using generative AI to completely create a performance out of nothing appears to go a step even further, completely removing any actors from the process.

Writer and director Coerte Voorhees told Variety that he chose to use AI rather than recast the role due to budget constraints, and that Kilmer’s children gave the project their blessing. Even so, online commenters have labelled it disgusting and disrespectful, not only for digitally reanimating Kilmer but also for the damaging precedent As Deep as the Grave‘s use of AI could set for the film industry as a whole.

#Val #Kilmer #deepfake #Deep #Grave #trailer #sparks #outrage">Val Kilmer AI deepfake in ‘As Deep as the Grave’ trailer sparks outrage

Val Kilmer AI deepfake film As Deep as the Grave has just released its first trailer. The internet has responded with overwhelming disgust.

A widely recognised actor known for his roles in films such as Top Gun, Batman Forever, and Kiss Kiss Bang Bang, Kilmer died from pneumonia last April at 65 years old. Upcoming film As Deep as the Grave has now used generative AI to create a digital puppet in Kilmer’s likeness, having it portray a character appearing in “a significant part” of the historical film.

As Deep as the Grave follows married archaeologists Ann Axtell Morris (Abigail Lawrie) and Earl H. Morris (Tom Felton), who conducted fieldwork in the U.S. southwest during the 1920s. Kilmer’s AI-generated likeness will be used to depict Father Fintan, a Catholic priest who is also a Native American spiritualist. The film also features Abigail Breslin, Wes Studi, and Finn Jones.

Though Kilmer was cast in As Deep as the Grave prior to his death, delays in production and issues with his health meant he never shot any scenes. Kilmer had previously given a tech-assisted performance in Top Gun: Maverick, which digitally altered his real voice. He also worked with UK company Sonantic to create an AI speaking voice based on his old recordings. However, As Deep as the Grave will be the first time his likeness and voice will be completely AI-generated in a film.

“Very fitting that this trailer includes a scene where a corpse is unceremoniously yanked out of the ground,” read one of the top comments on As Deep as the Grave‘s trailer at time of writing.

CGI likenesses of deceased actors have been used in feature films before. In 2016, Rogue One: A Star Wars Story gained attention for using CGI and motion capture to resurrect Peter Cushing and portray a younger Carrie Fisher for a few minutes of the film. In 2015, Furious 7 used similar techniques to insert Paul Walker into the remainder of the film after he died mid-shoot. Though Furious 7 largely received a pass due to the circumstances, Rogue One received criticism regarding the ethics of its CGI Cushing. Using generative AI to completely create a performance out of nothing appears to go a step even further, completely removing any actors from the process.

Writer and director Coerte Voorhees told Variety that he chose to use AI rather than recast the role due to budget constraints, and that Kilmer’s children gave the project their blessing. Even so, online commenters have labelled it disgusting and disrespectful, not only for digitally reanimating Kilmer but also for the damaging precedent As Deep as the Grave‘s use of AI could set for the film industry as a whole.

#Val #Kilmer #deepfake #Deep #Grave #trailer #sparks #outrage

The feature was expanded in January to give parents some control over how long their kids spend scrolling through Shorts, with an option for zero minutes “coming soon.” According to YouTube spokesperson Makenzie Spiller, the option to set the timer to zero is now “live for all parents, and is currently being rolled out to everyone,” including users with regular adult accounts.

Regardless of age, it can be a handy tool for anyone who wants to spend a little less time scrolling. The Shorts tab won’t show any videos once you hit your limit, just a notification that you’ve “reached your Shorts feed limit.” In our tests, hitting the time limit also removes Shorts from the Home screen, so by setting the timer to zero you can ignore Shorts entirely if you want. To turn on the timer, go to the settings in the YouTube app and select “time management” then toggle on the Shorts feed limit and select a time for it.

#YouTube #lets #turn #ShortsNews,Social Media,Streaming,Tech,YouTube">YouTube now lets you turn off ShortsYouTube’s time management settings now have an option to put a zero-minute time limit on Shorts, effectively removing them from your app in Android and iOS. The option is an update to the Shorts timer YouTube originally announced in October; the lowest previous option was 15 minutes.The feature was expanded in January to give parents some control over how long their kids spend scrolling through Shorts, with an option for zero minutes “coming soon.” According to YouTube spokesperson Makenzie Spiller, the option to set the timer to zero is now “live for all parents, and is currently being rolled out to everyone,” including users with regular adult accounts.Regardless of age, it can be a handy tool for anyone who wants to spend a little less time scrolling. The Shorts tab won’t show any videos once you hit your limit, just a notification that you’ve “reached your Shorts feed limit.” In our tests, hitting the time limit also removes Shorts from the Home screen, so by setting the timer to zero you can ignore Shorts entirely if you want. To turn on the timer, go to the settings in the YouTube app and select “time management” then toggle on the Shorts feed limit and select a time for it.#YouTube #lets #turn #ShortsNews,Social Media,Streaming,Tech,YouTube

originally announced in October; the lowest previous option was 15 minutes.

The feature was expanded in January to give parents some control over how long their kids spend scrolling through Shorts, with an option for zero minutes “coming soon.” According to YouTube spokesperson Makenzie Spiller, the option to set the timer to zero is now “live for all parents, and is currently being rolled out to everyone,” including users with regular adult accounts.

Regardless of age, it can be a handy tool for anyone who wants to spend a little less time scrolling. The Shorts tab won’t show any videos once you hit your limit, just a notification that you’ve “reached your Shorts feed limit.” In our tests, hitting the time limit also removes Shorts from the Home screen, so by setting the timer to zero you can ignore Shorts entirely if you want. To turn on the timer, go to the settings in the YouTube app and select “time management” then toggle on the Shorts feed limit and select a time for it.

#YouTube #lets #turn #ShortsNews,Social Media,Streaming,Tech,YouTube">YouTube now lets you turn off Shorts

YouTube’s time management settings now have an option to put a zero-minute time limit on Shorts, effectively removing them from your app in Android and iOS. The option is an update to the Shorts timer YouTube originally announced in October; the lowest previous option was 15 minutes.

The feature was expanded in January to give parents some control over how long their kids spend scrolling through Shorts, with an option for zero minutes “coming soon.” According to YouTube spokesperson Makenzie Spiller, the option to set the timer to zero is now “live for all parents, and is currently being rolled out to everyone,” including users with regular adult accounts.

Regardless of age, it can be a handy tool for anyone who wants to spend a little less time scrolling. The Shorts tab won’t show any videos once you hit your limit, just a notification that you’ve “reached your Shorts feed limit.” In our tests, hitting the time limit also removes Shorts from the Home screen, so by setting the timer to zero you can ignore Shorts entirely if you want. To turn on the timer, go to the settings in the YouTube app and select “time management” then toggle on the Shorts feed limit and select a time for it.

#YouTube #lets #turn #ShortsNews,Social Media,Streaming,Tech,YouTube

Post Comment