At the end of the WWDC 2025 keynote address on Monday, the company literally sang its praises to app developers, as singer-songwriter Allen White humorously turned customers’ positive App Store reviews into song lyrics for a tune titled “6 out of 5 stars.”
“Best app I’ve ever set my sorry eyes upon,” he crooned. “This is not an app. It’s a piece of art.”
Image Credits:Apple (screenshot)
What Apple didn’t mention, however, was anything related to the tumultuous past few years for its developer community, or more broadly, why its developers should continue to put their trust in a company that’s fighting them for every nickel and dime while failing them in other ways.
In recent years, the Cupertino-based tech giant has put its app developer community through the wringer as it stringently fought against regulation, lawsuits, and any other efforts to rein in its alleged App Store monopoly by nations, lawmakers, and the courts.
Meanwhile, it has yet to deliver on some of the key technology advances that could modernize developers’ apps for the AI era.
In March, it delayed its “more personalized,” AI-powered Siri, demonstrated at last year’s WWDC. Apple only briefly acknowledged this fumble during this year’s keynote address, when SVP of Software Engineering Craig Federighi said the work “needed more time” to meet Apple’s high bar.
Image Credits:Apple
At WWDC, Apple’s scattered mentions of AI related to features that Google’s AI already has, such as Apple’s AI-powered translation features, though it tried to differentiate from Google by offering lyrics translation in Apple Music. Apple’s AI-powered Visual Intelligence feature was even demonstrated by tapping into Google’s app for image search results, something that feels more native on Android devices with innovations like Circle to Search, Lens’s multisearch, visual search in videos, and more.
Elsewhere, Apple appeased developers’ demand for AI with further integrations with OpenAI technology, like the addition of ChatGPT in Apple’s Image Playground app and for coding assistance in Xcode. But no deals with other AI providers were announced, despite rumors that Google Gemini integration was on the way and that Apple was teaming up with Anthropic on an AI-powered coding assistance tool.
Image Credits:Apple
Apple also made its scripting and automations app Shortcuts easier to use with the addition of AI features, but this ended up feeling more like a stop-gap to tide over power users until an AI Siri could take actions in their apps for them.
Then there was the deafening silence over the increasingly controversial App Store commissions.
In the U.S., for instance, Apple is fresh off a key loss in its battle with Fortnite maker Epic Games that forces Apple to now allow U.S. developers to point to alternative payment mechanisms on the web, where it can’t take a commission.
Yet Apple didn’t spend time during its hour-and-a-half-long keynote to talk about how its App Store is the best place to build an app business, improvements it’s made to payment processing systems, or how it’s weeding out scams. (It touted some of its developer benefits in the days leading up to WWDC, where it focused on its anti-fraud features and developer revenues.)
Apple also launched a standalone Games app, but the keynote address focused on the consumer benefits — Challenges, social features, easy access to Apple’s own gaming store Arcade — not on what it could do for mobile developers.
Image Credits:Apple
Nor, as some had hoped, did Apple announce a reduction in its App Store commissions across the board for all developers, finally putting the question to rest as to whether Apple’s in-app payments system is worth its price.
While arguably more developer-focused improvements will roll out this week at WWDC, through its developer keynote, Platforms State of the Union, and its various sessions, Apple missed a chance to surprise its developer community with some sort of acknowledgement that it’s understood that these past years have been tough, but ultimately it’s on developers’ sides.
Instead, the only reference so far to the changing market dynamics of the App Store ecosystem was a small update to its App Review Guidelines, which has swapped out the wording “alternative app marketplace” for “alternative distribution” — a subtle reminder that Apple only thinks the only app “marketplace” that can exist is its own App Store.
As for boosting developers’ businesses, Apple seemed to be thinking of itself and its own coffers first. In the initial developer beta of iOS 26, the App Store opens up to its Search page by default, meaning that Apple is pushing developers to spend more on its App Store Search ads for discovery.
Image Credits:Apple
Other changes point to Apple seeing developers as just another lever to be pulled to make the company more money, while it focuses on delighting consumers with new bells and whistles, like its interface design overhaul dubbed Liquid Glass.
Though inspired by its VR headset Vision Pro, Apple didn’t offer developers an explanation as to why they should make over their perfectly functional apps to meet these new design guidelines. The company could have at least hinted at the fact that Liquid Glass seems an obvious precursor to building an operating system that would eventually extend beyond smartphones and tablets, to reach new computing platforms, like AR glasses.
But Apple’s cultural preference for keeping secrets, despite years of comprehensive leaks, largely from Bloomberg’s Mark Gurman, kept it from suggesting that Liquid Glass was anything but an updated new look to wow consumers.
In wrapping the event by singing cheery App Store reviews (a system now beset by bots and fake reviews, as developers know), Apple tried to lighten the mood. It knew that this year’s event would let down its developer community with its AI delays, amid an aggressive pursuit of developer revenue.
The end result made the song feel like a performative act of developer appreciation — yes, an actual performance! — rather than a true reflection of how valuable developers are to Apple’s ability to ship more iPhones and make consumers happy.
#Reggie #FilsAimé #Amazon #asked #Nintendo #break #lawAmazon,Gaming,News,Nintendo,Tech">Reggie Fils-Aimé says Amazon once asked Nintendo to break the law
“Literally, we stopped selling to Amazon, and it’s because I wasn’t going to do something illegal. I wasn’t going to do something that would put at risk the relationship we have with other retailers. But it also set the stage to say, look, you’re not going to push me around. This is the way we do business. And so that’s how, over time, you build respect.”
The study was published this week in Science and comes from a research team led by physicians and computer scientists at Harvard Medical School and Beth Israel Deaconess Medical Center. The researchers said they conducted a variety of experiments to measure how OpenAI’s models compared to human physicians.
In one experiment, researchers focused on 76 patients who came into the Beth Israel emergency room, comparing the diagnoses offered by two internal medicine attending physicians to those generated by OpenAI’s o1 and 4o models. These diagnoses were assessed by two other attending physicians, who did not know which ones came from humans and which came from AI.
“At each diagnostic touchpoint, o1 either performed nominally better than or on par with the two attending physicians and 4o,” the study said, adding that the differences “were especially pronounced at the first diagnostic touchpoint (initial ER triage), where there is the least information available about the patient and the most urgency to make the correct decision.”
In Harvard Medical School’s press release about the study, the researchers emphasized that they did not “pre-process the data at all” — the AI models were presented with the same information that was available in the electronic medical records at the time of each diagnosis.
With that information, the o1 model managed to offer “the exact or very close diagnosis” in 67% of triage cases, compared to one physician who had the exact or close diagnosis 55% of the time, and to the other who hit the mark 50% of the time.
“We tested the AI model against virtually every benchmark, and it eclipsed both prior models and our physician baselines,” said Arjun Manrai, who heads an AI lab at Harvard Medical School and is one of the study’s lead authors, in the press release.
Techcrunch event
San Francisco, CA|October 13-15, 2026
To be clear, the study didn’t claim that AI is ready to make real life-or-death decisions in the emergency room. Instead, it said the findings show an “urgent need for prospective trials to evaluate these technologies in real-world patient care settings.”
The researchers also noted that they only studied how models performed when provided with text-based information, and that “existing studies suggest that current foundation models are more limited in reasoning over nontext inputs.”
Adam Rodman, a Beth Israel doctor who’s also one of the study’s lead authors, warned the Guardian that there’s “no formal framework right now for accountability” around AI diagnoses, and that patients still “want humans to guide them through life or death decisions [and] to guide them through challenging treatment decisions.”
In a post about the study, Kristen Panthagani, an emergency physician, said this is an “an interesting AI study that has led to some very overhyped headlines,” especially since it was comparing AI diagnoses to those from internal medicine physicians, not ER physicians.
“If we’re going to compare AI tools to physicians’ clinical ability, we should start by comparing to physicians who actually practice that specialty,” Panthagani said. “I would not be surprised if a LLM could beat a dermatologist at an neurosurgery board exam, [but] that’s not a particularly helpful thing to know.”
She also argued, “As an ER doctor seeing a patient for a first time, my primary goal is not to guess your ultimate diagnosis. My primary goal is to determine if you have a condition that could kill you.”
This post and headline have been updated to reflect the fact that the diagnoses in the study came from internal medicine attending physicians, and to include commentary from Kristen Panthagani.
When you purchase through links in our articles, we may earn a small commission. This doesn’t affect our editorial independence.
The study was published this week in Science and comes from a research team led by physicians and computer scientists at Harvard Medical School and Beth Israel Deaconess Medical Center. The researchers said they conducted a variety of experiments to measure how OpenAI’s models compared to human physicians.
In one experiment, researchers focused on 76 patients who came into the Beth Israel emergency room, comparing the diagnoses offered by two internal medicine attending physicians to those generated by OpenAI’s o1 and 4o models. These diagnoses were assessed by two other attending physicians, who did not know which ones came from humans and which came from AI.
“At each diagnostic touchpoint, o1 either performed nominally better than or on par with the two attending physicians and 4o,” the study said, adding that the differences “were especially pronounced at the first diagnostic touchpoint (initial ER triage), where there is the least information available about the patient and the most urgency to make the correct decision.”
In Harvard Medical School’s press release about the study, the researchers emphasized that they did not “pre-process the data at all” — the AI models were presented with the same information that was available in the electronic medical records at the time of each diagnosis.
With that information, the o1 model managed to offer “the exact or very close diagnosis” in 67% of triage cases, compared to one physician who had the exact or close diagnosis 55% of the time, and to the other who hit the mark 50% of the time.
“We tested the AI model against virtually every benchmark, and it eclipsed both prior models and our physician baselines,” said Arjun Manrai, who heads an AI lab at Harvard Medical School and is one of the study’s lead authors, in the press release.
Techcrunch event
San Francisco, CA|October 13-15, 2026
To be clear, the study didn’t claim that AI is ready to make real life-or-death decisions in the emergency room. Instead, it said the findings show an “urgent need for prospective trials to evaluate these technologies in real-world patient care settings.”
The researchers also noted that they only studied how models performed when provided with text-based information, and that “existing studies suggest that current foundation models are more limited in reasoning over nontext inputs.”
Adam Rodman, a Beth Israel doctor who’s also one of the study’s lead authors, warned the Guardian that there’s “no formal framework right now for accountability” around AI diagnoses, and that patients still “want humans to guide them through life or death decisions [and] to guide them through challenging treatment decisions.”
In a post about the study, Kristen Panthagani, an emergency physician, said this is an “an interesting AI study that has led to some very overhyped headlines,” especially since it was comparing AI diagnoses to those from internal medicine physicians, not ER physicians.
“If we’re going to compare AI tools to physicians’ clinical ability, we should start by comparing to physicians who actually practice that specialty,” Panthagani said. “I would not be surprised if a LLM could beat a dermatologist at an neurosurgery board exam, [but] that’s not a particularly helpful thing to know.”
She also argued, “As an ER doctor seeing a patient for a first time, my primary goal is not to guess your ultimate diagnosis. My primary goal is to determine if you have a condition that could kill you.”
This post and headline have been updated to reflect the fact that the diagnoses in the study came from internal medicine attending physicians, and to include commentary from Kristen Panthagani.
When you purchase through links in our articles, we may earn a small commission. This doesn’t affect our editorial independence.
#Harvard #study #offered #accurate #emergency #room #diagnoses #human #doctors #TechCrunchbeth israel,harvard medical school,OpenAI">In Harvard study, AI offered more accurate emergency room diagnoses than two human doctors | TechCrunch
A new study examines how large language models perform in a variety of medical contexts, including real emergency room cases — where at least one model seemed to be more accurate than human doctors.
The study was published this week in Science and comes from a research team led by physicians and computer scientists at Harvard Medical School and Beth Israel Deaconess Medical Center. The researchers said they conducted a variety of experiments to measure how OpenAI’s models compared to human physicians.
In one experiment, researchers focused on 76 patients who came into the Beth Israel emergency room, comparing the diagnoses offered by two internal medicine attending physicians to those generated by OpenAI’s o1 and 4o models. These diagnoses were assessed by two other attending physicians, who did not know which ones came from humans and which came from AI.
“At each diagnostic touchpoint, o1 either performed nominally better than or on par with the two attending physicians and 4o,” the study said, adding that the differences “were especially pronounced at the first diagnostic touchpoint (initial ER triage), where there is the least information available about the patient and the most urgency to make the correct decision.”
In Harvard Medical School’s press release about the study, the researchers emphasized that they did not “pre-process the data at all” — the AI models were presented with the same information that was available in the electronic medical records at the time of each diagnosis.
With that information, the o1 model managed to offer “the exact or very close diagnosis” in 67% of triage cases, compared to one physician who had the exact or close diagnosis 55% of the time, and to the other who hit the mark 50% of the time.
“We tested the AI model against virtually every benchmark, and it eclipsed both prior models and our physician baselines,” said Arjun Manrai, who heads an AI lab at Harvard Medical School and is one of the study’s lead authors, in the press release.
Techcrunch event
San Francisco, CA|October 13-15, 2026
To be clear, the study didn’t claim that AI is ready to make real life-or-death decisions in the emergency room. Instead, it said the findings show an “urgent need for prospective trials to evaluate these technologies in real-world patient care settings.”
The researchers also noted that they only studied how models performed when provided with text-based information, and that “existing studies suggest that current foundation models are more limited in reasoning over nontext inputs.”
Adam Rodman, a Beth Israel doctor who’s also one of the study’s lead authors, warned the Guardian that there’s “no formal framework right now for accountability” around AI diagnoses, and that patients still “want humans to guide them through life or death decisions [and] to guide them through challenging treatment decisions.”
In a post about the study, Kristen Panthagani, an emergency physician, said this is an “an interesting AI study that has led to some very overhyped headlines,” especially since it was comparing AI diagnoses to those from internal medicine physicians, not ER physicians.
“If we’re going to compare AI tools to physicians’ clinical ability, we should start by comparing to physicians who actually practice that specialty,” Panthagani said. “I would not be surprised if a LLM could beat a dermatologist at an neurosurgery board exam, [but] that’s not a particularly helpful thing to know.”
She also argued, “As an ER doctor seeing a patient for a first time, my primary goal is not to guess your ultimate diagnosis. My primary goal is to determine if you have a condition that could kill you.”
This post and headline have been updated to reflect the fact that the diagnoses in the study came from internal medicine attending physicians, and to include commentary from Kristen Panthagani.
When you purchase through links in our articles, we may earn a small commission. This doesn’t affect our editorial independence.
If Apple were ever going to make an Android phone, then they’d probably design something like the Xiaomi 17. I wouldn’t describe the build as flashy, but it’s super elegant and reminiscent of past Xiaomi flagships. I talked about this in my X300 Pro review, as creating a brand identity to compete against Samsung and Apple is super important, and Xiaomi has listened. While I was using it daily, many of my friends and family asked me what Xiaomi phone I was using—note the wording: “Xiaomi phone,” meaning they knew it was a particular brand, and that’s important. The Chinese smartphone maker said they thought of every curve, and I’ll just say it straight: the 17 is the best-feeling compact phone I’ve held this year.
The corners are crafted to perfection, the width is spot on, and even the way the aluminum frame blends into the glass without an abrupt edge makes carrying the phone a very enjoyable experience. Beyond that, the back glass is frosted to prevent the phone from slipping off glass surfaces, and the side frame doesn’t let go of its color inside a case.
Speaking of color, you get plenty of options, but my favorite is definitely the blue variant, as it has that breezy summer vibe. The buttons are tactile and positioned where your hand would naturally rest.
Moving to the camera module, Xiaomi has taken the iPhone route of individual stove-top camera cutouts. There are four of them (one houses the flash), and aside from the fact that dust is difficult to get out from between, I do quite like them. The ultrasonic fingerprint scanner is positioned at a comfy place where your thumb would naturally rest. I used it on the beach with wet hands, and it worked perfectly fine. Besides, the phone is IP69-rated for dust and water resistance, meaning it should technically withstand a swim. Did I dare take it inside the water on the beach? Absolutely not, because the IP rating is only for fresh water, and seawater can cause irreversible damage.
Display
I’ve said this before that all flagship displays are essentially the same, and that holds true for the Xiaomi 17, too. The phone features a 6.3-inch 1220 x 2656 OLED display, with an adaptive 120Hz refresh rate. This time, Xiaomi has trimmed the bezels even more for a more premium look, and I’m a fan. The panel is exceptionally color-accurate and vibrant for content consumption, as evidenced by my 3-hour run of The Pitt season 2 on the flight to Thailand. Even the HDR performance is exceptional.
Xiaomi claims a peak brightness number of 3,500 nits. Sadly, I don’t have a light meter to put the claim to the test, but from my experience using the panel in the 12 noon sun at Phi Phi Island, it’s plenty bright for outdoor use. The texts were legible, and I could use the phone for GPS navigation without squinting.
When it comes to durability, I usually don’t like to test that part myself and instead rely on user reports. However, I accidentally dropped the Xiaomi 17 on a concrete floor. The result was surprisingly good. I dropped it, without a case, from a tripod at chest height, meaning that, while the phone was in the air, all sorts of scary thoughts came to mind, including how much this repair was going to cost me. Thankfully, the phone escaped with only minor damage to the frame.
Performance & Software
Performance is what makes or breaks the smartphone experience, and it’s no surprise to anyone that the Xiaomi 17 delivers top-of-the-line performance. The Snapdragon 8 Elite Gen 5 is the best Android processor in the market, and it’s coupled with 12GB of LPDDR5X RAM and up to 512GB of UFS 4.1 internal storage. The results? The Xiaomi 17 is an absolute joy to use. It flies through the UI like nothing, and there’s ample headroom for literally any task. That being said, the phone runs on HyperOS 3, which, for the uninitiated, is a very altered version of Android that resembles more like iOS.
I don’t have a problem with the look, especially since HyperOS is one of the smoothest Android skins, with silky animations and a lot of customization. My issue is that, unlike other Chinese skins that allow you to tone down the iOS-ness, Xiaomi doesn’t.
For example, the notification shade is divided into two sections: the quick control and the panel. I don’t like that, but when I went digging in the settings to find a way to merge them, there wasn’t. Also, the back gesture is enabled in the keyboard, so when I tried deleting long text, it would often send me back instead.
There are a few silver linings I wish others would copy from HyperOS, one major one being the lockscreen customizations. There are so many options, and every one of them looks gorgeous. As this is 2026, there’s a host of AI features, such as object eraser, image upscaling, and inpainting. I tried them all, and they work exactly as you’d expect. The company also promises about six years of major software updates and security patches. This is better than vivo’s five years.
Benchmarks & Gaming
As this is a review, I also ran a series of benchmarks to test the Snapdragon 8 Elite Gen 5’s limits. The phone scored 3,415 in Geekbench’s single-core test and 10,008 in the multi-core test. These are insane numbers, especially when compared with the likes of the vivo X300 Pro and the Find X9, which score about 20%-30% lower in multi-core tests. The story remained similar on AnTuTu, where the Xiaomi 17 handsomely beat its Chinese rivals, scoring 3,423,349.
As expected, this performance translates extremely well in gaming. I’m a former PUBG (BGMI) eSports player, and my results were exceptional. The phone maintained 120 FPS gameplay even at high settings without a hint of stutter. I also like Xiaomi’s thermal management, which kept the phone from overheating during both gaming and photo capture in Thailand’s hot summer.
Battery Life & Charging
After all the chatter about the small form factor, you may expect the Xiaomi 17 to compromise on the battery life, just as other Apple and Samsung phones do. Well, you can’t be more wrong, as the Xiaomi 17 packs an even bigger battery, 6,330mAh to be precise, than the 17 Ultra. And the results are just fantastic. On the morning of my Thailand flight, I unplugged the phone at 5 am. I then continued using the phone for the rest of the day, including the three hours of The Pitt on the flight and map navigation when reaching Phuket airport. I ended the day with 20% remaining, and at 3 am the next morning, I had 20% remaining. For a more typical person, you’d be looking more at two days of usage without a hitch.
When it was finally time to charge, Xiaomi, unlike Samsung, bundles a 100W fast charger in the box that charges the phone from 20% to 80% in just 30 minutes. You also get 50W of reverse wireless charging, though that requires a specific charger.
Cameras
If a phone doesn’t fold in half or has dual screens, the only way to differentiate itself is through the cameras. They are the main reason why people lean towards a certain brand, and recently, both OPPO and vivo have been killing it. However, I think there’s room for a third king: the Xiaomi 17. Like others, it also houses a triple-sensor array, led by the 50MP LightFusion 950 sensor, a 50MP JN1 60mm telephoto, and another 50MP OV50M ultrawide lens. Colors are handled by Leica, and that’s the main strength of the Xiaomi 17. The photos it takes, with the different Leica filters, have a certain character you won’t find anywhere else. Every phone takes similar photos these days, and it’s these color profiles that matter the most.
Still, if you’re not a fan of poking around with the cameras, the default Leica Authentic profile produces colors that are very close to natural, with highlights and shadows handled extremely well. The details are crisp and plenty, the HDR performance is mostly spot on, and the contrast is slightly on the boosted side, which is what I like. Beyond the default camera profile, there are a myriad of filters, such as Negative, Positive, Sepia, Natural, Vibrant, and Blue. Each has a different style of capturing the colors and subject, and I really did find myself going through each and every one of them to decide which actually serves the scene the best. And the results speak for themselves. Every photo tells a different story, and that’s the Xiaomi 17’s biggest strength.
The telephoto lens is 2.5x, and I’d say the same about it, too. It serves as the main portrait camera, and the images deliver stellar detail, with excellent foreground separation and improved natural skin tones without the infamous beautification. Xiaomi doesn’t rely much on AI processing, so zooming past 5x-6x will result in blurry photos. Keep that in mind. The ultrawide hasn’t changed from the previous generation, so it still doesn’t have autofocus for macro photography. While it works great when the light is ample, I saw a significant drop in quality at night.
Speaking of the night, both the main and telephoto sensors benefit from Xiaomi’s mature image processing, which retains detail in shadows without making the image muddy or introducing noise. Videos, which can be shot at up to 8K, carry similar details in all lighting conditions, and I’m a fan. Sadly, it’s not all perfect. In Thailand’s heat, some of the videos I captured were choppy, even when I was in the hotel. This problem then carried over to India, where the first few seconds of every video would stutter. I’ve communicated this issue with the Xiaomi team, so a fix could be imminent. Overall, I love the Xiaomi 17’s cameras.
Verdict
Sure, the ₹89,999 price tag of the Xiaomi 17 might feel a bit much, considering it’s more than the vivo and OPPO competition. But the Xiaomi 17 brings a lot of things to the table. You get the best-in-class performance that’s miles ahead of the competition. A design that’s understated yet premium. Battery life that can easily last two full days, and cameras that, instead of being same same but different, induce a character to each and every photo that makes them more memorable. Of course, it’s not perfect. I’d like the camera bugs fixed and ultrawide performance improved, but overall, the Xiaomi 17 gets my recommendation.
If Apple were ever going to make an Android phone, then they’d probably design something like the Xiaomi 17. I wouldn’t describe the build as flashy, but it’s super elegant and reminiscent of past Xiaomi flagships. I talked about this in my X300 Pro review, as creating a brand identity to compete against Samsung and Apple is super important, and Xiaomi has listened. While I was using it daily, many of my friends and family asked me what Xiaomi phone I was using—note the wording: “Xiaomi phone,” meaning they knew it was a particular brand, and that’s important. The Chinese smartphone maker said they thought of every curve, and I’ll just say it straight: the 17 is the best-feeling compact phone I’ve held this year.
The corners are crafted to perfection, the width is spot on, and even the way the aluminum frame blends into the glass without an abrupt edge makes carrying the phone a very enjoyable experience. Beyond that, the back glass is frosted to prevent the phone from slipping off glass surfaces, and the side frame doesn’t let go of its color inside a case.
Speaking of color, you get plenty of options, but my favorite is definitely the blue variant, as it has that breezy summer vibe. The buttons are tactile and positioned where your hand would naturally rest.
Moving to the camera module, Xiaomi has taken the iPhone route of individual stove-top camera cutouts. There are four of them (one houses the flash), and aside from the fact that dust is difficult to get out from between, I do quite like them. The ultrasonic fingerprint scanner is positioned at a comfy place where your thumb would naturally rest. I used it on the beach with wet hands, and it worked perfectly fine. Besides, the phone is IP69-rated for dust and water resistance, meaning it should technically withstand a swim. Did I dare take it inside the water on the beach? Absolutely not, because the IP rating is only for fresh water, and seawater can cause irreversible damage.
Display
I’ve said this before that all flagship displays are essentially the same, and that holds true for the Xiaomi 17, too. The phone features a 6.3-inch 1220 x 2656 OLED display, with an adaptive 120Hz refresh rate. This time, Xiaomi has trimmed the bezels even more for a more premium look, and I’m a fan. The panel is exceptionally color-accurate and vibrant for content consumption, as evidenced by my 3-hour run of The Pitt season 2 on the flight to Thailand. Even the HDR performance is exceptional.
Xiaomi claims a peak brightness number of 3,500 nits. Sadly, I don’t have a light meter to put the claim to the test, but from my experience using the panel in the 12 noon sun at Phi Phi Island, it’s plenty bright for outdoor use. The texts were legible, and I could use the phone for GPS navigation without squinting.
When it comes to durability, I usually don’t like to test that part myself and instead rely on user reports. However, I accidentally dropped the Xiaomi 17 on a concrete floor. The result was surprisingly good. I dropped it, without a case, from a tripod at chest height, meaning that, while the phone was in the air, all sorts of scary thoughts came to mind, including how much this repair was going to cost me. Thankfully, the phone escaped with only minor damage to the frame.
Performance & Software
Performance is what makes or breaks the smartphone experience, and it’s no surprise to anyone that the Xiaomi 17 delivers top-of-the-line performance. The Snapdragon 8 Elite Gen 5 is the best Android processor in the market, and it’s coupled with 12GB of LPDDR5X RAM and up to 512GB of UFS 4.1 internal storage. The results? The Xiaomi 17 is an absolute joy to use. It flies through the UI like nothing, and there’s ample headroom for literally any task. That being said, the phone runs on HyperOS 3, which, for the uninitiated, is a very altered version of Android that resembles more like iOS.
I don’t have a problem with the look, especially since HyperOS is one of the smoothest Android skins, with silky animations and a lot of customization. My issue is that, unlike other Chinese skins that allow you to tone down the iOS-ness, Xiaomi doesn’t.
For example, the notification shade is divided into two sections: the quick control and the panel. I don’t like that, but when I went digging in the settings to find a way to merge them, there wasn’t. Also, the back gesture is enabled in the keyboard, so when I tried deleting long text, it would often send me back instead.
There are a few silver linings I wish others would copy from HyperOS, one major one being the lockscreen customizations. There are so many options, and every one of them looks gorgeous. As this is 2026, there’s a host of AI features, such as object eraser, image upscaling, and inpainting. I tried them all, and they work exactly as you’d expect. The company also promises about six years of major software updates and security patches. This is better than vivo’s five years.
Benchmarks & Gaming
As this is a review, I also ran a series of benchmarks to test the Snapdragon 8 Elite Gen 5’s limits. The phone scored 3,415 in Geekbench’s single-core test and 10,008 in the multi-core test. These are insane numbers, especially when compared with the likes of the vivo X300 Pro and the Find X9, which score about 20%-30% lower in multi-core tests. The story remained similar on AnTuTu, where the Xiaomi 17 handsomely beat its Chinese rivals, scoring 3,423,349.
As expected, this performance translates extremely well in gaming. I’m a former PUBG (BGMI) eSports player, and my results were exceptional. The phone maintained 120 FPS gameplay even at high settings without a hint of stutter. I also like Xiaomi’s thermal management, which kept the phone from overheating during both gaming and photo capture in Thailand’s hot summer.
Battery Life & Charging
After all the chatter about the small form factor, you may expect the Xiaomi 17 to compromise on the battery life, just as other Apple and Samsung phones do. Well, you can’t be more wrong, as the Xiaomi 17 packs an even bigger battery, 6,330mAh to be precise, than the 17 Ultra. And the results are just fantastic. On the morning of my Thailand flight, I unplugged the phone at 5 am. I then continued using the phone for the rest of the day, including the three hours of The Pitt on the flight and map navigation when reaching Phuket airport. I ended the day with 20% remaining, and at 3 am the next morning, I had 20% remaining. For a more typical person, you’d be looking more at two days of usage without a hitch.
When it was finally time to charge, Xiaomi, unlike Samsung, bundles a 100W fast charger in the box that charges the phone from 20% to 80% in just 30 minutes. You also get 50W of reverse wireless charging, though that requires a specific charger.
Cameras
If a phone doesn’t fold in half or has dual screens, the only way to differentiate itself is through the cameras. They are the main reason why people lean towards a certain brand, and recently, both OPPO and vivo have been killing it. However, I think there’s room for a third king: the Xiaomi 17. Like others, it also houses a triple-sensor array, led by the 50MP LightFusion 950 sensor, a 50MP JN1 60mm telephoto, and another 50MP OV50M ultrawide lens. Colors are handled by Leica, and that’s the main strength of the Xiaomi 17. The photos it takes, with the different Leica filters, have a certain character you won’t find anywhere else. Every phone takes similar photos these days, and it’s these color profiles that matter the most.
Still, if you’re not a fan of poking around with the cameras, the default Leica Authentic profile produces colors that are very close to natural, with highlights and shadows handled extremely well. The details are crisp and plenty, the HDR performance is mostly spot on, and the contrast is slightly on the boosted side, which is what I like. Beyond the default camera profile, there are a myriad of filters, such as Negative, Positive, Sepia, Natural, Vibrant, and Blue. Each has a different style of capturing the colors and subject, and I really did find myself going through each and every one of them to decide which actually serves the scene the best. And the results speak for themselves. Every photo tells a different story, and that’s the Xiaomi 17’s biggest strength.
The telephoto lens is 2.5x, and I’d say the same about it, too. It serves as the main portrait camera, and the images deliver stellar detail, with excellent foreground separation and improved natural skin tones without the infamous beautification. Xiaomi doesn’t rely much on AI processing, so zooming past 5x-6x will result in blurry photos. Keep that in mind. The ultrawide hasn’t changed from the previous generation, so it still doesn’t have autofocus for macro photography. While it works great when the light is ample, I saw a significant drop in quality at night.
Speaking of the night, both the main and telephoto sensors benefit from Xiaomi’s mature image processing, which retains detail in shadows without making the image muddy or introducing noise. Videos, which can be shot at up to 8K, carry similar details in all lighting conditions, and I’m a fan. Sadly, it’s not all perfect. In Thailand’s heat, some of the videos I captured were choppy, even when I was in the hotel. This problem then carried over to India, where the first few seconds of every video would stutter. I’ve communicated this issue with the Xiaomi team, so a fix could be imminent. Overall, I love the Xiaomi 17’s cameras.
Verdict
Sure, the ₹89,999 price tag of the Xiaomi 17 might feel a bit much, considering it’s more than the vivo and OPPO competition. But the Xiaomi 17 brings a lot of things to the table. You get the best-in-class performance that’s miles ahead of the competition. A design that’s understated yet premium. Battery life that can easily last two full days, and cameras that, instead of being same same but different, induce a character to each and every photo that makes them more memorable. Of course, it’s not perfect. I’d like the camera bugs fixed and ultrawide performance improved, but overall, the Xiaomi 17 gets my recommendation.
#Xiaomi #Review #Thailand #Real #Camera #TestXiaomi">Xiaomi 17 Review: I Took It to Thailand for a Real Camera Test
Xiaomi phones are a little tough to judge. After all, these guys do everything, from making phones to laptops and sometimes even record-breaking electric SUVs. The Xiaomi 17 is a bit like the quiet kid that never gets noticed, simply because its bigger brother, the 17 Ultra, is on a streak of collecting all the best smartphone camera awards. But here’s the thing: most people won’t ever splurge that much money on a non-Samsung or Apple Ultra flagship. The main sales driver will always be the base model, and that’s the question I had in mind. Can the Xiaomi 17 go head-to-head with the OPPO Find X9 and the vivo X300, especially since it’s more expensive than both? You can thank AI for that.
To answer this very question, I got the Xiaomi 17 for review and took it with me on a work trip to Phuket, Thailand. Here, I used the phone to capture about 500 photos in the summer heat, with temperatures soaring to 40 degrees, and constant GPS navigation to put the Snapdragon 8 Elite Gen 5 SoC through its paces. Spoiler alert, I really do love this phone, but there are a few quirks, too. Here’s why.
Xiaomi 17 Review
Hisan Kidwai
Summary
The Xiaomi 17 brings a lot of things to the table. You get the best-in-class performance that’s miles ahead of the competition. A design that’s understated yet premium. Battery life that can easily last two full days, and cameras that, instead of being same same but different, induce a character to each and every photo that makes them more memorable. Of course, it’s not perfect. I’d like the camera bugs fixed and ultrawide performance improved, but overall, the Xiaomi 17 gets my recommendation.
Design & Hardware
If Apple were ever going to make an Android phone, then they’d probably design something like the Xiaomi 17. I wouldn’t describe the build as flashy, but it’s super elegant and reminiscent of past Xiaomi flagships. I talked about this in my X300 Pro review, as creating a brand identity to compete against Samsung and Apple is super important, and Xiaomi has listened. While I was using it daily, many of my friends and family asked me what Xiaomi phone I was using—note the wording: “Xiaomi phone,” meaning they knew it was a particular brand, and that’s important. The Chinese smartphone maker said they thought of every curve, and I’ll just say it straight: the 17 is the best-feeling compact phone I’ve held this year.
The corners are crafted to perfection, the width is spot on, and even the way the aluminum frame blends into the glass without an abrupt edge makes carrying the phone a very enjoyable experience. Beyond that, the back glass is frosted to prevent the phone from slipping off glass surfaces, and the side frame doesn’t let go of its color inside a case.
Speaking of color, you get plenty of options, but my favorite is definitely the blue variant, as it has that breezy summer vibe. The buttons are tactile and positioned where your hand would naturally rest.
Moving to the camera module, Xiaomi has taken the iPhone route of individual stove-top camera cutouts. There are four of them (one houses the flash), and aside from the fact that dust is difficult to get out from between, I do quite like them. The ultrasonic fingerprint scanner is positioned at a comfy place where your thumb would naturally rest. I used it on the beach with wet hands, and it worked perfectly fine. Besides, the phone is IP69-rated for dust and water resistance, meaning it should technically withstand a swim. Did I dare take it inside the water on the beach? Absolutely not, because the IP rating is only for fresh water, and seawater can cause irreversible damage.
Display
I’ve said this before that all flagship displays are essentially the same, and that holds true for the Xiaomi 17, too. The phone features a 6.3-inch 1220 x 2656 OLED display, with an adaptive 120Hz refresh rate. This time, Xiaomi has trimmed the bezels even more for a more premium look, and I’m a fan. The panel is exceptionally color-accurate and vibrant for content consumption, as evidenced by my 3-hour run of The Pitt season 2 on the flight to Thailand. Even the HDR performance is exceptional.
Xiaomi claims a peak brightness number of 3,500 nits. Sadly, I don’t have a light meter to put the claim to the test, but from my experience using the panel in the 12 noon sun at Phi Phi Island, it’s plenty bright for outdoor use. The texts were legible, and I could use the phone for GPS navigation without squinting.
When it comes to durability, I usually don’t like to test that part myself and instead rely on user reports. However, I accidentally dropped the Xiaomi 17 on a concrete floor. The result was surprisingly good. I dropped it, without a case, from a tripod at chest height, meaning that, while the phone was in the air, all sorts of scary thoughts came to mind, including how much this repair was going to cost me. Thankfully, the phone escaped with only minor damage to the frame.
Performance & Software
Performance is what makes or breaks the smartphone experience, and it’s no surprise to anyone that the Xiaomi 17 delivers top-of-the-line performance. The Snapdragon 8 Elite Gen 5 is the best Android processor in the market, and it’s coupled with 12GB of LPDDR5X RAM and up to 512GB of UFS 4.1 internal storage. The results? The Xiaomi 17 is an absolute joy to use. It flies through the UI like nothing, and there’s ample headroom for literally any task. That being said, the phone runs on HyperOS 3, which, for the uninitiated, is a very altered version of Android that resembles more like iOS.
I don’t have a problem with the look, especially since HyperOS is one of the smoothest Android skins, with silky animations and a lot of customization. My issue is that, unlike other Chinese skins that allow you to tone down the iOS-ness, Xiaomi doesn’t.
For example, the notification shade is divided into two sections: the quick control and the panel. I don’t like that, but when I went digging in the settings to find a way to merge them, there wasn’t. Also, the back gesture is enabled in the keyboard, so when I tried deleting long text, it would often send me back instead.
There are a few silver linings I wish others would copy from HyperOS, one major one being the lockscreen customizations. There are so many options, and every one of them looks gorgeous. As this is 2026, there’s a host of AI features, such as object eraser, image upscaling, and inpainting. I tried them all, and they work exactly as you’d expect. The company also promises about six years of major software updates and security patches. This is better than vivo’s five years.
Benchmarks & Gaming
As this is a review, I also ran a series of benchmarks to test the Snapdragon 8 Elite Gen 5’s limits. The phone scored 3,415 in Geekbench’s single-core test and 10,008 in the multi-core test. These are insane numbers, especially when compared with the likes of the vivo X300 Pro and the Find X9, which score about 20%-30% lower in multi-core tests. The story remained similar on AnTuTu, where the Xiaomi 17 handsomely beat its Chinese rivals, scoring 3,423,349.
As expected, this performance translates extremely well in gaming. I’m a former PUBG (BGMI) eSports player, and my results were exceptional. The phone maintained 120 FPS gameplay even at high settings without a hint of stutter. I also like Xiaomi’s thermal management, which kept the phone from overheating during both gaming and photo capture in Thailand’s hot summer.
Battery Life & Charging
After all the chatter about the small form factor, you may expect the Xiaomi 17 to compromise on the battery life, just as other Apple and Samsung phones do. Well, you can’t be more wrong, as the Xiaomi 17 packs an even bigger battery, 6,330mAh to be precise, than the 17 Ultra. And the results are just fantastic. On the morning of my Thailand flight, I unplugged the phone at 5 am. I then continued using the phone for the rest of the day, including the three hours of The Pitt on the flight and map navigation when reaching Phuket airport. I ended the day with 20% remaining, and at 3 am the next morning, I had 20% remaining. For a more typical person, you’d be looking more at two days of usage without a hitch.
When it was finally time to charge, Xiaomi, unlike Samsung, bundles a 100W fast charger in the box that charges the phone from 20% to 80% in just 30 minutes. You also get 50W of reverse wireless charging, though that requires a specific charger.
Cameras
If a phone doesn’t fold in half or has dual screens, the only way to differentiate itself is through the cameras. They are the main reason why people lean towards a certain brand, and recently, both OPPO and vivo have been killing it. However, I think there’s room for a third king: the Xiaomi 17. Like others, it also houses a triple-sensor array, led by the 50MP LightFusion 950 sensor, a 50MP JN1 60mm telephoto, and another 50MP OV50M ultrawide lens. Colors are handled by Leica, and that’s the main strength of the Xiaomi 17. The photos it takes, with the different Leica filters, have a certain character you won’t find anywhere else. Every phone takes similar photos these days, and it’s these color profiles that matter the most.
Still, if you’re not a fan of poking around with the cameras, the default Leica Authentic profile produces colors that are very close to natural, with highlights and shadows handled extremely well. The details are crisp and plenty, the HDR performance is mostly spot on, and the contrast is slightly on the boosted side, which is what I like. Beyond the default camera profile, there are a myriad of filters, such as Negative, Positive, Sepia, Natural, Vibrant, and Blue. Each has a different style of capturing the colors and subject, and I really did find myself going through each and every one of them to decide which actually serves the scene the best. And the results speak for themselves. Every photo tells a different story, and that’s the Xiaomi 17’s biggest strength.
The telephoto lens is 2.5x, and I’d say the same about it, too. It serves as the main portrait camera, and the images deliver stellar detail, with excellent foreground separation and improved natural skin tones without the infamous beautification. Xiaomi doesn’t rely much on AI processing, so zooming past 5x-6x will result in blurry photos. Keep that in mind. The ultrawide hasn’t changed from the previous generation, so it still doesn’t have autofocus for macro photography. While it works great when the light is ample, I saw a significant drop in quality at night.
Speaking of the night, both the main and telephoto sensors benefit from Xiaomi’s mature image processing, which retains detail in shadows without making the image muddy or introducing noise. Videos, which can be shot at up to 8K, carry similar details in all lighting conditions, and I’m a fan. Sadly, it’s not all perfect. In Thailand’s heat, some of the videos I captured were choppy, even when I was in the hotel. This problem then carried over to India, where the first few seconds of every video would stutter. I’ve communicated this issue with the Xiaomi team, so a fix could be imminent. Overall, I love the Xiaomi 17’s cameras.
Verdict
Sure, the ₹89,999 price tag of the Xiaomi 17 might feel a bit much, considering it’s more than the vivo and OPPO competition. But the Xiaomi 17 brings a lot of things to the table. You get the best-in-class performance that’s miles ahead of the competition. A design that’s understated yet premium. Battery life that can easily last two full days, and cameras that, instead of being same same but different, induce a character to each and every photo that makes them more memorable. Of course, it’s not perfect. I’d like the camera bugs fixed and ultrawide performance improved, but overall, the Xiaomi 17 gets my recommendation.
Post Comment