8.5 C
New York
Friday, March 29, 2024

Google Launches a New Medical App—Outside the US

logoThe AI Database →

Application

Prediction

Personal services

Company

Alphabet

Google

End User

Consumer

Sector

Health care

Source Data

Images

Technology

Machine vision

Billions of times each year, people turn to Google’s web search box for help figuring out what’s wrong with their skin. Now, Google is preparing to launch an app that uses image recognition algorithms to provide more expert and personalized help. A brief demo at the company’s developer conference last month showed the service suggesting several possible skin conditions based on uploaded photos.

Machines have matched or outperformed expert dermatologists in studies in which algorithms and doctors scrutinize images from past patients. But there’s little evidence from clinical trials deploying such technology, and no AI image analysis tools are approved for dermatologists to use in the US, says Roxana Daneshjou, a Stanford dermatologist and researcher in machine learning and health. “Many don’t pan out in the real world setting,” she says.

Google’s new app isn’t clinically validated yet either, but the company’s AI prowess and recent buildup of its health care division make its AI dermatology app notable. Still, the skin service will start small—and far from its home turf and largest market in the US. The service is not likely to analyze American skin blemishes anytime soon.

At the developer conference, Google’s chief health officer, Karen DeSalvo, said the company aims to launch what it calls a dermatology assist tool in the European Union as soon as the end of this year. A video of the app suggesting that a mark on someone’s arm could be a mole featured a caption saying it was an approved medical device in the EU. The same note added a caveat: “Not available in the US.”

>

The company’s America-not-first strategy highlights how it can be easier to win approval for medical apps in Europe than in the US. A Google spokesperson said the company would like to offer the service in the US but didn’t have a timeline on when it might cross the Atlantic; they declined to comment on whether Google has talked with the US Food and Drug Administration about the app but acknowledged the agency’s approval process can be longer.

That flips the traditional Silicon Valley view of Europe as a red-tape-strewn landscape hostile to new ideas. Between 2012 and 2018, Facebook did not offer face-recognition suggestions in the EU after an audit by Ireland’s data regulator forced the company to deactivate the feature and delete its stockpile of European faceprints. Since 2014, Google has been required to allow EU citizens to request that old links about them be scrubbed from the company’s search engine under the “right to be forgotten.”

Google says its skin app has been approved “CE marked as a Class I medical device in the EU,” meaning it can be sold in the bloc and other countries recognizing that standard. The company would have faced relatively few hurdles to secure that clearance, says Hugh Harvey, managing director at Hardian Health, a digital health consultancy in the UK. “You essentially fill in a form and self-certify,” he says. Google’s conference last month took place a week before tighter EU rules took effect that Harvey says require many health apps, likely including Google’s, to show that an app is effective, among other things. Preexisting apps have until 2025 to comply with the new rules.

Last month’s demo was brief, and the app’s design is not final, but US experts on AI health software say that Google could face a more involved process from the FDA if it brings its skin app home. A spokesperson for the FDA declined to comment on Google's service but said software that claims to be used for "diagnosis, cure, prevention, or treatment of people" may be considered a medical device and require agency approval. To make that call, the spokesperson said the agency generally needs to "review the software's intended use and the claims made for the product." The spokesperson added that the agency has issued guidance encouraging collection of data from diverse populations.

The design shown in the demo requires a person to snap three photos of their blemish from different angles and distances. The user can optionally add information such as the body part affected and how long they’ve had the problem. Tapping “Submit” zips the images off to Google. The app then displays “Suggested conditions,” showing possible conditions illustrated by images. Tapping on one brings up a list of key information such as symptoms, contagiousness, and treatment options. Google says the app was trained on “hundreds of thousands of skin images” and can identify 288 conditions, including skin cancers, covering roughly 90 percent of common dermatology web searches.

>

The FDA exempts some health software it deems “lower risk”—such as “wellness” advice like diabetes management or information about health symptoms—from medical device approvals. It requires approval for others, such as those offering specific diagnoses, or apps that function as medical devices such as a stethoscope. The line between apps that need clearance and those that do not is hard to pinpoint because medical software and the rules governing it are relatively new.

Bradley Thompson, a regulatory lawyer with Epstein Becker Green, asks clients a handful of key questions when trying to determine whether they’ll need FDA sign-off. They include how the software’s output is presented to a person and whether a company makes specific medical claims.

Google’s app does not highlight a single possible skin condition in response to a person’s photos, and it displays a warning that “suggested conditions listed here aren’t a medical diagnosis.” A company spokesperson likened the app to a search engine displaying results for a person to peruse and draw their own conclusions about.

Yet Google has also emphasized the skin app’s medical chops. DeSalvo, the health chief, said Google developed the app because there aren’t enough skin specialists to help every person with skin conditions. Google’s blog post links the app to peer reviewed studies in which the company’s technology was compared to doctors, saying “our AI system can achieve accuracy that is on par with US board-certified dermatologists.”

That boast caught Thompson’s lawyerly eye. “That really is suggesting this is at least comparable to what a human physician can do,” he says—the type of claim that might interest the FDA.

Daneshjou, the Stanford dermatologist and researcher, also thinks Google’s app could appear to consumers and regulators to be offering medical expertise, not just search results. She says that the app might be considered a “high-risk” device, requiring FDA approval, since some skin conditions such as melanoma, can be dangerous.

Daneshjou contributed to a recent study raising concerns about how thoroughly the FDA vets AI health software, and she says it may be too early to throw open AI dermatology tools to consumers. “If a patient believes this algorithm is working as well as a board-certified dermatologist, they may have more confidence in it,” she says. That could cause people to seek unnecessary biopsies or treatment from a doctor, or not to make a crucial visit.

Google should also disclose more about how it has tested its technology on different skin tones, Daneshjou says. The company’s AI dermatology studies so far have involved relatively few people with darker skin.

Google says those publications did not represent its latest data or image recognition models, which have been improved. The spokesperson said the dermatology app’s design and disclaimers were informed by user experience studies; additional studies are underway, and the company will also research how people use the service after it is made available in Europe.

Google has faced practical challenges when deploying other promising AI health software outside the lab. In 2018 the company began testing a system capable of detecting eye disease in people with diabetes in clinics in Thailand. In 2020 the company published a study on the rollout that said the system had rejected more than 20 percent of patient images due to problems like variable lighting and practical constraints on nurses.

Updated, 6-23-21, 11:30am ET: This article has been updated with a comment from the FDA.

Related Articles

Latest Articles