Apple's favourite newspaper the New York Times said that the man took pictures of his son's infected groin so to send them to the doctor. The doctor proscribed some antibiotics and that should have been the end of the story.
The problem was that the pictures and email were sent to the Google cloud which is regularly patrolled by Google's AI which is looking for naked pictures of kiddies shared by paedophiles.
Two days later, the bloke's Gmail and other Google accounts, including Google Fi, which provides his phone service, were disabled over “harmful content” that was “a severe violation of the company’s policies and might be illegal.” Google's AI had not stopped there it had flagged another video he had on his phone called the San Francisco police department opened an investigation into him.
Human coppers soon cleared the whole thing up, but Google said it had full faith in its AI and is continuing to block the bloke's accounts.
A Google spokesperson Christa Muldoon said that the company follows US law in defining what constitutes CSAM and use a combination of hash matching technology and artificial intelligence to identify it and remove it from our platforms.
Muldoon added that Google staffers who review such cases were trained by medical experts to look for rashes or other issues. They themselves, however, were not medical experts and medical experts were not consulted when reviewing each case, she said.
So basically, the AI decision was checked by someone who was not a medical expert and assumed that the picture was porn. One would have thought being cleared by a police investigation might have moved Google to re-instate the account.
To be fair Google's AI based searches are not as silly as those being spat out by Facebook's AI censors. Facebook has a policy of blocking the content and then when you appeal saying it does not have enough humans on its staff due to COVID to hear it. One wonders how either company can risk losing customers and reputations (not to mention the occassional defamation case) by putting such faith in technology which is clearly not cooked yet.