Elon Musk has told the public to upload their medical information to his AI platform, Grok. He claimed the tool could give a 'second opinion' on health problems. This time, it stirred reactions that came from concern, especially since Grok has a history of generating offensive and unsafe content.

Former intelligence officer Travis Akers warned that sharing sensitive health data with AI is very risky. Experts say users could lose control of private information. They also stress that AI is not a replacement for doctors. Privacy and safety should come before convenience when it comes to healthcare.

Grok has been criticised for generating harmful content. It has posted anti-semitic statements and inappropriate images, including some of underage people. Despite this, Musk has praised the AI publicly. He recently suggested on X that users could 'take a picture of your medical data or upload the file to get a second opinion from Grok.'

You can just take a picture of your medical data or upload the file to get a second opinion from Grokhttps://t.co/YP2u8K2sSp

Some users say they have received accurate diagnoses for rare conditions. But experts caution that the risks are far greater than the benefits.

According to theLAD Bible, Bradley Malin, a professor of biomedical informatics at Vanderbilt University, said: 'Posting personal information to Grok is more like, "Wheee! Let's throw this data out there, and hope the company is going to do what I want them to do."'

It is also worth noting that AI does not follow the same privacy rules as hospitals.

Travis Akers, a former intelligence officer, issued a sharp warning. He wrote on X: 'Nobody, and I repeat, absolutely nobody should ever upload their medical information into an AI platform. I am telling you this as a former intelligence officer.'

Nobody, and I repeat, absolutely nobody should ever upload their medical information into an AI platform.I am telling you this as a former intelligence officer.pic.twitter.com/zrTlQzidtI

Akers explained that sensitive data could be stolen or shared without consent. Unlike doctors, AI companies are not bound by strict legal rules. Files could be stored forever or fall into the wrong hands. Users risk exposing their most private health details to strangers.

Source: International Business Times UK