Will ChatGPT no longer be able to give medical and legal advice? How things really are

The news is making the rounds on the web that ChatGPT could no longer provide medical or legal advice. News which, however, is completely unfounded. She intervened to clarify the situation OpenAI who explained how policy. The rules, in fact, had already been present for some time: the chatbot in areas that require a professional licensebut may provide general information for educational purposes.

The case exploded after a viral post – later removed – which spoke of a “censorship shift” by the company. Nothing could be further from reality. As Karan Singhal, head of healthcare AI at OpenAI, explained, ChatGPT will continue to be “a useful resource for understanding legal and health issues”, but .

The new rules: inform yes, prescribe no

As of October 29, 2025, OpenAI simply has unified its policies in a clearer document, also aligning with the requests of the European AI Act. The rules prohibit the use of artificial intelligence to provide personalized consultations in areas such as health, justice, finance, work and educationexcept under the supervision of a licensed professional.

This means that ChatGPT can explain what is a contract or bronchitis, but he cannot write a tailor-made will or suggest an antibiotic. The goal is to protect users from concrete risks: Cases of intoxications and even psychotic episodes linked to incorrect responses from chatbots.

OpenAI recognizes that chatbots may appear empathetic and trustworthy, but . Artificial intelligence must flankdo not replace those with certified professional skills. In an era where millions of users confide intimate thoughts or psychological problems to ChatGPT, the distinction between information And therapyBetween help And diagnosisis more important than ever. The rule is simple but essential: ChatGPT can help you understand, but it can’t decide for you.