Tuesday, November 4, 2025

Top 5 This Week

Related News

OpenAI limits ChatGPT from giving medical, legal and financial advice to reduce liability risks

OpenAI has announced new restrictions on how users can interact with ChatGPT, redefining the AI system as an “educational tool” rather than a “consultant.” The company has stopped the chatbot from offering specific medical, legal, or financial advice, citing growing concerns over liability and regulatory compliance.

According to a recent report, ChatGPT will no longer provide professional-level consultations or advice that requires certification. This includes areas such as medical diagnosis and prescriptions, legal case drafting, financial planning, or any other decisions involving personal or professional risk.

The update does not mean that ChatGPT will be silent on these topics altogether. It will still be able to explain general concepts like how tax brackets work, what a legal will means, or what typical treatments exist for certain conditions. However, it will not name medications, suggest dosages, provide lawsuit templates, or give investment recommendations.

The new policy also restricts the use of ChatGPT for facial recognition or personal identification without consent and prohibits activities that may lead to academic misconduct. OpenAI said these changes are meant to “enhance user safety and prevent potential harm” that could arise from relying on AI for expert-level guidance.

Users have noticed that attempts to get around these limits by using hypothetical or indirect phrasing are now blocked by safety filters. The changes come amid increasing debate about people turning to AI for professional advice, especially in areas like healthcare and finance.

OpenAI reminded users that, unlike licensed professionals, conversations with ChatGPT are not protected by doctor–patient or attorney–client privilege. This means chat records could potentially be accessed in legal proceedings.

Recently, OpenAI has also introduced new safety measures to support users dealing with mental health issues such as self-harm, mania, or emotional distress. However, the company emphasized that ChatGPT is not a replacement for professional help and cannot detect emergencies or real-time threats.

Experts advise that users should never share sensitive or confidential information such as medical reports, financial data, or private contracts with the chatbot, as privacy and storage cannot be guaranteed.

Also read: Viksit Workforce for a Viksit Bharat

Do Follow: The Mainstream formerly known as CIO News LinkedIn Account | The Mainstream formerly known as CIO News Facebook | The Mainstream formerly known as CIO News Youtube | The Mainstream formerly known as CIO News Twitter

About us:

The Mainstream formerly known as CIO News is a premier platform dedicated to delivering latest news, updates, and insights from the tech industry. With its strong foundation of intellectual property and thought leadership, the platform is well-positioned to stay ahead of the curve and lead conversations about how technology shapes our world. From its early days as CIO News to its rebranding as The Mainstream on November 28, 2024, it has been expanding its global reach, targeting key markets in the Middle East & Africa, ASEAN, the USA, and the UK. The Mainstream is a vision to put technology at the center of every conversation, inspiring professionals and organizations to embrace the future of tech.

Popular Articles