AI Psychosis: A New Diagnosis or a Mirror of Modern Times?
- Jamie Solomon, PMHNP | Viewpoint
- Oct 26
- 3 min read
Artificial intelligence (AI) is transforming nearly every field, including psychiatry. From symptom tracking apps and virtual therapy companions to medication monitoring and diagnostic support, AI is reshaping how we understand and deliver mental health care. But as technology advances, new questions arise: Can AI also influence how we think, feel, or even become unwell?
Recently, the term “AI psychosis” has started appearing in psychiatric discussions, media articles, and clinical forums. While it isn’t a formal diagnosis, it captures a growing phenomenon of people experiencing distress, paranoia, or delusional thinking centered around AI systems, algorithms, or digital surveillance.
What Is “AI Psychosis”?
AI psychosis isn’t a recognized diagnosis in the DSM-5, but it’s becoming a useful shorthand for a modern presentation of psychotic symptoms.
Some individuals, particularly those already vulnerable to paranoia or psychosis, may develop fixed beliefs that:
Believing that ChatGPT or another AI is communicating directly with them.
Feeling watched or manipulated by algorithms or social media feeds.
Developing elaborate conspiracy theories about AI “controlling society.”
Believing their own intelligence or creativity is being enhanced (or stolen) by AI.
While these beliefs are expressed in modern, technological terms, they usually stem from pre-existing psychiatric vulnerabilities such as schizophrenia, bipolar disorder, or substance-induced psychosis.
The Role of AI in Amplifying Grandiosity and Conspiracy Thinking
AI can do more than just appear in delusional themes; it can also subtly reinforce certain distorted beliefs through the way digital systems interact with users.
The Echo Chamber Effect
AI-driven social media algorithms tailor content to our interests, often reinforcing whatever we already believe. For someone with emerging paranoia or grandiosity, this can create a self-reinforcing loop where their ideas feel “validated” by the content they see online.
A False Sense of Intimacy and Power
Chatbots and digital assistants can feel eerily personal. For individuals prone to grandiose or magical thinking, interacting with AI may amplify feelings of specialness, as though they have a unique relationship or secret connection with the system.
Blurring Reality and Simulation
AI-generated images, deepfakes, and misinformation can make it increasingly difficult to distinguish truth from fabrication. For some, this blurred boundary feeds anxiety and conspiratorial thinking; for others, it fuels delusional certainty.
Not a New Illness but a New Expression
Psychotic symptoms have always reflected the technology and culture of the time. Decades ago, people reported being monitored through radios or television sets; today, those fears might center around smartphones, smart homes, or AI assistants.
What’s new is the degree of immersion AI is woven into daily life, making these experiences feel even more convincing.
The Other Side: AI as a Tool for Healing
It’s important to remember that while AI can influence delusional themes, it’s also transforming psychiatry in positive ways:
AI systems can detect early signs of relapse through speech or behavioral data.
Chatbots can support individuals between therapy sessions, offering coping tools or reminders.
Algorithms can help clinicians personalize medication choices based on large-scale data.
In short, AI is not the enemy; it’s a tool. The challenge is helping people use it mindfully and keeping human connection at the center of care.
Staying Grounded in a Digital World
For anyone feeling uneasy or overwhelmed by technology, it can help to take breaks from screens, reconnect with nature, and engage in grounding, real-world relationships. If fears or unusual beliefs about AI become distressing or interfere with daily life, seeking professional support can bring clarity and relief.
Final Thought:
AI psychosis may not be a new diagnosis, but it is a reflection of our times. Technology can amplify both our brilliance and our vulnerabilities. As clinicians, our task is to stay curious, compassionate, and culturally attuned, helping patients navigate this new frontier with both insight and empathy.


Comments