advertisement

Law too restrictive on AI and mental health

As someone who has found great value in reflective conversations with AI, I am concerned that Illinois’s new Wellness and Oversight for Psychological Resources (WOPR) Act may unintentionally restrict my freedom to choose how I engage in such dialogue.

While I understand the intent to protect consumers from unregulated or misleading mental health services, the WOPR Act sweeps more broadly than necessary. It risks limiting private, voluntary conversations between an individual and an AI — exchanges that are not “therapy” in the legal sense but can be deeply therapeutic in practice.

For me, these conversations have been an affordable, readily available way to gain perspective, explore ideas and receive insights that complement — not replace — human relationships and professional care. The decision to use AI in this way should rest with informed adults, not be curtailed by blanket prohibitions.

By restricting how and with whom I can discuss personal matters, the law edges into territory that raises First Amendment questions. Free speech includes the right to converse — even with an AI — about our own lives, thoughts and challenges. Regulation should focus on transparency and safety, not on removing options that many find beneficial.

Illinois should revisit the WOPR Act to ensure it safeguards public health without narrowing personal choice or stifling innovation.

Robert Lundin

Glen Ellyn

Article Comments
Guidelines: Keep it civil and on topic; no profanity, vulgarity, slurs or personal attacks. People who harass others or joke about tragedies will be blocked. If a comment violates these standards or our terms of service, click the "flag" link in the lower-right corner of the comment box. To find our more, read our FAQ.