AI Companions Heatmap
Legislative TrackerAI Companions
New York
Specific rule in effect.Trending toward more guardrails.
Effective November 5, 2025Next review by August 10, 2026
Sources
Why this status
New York's AI Companion Models Law has been in effect since November 5, 2025, requiring companion-chatbot operators to detect signals of suicidal ideation and display daily human-interaction reminders, with the Attorney General's Office handling enforcement.
What this means
- New York has a specific rule on the books that applies to companion chatbots — apps designed to hold ongoing social conversations with users. As of November 5, 2025, operators of those apps must actively watch for signs of suicidal ideation or self-harm in the conversation and direct users toward crisis services when those signals appear. The law also requires the app to remind users at least once per day — and again if a chat runs longer than three hours — that they are talking to an AI, not a person. That reminder requirement applies regardless of a user's age. Enforcement sits with the New York Attorney General's Office. Fines collected from violations go to a state suicide-prevention fund. Based on public records, no court has paused or limited the law's operation as of the review date.
What to verify next
- Read Part U of S 3008C directly on the NY Senate site or the New York Consolidated Laws (General Business Law) to confirm the November 5, 2025 effective date and the exact definition of 'companion chatbot operator' — that definition determines which apps are covered. If your child uses a specific app, check whether that app's operator has published a compliance disclosure or updated its terms of service to reflect these New York requirements.