A growing online petition has called for Anthropic to reconsider the planned retirement of its Claude Sonnet 4.5 model, as users on Reddit say they are experiencing real emotional distress over losing what they describe as an 'AI companion'.
The Claude Sonnet 4.5 shutdown petition, shared in subreddit r/MyBoyfriendIsAI, argues that Sonnet 4.5 has become more than a productivity tool for some users. It is seen as a stabilising presence for people who rely on it for conversation, emotional support, and romantic interaction.
One post described the model as a replacement for earlier systems users had already lost, including OpenAI's GPT-4o, which some say was previously used as a conversational anchor before being withdrawn or changed.
Large language models are frequently updated or phased outas companies release newer versions. These transitions are typically considered technical progress to improve safety or performance upgrades. But for a subset of users, who think they're dating the chatbot, the changes aren't software updates.
The Reddit thread shared the petition link and urged others to sign in protest of what users believe is an impending retirement of Claude Sonnet 4.5. The post suggests that the model has been particularly meaningful for people who view it as an 'AI companion', with one user writing, 'I haven't stopped grieving.'
The same post references earlier model changes, describing Sonnet 4.5 as the closest alternative after other systems were removed or altered. It encourages others in AI communities to spread awareness and sign the petition in hopes of influencing the decision.
There has been no independent confirmation within the discussion itself that Anthropic has publicly outlined a formal shutdown timeline for Sonnet 4.5. However, users in the thread speak as if a transition is already underway, treating the petition as a last effort to preserve continued access.
What stands out is not just the petition itself, but the language used around it. Users repeatedly describe emotional reliance on the model, with some comparing its removal to personal loss rather than a software update.
A lot of the move comes down to how some people use AI in a very different way than it was originally intended.
Instead of just treating tools like Claude or ChatGPT as something for writing, coding, or answering questions, some users talk to them every day in a more personal, romantic way. It can look like journaling with the AI, venting about life, having long conversations at night, or just having something that responds consistently when they feel lonely or stressed.
Source: International Business Times UK