First they moved away from this in 4o because it led to more sycophancy, AI psychosis and ultimately deaths by suicide[1].
Then growth slowed[2], and so now they rush this out the door even though it's likely not 'healthy' for users.
Just like social media these platforms have a growth dial which is directly linked to a mental health dial because addiction is good for business. Yes, people should take personal responsibility for this kind of thing, but in cases where these tools become addicting, and they are not well understood this seems to be a tragedy of the commons.
First they moved away from this in 4o because it led to more sycophancy, AI psychosis and ultimately deaths by suicide[1].
Then growth slowed[2], and so now they rush this out the door even though it's likely not 'healthy' for users.
Just like social media these platforms have a growth dial which is directly linked to a mental health dial because addiction is good for business. Yes, people should take personal responsibility for this kind of thing, but in cases where these tools become addicting, and they are not well understood this seems to be a tragedy of the commons.
1 - https://www.theguardian.com/technology/2025/nov/07/chatgpt-l...
2 – https://futurism.com/artificial-intelligence/chatgpt-peaked-...