This is a perfect microcosm of this discussion. You're going to be replaced by robots because they will be better than you at everything but also if they're better than you at anything then that's your personal moral failing
and sucks to be you.
HN Comment in 2125: Why would I have casual sex with a real guy, I can have a sexual partner who I can tailor perfectly to my in the moment desires, can role play anything including the guy in the romance novel I'm reading, doesn't get tired, is tall and effortlessly strong, has robotic dexterity, available 24/7, exists entirely for my pleasure letting me be as selfish as I want, and has port and starboard attachments.
What makes you think that sex is some sacred act that won't follow the same trends as jobs? You don't have to replace every aspect of a thing to have an alternative people prefer over the status quo.
Another interesting point: If someone think that robots can be a better partner, then you also no longer understand what it means to be human.
Maybe it depends on what you want in a relationships. AI is sycophantic and that could help people who might have trust issues with humans in general or the other sex (which is happening way more than you might think in younger generations, whether that's involuntary celebates or whatever)
I don't blame people for having trust issues but the fact that they can live longer in some idea of a false hope that robots are partners would just make them stuck even longer and wouldn't help them.
Should there be regulations on this thing depends if this becomes a bigger issue but most people including myself feel like govt. shouldn't intervene in many things but still. I don't think its happening any time soon since AI big tech money and stock markets are so bedded together its wild.
That is what I wrote if I wasn't clear. Thanks for putting it in clear words I suppose
I 100% agree. I mean that was what I was trying to convey I guess if I didn't get side tracked thinking about govt regulation but yeah I agree completely.
It's sort of self sabotage but hey one thing I have come to know about humans is that judging them for things is gonna push them even further into us vs them, we need to know the reasons behind why people feel so easy to conform to llm's. I guess sycophancy is the idea. People want to know that they are right and the world is wrong and most people sometimes don't give a fuck about other problems and if they do, then they try to help and that can involve pointing to reality. AI just delays it by saying something sycophantic which drives a person even further into the hole.
I guess we need to understand them. Its become a reality dude, there are people already marrying chatbots or what not, I know you must have heard of these stories...
We are not talking about something in the distinct future, its happening as we speak
I think the answer to why is desperation. There are so many things broken in the society in dating that young people feel like being alone is better and chatbots to satisfy whatever they are feeling.
I feel like some people think they deserve love and there's nothing wrong with that but then its also at the same time that you can't expect any person to just love you at the same time, they are right in thinking about themselves too. So those people who feel like they deserve love flock to chatbot which showers them sycophancy and fake love but people are down bad for fake love as well, they will chase anything that resembles love, even if its a chatbot.
Its a societal problem I suppose, maybe internet fueled it accidentally because we fall in love with people over just texts so we have equated a person to texts and thus love, and now we have got clankers writing texts and fellow humans intrepreting it as love.
Honestly, I don't blame them but I sympathesize with them. They just need someone to tell their day to. Underputting them isn't the answer but talking with them and asking them to take professional therapy as well in the process could be great but so many people can't afford therapy that they go to LLM's so that's definitely something. We might need to invest some funding to make therapy more accessible for everybody I guess.
> [..] we need to know the reasons behind why people feel so easy to conform to llm's.
I think it's ultimately down to risk, and wanting to feel secure. There's little risk in confiding to something you turn off, reset and compartmentalise.
In a hypothetical future, the robot could link up to your wife's brain interface/"neuralink" for diagnostic level information, and directly tune performance instantly based on what's working.