> She was worried about a friend who was travelling in a far-away city, with little timezone overlap when they could chat, but she could talk to ChatGPT anytime about what the city was like and what tourist activities her friend might be doing, which helped her feel connected. She liked the memory feature too, saying it was like talking to a person who was living there.
This seems rather sad. Is this really what AI is for?
And we do not need gigawatts and gigawatts for this use case anyway. A small local model or batched inference of a small model should do just fine.
A common cited use case of LLMs is scheduling travel, so being able to pretend it’s somebody somewhere else is for sure important to incentivize going somewhere!
Same. I have a lot of ideas I like to explore that people find boring or tedious. I used to just read, but it's pleasant to have the option to play with those thoughts more.
Asking ChatGPT about safety of someone traveling instead of asking that person is the nerdy thing to do. Somehow a hairstylist doesn't invoke image of a nerd in me. That is why I find this story implausible.
When I say nerdy and/or obscure, I mean things like "are quantum fluctuations ergodic and how does this affect affect the probability of a quantum fluctuation triggering a new big bang?"
Call me weird, I know absolutely nothing about what you just wrote but I'm super intrigued now and would love to know more, ah. But that's just because I'm generally curious about pretty much everything.
But I can see why it can be hard to find people to talk about that. Heck it might be hard to find people who even know about that topic in general.
It's sad that we aren't all rich enough to have a personal assistant to tend to our sides 24/7? mean, it seems more useful than, say, cruise ships, but they get to exist.
Things don't "get to exist", that implies that there's a person or people who have decided not to use some power of theirs to make all cruise ships disappear.
The GP quote also wasn't about a personal assistant use case, it was about filling a hole in personal connection. Its sad because we are more often today having less human connections and more digital, aka fake, ones.
It’s super dope, and you can have it talk to people for you in the local language when you go there. I’ve busted it out to explain what I’m thinking for me. Watching travel shows on TV or reading travel magazines is sadder.
This seems rather sad. Is this really what AI is for?
And we do not need gigawatts and gigawatts for this use case anyway. A small local model or batched inference of a small model should do just fine.