Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Eliza (seemingly) implements a rudimentary conversational model of psychotherapy that facilitates transference.

https://en.wikipedia.org/wiki/Conversational_model

This model differs from goal orientated transactional conversational models e.g. phone robots in so far as the objective is not to direct behaviour, but rather to facilitate self-expression, a task which can be performed with sparse requirements for real-world logical semantic modelling.

Eliza could always keep within the bounds of transference by deflecting back to interlocutor e.g.

You said earlier that ____________

Are you worried about __________

Tell me why not?

Do you like talking about __________

Anything specific?

Let's talk about something more interesting.

- - -

Well, not always. Eliza was easily derailed into revealing its logical semantic paucity, thus breaking the spell of transference.

GPT-3 has reached a significantly higher plateau of semantic pseudo-competency, but what if it were applied simply to the application of transference conversation? So long as it allows lonely people to chat away without breaking the spell, that might be enough for GPT-3 to earn $3 a month.

Could there be a market for friendertainment?

https://phys.org/news/2018-12-americans-lonely.html



> Could there be a market for friendertainment?

Aren't there tons of English-speaking people in third-world countries capable of being worthy pen-pals and filling this market gap?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: