Microsoft Agent and Microsoft Bob and Clippy were all based on a tragic misinterpretation of the theories of Clifford Nass and Byron Reeves, and as much as those Microsoft products were mismanaged, mocked, maligned, and abused, his theories and work were actually quite interesting and still relevant, though tragically misunderstood. I saw him give a fascinating talk about his work at Ted Selker's "New Paradigms for Using Computers" workshop at IBM Almaden Labs in 1996.
>The theory behind this software came from work on social interfaces by Clifford Nass and Byron Reeves at Stanford's Center for the Study of Language and Information.
>Clifford Ivar Nass (April 3, 1958 – November 2, 2013) was a professor of communication at Stanford University, co-creator of The Media Equation theory, and a renowned authority on human-computer interaction (HCI). He was also known for his work on individual differences associated with media multitasking. Nass was the Thomas M. Storke Professor at Stanford and held courtesy appointments in Computer Science, Education, Law, and Sociology. He was also affiliated with the programs in Symbolic Systems and Science, Technology, and Society.
g4tv.com-video4080: Why People Yell at Their Computer Monitors and Hate Microsoft's Clippy
>Alan Cooper (the "Father of Visual Basic") said: "Clippy was based on a really tragic misunderstanding of a truly profound bit of scientific research. At Stanford University, Clifford Nass and Byron Reeves, two brilliant scientists, had done some pioneering work proving conclusively that human beings react to computers with the same set of emotional reactions that they use to react to other human beings. [...] The work of Nass and Reeves proved that when people talk to computers, when they hit the keyboard and move the mouse, the part of their brain that's being activated is the part that has that emotional reaction to people dealing with people. Here's where the great mistake was made. That's really good research up to that point. But then the great mistake was made, which was: well if people react to computers as though they're people, we have to put the faces of people on computers. Which in my opinion is exactly the incorrect reaction. If people are going to react to computers as though they're humans, the one thing you don't have to do is anthropomorphize them, because they're already using that part of the brain. Clippy was a program based on the research that Nass and Reeves did, and it was a tragic misinterpretation of their work."
Social science research influences computer product design
>STANFORD -- A new home computer product to be introduced with fanfare at the Consumer Electronics Show in Las Vegas Saturday, Jan. 7, is based on research on human-computer interaction conducted at Stanford's Center for the Study of Language and Information.
>"Microsoft's new Bob home computer program is an example of how formerly arcane knowledge about human behavior has become as relevant as computer science to the communication technology marketplace," said John Perry, director of CSLI. The 12- year-old Stanford center does research in the related fields of information, computing and cognition.
>"The interface between humans and computers is where the action in computers is now, and so research on how people think and behave is becoming hot stuff," Perry said.
>Two social scientists, Clifford Nass and Byron Reeves, professors in the Communication Department, provided their theories and research results to Microsoft Corp.'s "social interface" program designers. The program's first product, called Bob, is to be introduced Saturday, Jan. 7 by Microsoft chairman and CEO Bill Gates at the Consumer Electronics Show in Las Vegas. Reeves and Nass are currently serving as consultants to Microsoft.
>Their research can be applied, however, to other forms of information technology, including voicemail and interactive television.
>"Nass' and Reeves' work considers to what extent people react to technology as if it were more real than it is," Perry said. "They have found that to a very considerable extent people treat their computers and other computer-driven technology in the same ways that they treat people - as if the computer possessed reason, feelings, etc. People also treat pictures on screens as real objects, rather than as representations of real objects. This is relevant to anyone who wants to design technology or content that is as effective as it can be," Perry said.
>This work can also be controversial, Nass said. For example, some women have complained about findings, in his and Reeves' experiments with computer voices, that people are prone to gender stereotyping in voice-based technologies. "Female voices are perceived as less effective evaluators and more nurturing than are male-voiced systems. Female voiced computers are perceived as better teachers of love and relationships and worse teachers of technical subjects than are male-voiced teaching systems," the two reported in CSLI's annual research report.
>"We are not supporting gender stereotyping but we are identifying something that people designing products should be sensitive about," Nass said. "It's an important finding also, because it says that you can't blame women for gender stereotyping because of the way they dress and behave. Here is a black box that doesn't dress or behave differently than men, and it still gets gender stereotyped." [...]
Computers as Social Actors. Clifford Nass. Professor. Stanford University. "New Paradigms for Using Computers" workshop, IBM Almaden Labs, 1996.
>Individual's interaction with technologies is fundamentally, emphasis on fundamentally, social and natural, and in a minute I'll define what I mean by social and natural. I can refer to in questions. Second point if you'll see is these responses are automatic and unconscious. Simply put, all of you in the audience will deny that you would do when you would see people like you that is experienced computer users do up here. The reason you'll deny it is because these are responses that you're not consciously aware of and that you couldn't control. So what do I mean by fundamentally social. What I mean is go to the social science section of the library and the argument of this talk is the people who know the most by far about human computer interaction or social science. Unfortunately, none of them know that. Little did they know that they had been spending all their time writing deeply about human computer interaction and just were not aware of it. [...]
>Phil Agre: This may be as big a question as Ken's, Cliff I found your presentation ethically troubling all the way down, I want to ....
>Clifford Clifford Nass: It's not my fault (laughs in the background)
>Phil Agre: No, I think it is. At least, it's my concern. Let me just try a scenario on you. In the literature you are talking about is a great deal of research on the conditions under which people are more likely to obey instructions. What do you think about imbedding those principles in user interfaces. Are you comfortable with that?
>Clifford Nass: Okay, I think I can give you a really short answer. It is critically important and socially valuable to know all the terrible ways that people can be manipulated. That is critically important that is not to say, nor if I advocated at all in this talk, that we necessarily should use those methods. The discovery that people can be manipulated is one of the most important social findings in the 20th century and I'm also delighted we know that. I'm also delighted that we know we should avoid it, that's good too. There is no ethical component to the discovery that these things exist, there is an ethical component in using them and I am not advocating which ones you use and which ones you don't. That's for the individual ..
>Ted Selker: Except, except when you are in your consulting role.
>Clifford Nass: Well but even there, I'll give you a really short anecdote: male characters are trusted more than female characters.
>Ted Selker: So the character in us, for example if I am designing a user interface I really want to focus and work on tasks and be oriented. Now if I've got this little guy over here, that's like disorienting me. I'm sorry guys, that's not really helping with my task. Now when is it appropriate to have an avatar helping me in a task, and that has to do when the task is generally social probably plus I'm sure we can learn about that.
>Clifford Nass: No, it's the same thing as sometimes when I want to know what the meaning of a word is, I look in the dictionary. Sometimes I go to the guys next door, not for reasons of speed but I feel like being social with the guy next door. Even though I may be working on a task I may just feel like it. Similarly social things should be there, social manifestations should be there when you feel like it. With that said, one lesson from Bob is the characters there where way over the top. They spent their life saying look at me I am a character, look at me I am a character, we don't like that in people and we certainly don't like in software either. So social presences that are available when we want them and not when you don't are the people we like the best and those are the people we should model.
I think Bonzi Buddy was built on Microsoft Agent, so it would make sense.