A recent article in the paper you are now reading, “Chatty robot helps seniors fight loneliness through AI companionship”. ( PR, page A6, 12/272023) triggered a melancholy recollection of a relatively recent visit to an elderly acquaintance (elderly, only in the sense that she was and still is, older than me). She was proudly in the process of showing off her newest acquisition: cooing and snuggling with a highly realistic, fuzzy little cat doll that mewed and/or meowed when you petted it. However, that first somewhat sorrowful reaction to an adult seduced into childlike behavior over what was no more than a souped-up toy doll --- a child’s toy --- quickly mellowed to a feeling of how cool and lifelike the robot acted and reacted. And the more I thought about it, the more I mellowed into an attitude of, “Why not?” ---who am I to be offended by another’s harmless pleasure (an attitude I find offensive when practiced by others)? Also, how is the mode and delivery of this enjoyable activity different from interacting with a real live pet (which by the way doesn't have to be fed and cleaned up after). And, the final push off my high horse was to realize that this situation seems not so much different than enjoying a good sci-fi novel like the Seven Moons of Mali or viewing a hot game on the Sports Channel? As Tevye, in “Fiddler on the Roof” has famously replied, “I’ll tell you --- I don’t know” --- but I will tell you what I do know (within reasonable bounds of certainty) and that is that technology, any technology, is borne from our curiosity and need for control in order to make our lives more pleasurable and enjoyable.
All the same, it is useful to also consider the admonition that “Enjoyment and Pleasure are terms often used interchangeably, but they are not the same thing. Pleasure happens to you; enjoyment is something that you create through your own effort. Pleasure is the lightheadedness you get from a bit of grain alcohol; Enjoyment is the satisfaction of a good wine, properly understood. Pleasure is addictive and animal;
Enjoyment is elective and human.” (Arthur Brooks, The Atlantic Magazine, MARCH 24, 2022)
So using a “chatty robot” or Chatbot as a companion for the potentially lonely came as not much of a surprise and reminded me of the initial reactions to Joseph Weizenbaum’s computer program, “ELIZA”, one of the earliest (mid 60s), if not the first, attempts of AI research to simulate and stimulate the man-machine conversation. All of this is described brilliantly and lucidly in Joseph Weizenbaum’s book. “Computer Power and Human Reason --- From Judgment to Calculation”, and so I will have no reservations in quoting directly from this work and what I will be quoting is just limited to the Introduction. For those who are a bit more technically adventuresome, there are chapters on “Where the Power of the Computer Comes From” and “How Computers Work “ as well as “Artificial Intelligence”.
Early on Weizwnbaum reveals , “I chose the name ELIZA because, like the Eliza of Pygmalion fame, it could be taught to ‘speak’ increasingly well... The script is a set of rules rather like those that might be given to an actor who is to use them to improvise around a certain theme. Thus ELIZA could be given a certain script to enable it to maintain a conversation about cooking eggs or about managing a bank checking account, and so on. Each specific script thus enabled ELIZA to play a specific conversational role. For my first experiment, I gave ELIZA a script designed to permit it to play (I should really say parody) the role of a Rogerian psychotherapist engaged in an initial interview with a patient. The Rogerian psychotherapist is relatively easy to imitate because much of his (sic) technique consists of drawing out his patient by reflecting his patient’s statements back to him.”
The most famous ELIZA transcript was published soon after in an article Weizenbaum wrote for the January, 1966, volume of Communications of the ACM. It is a record of a conversation between the program and a young woman; the program’s responses in the text below are in all caps.
Men are all alike.
IN WHAT WAY
They’re always bugging us about something or other.
CAN YOU THINK OF A SPECIFIC EXAMPLE
Well, my boyfriend made me come here.
YOU BOYFRIEND MADE YOU COME HERE
He says I’m depressed much of the time.
I AM SORRY TO HEAR YOU ARE DEPRESSED
It’s true. I am unhappy.
DO YOU THINK COMING HERE WILL HELP YOU NOT TO BE UNHAPPY
I need some help, that much seems certain.
WHAT WOULD IT MEAN TO YOU IF YOU GOT SOME HELP
Perhaps I could learn to get along with my mother.
TELL ME MORE ABOUT YOUR FAMILY
There is a bit more dialogue but you get the idea...
You can try it out for yourself at this site: https://psych.fullerton.edu/mbirnbaum/psych101/eliza.htm
If, as I did, you test it out by starting with the prompt as in the example above, “Men are all alike” don’t be surprised to find the conversation pursuing a different path as some randomness has been injected into the script at this website.
Weizenbaum goes on to note that, “I was startled to see how quickly and how very deeply people conversing with the computer became emotionally involved and how unequivocally they anthropomorphized it. Once, my secretary who had watched me work on the program for many months and therefore surely knew it to be a computer program, started conversing with it. After only a few interchanges with it she asked me to leave the room! “
He was also amazed that a number of practicing psychiatrists seriously believed that the program could grow into a nearly completely automatic form of psychotherapy. He quoted from a paper by Colby et al , “Further work must be done before the program will be ready for clinical use.. If the method proves beneficial, then it would provide a therapeutic tool which can be made widely available to mental hospitals and psychiatric centers suffering a shortage of therapists. Because of the time-sharing capabilities of modern and future computers, several hundred patients an hour could be handled by a computer system designed for this purpose. The human therapist, involved in the design and operation of this system, would not be replaced, but would become a much more efficient man (sic!) since his efforts would no longer be limited to the one-to-one patient-therapist ratio as now exists.”
And in a footnote, Weizenbaum adds “ Nor is Dr Colby alone in his enthusiasm for computer administered psychotherapy. Dr. Carl Sagan,the astrophysicist, recently commented on ELIZA, “No such program is adequate for psychiatric use today but the same can be remarked about some humans psychotherapists, In a period where more and more people in our society seem to be in need of psychiatric counseling, and whe time sharing of computers is widespread, I can imagine the development of a network of computer psychotherapeutic terminals, something like arrays of large telephone booths, in which, for a few dollars a session, we would be able to talk with an attentive, tested, and largely non-directive psychotherapist”.
In response to Colby and Sagan’s interpretations of ELIZA, Weizenbaum decries, “I had thought it essential, as a prerequisite to the very possibility that one person might help another learn to cope with his emotional problems, that the helper himself participate in the other’s experience of those problems. There are undoubtedly many techniques to facilitate the therapist’s imaginative projection into the patient's inner life. But that it was possible for even one practicing psychiatrist to advocate that this crucial component of the therapeutic process be entirely supplanted by pure technique --- that I had not imagined! What must a psychiatrist think he is doing while treating a patient, that he can view the simplest mechanical parody of a single interviewing technique as having captured anything of the essence of a human encounter?”
Whew... no to difficult to ascertain Weizenbaum’s underlying attitude towards the “science” of psychotherapy! Unfortunately, we still observe this phenomenon today when a person proposes absurd solutions to problems way outside their area of expertise, such as, for example, a politician proposing a solution to a problem in medical science --- drinking Clorox to cure Covid comes to mind.
He contends there are important differences between men and machines as thinkers , ”I would argue that, however intelligent machines may be made to be, there are some acts of thought that ought to be attempted only by humans... I believe there are limits to what computers ought to be put to do. One socially significant question I thus intend to raise is over the proper place of computers in the social order ...We can count, but we are rapidly forgetting how to say what is worth counting and why.”
And, that last sentiment, being the purpose of this column, which is to explore the effects of Technology on Society and vice-versa, I couldn't agree more.
No comments:
Post a Comment