Page 111 of The Proving Ground
“You could say that, yes. I’ve developed it, I teach it, I believe it will make the world a better place.”
“Is it fair to say you have been immersed in AI since its beginning?”
“Goodness, I’m not that old.”
I waited for the polite laughter in the courtroom to subside.
“Well, then, can you tell us how long artificial intelligence has been around?” I asked.
“Early forms of it go back to the sixties, at least,” Spindler said.
“Are you talking about something called Eliza?”
“Yes. Long before there was a Siri or an Alexa or a Watson, there was Eliza.”
“Can you tell us about Eliza, Professor Spindler?”
Mitchell Mason objected, citing relevancy, but the judge overruled him without asking me to defend the question.
“You can go ahead and answer, Professor,” I said.
“Eliza was an early form of artificial intelligence,” Spindler said. “It is widely considered to be one of the very first chatbots.”
“And who—or I should say, what—was Eliza?”
“Eliza was a computer program developed at MIT—the Massachusetts Institute of Technology—in the mid-sixties. It was a fairly simple software program originally conceived of as a computerized psychotherapist. It was named after Eliza Doolittle from the Shaw playPygmalionand, of course, the musicalMy Fair Lady,the movie version of which premiered the same year work began on Eliza.”
“As I recall, the movie was about a professor of phonetics trying to teach an uneducated Cockney flower girl how to speak properly?”
“Yes, with Audrey Hepburn as Eliza.”
Spindler said it with a tone of deference for the great screen beauty. This prompted Judge Ruhlin to wave off a rising Mitchell Mason and step in before he could even object.
“Mr. Haller, could we please move on to testimony germane to the case at hand?” she asked.
“Apologies, Your Honor,” I said. “Moving on. Professor Spindler, is this early form of artificial intelligence of importance today and to this case?”
“Yes, it is,” Spindler said. “There is a phenomenon known as the Eliza effect that is very much in play today and in regard to this case.”
“How so, Professor? What is the Eliza effect?”
“In short, it is people’s tendency to attribute human thoughts and emotions to machines. I believe that Joseph Weizenbaum, the creator of Eliza, called it a wonderful illusion of intelligence and spontaneity. But of course it wasn’t real. It was artificial. Eliza was literally following a script and operated by matching a user’s typed words or queries with potential responses in that script.”
“Would you say that the wonderful illusion of AI has come a long way in the sixty years since Eliza?”
“Yes, certainly. Eliza was a dialogue box. You typed in a question and it answered or, more often, responded with a question of its own. It simulated Rogerian psychotherapy, which is a humanistic approach to patients that is dependent on simple, supportive, and nonjudgmental responses from the therapist. It’s theAnd how did that make you feel?kind of therapy. We have much more advanced chatbots and conversation apps nowadays that include visual and audio dimensions that seem quite real.”
“Have you had a chance to examine Wren, the AI companion involved in this case?”
“I have reviewed the chat logs and evaluated the app’s underpinnings—its framework and graphics—and sifted through its code, yes. Wren’s come a long way from its ancestor Eliza. But the basic foundation of a conversational chatbot is pretty much unchanged.”
“Meaning what?”
“Meaning garbage in, garbage out. It’s all about the quality of the programming. The coding, training, and ongoing refinements. Whatever data goes into the training of a large language AI model is what comes out when it is put into use.”
“Are you saying that an AI program like Wren will carry the biases of those who feed it data and train it?”
“That is absolutely what I’m saying. It is true of all technology.”
Table of Contents
- Page 1
- Page 2
- Page 3
- Page 4
- Page 5
- Page 6
- Page 7
- Page 8
- Page 9
- Page 10
- Page 11
- Page 12
- Page 13
- Page 14
- Page 15
- Page 16
- Page 17
- Page 18
- Page 19
- Page 20
- Page 21
- Page 22
- Page 23
- Page 24
- Page 25
- Page 26
- Page 27
- Page 28
- Page 29
- Page 30
- Page 31
- Page 32
- Page 33
- Page 34
- Page 35
- Page 36
- Page 37
- Page 38
- Page 39
- Page 40
- Page 41
- Page 42
- Page 43
- Page 44
- Page 45
- Page 46
- Page 47
- Page 48
- Page 49
- Page 50
- Page 51
- Page 52
- Page 53
- Page 54
- Page 55
- Page 56
- Page 57
- Page 58
- Page 59
- Page 60
- Page 61
- Page 62
- Page 63
- Page 64
- Page 65
- Page 66
- Page 67
- Page 68
- Page 69
- Page 70
- Page 71
- Page 72
- Page 73
- Page 74
- Page 75
- Page 76
- Page 77
- Page 78
- Page 79
- Page 80
- Page 81
- Page 82
- Page 83
- Page 84
- Page 85
- Page 86
- Page 87
- Page 88
- Page 89
- Page 90
- Page 91
- Page 92
- Page 93
- Page 94
- Page 95
- Page 96
- Page 97
- Page 98
- Page 99
- Page 100
- Page 101
- Page 102
- Page 103
- Page 104
- Page 105
- Page 106
- Page 107
- Page 108
- Page 109
- Page 110
- Page 111 (reading here)
- Page 112
- Page 113
- Page 114
- Page 115
- Page 116
- Page 117
- Page 118
- Page 119
- Page 120
- Page 121
- Page 122
- Page 123
- Page 124
- Page 125
- Page 126
- Page 127
- Page 128
- Page 129
- Page 130
- Page 131
- Page 132
- Page 133
- Page 134
- Page 135
- Page 136
- Page 137
- Page 138
- Page 139
- Page 140
- Page 141
- Page 142
- Page 143