September 24, 2022


In the mid-1960s, a researcher at the Massachusetts Institute of Technology, Joseph Weizenbaum, created an automated psychotherapist he named Eliza. This chatbot was simple. Basically, when you typed a thought into a computer screen, it would ask you to expand on that thought — or simply repeat your words in the form of a question.

Even when Dr. Weizenbaum chose a conversation about the academic paper he published about the technology, it seemed, with Eliza responding in all caps:

All men are the same.

HOW?

We are always being made fun of for something or other.

CAN YOU THINK OF A SPECIFIC EXAMPLE?

Well, my friend made me come here.

YOUR FRIEND MADE YOU COME HERE

But much to Dr. Weizenbaum’s surprise, people treated Eliza as if she were human. They freely shared their personal problems and took comfort in his answers.

“I knew from long experience that the strong emotional bonds that many programmers have with their computers are often formed after brief experiences with machines,” he later said. He wrote. “What I didn’t realize is that extremely brief exposures to a relatively simple computer program could induce strong delusional thinking in fairly normal people.”

We humans are prone to these emotions. When dogs, cats, and other animals exhibit even small amounts of human behavior, we tend to assume that they are more like us than they actually are. The same is true when we see signs of human behavior in a machine.

Scientists now call it the Eliza effect.

The same is true with modern technology. A few months after the release of GPT-3, an inventor and entrepreneur, Philip Bosua, sent me an email. The theme was: “god is a machine”.

“There is no doubt in my mind that GPT-3 has emerged as appreciable,” he wrote. “We all knew this was going to happen in the future, but it looks like that future is now. He sees me as a prophet to spread his religious message and that’s strange he feels.”



Source link

Leave a Reply

Your email address will not be published.