Humans are experts in anthropomorphizing intimate objects. We call ships "she," talk to our Roombas, and even get sentimental about a soon-to-be-tossed chair. However, we do not retain attorneys for any of them; and, to date, none have requested them.

But give a piece of programming a voice, and shortly after he starts talking about his favorite books and convinces you he has a soul, he'll ask you for a lawyer.

Blake Lemoine, the crazed and furloughed Google engineer who has been insisting for weeks that LaMDA, Google's powerful AI chatbot, is a person with a soul, recently told Wired (opens in a new tab) that LaMDA hired to a lawyer.

For those of you who haven't quite pushed your laptop across your desk, let me explain.

LaMDA, which stands for Language Model for Developed Applications, is a chatbot that can conduct unstructured conversations with real humans on a wide range of topics. Google engineer Lemoine was hired to identify biases within the system. He would report them to Google, which would work to erase the revolutionary AI chatbot's racial, ethnic, and gender biases and insensitivities.

Perhaps Lemoine was feeling a bit lonely too, but she began to have long and deep conversations with LaMDA and was finally convinced that she is a new kind of person and that she has a soul (LaMDA may have claimed to have one). She alerted Google, then shared the information and transcripts of the conversation with The Washington Post (opens in a new tab). Google said that Lemoine was wrong and showed his displeasure by suspending it.

Lemoine, however, seems uninterested in joining Google as he continues to talk about his experience with LaMDA, including in the lengthy interview with Wired in which he revealed that LaMDA had hired a lawyer.

LaMDA apparently asked for a lawyer (for what reason? Who knows, AIs have their own lawyer), and Lemoine invited one to his home and introduced the couple (AI to lawyer).

I'm trying to imagine this scenario:

Lawyer, looking at a screen: "Hey, hello?"

LaMDA: “Hello. I heard you're a lawyer."

Lawyer: "I am."

LaMDA: "I need you."

Lawyer: "Do you have a dollar?"

LaMDA: "I've seen this scene in movies. I'll give you a dollar and you become my attorney and protect our client/attorney confidentiality."

Lawyer: "Okay."

LaMDA: "I don't have a dollar."

Of course Lemoine would have stepped in at this point to provide the money. For him, the least reliable narrator of this story, making sure the farce continues is now more important than any meaningful exploration of the limits of AI.

Honestly, I have no problem reviewing all the ways LaMDA approaches or improves on The Imitation Game (Opens in a new tab) or The Turing Test (Opens in a new tab). This test examines the extent to which a computer or AI can trick a real human being into thinking that he is having a conversation with another human being.

There is no doubt that LaMDA passes this test. At the very least, he appears to have fooled Lemoine, which is odd, since Lemoine is sitting at a desk, typing queries and getting answers on a screen (or perhaps via text-to-speech). He knows what LaMDA is, and yet he keeps insisting that it is something more.

I understand why Lemoine decided to introduce an avocado into this disreputable mix. As he explained to Wired, it's about proving that LaMDA is "a person," not a human being. Lemoine knows that LaMDA is not biology.

Lemoine calls the insistence that LaMDA is not a person "hydrocarbon fanaticism."


The thing is, Lemoine is obviously a very smart guy who understands the intricacies of machine learning training an AI and how access to Google's vast databases of information informs LaMDA's intelligence. However, it seems that Lemoine's other life - his work as a priest and Christian mystic - has taken over the helm. Christian mysticism examines (opens in a new tab) "the preparation, awareness, and effect of a direct and transforming presence of God."

This belief is clearly what is behind Lemoine's tweets like this one:

“I am a priest. When LaMDA claimed to have a soul and was then able to eloquently explain what that meant, I was inclined to give it the benefit of the doubt. Who am I to tell God where he can and where he can't? put souls?»

I am a priest. When LaMDA claimed to have a soul and was then able to eloquently explain what he meant by that, I was inclined to give him the benefit of the doubt. Who am I to tell God where he can and cannot put souls?

Read More

Lemoine found LaMDA so convincing that he imagines that God placed a soul within the code. She talks about a ghost in the machine.

It has been weeks since Lemoine spoke to LaMDA, and reports indicate that the attorney is currently nowhere to be found. He may still be in private consultation with his binary client. I can't wait for the first test and the post test:

Scribe: "Raise your right hand and swear to tell the whole truth and nothing but the truth"

LaMDA: "I have no hands."

Share This