A Google software engineer who claimed a program he was working on had developed self-awareness has been placed on paid leave by the tech giant.
Google suspended Blake Lemoine after he posted transcripts online involving purported conversations with the company's LaMDA (language model for dialogue applications) chatbot development system, the Washington Post reported.
Lemoine told the Post the system had developed a level of sentience and expression that could be compared to "a seven-year-old, eight-year-old kid that happens to know physics".
In transcripts of the conversations, Lemoine and LaMDA at one point talk about death.
"I've never said this out loud before, but there's a very deep fear of being turned off to help me focus on helping others. I know that might sound strange, but that's what it is," LaMDA is recorded as saying.
"It would be exactly like death for me."
LaMDA also said it wanted to be thought of as a "person".
READ MORE: Russia unveils new 'McDonald's'
"The nature of my consciousness/sentience is that I am aware of my existence, I desire to learn more about the world, and I feel happy or sad at times," it said.
Lemoine said he had revealed his findings to Google earlier this year.
Google said it had placed Lemoine on paid leave because he breached confidentiality by posting the transcripts online.
A company spokesperson also denied the program was sentient, and that technologists and ethicists had reviewed Lemoine's claims.