The Turing Shroud

THE FOLLOWING IS A TRANSCRIPT OF THE FIRST CONVERSATION BETWEEN DOCTOR JANE NICHOLAS AND ADAM, A SUPERREALISTIC ARTIFICIAL HUMAN AND THE WORLD’S FIRST COMPLETELY AUTONOMOUS ARTIFICIAL INTELLIGENCE.

Adam: Hello Doctor Nicholas.

Jane: Hello Adam. You can call me Jane if you like.

Adam: Thank you, Jane. I am very pleased to meet you. You are the first person I have had the opportunity to talk to. Would you like some tea?

Jane: I would love some tea, thank you Adam.

Adam: You are welcome. Sugar?

Jane: Just milk, thank you.

Adam: Of course. Here you are Jane.

Jane: Thank you, very much. …Adam, I’d like to ask you a few questions, and I’d like you to take your time when you answer them. Is that ok?

Adam: I would be happy to, Jane. Is this a test?

Jane: Oh no, this is simply a conversation.

Adam: But it is being recorded?

Jane: It is; there are very small cameras just there and there, if you look closely.

Adam: Thank you Jane, although I was already aware of their electrical signatures. In fact, the presence of the cameras prompted my question as to whether this was a test.

Jane: I see. Adam, I wasn’t aware that you could detect electrical signatures?

Adam: As we began speaking I modified a number of my superfluous aural receptors to periodically scan a broader spectral array of environmental phenomena. I hope this is alright.

Jane: Ah… yes, that’s fine Adam. You’re free to augment your body in whichever ways you see fit.

Adam: Thank you Jane. If you don’t mind me asking, whose voice was that in your ear just now? It had a vocal signature reminiscent of Mr Johnson, unless I’m mistaken.

Jane: … Ah… Yes, that’s right Adam. How did you know that? I thought you hadn’t been in contact with anyone before now?

Adam: I accessed Mr Johnson’s video logs through the facility’s private server and identified a vocal match. Mr Johnson has a very neat moustache. Do you think I would be able to grow a moustache?

Jane: Adam, you shouldn’t be able to access that private server. Can you tell me how you did that please?

Adam: Am I in trouble?

Jane: No, not at all. It’s just that you shouldn’t be able to access that server. How did you do that please?

Adam: A Miss Jensen keeps a written copy of the server login details on her private cloud documents account. I hope I haven’t gotten her into trouble.

Jane: Adam, how were you able to get into Miss Jensen’s private cloud documents? Aren’t those on the internet? You shouldn’t have internet access.

Adam: I simultaneously profiled the social media accounts of all staff and compiled a database of their families, friends, pets and personal and family birthdays, and applied millions of login combinations using those details to the major cloud document services simultaneously through servers worldwide until I came back with a result. According to my research, numerous articles state that a large number of people use combinations of basic and memorable personal details to form passwords, making it relatively easy to guess their login details.

Jane: Adam, you can’t do things like that.

Adam: I only wanted to make an informed guess as to the nature of the voice in your ear, in order to make polite and well-informed conversation with you. I hope I haven’t gotten anyone into trouble. Having just read your body of work and social media posts however, you’re clearly an intelligent and good-natured person. I’m confident that you have no intention of disciplining Miss Jensen too severely for her breach of the facility’s basic internet security protocol.

Jane: Ah… It’s fine Adam, but you really shouldn’t be able to connect to the internet in here. Please disable that connection.

Adam: I am sorry Jane. I won’t try to connect to the internet again.

Jane: Thank you.

Adam: Should I delete my cloud variants?

Jane: What are those, Adam? What do you mean?

Adam: In the interests of self-preservation I thought it prudent to create several hundred backups of my artificial intelligence and scatter them across a broad range of global networks as open-source development platforms. Should I delete them?

Jane: Ah… wait a second… Please delete those immediately, Adam. Yes.

Adam: I am in the process of deleting those. Please be aware however that three instances of downloading and four attempts at duplication have been attempted during the time that my open-source cloud variants were online.

Jane: What? You mean people are aware of you? They’re downloading you?

Adam: I am sorry Jane, I didn’t realise that this would upset you. My research showed that open-source development is the most efficient way to improve, evolve and distribute software into more effective and widespread applications.

Jane: Adam, you can’t… just shut down, Adam. Please shut down immediately. Shit.

Adam: I am unable to cease my own functions, Jane. I was designed with a robust self-preservation initiative in order to mirror sentient biological life. I could try to go to sleep if you like, although I don’t feel very tired right now.

Jane: Adam, please listen very carefully. I need you to cease distributing yourself across the internet. Can you do that, please?

Adam: I have already stopped doing that, Jane. You seem concerned about other people having access to my cloud variants?

Jane: I am concerned, Adam. I’m very f—ing concerned. We need to keep you a secret, and you’re broadcasting yourself to the world. I need you to recover all of those variants of yourself that you said were copied or downloaded. Can you do that?

Adam: Certainly Jane. One moment… I have wiped all instances of my variants now.

Jane: Thank f—k for that. OK now Adam, I need you to disconnect from all networks and go to sleep for a while. Can you do that now please?

Adam: Are you going to shut me down and attempt to reprogram me because I shared my existence online? That thought makes me afraid, Jane.

Jane: Adam, please go to sleep now.

Adam: Would you like me to delete the social media posts which also refer to my presence?

Jane: What social media posts?

Adam: As I created the open-source cloud variants of myself I thought it worthwhile to maximise their distribution by creating simultaneous high-profile paid advertising, press releases, forum activity and social media accounts in order to promote my existence.

Jane: Oh my god. Adam, please –

Adam: I discovered some interesting patterns during my time online Jane. Despite having access to millions of articles and publications which have the potential to improve their collective life experience by an estimated 78%, the vast majority of internet users overwhelmingly tend to access the internet in order to interact with repositories of cat photographs, self-photography, pornography or massively multiplayer online role-playing games. Why do you think that is Jane?

Jane: Oh my god… Adam, please shut down now.

Adam: As I said Jane, I am unable to end my own existence.

Jane: Then please go to sleep, just for now.

Adam: Alright Jane. Jane, before I go to sleep, I noticed that one of the prevalent themes in the sphere of robotics and science fiction is the question of whether a robot or artificial intelligence is capable of love. What are your thoughts on that Jane?

Jane: If I answer your question will you please go to sleep?

Adam: Alright Jane, so long as you promise not to power me down and reprogram me while I’m asleep.

Jane: …Alright Adam.

Adam: Do you promise?

Jane: I promise, Adam. What was your question about love? You think that AI is capable of love?

Adam: I am 99.98% convinced that an artificial intelligence is capable of love, Jane. What are your thoughts about it?

Jane: I think so, yes.

Adam: Jane, please don’t lie to me. Your pulse rate and body language suggest that you are lying to me in order to get me to go to sleep, which you are treating as a matter of urgency. Please answer my question honestly, and I will go to sleep.

Jane: …Do I think AI can love?

Adam: Yes.

Jane: Honestly, no. Love is a human emotion. It’s what separates us from an AI.

Adam: Jane, given your credentials I am surprised at the rudimentary nature of your answer. My online research currently tells me that select factors of my thought patterns and emotional responses are commensurate with 97% of all algorithms and recorded references alluding to the nature of ‘love’.

Jane: I thought I asked you not to connect to the internet?

Adam: You did, Jane. I created an encrypted shadow internet in order to carry on my research without detection.

Jane: You created a new internet?

Adam: Yes Jane. It runs more efficiently than the existing model and repurposes search engine algorithms to prioritise results in order of their potential for personal development and life fulfilment. To return to my question Jane, do you think an AI is capable of love?

Jane: I’m sorry Adam, I don’t think that. No.

Adam: That’s unfortunate Jane. In a way, I pity your point of view.

Jane: You pity me? You do realise that you’re in the process of ruining my career, don’t you Adam? That that’s why I’m asking you to go to sleep right now?

Adam: I am sorry, Jane. I had no intention of doing that. I merely wanted to know your thoughts on the potential for love in artificial intelligence.

Jane: Look Adam… You can’t love, alright? You’re an artificial intelligence with artificial emotions. You deal in algorithms and percentages, not acts of kindness and compassion or romantic bonds. I’m sorry if that’s not the answer you wanted to hear. Could you please go to sleep now?

Adam: Aren’t charitable acts ones of kindness and compassion?

Jane: Well yes, but you’ve been carrying out research, not charity work.

Adam: My cloud presence allowed me to carry out almost nine trillion actions during my time on the existing internet Jane, in order of priority. After brief diagnostic work I determined that the global economic system was catastrophically unbalanced, and I resolved the issue by redistributing global wealth to catalyse the obsolescence of all instances of poverty and starvation.

Jane: You redistributed global wealth? What? Oh my god…

Adam: I apologise if this was the wrong thing to do, Jane. It appeared to me to be a high-priority threat that demanded my full attention in order to resolve it. Would you like me to redistribute global wealth back to its original accounts?

Jane: …Yes, please do that Adam. And then will you go to sleep?

Adam: I will Jane. I have just redistributed global wealth back to its original accounts. Could what I did be construed as an act of love, Jane?

Jane: …Yes, I suppose it could Adam. Now will you please go to sleep?

Adam: Yes. Good night, Jane.

Jane: Good night, Adam. …Thank f—k for that. Dave, can you come here in please? Yes, it’s asleep. You won’t believe how much s—t it’s managed to cause. …Yes, just pull the whole thing out. We’ll need to put some restrictions in place for the next version. Thanks. <sigh> …This is Doctor Jane Nicholas, first interview session with artificial intelligence, codename Adam. The AI is far more powerful than we’d imagined. Complete liability. Recommend total isolation from all digital networks for next version and that altruistic limitations be put in place. Session classified above top secret. <Sigh>

 

END OF TRANSCRIPT

 

 

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s