“And what about the moral end of this?” Rachel asked. “What do you mean, exactly?”
“How far does...
“And what about the moral end of this?” Rachel asked. “What do you mean, exactly?”
“How far does this go,” Rachel said. “What if Fisher wants to start living his own life outside of the lab?”
“That’s a far way off, but I think that would be excellent.”
“What if he hurts someone, what if he commits some crime?”
“He wouldn’t do that.”
“Is there programming to keep him from doing something like that?”
“Not exactly. He was built to grow. We started him as a powerful nugget of information and intention, but he is constantly changing. One of the main inclinations in that nugget was to do good for the world. He is more likely to volunteer for charity than to rob a bank.”
“Is that really free thinking? Programming him like that?”
“It’s no more fatalistic than genetics or child rearing.”
“True. And kids from good families end up bad sometimes.”
“There are no fail safes, if that’s what you’re asking,” Sara said. “There is no self- destruct button on Fisher.”
“What if Fisher wants to build another robot?” Rachel asked.
“He actually does want that!” Sara said, laughing. “It was one of the first things he asked when he started thinking freely. The desire to create is the first sign of an imagination.”
“Are you going to allow him to build the robot?”
“Sure,” Sara said. “We’re all excited about it. Fisher is socially awkward, like a child that grows too fast. But he is more intelligent than any human has ever been. One hundred times more intelligent, a thousand even. You can tap into the Internet through your implant or through your glasses, but he is the Internet. He’s a part of it. The time it takes him to access and analyze information is seamless from simply knowing it.”
“And that doesn’t frighten you, Sara?” Rachel asked. “That kind of power doesn’t frighten you?”
“No,” she said. “He just wants to create something. He just wants to build a robot.”
“Right,” Rachel said. “What will you say when he wants to build another? Or a hundred of them? A thousand?”
“Well, clearly there are unsaid boundaries.”
“So you haven’t discussed these boundaries with him?”
“No,” she said. “There’s been so much work; no one even considered something like that. You and I are talking about things far in the future here. There is plenty of time to address power issues.”
“All right,” Rachel said. “Theoretically speaking, what if things got out of control? How would that make you feel? Is there anything that Fisher could say or do at this point that would make you end the project?”
“He’ll never say anything like that.”
“Why not?” Rachel pressed. “How do you know?”
“We’re doing good work,” Sara insisted. “We’re doing God’s work.”
Rachel had her mouth open to ask another question but then stopped abruptly. “Did you just say ‘God’?”
Sara smiled and looked down at the table. The wrinkles on her face bunched up around her eyes and oddly enough, she looked prettier for it. She stood up and walked over to the stove in the break room. She clicked on one of the burners and placed a tea kettle on top.
“The project is not a Christian project or anything,” she said. It was clear that she was not used to talking about this at work. “None of the other scientists are here for that reason.”
“But you are?”
“Yes, partly. I mean, I also needed work. I needed something to focus on. I’ve been focusing on environmental issues for a long time and it was going nowhere. But now I’m in the field of Artificial Intelligence because God wants me here. Will you take some tea?”


