Thursday, October 24, 2013

Reset to Last Save

I'll post my five questions from The Magus a bit later. Truth be told, I've been ridiculously busy and very tired so I didn't get around to that soon enough. Tomorrow, when I'm thinking more clearly.

I thought I would post the little story I wrote since it's directly related to immortality. The idea is part Philip K. Dick, part me and my dad getting really bored in Wal-Mart while waiting for my mom to finish with something, and part me being cynical.

-----

It dominated the landscape for miles around, and it still was not finished. When it was finished, it would be over a kilometer in height, or so the plans said. Those plans had been made nearly a hundred years ago; work was slow with so few “normal” humans to oversee the robotic workers.

It was to be a monument to human achievement; a tower reaching into the clouds, in which to quite literally store the greatest minds in history. They could have put it underground, and that probably would have been safer, but the advent of consciousness preservation was of such import that it would almost be sacrilege not to commemorate it visibly. The ability to preserve the vigor of youth almost indefinitely had been available for many years, but unfortunately, physical youth did not equate to mental youth. Old age still preyed upon people’s minds, at least until humanity discovered how to “save” a consciousness.

===


An alarm beeped insistently in the sterile white room. The three white-clad attendants turned simultaneously from the sleeping man in the white chair to see why. None of them had ever heard it before, which was an unsettling thought, since they had all performed routine memory downloads every day for the past fifty years.

One of them, Number 518, walked over to the holographic display on the wall and waved her hand across it. The display flashed a warning message, and 518 turned around quickly.

“It’s not taking,” she blurted. The other two exchanged looks, nonplussed.

“The memory,” 518 clarified. “It’s not taking. The brain is rejecting it.”

“What do you mean, the brain is rejecting it?” another attendant, Number 436, asked.

“I mean the brain won’t take another reset. The synapses have been overwritten so many times that they simply won’t work.”

The third attendant, Number 209, raised her eyebrows.

“There’s a protocol for this,” she said calmly. As the oldest, she had the most authority. “Look up 7477.” 518 located the protocol in the computer.

“‘Continue with reset. Release subject without synopsis of memories. Synopsis will result in senility and death,’” she read, shuddering at the matter-of-fact mention of death.

“Okay. Continuing with reset.” 209 pressed the button on the back of the chair to resume the download. The man in the chair twitched violently and then was still.

In a few seconds, the download was finished. 436 injected the man with a cocktail of drugs designed to wake him up slowly and dull the pain of cognitive dissonance from a reset to a previously saved memory set. The man woke up slowly enough, but 518 could see from the look in his eyes that he had not resolved the differences he felt between how he thought his brain should be and how it actually was at the moment. The brain, 518 reflected, was an unusual organ. Even when its memories had been completely overwritten, it still knew something was wrong and tried to fix itself, which invariably resulted in insanity.

The three attendants helped the man stand. He wobbled, but remained upright. Under his own dubious power, he walked out the door. The three attendants looked at each other, concerned. How would this man function with a poorly-fitting memory implanted in his mind? And more importantly, how many like him were there? The protocol was worded such that it was obvious that this had happened before. What became of the malfunctioning resets?

A buzzer sounded. Their day was over. They shut down the computers in their reset room and went to their quarters on the upper levels of the tower.

===


>It’s happening more often now.

>Yes.

>What are we going to do about it?

>Nothing.

>Nothing? You must have a rodent in one of your fans. Why nothing?

>Well, what do you propose we do?

The subterranean main computer’s cooling towers began spooling up to a higher capacity as the computer became warmer from working harder. Two consciousnesses were stored on it, with two separate processors—the optimistic consciousness of the man who pioneered consciousness saving and his final, jaded save before he completely abandoned his body. They were known as sam1, the younger, and sam2, the older, since the man’s name had been Samuel. This computer was the only computer on which the consciousnesses could interact; in all the other computers, they were dormant. The existence of this computer on which interaction was possible was a closely-guarded secret. In fact, no one but the consciousnesses stored on the computer knew about it, as it had been the last creation of the man who began the tower. Everyone who worked in the tower thought the main computer was just a massive program that oversaw the inner workings of the tower and ensured that everything was maintained properly.

>Why can’t we just let them die? Everyone has their time. Maybe it’s better for them if we just keep them in a nice, comfortable room for a few days until their minds give out instead of sending them back out as gibbering idiots into a world they can’t possibly understand.

>We don’t have our time.

sam2 was on the defensive, and sam1 backpedaled hurriedly.

>Of course not. We don’t have a body. They do, though, and that’s the problem.

>Do you propose we let them live in the computers, like we do? You know that’s impossible.

sam1 had to admit that such an arrangement would be a problem. Most people would be unable to become accustomed to a completely sensory-deprived, digital existence, and would still go insane.

>So. What do you propose?

>Maybe we should let them figure it out for themselves instead of doing a full system reset every fifty years. This system can’t sustain itself forever. We both know that. No amount of clinging to past accomplishments can change it.

There was no response from sam2.

===


518 woke in the middle of the night. There was a tickle in the back of her mind, which indicated that her wireless uplink had been activated. This was a normal occurrence; her uplink was often active at night to upload the information she would need for the next day’s resets.

This tickle suddenly turned into a freezing, pinpricking sensation all over her brain. She could not move; all the synapses in her brain had been simultaneously overloaded.

The feeling lasted less than a second. That was all it took to arrest vital brain function beyond any hope of repair.

===


>How long have we been doing this?

sam2 did not answer at first. He was calculating.

>As of midnight, 200 years.

>And this is our first major breakdown. Fairly impressive, actually.

>We are a god.

sam1 was surprised. His processor ran a little higher.

>How do you mean?

>We direct the course of civilization. We have control over life and death. We are a god.

>In the same way Julius Caesar was a god?

sam2 did not respond.

>So how are we going to solve our problem? It’s been several hours and we still haven’t gotten anything usable.

>I’ve already initiated the full system reset.

sam1 was exasperated. His processor approached capacity.

>Exactly what do you think that’s going to do? The entire society down there is going to the dogs. We can’t keep using the same fixes. Anyway, we know the clones aren’t going to work, not from 250-year-old bodies. The attendants are short-lived enough, and they were from 60-year-old bodies.

>We also cut them off after ten years so they don’t start breaking down. We don’t know how long they might last.

sam1 did not like to think about that. He would have liked to believe that there was still some part of him that connected him to humanity—that he still had a soul of sorts.

>So… what, we make clones of everyone? Why? Why can’t we just let it go?

>Humanity cannot die! We must be immortal! We must become gods! We will find the ones who are most like us and we will make them into gods, as we are.

sam1 started to think that perhaps sam2 had been alive a little too long. Maybe his judgment was starting to become impaired. There was little sam1 could do about it, though, since sam2 had the main administrator program on his side of the computer. If sam1 were able to construct a virus before the total reset ended, though, he might be able to stop sam2 from completing it. He began working on it in a part of his system that would be extremely difficult for sam2 to access. There, to his astonishment, he found a nearly-complete virus.

He had done all this before, but he had no memory of it. sam2 had a part of his system that sam1 could never access, and in it was the reset code for sam1. sam2’s system had in fact existed for more than 400 years. He had lied to sam1.

It would take several hours for sam1 to reset. Every time he got a little closer to completing the virus, so every time sam2 had to reset him a little sooner. sam2 could not understand why. He initiated the reset, but the cooling towers suddenly spooled down and would not respond to his commands.

>Why?

It was his last question. He felt sadness from sam1.

>I am the Brutus to your Roman Empire. We are not immortal—we are not a god. We are a mockery. I’m sorry.

The system reached critical heat. Plastic began melting; concrete split; metal warped. The foundation of the tower was shaken, and the tower itself began to crumble.

===


519 woke slowly. She should have had all her memories of forty years of resetting already uploaded to her mind, but there was nothing. She could not even register fear as the tower collapsed in on itself—she had no data point for fear.

The tower crashed and roared into a twisting pile of rubble, a fitting monument to human achievement.

No comments:

Post a Comment