[Previous] XXI | Home | [Next] New Blogging Script

XXII

The Golden Age trilogy by John C Wright is excellent science fiction. Also good are the books by Greg Egan. Both authors think about what the future could be like and put interesting technology like virtual reality and nano-tech into their books. They both consider the problems of space colonization over long distances -- even once you get there communication between colonies takes a long time at light speed. They both consider putting human minds into computers instead of flesh bodies, and moral and legal issues with cloning. They both have somewhat libertarian attitudes.

One thing that makes these books especially interesting to read is to see which issues they fall into parochial misconceptions on. And on which they see something that is hard to see and weird today. For example, both authors retain roughly the modern concept of exclusive one-to-one long term marriage (even including personal fighting). But on the other hand they can imagine a world where wanting to have a physical body is just a vestige of the past, or where people converse in different languages automatically translated in real time.

One of the issues of particular interest to me is creating copies of minds. At first glance I love the idea. I could play chess against myself. That'd be awesome. Or better, I could play team games except my whole team is copies of me. We'd be so organized and coordinated. It'd let me really test out how good my strategies are.

Egan and Wright describe characters who think copying is a serious moral issue. And they have a point. At the least it's like being a parent. You are creating a new person -- a new mind. You are responsible for helping that person get started in the world and become independent. Since a copy is already an adult (doesn't need any advice or education) that might just mean giving him enough wealth to adjust to his situation (he is used to having your life and your means of income, but for most situations he'll need to find his own, at least new work doing the same tasks) and finds his place in the world.

But they take things further. Their characters have a strong distaste for copying themselves. They want immortality, so they make *inactive* copies and store them, and only wake them up if they die. They don't want two copies to run at once. They only make exceptions for rare circumstances, like sending copies to many distant planets might qualify. But even so, one of Egan's characters in Diaspora left instructions that once one of her copies arrived at a planet with life, no more should be woken up at other locations. And some people erased their Earth-copy when going to other planets, or later committed suicide once the planet expedition was a success. And another issue is one of the ships was destroyed en route (hit space debris) and people considered this a tragedy -- the 92 awake people were killed. The rest who were inactive though felt validated in their choice not to be awake -- they avoided death.

This way of thinking is wrong.

The important thing -- what people are -- is knowledge. Spreading knowledge is good. That's what we are doing when we tell parenting ideas to new people. And that's what we are doing when we build new computers -- we are putting more knowledge into more places. Having one blueprint in one place isn't good enough. It's important to spread knowledge -- copy it even -- into many locations. This makes more areas of the universe good places -- places that create knowledge, or places at least that aid knowledge creating entities. More computers embodies more knowledge -- it doesn't matter that it's a copy. It's good. Now more people can use it. More planets can have computers if we build more. More locations. More space stations. And it's just the same with minds: having more minds thinking makes the universe a better place, and it makes their particular locations better places. Being a copy isn't a waste at all. Even if you didn't diverge and have different ideas than the original, a copy means more people can have conversations with you/your-knowledge. That's great!

And this fear of death? If the knowledge is destroyed that's equally bad whether you were awake or not. This whole idea of sleeping through long flights is completely wasteful -- completely inhumane. It's horrible to have all these people pretending like they are in comas when they could be alive and awake and thinking. It's like they don't think life is worth living unless they have a planet to play with. Why not spend the time thinking? (And simulating virtual reality worlds -- you can have whatever kind of life you want, all by yourself -- except not by yourself because you can put people into your world -- your children -- but not children in the normal sense, you can make fully formed adult friends, just they are your responsibility). The only bad part of death besides the knowledge destruction -- including the prevention of completion of goals it had (which btw is prevented by not letting it wake up) -- is the suffering. But when you're going near light speed, even a human body wouldn't suffer from a collision -- completely obliterated far too fast to feel pain. And these minds in computers don't have pain nerves anyway. The only way they might suffer is if they got advanced warning of their death and that distressed them. But if that's even possible, and wouldn't allow for dodging the obstacle or solving the problem, then one could still choose not to hear about it. You don't have to watch where you're going. If you choose to be notified about impending death, and you are distressed by it, that's not rational. If you want to know you should be glad to have found out -- it doesn't make sense to want to be notified by then treat the notification as anything other than a gift -- a miracle of science -- you get to know in advance like you wanted, your preferences are more satisfied. It's good. If you don't like knowing then you should be happy not to find out. You might also say you want to know but you're suffering because you got unlucky with having an obstacle in your ship's way. But that's not rational either -- there is no reason to feel bad about luck. You didn't choose wrongly.

There is also confusion about identity. If I copy myself, who is me? And I think this is why, really, people don't like the idea of their copies being destroyed en-route *while awake*. They have very bad ideas about consciousness (the conventional ideas are just plain magical thinking). They see being awake and conscious as critical, and this person as them, and they find the idea a bit like dying themselves. This is absurd. Putting it to sleep is nothing but a disservice that prevents it from thinking. And it's not you in the same way two copies of a book are distinct. It just has the same knowledge as you. Which is destroyed whether it's awake or not.

The whole way of thinking about identity and "me" is bad. Just don't worry about it. If you copy your mind now there is this same knowledge in two places. (And it will become different over time, but no matter, that's just a natural consequence of creating new ideas and changing unpredictably.) So what? There isn't one that's "really" you -- it's the same knowledge. That's the whole point. It's like giving special status to the first book off the printing press for being the "original".

And you know there's a zillion copies of you and your mind already in the multiverse (see The Fabric of Reality). Knowledge is a multiversal structure. Knowledge can be the same over more universes because there is a reason it is that way -- it's not arbitrary. So you get larger structures across the multiverse. People are a major one. Vast regions of the multiverse -- vast numbers of "parallel universes" have very nearly or exactly the same *you*. Because your mind is a matter of knowledge and that gives it stability. If a conclusion is a matter of logic you are going to reach it in most universes so you end up the same in a lot of places. So, lots of copies of "you". Get used to it. What's one more? This one you can meet and talk with. But so what? More of you is good. It means more knowledge in the multiverse. Simple.

Whatever "consciousness" or "self awareness" or also "qualia" is doesn't really matter to any of this. I'm sure it matters for *something* but not to these fundamental issues. It's just a detail -- a property of certain knowledge. The important thing is still that copy yourself is like making more copies of OS X and spreading them around -- a good thing. As long as they can all find happy places in the world -- as long as you take on the responsibility of a parent -- then it's all good. The important thing to think about is in terms of knowledge.

Why is killing a cat not bad? Because it didn't have any important knowledge in it. It did have knowledge, but nothing important or useful. And easy to reproduce. Exact same logic as destroying a stapler. It has knowledge. Destroying them for no reason is a bit of a waste (but a small waste, of miniscule importance beside human preferences). But we can create more staplers no problem, and more cats. The knowledge in them is fungible.

If you destroy a *unique* person that's really bad. They can't be recreated. Knowledge is gone and will have to be completely reinvented. We don't have to think about consciousness or anything like that. If they are asleep it's the same.

And this way of thinking works with a fetus too. A fetus has no unique knowledge, so it's not important. End of story.

Once people have copies, destroying one won't be murder. It will be like destroying a stapler, except that people are more complex and have more knowledge. So it will be more like destroying billions of dollars of information. Except that copying data will be cheap so it might just be a small hassle to recreate it. It's still billions of today's dollars of information, it's just that stuff will be so cheap in the future that a billion dollars of wealth today won't matter at all. The exception will be if the copy diverged from the backup -- if it has new, unique knowledge. Also you will put the person out of action while they are restored -- a bit like forcing them to take a nap. That's bad but it's not murder. (Yes, parents shouldn't do that to their kids. Ever.) If they have some unique knowledge not yet backed up and you destroy it that's bad too -- quite unfortunate -- but it's not murder. Murder is killing a person-sized amount of knowledge. You've destroyed something a lot smaller -- like murdering a couple of someone's ideas. Bad but smaller. Such things happen by accident all the time today -- people get hit on the head and forget an idea. And sometimes it's someone else's fault -- he hit a tennis ball at you by mistake when you weren't looking -- and we don't prosecute him for murder, nor even for the few ideas/neurons he knocked out of our head. We just try to be more careful next time.

Relating to marriage as mentioned earlier, one character had a break up with his partner of many years. He doesn't know why, though, because she asked him to delete his memories of her. Now he only remembers that she made that request and he agreed. That's horrible! That's killing knowledge. It's destroying part of himself. And his "loved one" wanted him to be hurt in this way. His loved one wanted all the good times they had, when they helped each other, to be destroyed. And he agreed to it -- how messed up his he? She has no right! Doesn't she have no right? How much autonomy did he voluntarily give up?

Elliot Temple on August 17, 2007

Messages

Want to discuss this? Join my forum.

(Due to multi-year, sustained harassment from David Deutsch and his fans, commenting here requires an account. Accounts are not publicly available. Discussion info.)