imminst.org

Status
Not open for further replies.
I don’t understand what point you were trying to make here.
Simple, my argument is an analogy, replace the single celled with human, and human with x.

I have no such illusions or aspirations and I have no interest of what might be superior to me. My only interest is my own long term personal survival. If my lack of death blocks the development of something superior then good since I have no interest in their survival over mine.
Ah, there we differ, I have no interest in long term personal survival. My interest is that those, who come after me, are far more superior than me.
 
Exhumed,

In that case, I call it suicide.
How so? You would still be alive, albeit in a non-biological medium.

The transition is what I'm curious about. It seems to me that you are just getting copied, not becoming immortal. You'll have a very grateful, immortal copy, but you're dead.
No, you will have made a transition to a more resilient medium. You will have a dual existence for a short while until your bio form is destroyed.

The assumption here is that the entity that is “you” is held within the unique neural networks maintained by your brain. We further assume that computing technology will be able to provide a processing engine that can duplicate identical effects to a biological brain using your unique patterns as the program data.

Though this depends on opinion. Is being alive any different than having a continuous new copy in place? I get uncomfortable entertaining that idea In that case, the upload would be like becoming immortal, IMO. Otherwise no.
I think I have answered that above.

Another thing I wonder about... If we accept this upload as immortal, then it applies immortality in other ways, which already exist. Like the common one of having children continue on (ignoring the fact that children are often substantially different for now), or similar people in the future having your essence of character.
No I do not see any value in that perception. Whether my children have similar characteristics to me is irrelevant to me. They are not me. I have a unique set of neural networks based on my unique life experiences, and my children will have their own. There is no benefit to me if I die and my children live on.

If we are really just our behavior patterns than we already have immortality I think...
Each person has their own unique experience and neural patterns. That uniqueness is not currently capable of immortality.
 
Avatar,

Ah, there we differ, I have no interest in long term personal survival.
What is so attractive about non existence?

My interest is that those, who come after me, are far more superior than me.
Why have any interest in something you will not be able to experience?
 
Ah, there we differ, I have no interest in long term personal survival. My interest is that those, who come after me, are far more superior than me.

I have no interest in my long-term survival. I hate my body and there is a part of me that can’t wait to die. My interest in immortality is all about humanity creating genetically enhanced humans and finding a way to transfer the memories in a person's brain to neuron chips. It is all about the preservation of information and enhancing our biology. I wouldn’t be the same person if I transferred my memories to a genetically enhanced body. I would have the memories from my old body, but there are some characteristics that would be left behind.
 
Q0101,

I don’t believe that binary code can be a replacement for chemistry. I believe that we will eventually create computer programs that can accurately simulate every chemical reaction in a carbon-based life form, and perhaps we will be able to use this technology to create digital copies of ourselves. (Mind uploading) But I don’t think it is possible for a non-biological computer program to experience emotions. Can a non-biological computer program really experience pain, sadness, joy, and anger, or can it just simulate human behavior. At what point should a robot or simulated life be considered sentient? We could program a robot to simulate our behavior, but it would not be the same thing as experiencing the biochemical reactions that we experience. It would just be a simulation. It would be a completely different kind of existence than ours.
I believe you are placing far too much confidence in biology as being an optimum medium for sentience. Also, there is nothing special about emotions that bar them from being experienced by a non biological machine. At the core of sentience is the processing power of the brain. This has some 200 billion neurons and there are more variations of neuron than any other cell type in the human body, but even so they all have essentially the same very simple characteristic. They have 100s to thousands of inputs from other neurons and deliver a single output. A neuron is in essence a slow microprocessor with a clock speed of around 300Hz. What makes the brain so powerful is that all these tiny processors are operating independently and in parallel. But the connections between these units are slow chemical synapses, again not very fast.

Exactly how the signals between the neurons result in sentience and what we call emotions is obviously the challenge for the upload designer. But the essential unit of processing is simple but it is just the vast number that is daunting. For us to do the same we’d need some 10,000 tightly linked high end computers to come close. Something we will be able to do within a few years.

The biochemical reactions you describe all result in data signals to the brain. Emotions are data signals also that result from other hormone and related body processes as well as thoughts, e.g. other signals from the brain. In the end it is the brain that is processing data that we experience.

I think our biggest problem here is trying to imagine how such an engine can result in self-awareness and sentience, i.e. vital characteristics that we cannot clearly define. Until we really understand these then I don’t think you are in any way safe to assert that an electronic non-biological engine cannot do the same as a biological processing engine.
 
avatar,

..towards a better world.
Why care? If you do not exist you will never see it. It is of no use to you.

I'm not egoistic.
But you are if you believe your best interest is to die to achieve a better world. Every action we do is driven by self-interest.
 
Q0101,

I have no interest in my long-term survival.
I can’t comprehend that perspective. I don’t understand why non-existence could ever be attractive.

I hate my body and there is a part of me that can’t wait to die.
And if it could be replaced by something you don’t hate, would that change your mind about survival?
 
But you are if you believe your best interest is to die to achieve a better world. Every action we do is driven by self-interest.
No, it's not in my bodily interests, I love my body and personality. :) But I think it's best for the evolution - social and biological. I think that that is something bigger and more important than my personality, which is nothing that extraordinary.
Why care? If you do not exist you will never see it. It is of no use to you.
Yes, I won't, but there will exist life forms that will enjoy it more and make a better use of the world than we do now, have a greater depth of experience than we. As I said, I'm not egoistic, I don't care that I won't witness it.

My dedication in life, while it lasts, is to leave something worthy for the next generations, add a little shoulder to the shoulders of giants, so we end up higher.

Or, if we stick to it that I naturally have to be egoistic, then my association is not with this body or personality, but I associate myself with life and the experience of life, no matter the species.
 
Q0101,

I believe you are placing far too much confidence in biology as being an optimum medium for sentience. Also, there is nothing special about emotions that bar them from being experienced by a non biological machine. At the core of sentience is the processing power of the brain. This has some 200 billion neurons and there are more variations of neuron than any other cell type in the human body, but even so they all have essentially the same very simple characteristic. They have 100s to thousands of inputs from other neurons and deliver a single output. A neuron is in essence a slow microprocessor with a clock speed of around 300Hz. What makes the brain so powerful is that all these tiny processors are operating independently and in parallel. But the connections between these units are slow chemical synapses, again not very fast.

Exactly how the signals between the neurons result in sentience and what we call emotions is obviously the challenge for the upload designer. But the essential unit of processing is simple but it is just the vast number that is daunting. For us to do the same we’d need some 10,000 tightly linked high end computers to come close. Something we will be able to do within a few years.

The biochemical reactions you describe all result in data signals to the brain. Emotions are data signals also that result from other hormone and related body processes as well as thoughts, e.g. other signals from the brain. In the end it is the brain that is processing data that we experience.

I think our biggest problem here is trying to imagine how such an engine can result in self-awareness and sentience, i.e. vital characteristics that we cannot clearly define. Until we really understand these then I don’t think you are in any way safe to assert that an electronic non-biological engine cannot do the same as a biological processing engine.

As I said many times before, it is all speculation. We can’t have a real conversation about the topic until the first mind upload. Do you ever think about the fact that your perception of existence could be completely different if you transferred your mind to a virtual simulation or a non-biological body? What would it be like to exist in a simulation where you could alter the laws of physics? What would it be like to have sensors that can scan the entire electromagnetic spectrum? Would you have the same concepts of good and evil if you had the ability to turn your pain sensors on and off? And why even bother to have a body that is shaped like a human? You could be shaped like a spider or octopus. You would only need tentacles that would allow you to move and manipulate objects in your environment. Your existence is defined by chemistry. You wouldn’t really be human if you were a digital simulation of chemistry. I could be wrong, but I believe that a combination of biology and digital computing is the ultimate medium for sentience.

Edit: I just wanted to mention that I could see us evolving into a species like the replicators on Stargate Atlantis. Now that would be the ultimate medium for sentience.
 
Last edited:
Q0101,

I can’t comprehend that perspective. I don’t understand why non-existence could ever be attractive.

And if it could be replaced by something you don’t hate, would that change your mind about survival?

Why is it so hard to comprehend? I hate my body, therefore I have three options. Oblivion, gene therapy, and mind uploading. I don’t believe that I am going to live long enough to see mind uploading. My best options are gene therapy and oblivion.
 
Status
Not open for further replies.
Back
Top