Question about a possibly needed law.

Recommended Videos

chris11246

New member
Jul 29, 2009
384
0
0
I was thinking about this one day and wondered, what if someone created a simulation with people or just a program that fit the scientific definition of life?
Since there is no unequivocal definition of life, the current understanding is descriptive, where life is a 'characteristic' of organisms that exhibit all or most of the following phenomena:

1. Homeostasis: Regulation of the internal environment to maintain a constant state; for example, electrolyte concentration or sweating to reduce temperature.
2. Organization: Being structurally composed of one or more cells, which are the basic units of life.
3. Metabolism: Transformation of energy by converting chemicals and energy into cellular components (anabolism) and decomposing organic matter (catabolism). Living things require energy to maintain internal organization (homeostasis) and to produce the other phenomena associated with life.
4. Growth: Maintenance of a higher rate of anabolism than catabolism. A growing organism increases in size in all of its parts, rather than simply accumulating matter.
5. Adaptation: The ability to change over a period of time in response to the environment. This ability is fundamental to the process of evolution and is determined by the organism's heredity as well as the composition of metabolized substances, and external factors present.
6. Response to stimuli: A response can take many forms, from the contraction of a unicellular organism to external chemicals, to complex reactions involving all the senses of multicellular organisms. A response is often expressed by motion, for example, the leaves of a plant turning toward the sun (phototropism) and by chemotaxis.
7. Reproduction: The ability to produce new individual organisms, either asexually from a single parent organism, or sexually from two parent organisms.

Proposed

To reflect the minimum phenomena required, some have proposed other biological definitions of life:

* Living things are systems that tend to respond to changes in their environment, and inside themselves, in such a way as to promote their own continuation.
* A network of inferior negative feedbacks (regulatory mechanisms) subordinated to a superior positive feedback (potential of expansion, reproduction).
* A systemic definition of life is that living things are self-organizing and autopoietic (self-producing). Variations of this definition include Stuart Kauffman's definition as an autonomous agent or a multi-agent system capable of reproducing itself or themselves, and of completing at least one thermodynamic work cycle.
* Life is a self-sustained chemical system capable of undergoing Darwinian evolution.[20]

Would it be considered killing them if you stop the simulation or program?

and even if it didn't do that what if it was an intelligent program that seemed to develop emotions?
 

Xpwn3ntial

Avid Reader
Dec 22, 2008
8,023
0
0
If it begins doing things that are not part of any of its programming (emotional development could be on that list), and it is aware of itself as a "living" thing, then it would be murder.
 

randomsix

New member
Apr 20, 2009
773
0
0
So what you're asking is do the morals we apply to biological life apply to artificial intelligence?
My answer: it depends on the nature of consciousness. It's a tricky question that I haven't thought enough about to be more specific than that.
 

clankwise

New member
Sep 27, 2009
162
0
0
Well i remember reading somewhere that the possiblity of earth and our exstince being a simulation are quite high. With technology increasing the way its and time being infinite you are proably in a simulation. So yes it would be like killing them if you shut it off. Sucks to be them !
 

thatstheguy

New member
Dec 27, 2008
1,158
0
0
That might be thinking a little too far ahead. If a "robot" can really fit the bill as being alive, then I guess it wouldn't be entirely unfair to make laws about it...

What am I saying, it'd be an inanimate object. Even if was life, would killing it be no different than killing animals? I don't know. Don't care.
 

CoverYourHead

High Priest of C'Thulhu
Dec 7, 2008
2,514
0
0
This reminds me of the android mission in Fallout 3... anybody else?

Meh, AI is created by man, for man, and to serve man... just like children... and children have rights... so I guess AI should too.
 

quiet_samurai

New member
Apr 24, 2009
3,897
0
0
That would depend if it was true AI or just incredibly advanced and sophisticated mimicry. They may look and behave the same, but that doesn't necessarily make it so.
 

Budgy

New member
Jan 9, 2008
23
0
0
Read The Cyberiad. Brings up this exact issue.

quiet_samurai said:
That would depend if it was true AI or just incredibly advanced and sophisticated mimicry. They may look and behave the same, but that doesn't necessarily make it so.
According to classical philosophy any other human around you "look and behave the same" but that doesn't prove they are humans, because you can't possibly know if they (or anything you perceive at all) are "real" since they are being brought to you through your senses (Berkeley's idea). In this way AI, relative to your position as a human, is in no way more "alive" to anybody than say, another human.

The main problem that nobody seems to grasp with the problem of AI being "alive" is the fact that they haven't found answers to more tangible questions.

If you can say that animals and humans around you are living things because they appear to be living, then you can say AI and simulations of life are alive as well. I am not arguing that AI is alive, I am simply arguing that from our reference point as humans (behind a shield of perception) there is no way to prove that simulations are alive.

Basically, The Matrix got half of Berkeley's idea right: the fact that there was a "real world" hidden behind the mask of perception. However, it was flawed in that it failed to realize that this "real world" may in itself be another matrix.
 

chris11246

New member
Jul 29, 2009
384
0
0
quiet_samurai said:
That would depend if it was true AI or just incredibly advanced and sophisticated mimicry. They may look and behave the same, but that doesn't necessarily make it so.
I dont mean mimicry. Im talking about AI that can learn. Scientists are working on decoding our brain and if we can figure out how we learn couldnt we make a robot that could too?