Go back
How petty can a God be?

How petty can a God be?

Spirituality

KellyJay
Walk your Faith

USA

Joined
24 May 04
Moves
160441
Clock
31 Jul 06
Vote Up
Vote Down

Originally posted by Pawnokeyhole
Can't you distinguish an assertion from a question?

Can you specify the properties of neurons that guarantee consciousness or those of silicon circuits that forbid it?#

If not, why are so certain of your contrary position?
Can you talk about all that is required for conscieousness, you know?
We can talk about the current flow through a processor, all that goes
into it, but is just current through a processor, is there a magical
ciruit design that turns that current into a real thought that makes
the computer become aware? If not the only thing man can do is
simply create a faster electrical abacus with the ability to program
it.
Kelly

KellyJay
Walk your Faith

USA

Joined
24 May 04
Moves
160441
Clock
31 Jul 06
Vote Up
Vote Down

Originally posted by Mixo
What was this event?
The fall of man.
Kelly

Mixo

The Tao Temple

Joined
08 Mar 06
Moves
33857
Clock
31 Jul 06
Vote Up
Vote Down

Originally posted by KellyJay
The fall of man.
Kelly
Is this the expulsion from Eden? If so, do you think any of the Old Testament accounts are just allegorical rather than actual?

H
I stink, ergo I am

On the rebound

Joined
14 Jul 05
Moves
4464
Clock
31 Jul 06
3 edits
Vote Up
Vote Down

Originally posted by Pawnokeyhole
*sighs*

Note: The below differs slightly from the version.

Okay, suppose I make beings--robots if you like, something else if you don't--with roughly equally balanced natural tendencies to be nasty or nice. Some people might think this approximates humans.

I leave these being to interact with humans in significant ways.

But they also have s other reasons that eliminate that liability. But, all else equal, He would be liable.
This is the point I voiced from the beginning as to the disingenuity -- I hope it is merely subjective bias -- of the robot analogy:

How can a creator be liable for the harm his creation does to itself and of its own choice?! Allow me to use an analogy of my own:

I set up a "virtual environment" (VE) with a series of artificial intelligence (AI) nodes having distinct identities and personalities. The intention is for me to interact with the virtual environment to "channel" the nodes to allow for wholesome interaction in the creation of new "super-code". This obviously allows for the possible risk of creating a cyber-virus, or perhaps a virtual masterpiece of unthinkable magnitude. To further complicate the issue, the AI nodes firewall my access into their VE, turn on each other and wreck their environment.

Would I still be liable for damage done to my own "virtual" creation? To whom would I be liable?

Edit: Try to understand my point of a closed, created system. Your argument of "unleashing them on the public" has no valid application to reality, since by definition, there is no "uncreated" (i.e. non-robot beings) within the created universe.

f
Bruno's Ghost

In a hot place

Joined
11 Sep 04
Moves
7707
Clock
31 Jul 06
Vote Up
Vote Down

Originally posted by Halitose
This is the point I voiced from the beginning as to the disingenuity -- I hope it is merely subjective bias -- of the robot analogy:

How can a creator be liable for the harm his creation does to itself and of its own choice?! Allow me to use an analogy of my own:

I set up a "virtual environment" (VE) with a series of artificial intellige ...[text shortened]... nition, there is no "uncreated" (i.e. non-robot beings) within the created universe.
Seems like you would have to be a tad reckless to allow your creation to disrupt the purpose you created it for.

b
Buzzardus Maximus

Joined
03 Oct 05
Moves
23729
Clock
31 Jul 06
Vote Up
Vote Down

Originally posted by frogstomp
Seems like you would have to be a tad reckless to allow your creation to disrupt the purpose you created it for.
Or disingenuous as to your real purpose.

H
I stink, ergo I am

On the rebound

Joined
14 Jul 05
Moves
4464
Clock
31 Jul 06
Vote Up
Vote Down

Originally posted by Pawnokeyhole
*sighs*

Note: The below differs slightly from the version.

Okay, suppose I make beings--robots if you like, something else if you don't--with roughly equally balanced natural tendencies to be nasty or nice. Some people might think this approximates humans.

I leave these being to interact with humans in significant ways.

But they also have s ...[text shortened]... other reasons that eliminate that liability. But, all else equal, He would be liable.
That's the sort of liability God faces for creating beings with free will, with conflicting instincts towards good and evil. There may be other reasons that eliminate that liability. But, all else equal, He would be liable.

Your ceteris paribus distinction raises another point:

What if the intention was not to have purely "good" or "nice" robots, but "willingly good" or "willingly nice" robots? Let's take love -- would the act (or emotion) be the same if it didn't arise by choice? You can progamme your cellular phone to say: "I love you", but I'm sure you'd agree that a sms message from a lover with the same holds so much more meaning -- even pathos.

I would contend that God willed to rather create beings capable of true love (with the added risk of hate, i.e. evil) rather than automated-response-robotrons.

H
I stink, ergo I am

On the rebound

Joined
14 Jul 05
Moves
4464
Clock
31 Jul 06
1 edit
Vote Up
Vote Down

Originally posted by frogstomp
Seems like you would have to be a tad reckless to allow your creation to disrupt the purpose you created it for.
Choice entails risk. Recklessness is the subjective vista of risk.

f
Bruno's Ghost

In a hot place

Joined
11 Sep 04
Moves
7707
Clock
01 Aug 06
Vote Up
Vote Down

Originally posted by Halitose
Choice entails risk. Recklessness is the subjective vista of risk.
But you arent taking any risks, because you don't give a hoot how your world works out.
BTW, would you subject your only begotten son to the whims of your AI's machinations?

googlefudge

Joined
31 May 06
Moves
1795
Clock
01 Aug 06
Vote Up
Vote Down

Originally posted by KellyJay
Another statement of faith, not a fact.
Kelly
symanticaly he was actually asking a question, not making any statement of fact.

googlefudge

Joined
31 May 06
Moves
1795
Clock
01 Aug 06
1 edit
Vote Up
Vote Down

Originally posted by KellyJay
Can you talk about all that is required for conscieousness, you know?
We can talk about the current flow through a processor, all that goes
into it, but is just current through a processor, is there a magical
ciruit design that turns that current into a real thought that makes
the computer become aware? If not the only thing man can do is
simply create a faster electrical abacus with the ability to program
it.
Kelly
Neurons 'talk' to each other through electrical impulses as well. What part of a 'neuron' net (brain) do you think is not possible to replicate by other means? Do you suggest that neurons are somehow exempt from the laws of physics? If so on what grounds? If you don't think that neurons are exempt from the laws of physics then there will be some system you can construct that will exactly replicate the important (for the purposes of thought in this case) processes that go on inside and between neurons, and therefore an entire working human brain can be constructed. Are you saying that this would not be sentient (assuming humans are)? And if you allow this to be a sentient being then why would it not be possible to build a nonhuman sentience, or do you not allow for any creature other than Homo sapiens to be sentient? If you think that sentience can't be explained through physical means then where is your evidence for any non-physical entity that somehow imbues sentience?

googlefudge

Joined
31 May 06
Moves
1795
Clock
01 Aug 06
Vote Up
Vote Down

Originally posted by frogstomp
Seems like you would have to be a tad reckless to allow your creation to disrupt the purpose you created it for.
But what if the only way to acheive the required goal is to alow risk of not acheiving it (atleast this time around)?

f
Bruno's Ghost

In a hot place

Joined
11 Sep 04
Moves
7707
Clock
01 Aug 06
Vote Up
Vote Down

Originally posted by googlefudge
But what if the only way to acheive the required goal is to alow risk of not acheiving it (atleast this time around)?
Still, to allow your AI's to act like viruses so they can erase other AI's is only counterproductive , unless you don't care about the final result.

Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.