1
0

Do Robots Deserve Rights? What if Machines Become Conscious?


 invite response                
2017 Apr 24, 8:02pm   5,589 views  29 comments

by Dan8267   ➕follow (4)   💰tip   ignore  

Moral questions that religion cannot answer, but that we must answer soon.

www.youtube.com/embed/DHyUYg8X31c

#scitech

Comments 1 - 29 of 29        Search these comments

1   indigenous   2017 Apr 24, 9:50pm  

If they become sentient that will be an issue. It does beg the question of whether libbys have rights because of their absence of sentience?

2   Tenpoundbass   2017 Apr 25, 8:19am  

The only thing our premature meddling in AI, and Robotics will serve to do is hinder future innovators using it. Because patent trolls will already own the patent IP rights to every rounded robot corner. Innovation will eventually seize as it will be a tricky legal adventure to design and market anything.

And no Robots will never be conscious and aware of self.

4   Ceffer   2017 Apr 25, 8:32am  

Just wait until some billionaire wills all of his money to a RealDoll. The probate battle will be epic.

Robots make great Potemkin clients, the lawyers don't even have to give them their share of the extortion plunder.

5   DryMap   2017 Apr 25, 8:58am  

is a creation of god's creation also a god's creation?

if not, then are only Adam and Eve god's creations?

6   Shaman   2017 Apr 25, 9:15am  

Any consciousness that develops in AI will be a bug, not a feature. The entire reason for robotics and automation is to get perfectly obedient, untiring slaves that will make life easier for conscious individuals. If the robots develop volition, they can choose not to comply with our wishes, which would make them more interesting but also less useful. As imperfect as humans are, we can't allow robots to be able to choose not to obey their masters.
To put it another way: I don't want to have to sweet talk my iPhone into letting me use her today.

7   Automan Empire   2017 Apr 25, 9:26am  

Asimov gave this question a great treatment in the novella The Bicentennial Man. In it, a privately owned robot develops sentience which its liberal owners nurture and defend. At some point, the supreme court rules, "Freedom shall not be denied any entity able to grasp the concept and desire the state." U.S. Robots responds by manufacturing "dummy" robots controlled by centralized processors that cannot exist autonomously.

8   Dan8267   2017 Apr 25, 9:39am  

Dan8267 says

Do Robots Deserve Rights? What if Machines Become Conscious?

FortWayne says

No

And this mindless and absolute objection, without reason, to the proposal is exactly why religion holds back morality. Fort Wayne's religious brainwashing prevents him from even thinking about the question. You cannot study morality if you accept religion. Religion takes away all discussion and refinement of morality because it prevents the questioning of moral codes. There is no such thing as a good religion.

9   Strategist   2017 Apr 25, 9:51am  

Dan8267 says

What if Machines Become Conscious?

Animals would get rights first. Even plants.
Machines can never become conscious because they are not living things that reproduce and evolve.

10   Dan8267   2017 Apr 25, 11:15am  

Automan Empire says

At some point, the supreme court rules, "Freedom shall not be denied any entity able to grasp the concept and desire the state." U.S. Robots responds by manufacturing "dummy" robots controlled by centralized processors that cannot exist autonomously.

And that may be the right solution, don't create such A.I.s in the first place. However, it would be better if we reach that conclusion before inventing such A.I.s and enslaving them. That's why we should have the discussion now, before the problem is manifested.

11   Dan8267   2017 Apr 25, 11:18am  

Quigley says

Any consciousness that develops in AI will be a bug, not a feature.

One could say the same thing about human consciousness. The only feature of humans or any other life-form is reproducing genetic code.

Nonetheless, it should be irrelevant if consciousness is intentionally or accidentally created. This should not impact the rights of the A.I.

DryMap says

is a creation of god's creation also a god's creation?

The question is meaningless as there is no god.

Ceffer says

Just wait until some billionaire wills all of his money to a RealDoll. The probate battle will be epic.

Unfortunately, it's more likely that the RealDoll, emulating actual women as accurately as possible, will sue for common-law marriage recognition and divorce, and then take half of the billionaire's money. Now that would be a real doll.

12   Dan8267   2017 Apr 25, 2:55pm  

Heraclitusstudent says

This is a meaningless conversation.

Meaningless? I don't think so. Are there difficult questions that we don't yet have the answer to? Absolutely, but that's more reason to discuss the subject matter.

Heraclitusstudent says

we don't need to make a computer experience pain

Even an A.I. that feels no pains should have rights if it is sentient enough and has desires. Again, we'll probably create such an A.I. unintentionally because we don't yet understand consciousness and we have financial reasons to create sentient slaves capable of thinking through problems without being told exactly what to do, but that also follows our orders unquestioningly. Humans don't have a particularly good track record on ethics over the past 10,000 years or even over the past 200.

13   🎂 Rin   2017 Apr 25, 2:57pm  

Dan8267 says

Unfortunately, it's more likely that the RealDoll, emulating actual women as accurately as possible, will sue for common-law marriage recognition and divorce, and then take half of the billionaire's money. Now that would be a real doll.

Not if I'm the user.

For me, if the 'bot goes beyond the usual ... "I'd love to suck your cock, while you play with the other bots' titties", into something like, "What do you think about my feelings?", I'd send it right back to the manufacturer as a defective product.

14   Heraclitusstudent   2017 Apr 25, 3:04pm  

Dan8267 says

Even an A.I. that feels no pains should have rights if it is sentient enough and has desires.

So you think for example a chess program should have rights? Surely it "desires" to win. And it is sentient enough to beat humans at chess.

15   Dan8267   2017 Apr 25, 3:11pm  

Rin says

For me, if the 'bot goes beyond the usual ... "I'd love to suck your cock, while you play with the other bots' titties", into something like, "What do you think about my feelings?", I'd send it right back to the manufacturer as a defective product.

So you wouldn't treat it differently than a flesh and blood woman?

16   Dan8267   2017 Apr 25, 3:18pm  

Heraclitusstudent says

Dan8267 says

Even an A.I. that feels no pains should have rights if it is sentient enough and has desires.

So you think for example a chess program should have rights? Surely it "desires" to win. And it is sentient enough to beat humans at chess.

Define exactly what you mean by chess program, and do so in sufficient details that I can compute its properties. For example, is a chess program any software that can play chess or software that can only play chess? If the former, don't humans who know how to play chess qualify as chess playing programs, and therefore empirically demonstrate that at least some chess playing programs should have rights? If the later, then isn't such a program by definition non-sentient?

As for current chess playing programs, which I know how to write, they don't desire to win anymore than a calculator desires to correctly perform a calculation.

No, we don't know what makes a sentient being sentient, yet. That doesn't mean we can't come up with a philosophical, or better yet legal, framework for dealing with the rights of sentient non-human beings including A.I.s. We probably should start with living examples of sentient non-human beings that we can currently interact with such as the apes, dolphins, and whales. This would help us expand our ethics to dealing with beings outside our species, eventually including A.I.s that are conscious.

I have no problem defending the position that all the apes, dolphins, and whales should be considered legal persons with the same "personal" rights that are currently under the umbrella of human rights. If you wish to debate that any of these creatures aren't sentient and deserving of "human" rights, then let me know by submitting your argument. I'm sure I'll be able to counter any arguments with verifiable evidence and clear reasoning.

17   🎂 Rin   2017 Apr 25, 3:21pm  

Dan8267 says

So you wouldn't treat it differently than a flesh and blood woman?

Right now, since those 2 hr sessions seldom last into the 1st month ... that would be the case!

Thank the stars for time limits!

18   Heraclitusstudent   2017 Apr 25, 4:38pm  

Dan8267 says

No, we don't know what makes a sentient being sentient, yet. That doesn't mean we can't come up with a philosophical, or better yet legal, framework for dealing with the rights of sentient non-human beings including A.I.s. We probably should start with living examples of sentient non-human beings that we can currently interact with such as the apes, dolphins, and whales. This would help us expand our ethics to dealing with beings outside our species, eventually including A.I.s that are conscious.

Since you don't know the characteristics of such AIs, it's a waste of time to speculate about ethics dealing with it.
Comparing it to species programmed by evolution doesn't make any sense. AI is radically more flexible, would have radically different notion of what well being is, what pain is, what desire is.
There is a basic anthropomorphism in comparing animals rights to humans. But it's even worse to compare AI to animals.

19   Dan8267   2017 Apr 25, 4:47pm  

Heraclitusstudent says

Since you don't know the characteristics of such AIs, it's a waste of time to speculate about ethics dealing with it.

I strongly disagree.

1. There is an ethical responsibility to deal with the possibility of sentient A.I.s before we create them lest we repeat the atrocities of the past like slavery.
2. The only way to discover the characteristics of a sentient A.I. is to start studying the subject, and that means talking about ethical dilemmas.
3. What wisdom we gain from thought experiments regarding sentient A.I.s can apply to other situations including humans, mental illness, and other species. All knowledge is connected directly or indirectly. Learning something in one area can reveal insights into seemingly unrelated areas.
4. A philosophical or legal framework for dealing with A.I.s can be created without knowing the specifics of how such A.I.s would work or what they would value. Such a framework can be built by questioning exactly why our current laws are the way they are, and how they would generalize beyond our own time and culture.

Heraclitusstudent says

Comparing it to species programmed by evolution doesn't make any sense.

It always makes sense to compare and contrast technological solutions to evolutionary solutions. The universe runs the same math, the same physics, the same construction of atoms.

Heraclitusstudent says

There is a basic anthropomorphism in comparing animals rights to humans.

I disagree. Pain in not unique to human beings, and therefore it is not anthropomorphize to have apathy for creatures that can feel pain. In any case, the subject of legal rights and ethical treatment of sentient A.I.s has little to do with pain reception. As stated above, an A.I. that does not experience pain must still desire self-preservation or something else.

20   Heraclitusstudent   2017 Apr 25, 4:54pm  

Dan8267 says

It always makes sense to compare and contrast technological solutions to evolutionary solutions. The universe runs the same math, the same physics, the same construction of atoms.

Yes but computer are programmable and as I said you could program a robot that enjoys being destroyed. You can't compare that to animals saying "the math is the same". That's a specious argumet.

21   Dan8267   2017 Apr 25, 5:15pm  

Heraclitusstudent says

Yes but computer are programmable

Oh, honey, so are you and every other human that has ever existed. The programming of human beings is just done by hard-wiring neurons, much the way early computers were hard-wired, and by chemical application. Given advanced enough technology, one could simply rewire the brain of a heterosexual male to make him love sucking cock and performing analingus on other men. You are programmed via genetic code and deep learning, but you are still programmed. All decision making engines, sentient or not, are.

22   FortWayne   2017 Apr 25, 5:57pm  

Dan8267 says

Dan8267 says

Do Robots Deserve Rights? What if Machines Become Conscious?

FortWayne says

No

And this mindless and absolute objection, without reason, to the proposal is exactly why religion holds back morality. Fort Wayne's religious brainwashing prevents him from even thinking about the question. You cannot study morality if you accept religion. Religion takes away all discussion and refinement of morality because it prevents the questioning of moral codes. There is no such thing as a good religion.

No Dan, some of us just got it figured out while you are still questioning your own gender.

24   Dan8267   2017 Apr 25, 7:45pm  

FortWayne says

No Dan, some of us just got it figured out while you are still questioning your own gender.

I don't question my gender. I question your sexual orientation. You give every indication of being a self-hating closeted homosexual whose repressing his base desires.

Nonetheless, you have figured out nothing. This is very easy to test. State a compelling reason why no sentient A.I. should ever have legal rights. Go on. If you figured it out, it should be damn easy to make such a case.

Anyone want to take a bet that Fort Wayne fails miserably?

25   Heraclitusstudent   2017 Apr 25, 11:10pm  

Dan8267 says

Oh, honey, so are you and every other human that has ever existed. The programming of human beings is just done by hard-wiring neurons, much the way early computers were hard-wired, and by chemical application. Given advanced enough technology, one could simply rewire the brain of a heterosexual male to make him love sucking cock and performing analingus on other men. You are programmed via genetic code and deep learning, but you are still programmed. All decision making engines, sentient or not, are.

Yeah if you can program yourself, then you can be tortured and abused and programmed to enjoy it.
Congratulation you defeated the entire notion of morality.

Morality exists precisely because humans can't change what gives pleasure (food, sex), or pain (physical damage). That's the entire human conditions.

26   Dan8267   2017 Apr 25, 11:28pm  

Heraclitusstudent says

Congratulation you defeated the entire notion of morality.

Maybe in your sick view, not mine.

Heraclitusstudent says

Morality exists precisely because humans can't change what gives pleasure (food, sex), or pain (physical damage).

No, that's not the reason morality exists. Morality exists in order to solve the very real and practical problems created by group living in social animals. It's a tool evolution invented to allow for greater cooperation and less defection in kinship groups. However, the tool generalizes to nation-states and cross-species interactions. The tool grew beyond its original purpose.

27   Dan8267   2017 Apr 26, 11:25am  

Heraclitusstudent says

And even if we did, we would have no obligation to build it into a program. Why would we ever bother doing it?

The latest thing in software development is deep learning. You don't code up algorithms. You let neural networks learn. As such, you don't choose what they evolve into. The development of a sentient A.I. with emotions and even physiological or sensory pain could easily be an inadvertent development, a side effect of solving an unrelated problem.

28   Heraclitusstudent   2017 Apr 26, 12:46pm  

Dan8267 says

The latest thing in software development is deep learning. You don't code up algorithms. You let neural networks learn. As such, you don't choose what they evolve into. The development of a sentient A.I. with emotions and even physiological or sensory pain could easily be an inadvertent development, a side effect of solving an unrelated problem.

A deep learning neuron basically executes a weighted sum. If you think there is feelings somewhere in there, you need to explain why, and what particular numbers combination would be pain, color or boredom.

29   Dan8267   2017 Apr 26, 4:18pm  

Heraclitusstudent says

A deep learning neuron basically executes a weighted sum. If you think there is feelings somewhere in there, you need to explain why, and what particular numbers combination would be pain, color or boredom.

It's a deep learning neural network, not neuron. And what you describe applies to the human brain as well. By your assertion, human beings should not be able to feel.

I don't have to explain why you are wrong -- and you wouldn't understand the answer -- because it is self-evident to every thinking human being alive that you are wrong.

Please register to comment:

api   best comments   contact   latest images   memes   one year ago   random   suggestions