« First « Previous Comments 3 - 29 of 29 Search these comments
Just wait until some billionaire wills all of his money to a RealDoll. The probate battle will be epic.
Robots make great Potemkin clients, the lawyers don't even have to give them their share of the extortion plunder.
is a creation of god's creation also a god's creation?
if not, then are only Adam and Eve god's creations?
Any consciousness that develops in AI will be a bug, not a feature. The entire reason for robotics and automation is to get perfectly obedient, untiring slaves that will make life easier for conscious individuals. If the robots develop volition, they can choose not to comply with our wishes, which would make them more interesting but also less useful. As imperfect as humans are, we can't allow robots to be able to choose not to obey their masters.
To put it another way: I don't want to have to sweet talk my iPhone into letting me use her today.
Asimov gave this question a great treatment in the novella The Bicentennial Man. In it, a privately owned robot develops sentience which its liberal owners nurture and defend. At some point, the supreme court rules, "Freedom shall not be denied any entity able to grasp the concept and desire the state." U.S. Robots responds by manufacturing "dummy" robots controlled by centralized processors that cannot exist autonomously.
Do Robots Deserve Rights? What if Machines Become Conscious?
No
And this mindless and absolute objection, without reason, to the proposal is exactly why religion holds back morality. Fort Wayne's religious brainwashing prevents him from even thinking about the question. You cannot study morality if you accept religion. Religion takes away all discussion and refinement of morality because it prevents the questioning of moral codes. There is no such thing as a good religion.
Animals would get rights first. Even plants.
Machines can never become conscious because they are not living things that reproduce and evolve.
At some point, the supreme court rules, "Freedom shall not be denied any entity able to grasp the concept and desire the state." U.S. Robots responds by manufacturing "dummy" robots controlled by centralized processors that cannot exist autonomously.
And that may be the right solution, don't create such A.I.s in the first place. However, it would be better if we reach that conclusion before inventing such A.I.s and enslaving them. That's why we should have the discussion now, before the problem is manifested.
Any consciousness that develops in AI will be a bug, not a feature.
One could say the same thing about human consciousness. The only feature of humans or any other life-form is reproducing genetic code.
Nonetheless, it should be irrelevant if consciousness is intentionally or accidentally created. This should not impact the rights of the A.I.
is a creation of god's creation also a god's creation?
The question is meaningless as there is no god.
Just wait until some billionaire wills all of his money to a RealDoll. The probate battle will be epic.
Unfortunately, it's more likely that the RealDoll, emulating actual women as accurately as possible, will sue for common-law marriage recognition and divorce, and then take half of the billionaire's money. Now that would be a real doll.
This is a meaningless conversation.
Meaningless? I don't think so. Are there difficult questions that we don't yet have the answer to? Absolutely, but that's more reason to discuss the subject matter.
we don't need to make a computer experience pain
Even an A.I. that feels no pains should have rights if it is sentient enough and has desires. Again, we'll probably create such an A.I. unintentionally because we don't yet understand consciousness and we have financial reasons to create sentient slaves capable of thinking through problems without being told exactly what to do, but that also follows our orders unquestioningly. Humans don't have a particularly good track record on ethics over the past 10,000 years or even over the past 200.
Unfortunately, it's more likely that the RealDoll, emulating actual women as accurately as possible, will sue for common-law marriage recognition and divorce, and then take half of the billionaire's money. Now that would be a real doll.
Not if I'm the user.
For me, if the 'bot goes beyond the usual ... "I'd love to suck your cock, while you play with the other bots' titties", into something like, "What do you think about my feelings?", I'd send it right back to the manufacturer as a defective product.
Even an A.I. that feels no pains should have rights if it is sentient enough and has desires.
So you think for example a chess program should have rights? Surely it "desires" to win. And it is sentient enough to beat humans at chess.
For me, if the 'bot goes beyond the usual ... "I'd love to suck your cock, while you play with the other bots' titties", into something like, "What do you think about my feelings?", I'd send it right back to the manufacturer as a defective product.
So you wouldn't treat it differently than a flesh and blood woman?
Even an A.I. that feels no pains should have rights if it is sentient enough and has desires.
So you think for example a chess program should have rights? Surely it "desires" to win. And it is sentient enough to beat humans at chess.
Define exactly what you mean by chess program, and do so in sufficient details that I can compute its properties. For example, is a chess program any software that can play chess or software that can only play chess? If the former, don't humans who know how to play chess qualify as chess playing programs, and therefore empirically demonstrate that at least some chess playing programs should have rights? If the later, then isn't such a program by definition non-sentient?
As for current chess playing programs, which I know how to write, they don't desire to win anymore than a calculator desires to correctly perform a calculation.
No, we don't know what makes a sentient being sentient, yet. That doesn't mean we can't come up with a philosophical, or better yet legal, framework for dealing with the rights of sentient non-human beings including A.I.s. We probably should start with living examples of sentient non-human beings that we can currently interact with such as the apes, dolphins, and whales. This would help us expand our ethics to dealing with beings outside our species, eventually including A.I.s that are conscious.
I have no problem defending the position that all the apes, dolphins, and whales should be considered legal persons with the same "personal" rights that are currently under the umbrella of human rights. If you wish to debate that any of these creatures aren't sentient and deserving of "human" rights, then let me know by submitting your argument. I'm sure I'll be able to counter any arguments with verifiable evidence and clear reasoning.
So you wouldn't treat it differently than a flesh and blood woman?
Right now, since those 2 hr sessions seldom last into the 1st month ... that would be the case!
Thank the stars for time limits!
No, we don't know what makes a sentient being sentient, yet. That doesn't mean we can't come up with a philosophical, or better yet legal, framework for dealing with the rights of sentient non-human beings including A.I.s. We probably should start with living examples of sentient non-human beings that we can currently interact with such as the apes, dolphins, and whales. This would help us expand our ethics to dealing with beings outside our species, eventually including A.I.s that are conscious.
Since you don't know the characteristics of such AIs, it's a waste of time to speculate about ethics dealing with it.
Comparing it to species programmed by evolution doesn't make any sense. AI is radically more flexible, would have radically different notion of what well being is, what pain is, what desire is.
There is a basic anthropomorphism in comparing animals rights to humans. But it's even worse to compare AI to animals.
Since you don't know the characteristics of such AIs, it's a waste of time to speculate about ethics dealing with it.
I strongly disagree.
1. There is an ethical responsibility to deal with the possibility of sentient A.I.s before we create them lest we repeat the atrocities of the past like slavery.
2. The only way to discover the characteristics of a sentient A.I. is to start studying the subject, and that means talking about ethical dilemmas.
3. What wisdom we gain from thought experiments regarding sentient A.I.s can apply to other situations including humans, mental illness, and other species. All knowledge is connected directly or indirectly. Learning something in one area can reveal insights into seemingly unrelated areas.
4. A philosophical or legal framework for dealing with A.I.s can be created without knowing the specifics of how such A.I.s would work or what they would value. Such a framework can be built by questioning exactly why our current laws are the way they are, and how they would generalize beyond our own time and culture.
Comparing it to species programmed by evolution doesn't make any sense.
It always makes sense to compare and contrast technological solutions to evolutionary solutions. The universe runs the same math, the same physics, the same construction of atoms.
There is a basic anthropomorphism in comparing animals rights to humans.
I disagree. Pain in not unique to human beings, and therefore it is not anthropomorphize to have apathy for creatures that can feel pain. In any case, the subject of legal rights and ethical treatment of sentient A.I.s has little to do with pain reception. As stated above, an A.I. that does not experience pain must still desire self-preservation or something else.
It always makes sense to compare and contrast technological solutions to evolutionary solutions. The universe runs the same math, the same physics, the same construction of atoms.
Yes but computer are programmable and as I said you could program a robot that enjoys being destroyed. You can't compare that to animals saying "the math is the same". That's a specious argumet.
Yes but computer are programmable
Oh, honey, so are you and every other human that has ever existed. The programming of human beings is just done by hard-wiring neurons, much the way early computers were hard-wired, and by chemical application. Given advanced enough technology, one could simply rewire the brain of a heterosexual male to make him love sucking cock and performing analingus on other men. You are programmed via genetic code and deep learning, but you are still programmed. All decision making engines, sentient or not, are.
Do Robots Deserve Rights? What if Machines Become Conscious?
No
And this mindless and absolute objection, without reason, to the proposal is exactly why religion holds back morality. Fort Wayne's religious brainwashing prevents him from even thinking about the question. You cannot study morality if you accept religion. Religion takes away all discussion and refinement of morality because it prevents the questioning of moral codes. There is no such thing as a good religion.
No Dan, some of us just got it figured out while you are still questioning your own gender.
No Dan, some of us just got it figured out while you are still questioning your own gender.
I don't question my gender. I question your sexual orientation. You give every indication of being a self-hating closeted homosexual whose repressing his base desires.
Nonetheless, you have figured out nothing. This is very easy to test. State a compelling reason why no sentient A.I. should ever have legal rights. Go on. If you figured it out, it should be damn easy to make such a case.
Anyone want to take a bet that Fort Wayne fails miserably?
Oh, honey, so are you and every other human that has ever existed. The programming of human beings is just done by hard-wiring neurons, much the way early computers were hard-wired, and by chemical application. Given advanced enough technology, one could simply rewire the brain of a heterosexual male to make him love sucking cock and performing analingus on other men. You are programmed via genetic code and deep learning, but you are still programmed. All decision making engines, sentient or not, are.
Yeah if you can program yourself, then you can be tortured and abused and programmed to enjoy it.
Congratulation you defeated the entire notion of morality.
Morality exists precisely because humans can't change what gives pleasure (food, sex), or pain (physical damage). That's the entire human conditions.
Congratulation you defeated the entire notion of morality.
Maybe in your sick view, not mine.
Morality exists precisely because humans can't change what gives pleasure (food, sex), or pain (physical damage).
No, that's not the reason morality exists. Morality exists in order to solve the very real and practical problems created by group living in social animals. It's a tool evolution invented to allow for greater cooperation and less defection in kinship groups. However, the tool generalizes to nation-states and cross-species interactions. The tool grew beyond its original purpose.
And even if we did, we would have no obligation to build it into a program. Why would we ever bother doing it?
The latest thing in software development is deep learning. You don't code up algorithms. You let neural networks learn. As such, you don't choose what they evolve into. The development of a sentient A.I. with emotions and even physiological or sensory pain could easily be an inadvertent development, a side effect of solving an unrelated problem.
The latest thing in software development is deep learning. You don't code up algorithms. You let neural networks learn. As such, you don't choose what they evolve into. The development of a sentient A.I. with emotions and even physiological or sensory pain could easily be an inadvertent development, a side effect of solving an unrelated problem.
A deep learning neuron basically executes a weighted sum. If you think there is feelings somewhere in there, you need to explain why, and what particular numbers combination would be pain, color or boredom.
A deep learning neuron basically executes a weighted sum. If you think there is feelings somewhere in there, you need to explain why, and what particular numbers combination would be pain, color or boredom.
It's a deep learning neural network, not neuron. And what you describe applies to the human brain as well. By your assertion, human beings should not be able to feel.
I don't have to explain why you are wrong -- and you wouldn't understand the answer -- because it is self-evident to every thinking human being alive that you are wrong.
Moral questions that religion cannot answer, but that we must answer soon.
www.youtube.com/embed/DHyUYg8X31c
#scitech