Posted: Mon May 23, 2005 10:06 pm
lots of morons in here
Foo wrote:lots of morons in here
We cool g?Keep It Real wrote:Foo wrote:lots of morons in here
The problem is, it's very hard to define responsibility.Massive Quasars wrote:[xeno]Julios wrote: Perhaps it's because deep down I don't believe in free will.
I don't want to get into a FW discussion here and now, but I believe given my cursory analysis of this issue, that free will exists just not in an absolute sense. I think free will increases in the direction of greater complexity of life. Where humans would have more free will than apes, and apes more than insects. I don't have a high degree of confidence in this position simply because I have not given the question enough consideration. However, with degrees of free will, there can be degrees of responsibility.
It would depend on how I did it, but yes, there is that possibility. Which is another flaw in our justice system.Foo wrote:..and as soon as you've done that, your society will put you to the chair. Or whatever.Nightshade wrote:I hope to god you never have to ask yourself if you still believe this statement while standing over your child's hospital bed.
If that happened to my daughter, I would do everything I possibly could to make sure that whoever did it met a BAD end.
Point being, it's just cyclical. If you're not enough of a person to let the buck stop at your door.. well, fuck ya.
:lol: Couldn't agree more. I'm of the exact same opinion.Nightshade wrote:
Why on earth does a piece of pigshit that would beat, rape, and bury a child deserve to be rehabilitated?
Two in the chest, one in the head.
[xeno]Julios wrote:
Perhaps it's because deep down I don't believe in free will.
Scooby Doo wrote:
RUH ROW, RAGGY!!
Where do you get this statistic from? Somebody convicted and sentanced for murder, and then released, is no more likely to commit another crime than anybody else.Transient wrote:Rehabilitation is bullshit. The percentage of inmates who get rehabilitated and proceed to stay out of trouble is so small it's laughable. Most end up committing the same crime again, if not a worse crime.
Live in prison is bullshit, too. It costs $60,000 to imprison a man per year. The death penalty is just a quick spike in the electrical bill.
This example is unclear. We don't know the circumstance under which it made the decision that resulted in those deaths.[xeno]Julios wrote: The problem is, it's very hard to define responsibility.
Consider a computer that is programmed to make decisions that require a lot of deep analysis. It does so using complex algorithms, or sophisticated connectionist processing.
Furthermore, let's say that as a result of one of the computer's decisions, a million people get blown up.
We could hold the computer responsible, and shut it down, but would it make sense to be angry at it?
Sure - perhaps capital punishment is necessary. I personally don't think so, but advocating for CP is not inconsistent with what I'm saying.Nightshade wrote:I want to say that if you're crazy and don't know any better, then the world is better off without you, but that's an oversimplification.
If someone knows that killing a child is wrong, then why did she commit the crime? Probably because of some desire or urge. Perhaps she wanted to get a mercedes and needed the money, so she killed the kid and sold the organs. That urge or desire was not able to be overcome, because of her character. Is she responsible for the nature of her character? Well the nature of her character is partially a result of experiences and decisions she's made in the past. But those events were a function of her previous character, etc.Nightshade wrote:If you know the difference between right and wrong, and you still commit the act, buh-bye. Say hello to Uncle Dirtnap.
Not necessarily. Mental illnesses could be understood as extreme examples of mental functioning that are societally dysfunctional. I would classify the murder of a child to fund a mercedes an example of that.Nightshade wrote:Issues of mental illness are a different story, and a bit more complicated.
If we can't afford it, we can just jail em for life, or kill em. Doesn't mean we have to hold them responsible in the way I'm talking about.Nightshade wrote:What's the solution? Spend a boatload of taxpayer money to treat someone's illness after they rape and murder someone? Why? So we can all feel better about ourselves and allow some freakzoid scumbag to suck on the public tit for the rest of the state-sponsored lives? Fuck that.
Regardless of whether we knew the circumstances of the decision, or whether or not we could reprogram it, wouldn't it be a bit absurd to get angry at the machine?Massive Quasars wrote: This example is unclear. We don't know the circumstance under which it made the decision that resulted in those deaths.
Regardless, if the AI is at fault for those deaths when they were reasonably avoidable, it will face consequences for it's actions. The best course of action would probably be to reprogram it to place greater value on human life or the lives of all sentient conscious beings. Then confine the AI to a simulation where you could test it with a huge number of difficult scenarios, dilemmas, where a decision is required. Should it pass, re-release it and monitor it's actions for some period afterwards.
We don't have that kind of flexibility with the human brain, yet. If we did, some extreme criminal acts may justify compulsory state sanctioned re-engineering of the brain to physically prevent re-offense. Upon release, there may be long term or life term monitoring. If not compulsory, this procedure may be given as one of two options including life without parole or the death penalty.
At the same time, those at high risk to offend may volunteer themselves for such a procedure where they have more say in what is done to them.
Pie in the sky? Maybe.
Is that based on cumulative data? I'd tend to disagree if you looked at each convict individually.feedback wrote:Where do you get this statistic from? Somebody convicted and sentanced for murder, and then released, is no more likely to commit another crime than anybody else.Transient wrote:Rehabilitation is bullshit. The percentage of inmates who get rehabilitated and proceed to stay out of trouble is so small it's laughable. Most end up committing the same crime again, if not a worse crime.
Live in prison is bullshit, too. It costs $60,000 to imprison a man per year. The death penalty is just a quick spike in the electrical bill.
I never said you should get angry at the machine, it's pointless.[xeno]Julios wrote:Regardless of whether we knew the circumstances of the decision, or whether or not we could reprogram it, wouldn't it be a bit absurd to get angry at the machine?Massive Quasars wrote: This example is unclear. We don't know the circumstance under which it made the decision that resulted in those deaths.
Regardless, if the AI is at fault for those deaths when they were reasonably avoidable, it will face consequences for it's actions. The best course of action would probably be to reprogram it to place greater value on human life or the lives of all sentient conscious beings. Then confine the AI to a simulation where you could test it with a huge number of difficult scenarios, dilemmas, where a decision is required. Should it pass, re-release it and monitor it's actions for some period afterwards.
We don't have that kind of flexibility with the human brain, yet. If we did, some extreme criminal acts may justify compulsory state sanctioned re-engineering of the brain to physically prevent re-offense. Upon release, there may be long term or life term monitoring. If not compulsory, this procedure may be given as one of two options including life without parole or the death penalty.
At the same time, those at high risk to offend may volunteer themselves for such a procedure where they have more say in what is done to them.
Pie in the sky? Maybe.
Afterall, it was merely functioning in accordance with the laws of nature. If you were to rewind time back a million times, it would make that same decision over and over again.
If you had a laplacian calculator, or a mathematical archangel, you would (barring quantum indeterminacies), be able to predict in advance the decision that the computer made.
Similarly, we could do the same for a human being from the time she was born.
then perhaps we should never get angry at a human machine.Massive Quasars wrote:I never said you should get angry at the machine, it's pointless.
It is unlikely that the (alleged) indeterminacies of quantum events play any causal role in decision making. More importantly, even if they did, it is hard to see how responsibility is accounted for. If these events are random, then that gives even more support for arguing against responsibility.Massive Quasars wrote:Why bar quantum indeterminancies in your example? Doesn't this reality creep in?
Eye for and eye leads to an endless circle of death.mjrpes wrote:Eye for an eye is the oldest law out there. Old is always best. Just like grandpa always knows best. If you don't go with eye for an eye you're saying that grandpa isn't best. BUT GRANDPA IS ALWAYS BEST!
boo hoo piddla go have a cry mate the little bitch probably had it coming anyway :lol:riddla wrote:Its quite easy to defend. That child's life is forever ruined. She no longer has her innocence and will probably have mental issues for life. To do such a thing to a child of EIGHT YEARS OLD is one step away from a death sentence. You not getting this is whats disturbing and difficult.Ryoki wrote:That is goverment ordered rape and possibly murder, and i find it difficult to see how you can defend it.
Wood chipper.
I'm not sure they stand in need of 'philosophical grounding' to begin with. You've sort of darted through a lot of territory here Jules...it might help if you briefly gave your argument re: why our common sense notions of free-will and responsibility need to be abandoned or at least heavily modified.[xeno]Julios wrote: We have evolved biologically and societally to express anger and feel a sense of injustice, when we are wronged by certain other beings. These beings include other humans. These reactive attitudes have a very important function, but it doesn't mean they are philosophically well grounded.
Well it's pointless, whether or not one becomes angry. Victims may get angry, but the judicial system attempts to be the rational arbiter that settles disputes (ideally without appealling to emotion).[xeno]Julios wrote:then perhaps we should never get angry at a human machine.Massive Quasars wrote:I never said you should get angry at the machine, it's pointless.
I haven't given the issue enough consideration. Don't take this as a cop out since I'd rather not make a comment and have to withdraw it after some thought.It is unlikely that the (alleged) indeterminacies of quantum events play any causal role in decision making. More importantly, even if they did, it is hard to see how responsibility is accounted for. If these events are random, then that gives even more support for arguing against responsibility.Massive Quasars wrote:Why bar quantum indeterminancies in your example? Doesn't this reality creep in?
Yes.Remember, I brought this thought experiment in to qualify the "degrees of responsibility" that you mentioned.
I'd like offer more to the table here, but I admitted initially that I did not have strong confidence in my tentative position on FW. Although I would agree with you on the ludicrousness of an "eye for an eye" justice.Yes we can hold the machine responsible - similarly to how we can hold a mountain responsible for an avalanche - but our reactive attitudes towards the machine should reflect reality. And they do, which is why we do not get angry and feel that the machine should be punished. The notion of an "eye for an eye" is ludicrous.
Again, I agree.We have evolved biologically and societally to express anger and feel a sense of injustice, when we are wronged by certain other beings. These beings include other humans. These reactive attitudes have a very important function, but it doesn't mean they are philosophically well grounded.