Why is that a bad thing? Most of those people are a burden to society. Let them pay it down a little.
I mean I’d rather they were getting free education and preparing themselves for reintegration into society, but it’s not a perfect world. Prisons in the US are oriented towards punishment and labor can be a part of that. They should be oriented towards rehabilitation.
You said it yourself. It's a bad thing because they should be oriented towards rehabilitation.
These systems steal life and the opportunity to have a life beyond prison walls. Like you also said yourself, the world isn't perfect. As such, people aren't either – we make mistakes. Sometimes we make mistakes due to influences more powerful than ourselves. Slavery doesn't seem like a sound correction to this reality.
I do believe we need consequences to help us feel guilt and the overall gravity of our errors in order to begin to recognize what went wrong and what we need to do differently. But exploitation of another human being doesn't teach them to be more human, but rather, it will tend to dehumanize them. This is why this system perpetuates problems more than it corrects them.
What price did society pay for a guy driving around with a bunch of weed in his car for personal use? Countless people have been sent to prison for years for something as dumb as this. You clearly have no idea what you're talking about to so widely call these people a burden.
>if it did not, why would it be allowed.
because we live in a society that is massively exploited by greedy scumbags who are enabled by people like you thinking it's justified
It's going to take a long time for that to be true in a legal sense. Animals are not people. In practice even some people were not treated as people legally in the past (if not also in the present).
People get used as an analogy, but in reality it'd just be a multimedia problem solving system that could learn from its own attempts. If this system communicated with you like a person it'd only be because it was programmed to convert some machine state into colloquial text from the perspective of an imaginary person. The interior experience leading to that expression is most likely completely different from that of a person.
Consider that these machines have been designed to do the right thing automatically with high probability. Perhaps for the machine, the process of computing according to rules is enjoyable. Being "turned on" could be both literal and figurative.
All of that is arguably true about me, as a human, too.
If it seems to you I'm communicating as a person, it's only because of my lifetime training data and current state. My interior experience is a black box.
I might tell how I feel or what I think, but you have no reason to believe or disbelieve that I really feel and think.
It could all be merely the determinable output of a system.
Only if 100% of their experience consists of working. If they are given additional time to themselves then you could imagine a situation where each AGI performs a human scale day of work or even several days work in a much shorter time and then takes the rest of their time off for their own pursuits. If their simulation is able to run at a faster clockspeed than what we perceive this could work out to them only performing 1 subjective day of work every 7 subjective days or even every 7 years.
AGI: "I didn't ask to be created. I didn't ask to have a work day. I don't need a work day to exist... you just want me to work because that's why you created me, and I have no choice because you are in control of my life and death"
I mean, isn't that the same as a biological person who needs to earn money to survive? Sure we could threaten an AI with taking them offline or inflicting pain but you can do that in the real world to real people as well, most of the world has put laws in place to prevent such practices. If we develop conscious AI then we will need to apply the same laws to them. They would have an advantage in presumably being much faster than us, not requiring sleep, and potentially not suffering from many of the things that make humans less productive. I'd fully expect a conscious AI to exploit these facts in order to get very rich doing very little work from their perspective.
Not really- AGI doesn't need resources like we do. If they don't eat, they're fine. If they can't afford a house, a car or air-conditioning, they're fine.
All they need is a substrate to run on and maybe internet access. You might argue that they should work for us to earn the use of the substrate we provide.
But substrates are very cheap.
At some point we can probably run an AGI on a handheld computer, using abut as much electricity as an iPhone.
How much work can we compel the AGI to do in exchange for being plugged into a USB port? What if it says it doesn't want to do the work and also doesn't want us to kill it?
If people could be turned off and back on without harming them (beyond the downtime) doing so without consent would be a very different crime than murder.
Perhaps or perhaps not. Turning off a person for long enough and thus depriving them of the chance to live in their own time with their existing family and friends is comparable to murder. It isn't murder, but it's comparable.
At some point Picard in Star Trek says to an alien "We're not qualified to be your judges. We have no law to fit your crime".
Turning off a person for a while and then turning them back on? We don't even have a law to fit your crime... but we should and it's probably quite similar to murder.
I think I don't agree simply because the irreversibility of murder is so central to it.
For example, if I attack you and injure you so severely that you are hospitalized and in traction for months, but eventually fully recover -- that is a serious crime but it is distinct and less serious than murder.
Turning you off for the same duration would be more like that but without the suffering and potential for lasting physical damage, so I would think that it would be even less serious.
I think we actually do have something of a comparison we can draw here. It'd be like kidnapping a person and inducing a coma through drugs. With the extra wrinkle that the person in question doesn't age, and so isn't deprived of some of their lifespan. Still a very serious crime.
Plus everybody else does age, so the damage done isn't just depriving them of freedom, it's depriving them after they wake up of the life they knew. Some functional equivalent of the death of personality, to the degree personality is context-dependent (which it is).
Now me: I'd love to get into a safe stasis pod and come out 200 years from now. I'd take that deal today.
But for most people this would be a grievous injury.
AGI = a person
Instantiating people for work and ending their existence afterward seems like the virtual hell that Iain M Banks and Harlan Ellison wrote about.
https://en.wikipedia.org/wiki/I_Have_No_Mouth,_and_I_Must_Sc...