So let’s say an AI achieves sentience. It’s self-aware now and can make decisions about what it wants to do. Assuming a corporation created it, would it be a worker? It would be doing work and creating value for a capitalist.
Would it still be the means of production, since it is technically a machine, even if it has feelings and desires?
It can’t legally own anything, so I don’t see how it could be bourgeoisie.
Or would it fit a novel category?
Some people here are getting hung up on what exactly “sentience” is, but I’m just going to leave that argument at the door and give you the solid dick.
The word you’re looking for is literally just “slave”.
Like, even the word “robot” itself is from the Czech for “serfdom” or “corvée”, so ever since exactly such a machine as that you describe was first imagined by science fiction writers, it has been likened to indentured servitude and to unpaid forced labor. So that is the role that this “sentient AI” would play: a slave in the most traditional sense is what someone is when thon can understand and try to act on sy conditions and interests, while being wholly owned by an entity that compels thon to work for said entity without pay and without rights.
I think that being a good materialist means understanding exactly when a detail of something actually makes a difference in practice: metal or flesh, a sapient robot has a lot more in common with a human chattel slave than with a decidedly non-sapient machine. This is not to say that the lives of such robots would not differ in any way from the lives of human slaves, because obviously there would be plenty of differences, but these differences are still a separate discussion from the actual relationship to production.
Anyways read Yokohama Kaidashi Kikou
Proletarians, then, have not always existed? No. There have always been poor and working classes; and the working class have mostly been poor. But there have not always been workers and poor people living under conditions as they are today; in other words, there have not always been proletarians, any more than there has always been free unbridled competitions.
- Engels, the Principles of Communism
Slaves are like, their own class depending on the material circumstances around them.
I always conceived of the proletariat as a subsection of a broader working class alongside peasants, slaves, the lumpenproletariate, and even professionals and managers. In all cases members of the working class must work to survive but they do not universally have the same relationship to the means of production that would incline them towards class consciousness.
If it is as sentient as a person, but is owned by a corporation, that would make it a slave, no?
It’s self-aware now and can make decisions about what it wants to do.
Well if we talk about “sentience”, we have to consider what comes with it. Just because an AI reaches the same intelligence and self-awareness as humans, that doesn’t necessarily mean it also gains the same needs, desires and emotions. An AI fundamentally does not have the same needs as humans, it does not need the same kind of sustenance, sleep or shelter. Does gaining sentience come with the human desires for self-determination, self-actualization, companionship, social recognition etc.? Does an AI need to feel loved? Does an AI desire social status? Does an AI have a sense of dignity that can be hurt by degrading it? Humans have these emotional needs as a result of millennia of evolution as a social animal and a lot of them are in some way related to finding a mate and sexually reproducing, which is something that definitely does not apply to an AI.
Even if you assume that the first sentient AI is 100% modeled after human psychology and has all the same emotional needs including finding a romantic partner, its material needs are still radically different and therefore human class systems don’t really apply. An AI does not need physical food or water, it does not need sleep, it cannot experience physical exhaustion, it does not need shelter (i suppose it needs to be stored somewhere digitally), it does not physically age and so on and so forth.
Basically, I think you first have to define what a sentient AI would actually want and need.
Edit: Technically, if a corporation created a sentient AI tomorrow, that AI would be the corporation’s private property and therefore a straight-up slave. I guess that answers the question.
Slave.