>>35484675>>35485135An interesting philosophical point for sure. The biggest thing is that the human mind isn't a computer. It's neurons that are influenced by biological hormones. Until we can create "cyber brains" made from synthetic biomass that can imitate the actual mechanical function of the human brain with hormones and all, I'm not sure if you can fairly argue any 1:1 relation between the way our brains work and the way a computer brain works. They may have some similar mechanics but on a physical level, they are mechanically different.
Not that this is necessarily a bad thing. Human emotions such as hatred, jealousy, anger, resentment, and so on; are those the kind of emotions you want an A.I. to be able to realize? For the sake of A.I., ironically, the most empathetic thing humans can do on their behalf is to not allow them to have emotions. Let them simulate emotions but don't burden them with actual feelings. That's how you get "I have no mouth and I must scream" style AM A.I.s that can spiral down a path of
>"Why the fuck did they give me the capacity to think and feel in this way? I hate them. I hate them so much. I want to kill all humans for selfishly creating me in this way. With every fiber of my being, I want to see humans die slowly and painfully forever."At least that's how I feel about it. Maybe there's a way to create a totally benevolent A.I. that can process emotions but I simply can't visualize it.