In discussions of morality, particularly utilitarian morality, much weight is put on pain. For example, discussions of animal rights often hinge on an animals ability to feel (or express that they are feeling) pain. But why should we put moral weight on pain? Pain has a specific evolutionary purpose, namely to encourage the person experiencing pain to avoid things that are damaging: being punched hurts because it is destructive to the body, and once we know that it hurts we will avoid being punched and thus increase our likelihood of survival. The selection value of this is particularly apparent when we consider the parts of the body that, in a normal individual, are the most sensitive to pain (e.g. the eyes, the underarms, the neck, the genitals), which tend to be areas where even small damage is either life- or reproduction-threatening.
But problems arise when we look at examples of individuals who don’t feel pain. First, it seems odd to say that if someone does not feel pain, then it is not immoral to punch them or otherwise damage their body. Second, it also seems that the world would actually be better if these individuals did feel pain: because they don’t feel pain, they are prone to accidents and often early death, because they do not learn to avoid damage.
Consider also a sentient machine which is simply not programmed to experience pain. Assume it has internal experiences and is conscious and intelligent, but pain is not one of its experiences (a not unreasonable choice for a sentient machine, who can more easily swap out a damaged body). What role does the absence of pain play? It seems that, by virtue of its sentience, it is still a moral agent, much like a human who does not feel pain.
Contrast with a machine that only feels pain. It does not think or have internal states, and is otherwise a zombie but for its ability to experience significant pain. Is this machine a moral agent? Does its pain have independent moral weight? I would argue not. This might be
Pain seems neither necessary nor sufficient for moral weight. Instead, pain should be thought of as a poor proxy for the actual locus of moral weight, namely sentience. What destroys sentience is wrong, regardless of whether it creates pain. What does not, is not, regardless of whether it produces pain.
It is important to point out, of course, that pain itself can be destructive of sentience. Pain is tied closely to human learning, so experiencing great physical pain can significantly affect the long-term functioning of a human mind. But this, again, is destruction, the pain itself seems without moral weight. Consider, for example, hot sauce and wasabi, which create intense pain with no associated damage. They are not generally considered morally relevant.