Saturday, August 16, 2008

When Humans Disappear: Take One

The Qwest Internet service was down--again. A robot asks a series of questions, all heartfelt, of course. I am transferred to "a representative" who is not there either, since it is Saturday. Vacuity upon vacuity, it is.

How does one speak with a bot? One may yell and call names, which are never heard or even processed. Should one be kind, just as Kant said we should be kind to animals--not because animals have rights, but because our bad behavior to animals might spill over to how we treat humans?

Of this I am sure: talking to bots dehumanizes us. We take what is distinctively human--our words, our voice--and direct it (ourselves) to what cannot hear, but pretends to hear, to what cannot respond, but pretends to respond. Data exchange replaces dialogue. Why? The answer is one word, a word that dominates us (usually without our knowledge): efficiency. Bots are faster and cheaper for business. It matters not what they do to us, the humans, the unplugged.


Kevin Winters said...

As beings that are essentially human, a robot cannot dehumanize us. We may believe ourselves to be like robots and try to act accordingly, but that is just living a lie, trying to be something we are not. In other words, we can only be dehumanized when we pretend or are duped into thinking that we are inhuman. Interacting with a computer or a car or a door does not change that.

Douglas Groothuis, Ph.D. said...

I find no logic in thee.

pgepps said...

Which is funny, because I am inclined to agree with both of you. I find that my complicity with a system which treats human interactions as interchangeable with mechanical processes bothers me, and I think it harms me; I think that being God's creature is diminished for me, to me, and sadly by me. At the same time, granted that such diminution is a given on a cursed earth, perhaps I am spitting into the wind when I'm irate at such things. Vanity and vexation of spirit.

Kevin Winters said...

I would think the logic would be quite clear:

1) We are essentially human.
2) Contingent relations do not destroy our humanity (i.e. "dehumanize" us).
3) Therefore, interacting with a robot (a contingent relation) does not destroy our humanity or make it disappear.

Put in more substance-property terms:

1) "Humanness" belongs to me essentially.
2) Contingent relational properties (e.g. relating with objects of all kinds) do not change my essential nature.
3) Ergo, relating with a computer does not make me less human.

If the above is true, and I think anyone who accepts the substance-property metaphysic (and many who do not) would agree, then the dehumanization is a falsehood that you are buying in to, that you are being duped by, as your humanity is there 100% no matter if you are relating to a computer or your wife. Granted, your relation to your wife is different than your relation to your computer, but neither one changes your humanity.

Kevin Winters said...

To put it one more way: your feeling of dehumanization is not inherent in the relation itself, but comes from your own capacity to be duped into thinking that relating with a robot somehow makes you less human, that relating with a robot must somehow make you more like the robot even though the relation is not equal (i.e. the robot cannot relate with you the same way you relate with it, but then again neither can your car, spoon, or record player).

I'm just going off the basic maxim that you've stated many times that words have meaning (though I wouldn't put it that way, we are in basic agreement). The word "dehumanizing" is a misnomer given your metaphysic.

Craig Fletcher said...

What was wrong with your service? I assume it was fixed? Otherwise I can help, but you already know that.

Chad said...

"We take what is distinctively human--our words, our voice--and direct it (ourselves) to what cannot hear, but pretends to hear, to what cannot respond, but pretends to respond. Data exchange replaces dialogue."

The same thing can be said of video games and other forms of entertainment.