Latour opens his article by explaining that sociologists are looking for a missing mass: "moral laws that would be inflexible enough to make us behave properly." Latour argues that the missing mass can be found in nonhumans. When humans don't act properly the nonhuman can, and vise versa.
The example Latour uses is that of a door. To go through a wall, humans would have to break the wall down, walk through the hole, and then build the wall up again. A hinge (nonhuman) fixes this. But then humans must be relied upon to open and close the door, which Latour demonstrates we aren't good at doing. A hydraulic door can then be substituted to open and close for us. The trade off is that hydraulic doors my "discriminate" against smaller people who don't weigh enough to operate the door or older people who don't move fast enough through the door before it closes.
Throughout the article is a question of morality, with the idea that machines can be "relentlessly" moral and humans cannot. But it seems to me as nonhumans grow more advanced the concept of morality grows less solid, especially when we get into "discrimination." Latour states there are ways around this, such as jarring the door with your foot to keep it from shutting in your face. And programmers can work to create a "smarter" door.
At this point, it seems the concept for morality is shifting. First, humans were unreliable, so a hinge was introduced. Humans proved unreliable again, so hydraulics stepped in. But then the nonhuman proved unreliable and so humans must act. This, I think, is what Latour is talking about regarding the social "missing mass." The moral laws find themselves in balance between humans and nonhumans.
All of this brought to mind the movie I, Robot. Jump the following clip to 1:10 and go to 5:10.
What interested me most about the clip is Spooner's story about Sarah. He insists that a human would have known to save Sarah over him, pointing out unreliability in nonhumans even though the robots are built specifically for morality:
Robots are built with three laws:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
According to Spooner's story, however, even with these laws, robots are without a moral compass. He believes humans should reduce their dependence on robots. As Latour would probably put it, Spooner believes humans need to balance out the missing mass or the morality factor.
What is interesting is that Sonny, the main robot in the film, is built so that he can choose to ignore the three laws. This perhaps demonstrates an effort on the part of his creator to balance out the missing mass by creating a robot that can make choices for itself that goes beyond logical processes.
No comments:
Post a Comment