Why do people defend the legitimacy of political authority? Those who do so would probably claim that they do so because of the rock-solid status of this idea. And there are a lot of them. Michael Huemer has written a book, "The Problem of Political Authority", and he claims they are wrong. His book deserves a review in this blog, but this entry will focus on the psychology that supports beliefs about authority, a topic he examines in chapter 6. That is where Huemer makes his most original contribution. Basically, he used the first five chapters to refute the various philosophical theories that philosophers have developed to support the idea of legitimate authority. So in chapter 6 he faces the question, if there really is no philosophical foundation for the idea of government authority, why do so many people accept that authority as legitimate? "If there is no political authority, it is natural to ask, then how have so many people come to have such a firm belief in it?" Because Huemer's discussion surprised me, I want to summarize and discuss it here. He creates a sort of descriptive epistemology, that shows not how we ought to arrive at beliefs, but how do we?
First Huemer admits that the problem of popular rejection for an idea is a serious criticism, but not a slam dunk. That is, while popular opinion has been wrong on many occasions, it is safer to bet on it than against it. He sets out to show that in the case of political authority, popular opinion is wrong.
Huemer describes the Milgram experiment, a famous experiment that tested the willingness of ordinary persons to obey the commands of an authority figure. Subjects in the experiment were fooled into thinking that they were giving dangerous electric shocks to other experimental subjects as a punishment for failing a memory challenge. The experiment produced two shocking results: "65 percent of subjects complied fully, eventually administering the 450-volt shock three times to a silent and apparently lifeless victim", and for the most part, participants in the experiment who obeyed fully rationalized their behavior as excusable. This is odd because when people hear about the experiment, almost no one thinks that obedience is justified, yet something about being in the situation undermines this attitude and compels subjects to obey. Huemer concludes, "most people's disposition to obey authorities is far stronger than one would have thought at first glance - and far stronger than one could possibly think justified."
Huemer combines the insight from the Milgram experiment and historical experience from the My Lai massacre and the Nuremburg trials to infer that "even if [all governments were illegitimate], it is quite likely that we would still by and large feel bound to obey our governments. [...] even people who are subjected to the clearest examples of illegitimate power still typically feel bound to obey." He suggests that such persons give in to the urge to obey, and then use motivated reasoning to devise excuses for their behavior. (Haidt describes a detailed psychological model for such motivated reasoning in his book, The Righteous Mind.) In other words, systematic bias warps our understanding of our own impulses to obey authority, and our intuitions about authority are not to be trusted.
Huemer adds the concept of cognitive dissonance to the mix. Cognitive dissonance motivates us to change either our behavior or our beliefs when they conflict. It is much easier to change our beliefs about authority than to escape the psychological impulse to obey authority. Hence, any theory supporting the legitimacy of authority and therefore the conscientiousness of obedience has a motivated audience. "But whether or not our behavior is motivated by compassion and a sense of duty, it is likely that we would generally wish to believe that it is. To believe this, we must accept a basic doctrine of political obligation, and we must accept the legitimacy of our government."
Two further factors appear in Huemer's equation, social proof (difficulty of disagreeing with the group consensus) and status quo bias (tendency to adapt to current practice and accept it as good). If you accept the reality of these two phenomena, then "whether or not any governments were legitimate, most of us would have a strong tendency to believe that some governments are legitimate, especially our own and others like it." Perhaps this sounds too strong, as if makes it impossible for anyone ever to break out of this illusion and oppose authority. But these are biases, tendencies, not empirical absolutes.
Next Huemer describes the power of symbols, rituals, and legalistic jargon to support the appearance of legitimacy of authorities. I'd have liked to get a better understanding of how this is supposed to work, or a more rigorous examination to prove that they actually have the intended effects. But the pervasiveness of these manipulative tools seems to indicate someone thinks they are worth perpetuating.
Finally, Huemer examines attitudes toward authority in light of the idea of Stockholm Syndrome (a psychological bond between kidnappers and their victims that can arise under certain circumstances). I've heard this twist in use before, but usually in a semi-joking way. Huemer is quite serious. I found this fascinating and mind-blowing. I am not quite sure it isn't my confirmation bias carrying me off. "Due to the Stockholm dynamic, power has a self-legitimizing tendency: once it becomes sufficiently entrenched, power is perceived as authority."
If you're too cheap to buy this book, find it in a bookstore and just read chapter six. If you are too lazy to read the whole chapter, just read the conclusion, section 6.8. That puts it all together. Moral illusions "are cases in which we have a systematic tendency to see something as right (or wrong) when in fact it is not. Throughout history, our forebears have been subject to widespread moral illusions - for instance, that women were inferior to men or that dark-skinned people were inferior to light-skinned ones. The suggestion that we are still subject to some moral illusions today should therefore surprise no one. We need to reflect on what moral illusions we might be subject to, keeping in mind that, by the nature of the case, they will not seem, on casual consideration, to be illusions." "Human beings come equipped with strong and pervasive pro-authority biases." "Theories of authority devised by political philosophers can plausibly be viewed as attempts to rationalize common intuitions about the need for obedience, where these intuitions are the product of systematic biases." In other words, question authority.
First Huemer admits that the problem of popular rejection for an idea is a serious criticism, but not a slam dunk. That is, while popular opinion has been wrong on many occasions, it is safer to bet on it than against it. He sets out to show that in the case of political authority, popular opinion is wrong.
Huemer describes the Milgram experiment, a famous experiment that tested the willingness of ordinary persons to obey the commands of an authority figure. Subjects in the experiment were fooled into thinking that they were giving dangerous electric shocks to other experimental subjects as a punishment for failing a memory challenge. The experiment produced two shocking results: "65 percent of subjects complied fully, eventually administering the 450-volt shock three times to a silent and apparently lifeless victim", and for the most part, participants in the experiment who obeyed fully rationalized their behavior as excusable. This is odd because when people hear about the experiment, almost no one thinks that obedience is justified, yet something about being in the situation undermines this attitude and compels subjects to obey. Huemer concludes, "most people's disposition to obey authorities is far stronger than one would have thought at first glance - and far stronger than one could possibly think justified."
Huemer combines the insight from the Milgram experiment and historical experience from the My Lai massacre and the Nuremburg trials to infer that "even if [all governments were illegitimate], it is quite likely that we would still by and large feel bound to obey our governments. [...] even people who are subjected to the clearest examples of illegitimate power still typically feel bound to obey." He suggests that such persons give in to the urge to obey, and then use motivated reasoning to devise excuses for their behavior. (Haidt describes a detailed psychological model for such motivated reasoning in his book, The Righteous Mind.) In other words, systematic bias warps our understanding of our own impulses to obey authority, and our intuitions about authority are not to be trusted.
Huemer adds the concept of cognitive dissonance to the mix. Cognitive dissonance motivates us to change either our behavior or our beliefs when they conflict. It is much easier to change our beliefs about authority than to escape the psychological impulse to obey authority. Hence, any theory supporting the legitimacy of authority and therefore the conscientiousness of obedience has a motivated audience. "But whether or not our behavior is motivated by compassion and a sense of duty, it is likely that we would generally wish to believe that it is. To believe this, we must accept a basic doctrine of political obligation, and we must accept the legitimacy of our government."
Two further factors appear in Huemer's equation, social proof (difficulty of disagreeing with the group consensus) and status quo bias (tendency to adapt to current practice and accept it as good). If you accept the reality of these two phenomena, then "whether or not any governments were legitimate, most of us would have a strong tendency to believe that some governments are legitimate, especially our own and others like it." Perhaps this sounds too strong, as if makes it impossible for anyone ever to break out of this illusion and oppose authority. But these are biases, tendencies, not empirical absolutes.
Next Huemer describes the power of symbols, rituals, and legalistic jargon to support the appearance of legitimacy of authorities. I'd have liked to get a better understanding of how this is supposed to work, or a more rigorous examination to prove that they actually have the intended effects. But the pervasiveness of these manipulative tools seems to indicate someone thinks they are worth perpetuating.
Finally, Huemer examines attitudes toward authority in light of the idea of Stockholm Syndrome (a psychological bond between kidnappers and their victims that can arise under certain circumstances). I've heard this twist in use before, but usually in a semi-joking way. Huemer is quite serious. I found this fascinating and mind-blowing. I am not quite sure it isn't my confirmation bias carrying me off. "Due to the Stockholm dynamic, power has a self-legitimizing tendency: once it becomes sufficiently entrenched, power is perceived as authority."
If you're too cheap to buy this book, find it in a bookstore and just read chapter six. If you are too lazy to read the whole chapter, just read the conclusion, section 6.8. That puts it all together. Moral illusions "are cases in which we have a systematic tendency to see something as right (or wrong) when in fact it is not. Throughout history, our forebears have been subject to widespread moral illusions - for instance, that women were inferior to men or that dark-skinned people were inferior to light-skinned ones. The suggestion that we are still subject to some moral illusions today should therefore surprise no one. We need to reflect on what moral illusions we might be subject to, keeping in mind that, by the nature of the case, they will not seem, on casual consideration, to be illusions." "Human beings come equipped with strong and pervasive pro-authority biases." "Theories of authority devised by political philosophers can plausibly be viewed as attempts to rationalize common intuitions about the need for obedience, where these intuitions are the product of systematic biases." In other words, question authority.
No comments:
Post a Comment