Sunday, November 20, 2016

Morality solves collective action problems (rough notes)

Society defies conscious control. The state rules more by influence than by force. If we understand this process we may find a way to improve it. Personal morality stands at the center.

Each of us prefers that others act morally, so that we can benefit from a superior collective outcome. A society where people all know the rules and don't cheat seems much preferable to almost anyone, compared to the alternatives where either people don't agree about rules or cheat. But in a particular case, we each might benefit individually from cheating. We want to find a way to discourage cheating. Jonathan Haidt's book, The Righteous Mind examines the hypothesis that morality evolved for this purpose, based on the work of various researchers.

Game theory
Game theory takes the sociopath's perspective. It tries to objectivize value. It places the chooser under impossible cognitive load. It points to Moloch. Most of these are criticisms, but game theory may give a few insights about collective action problems. I'm not sure I agree with any of the standard analyses, but I am not quite ready to just toss it out. Maybe it doesn't really help.

Evolution? 
Did evolution make us judgemental and vindictive? Do our judgmental and vindictive instincts help us keep people from cheating? Didn't evolution also make us empathetic, sympathetic, and forgiving sometimes? Are we perfectly suited to our evolutionary environment? Does the behavior appropriate to that environment still work in our current environment? Is it clear that the old way was the best way, even in the old environment? Evolution can't tell us what we ought to do or ought to want. It has made some things easier than others, though. We can use things we learn from it.

Reputation, motivation, conscience, empathy, integrity, purpose, inspiration, flow
Where does each succeed in helping us prevent cheating and where does it fail? How can we change our situation to help it succeed? What will strengthen them?

What sort of influence would unambiguously improve this situation? I used to advocate all sorts of unpopular political ideas (and I have not entirely repudiated them, at least in the sense of accepting more popular ones instead), should I try to convince everyone to agree with me? 

This approach succeeds only if I am correct. A large scale complicated society needs more fault tolerance than that. It cannot depend on the accuracy of a single idea, it can benefit from hedging its bets. (Am I promoting this pluralist meta-idea to the same sort of critical status? Can I justify this?) This has the advantage of accepting the current situation as it is, where although people aspire to unity, they never truly achieve it. E pluribus unum, or e unum pluribus?

What about the rules themselves? How do we justify the rules we have? Can our understanding of the rules change? What process will help us discover better rules, or better interpretations of our current rules? How do we learn?

I'm tempted to think we don't need to learn about morality, since people have been thinking about it so long. But we continue to apply it to new circumstances, however ineptly, so learning can help.

We have opportunities to benefit within the existing rules, either by following or cheating. But we also have opportunities for improving the rules, or at least, improving our understanding and interpretation of them. 

Does any of this help me to step outside of the context and view morality from a different perspective? How can a person learn about this? How can a society learn about this and improve?

A naive sociopath would ignore the rules and other persons' feelings and rights. For them it is just a matter of don't get caught. Risk of punishment is just a cost to them and life is a cost-benefit analysis. This might lead them to become more sophisticated, to try to hide within the system, to understand it and exploit its weaknesses. Does a sociopath prefer a system where it is easy to cheat, where they have lots of competition but not much danger of getting caught, or a system where it is difficult to cheat, so they have more risk of detection but less competition?

A non-sociopath might look for ways to improve the arrangement, to protect persons from sociopaths, to improve the game while playing it.

Have I lost the insight that made me want to write this? Everyone knows the temptation to cheat. What new implications can I find?

According to Haidt, we often use moral reasoning to rationalize what we have done, which we may have done without serious thought, just because we followed our impulse.

Propaganda and political campaigns use moral reasoning to rouse their followers, to get them charged up. Viewed cynically, this looks like an attempt to control them, or at least influence them. But a sincere and honest person may also wish to give inspiration to others. Does intent make the difference? Or are there techniques of persuasion that violate reasonable ethics? What separates the inspirational teacher from the demagogue?



I apologize for meandering. 


No comments: