Leiderdorp, a.d. XVI Kal. Sep. MMDCCLXVIII A.U.C.,
One of the basic assumptions many economists make, is that people are rational (Definition Rationality). By this they mean that people act in their self-interest, in a reasonable way. This could be their very narrow self-interest, say an employer who pays the lowest possible wage to his employees, but it could also be a broader self-interest, say a mother who wants her child to be happy and thus pays for his education. Although I personally believe that given enough time, education and information, people can make rational decisions, I ultimately think rationality is an assumption that is made too easily and too often by economicsts. Bounded rationality seems to be a safer bet in many scenarios, as supported for instance by Kahneman in Thinking, Fast & Slow. He argues the brain works with two functions, one is intuitive, the other rational. Sometimes people make intuitive choices, and other times more rational choices. Bounded rationality thus puts some limits to what humans can possible do, they have: 1) limited information; 2) limited capacity to process that information; 3) limited time to make decisions (Defition Bounded Rationality).
I think looking at the crisis simulations of historical events organized by students at LUC the Hague through the lens of bounded rationality might actually provide some useful insights. Although these simulations only provide some anecdotal evidence, they might offer insight into the importance of organizations which collect and process information in case of political, but also of economic crises.
Last year some of my fellow students and I organized a crisis simulation of the Vietnam War. It included several committees, such as the Senate, the National Security Council, the North Vietnamese Politburo and the Government of (South) Vietnam. I think we were highly successful in ‘throwing’ sufficient amounts of information at people to make them stop being rational actors, and lapsing into their ‘system one’ decision-making modus. Rather than taking their time to look at the rules of procedure we provided them, and basing their actions on these rules of procedures, they used their intuition. It could be argued that a rational actor would first take the time to find more information, or should even be assumed to be in possession of all necessary information. However, in a situation in which time is a scarce resource and rational actors do not have all information, they simply have to start following their biases, something also argued by Herbert Simon, in “Models of Bounded Rationality: Empirically Grounded Economic Reason.”
This also appeared from the reactions of the participants, for instance, a fellow student of mine noted that he was completely stressed out, leading him to say: “Wow, this was so intense. I never thought it would get this intense.” This does not sound like a cold rational actor, it sounds like someone who was very near to collapsing in despair because of an overload of information. While not presenting any conclusive evidence, this simulation does support Simon’s theory that if we feed an individual sufficient amounts of information at the same time, he will follow his biases and intuition. Rational decision making takes time, and time was a scarce resource during the simulation.
For me it showed the importance of, firstly, reflection on the intuitive choices I make and, secondly, well-ordered decision-making structures, in which decision makers receive as much relevant information as possible from a pre-structured system set up to provide information (which is the official task of the US National Security Council, one of the Councils we simulated). Whereas in a market with repetitive transactions, people can probably act rationally and base their assumption on previously done statistical research and experiences (for instance Mikhail Myagkov and Charles R. Plott argued in 1997), in less often occurring events, like economic crises, we might expect people to behave according to their intuition, as Kahneman argues and was confirmed by our simulation. Governments and large companies thus need a ‘crisis’ team which can be assembled in case of an economic crisis, which can aggregate information (which due to the crisis can no longer be taken from the markets or other usual sources), in order to ensure that decision makers have the information necessary to overcome their biases, in a structured way as to prevent an overload of information, and make a relatively correct assessment of reality possible.
Bottom Line: Human beings are inherently flawed, so we need institutions which prevent and mitigate the negative consequences of our biases and flawed intuition as much as possible.