Ecological rationality

Ecological rationality is a particular account of practical rationality, which specifies the norms of rational action – what one ought to do in order to be rational. The presently dominant account of practical rationality, rational choice theory, maintains that practical rationality consists in making decisions in accordance with certain rules, irrespective of context. Ecological rationality, in contrast, claims that the rationality of a particular decision depends on the circumstances in which it takes place. What is considered rational under the rational choice account thus might not be considered rational under the 'ecological rationality' account, and vice versa.

Rationality under rational choice theory

Ecological rationality challenges rational choice theory (RCT) as a normative account of rationality. According to rational choice theory, an action is considered rational if the action follows from preferences and expectations that satisfy a set of principles. These principles are often justified based on consistency considerations – for example, intransitive preferences and expectations inconsistent with available information are ruled out. Rational choice theory therefore cashes out practical rationality as the optimal path of action given one's subjective representation of reality.

Violations of rational choice theory

Since the second half of the 20th century, a body of research documented a collection of systematic violations of the principles of RCT. These violations are widely interpreted as demonstrations of irrationalities in human behavior. In contrast, the notion of Ecological Rationality questions the normative validity of RCT and therefore interprets the empirical findings in fundamentally different way.

Ecological rationality in the research on fast and frugal heuristics

Gerd Gigerenzer[1][2] argues that some observed behavior, although violating RCT principles, might be rational in environments with specific characteristics. That is, one ought to violate the principles of RCT in order to act rationally in these environments. This idea, that the rationality of an action not only depends on internal criteria (e.g., preference consistency) but also on the structure of the environment, was proposed earlier by Herbert A. Simon. Simon envisioned rationality as being shaped by a pair of scissors that cuts with two blades – one representing the structure of the task environment, the other the computational capacities of the agent.[3]

Example: take-the-best heuristic

To illustrate, consider the take-the-best heuristic,[4] which can be used for finding the best options from a set of two according to some criterion. Rather than considering information about all attributes of each option, the heuristic uses only information on the most valid attribute (i.e., the attribute correlating the highest with the criterion) that discriminates between different options and chooses the option favored by this one attribute. Thus, it does not form expectations integrating all available information, as required by RCT. Nonetheless, it was found that the take-the-best heuristic can yield better choices than other models of decision-making including multiple regression that considers all available information.[5] The success of this strategy, however, depends on specific characteristics of the choice environment: When information is scarce, validities of the attributes vary highly, and a large portion of attributes is redundant, the take-the-best heuristic is preferred.[6]

Example: 1/N heuristic

For a second example, consider the question of how to distribute an investment over several investment options. According to the 1/N heuristic, also called Naive Allocation,[7] agents simply allocate shares of equal size to each investment option. In contrast to the prescriptions of RCT, it does not consider any of the available information, nor does it generate a preference ranking of the available options. When the choice environment is characterized by high predictive uncertainty, a large set of investment options, and limited information about past performance, no rational choice model was found to consistently outperform the 1/N heuristic on a variety of indicators.[8]

Normative justifications

Given these and other examples, it seems reasonable to conclude that one ought to predict or choose according to these heuristics rather than according to RCT principles in these environments. Lists of environment characteristics affecting strategy performance have been proposed. They include uncertainty, the number of alternatives, sample size, redundancy, and variability.[2]

However, this is only a prescription how to act in these environments. The question remains where these prescriptions draw their normative power from. A number of arguments have been proposed in this regard.

First, RCT in some cases poses demands on cognitive abilities that humans cannot satisfy. Many real-world problems are computationally intractable - for example, making probabilistic inferences using Bayesian belief networks is NP-hard.[9] Many theorists agree that accounts of rationality must not require „[…] capacities, abilities, and skills far beyond those possessed by human beings as they now are.“ [10]

Second, even for problems that are tractable it has been argued that heuristics save effort, albeit at the cost of accuracy. Depending on the structure of the environment, this loss of accuracy might be small.[11] Taking effort considerations into account may make it normatively rational to choose according to the simpler and slightly less accurate heuristic.

Third, there is a fundamental distinction between situations characterized by either risk (known risks) or uncertainty (unknown risks).[12] In situations of risk, the accuracy-effort trade-off outlined above implies loss in accuracy as consequence of reducing the complexity of the decision strategy. In contrast, situations of uncertainty allow for less-is-more effects, describing situations in which systematically ignoring part of the available information leads to more accurate inferences. Adaptive heuristics, doing exactly this, may therefore be ecologically rational. An explanation of this finding is offered by the bias-variance dilemma, which is a mathematical formulation of how ignorance increases one source of estimation error (bias) but decreases another one (variance).[13]

Ecological rationality in experimental economics

Independently of Gerd Gigerenzer, Vernon L. Smith has developed his own account of ecological rationality. The two notions are related, however Smith predicates the concept to social entities such as markets, which have evolved in a trial-and-error process to an efficient outcome.[14]

See also

References

  1. Gigerenzer, G. (2008). "Why Heuristics Work". Perspectives on Psychological Science. 3: 20–281. doi:10.1111/j.1745-6916.2008.00058.x.
  2. 1 2 Gigerenzer, Gerd; Todd, Peter M. (1999). "Ecological rationality: the normative study of heuristics". In Gigerenzer, Gerd; Todd, Peter M.; The ABC Research Group. Ecological Rationality: Intelligence in the World. New York: Oxford University Press. pp. 487–497.
  3. Simon, H. A. (1990). "Invariants of Human Behavior". Annual Review of Psychology. 41: 1–19. doi:10.1146/annurev.ps.41.020190.000245. PMID 18331187.
  4. Gigerenzer, G.; Goldstein, D. G. (1996). "Reasoning the fast and frugal way: Models of bounded rationality". Psychological Review. 103 (4): 650–669. doi:10.1037/0033-295X.103.4.650. PMID 8888650.
  5. Czerliski, Jean; Gigerenzer, Gerd; Goldstein, Daniel G. (1999). "How good are simple heuristics?". In Gigerenzer, Gerd; Todd, Peter M.; The ABC Research Group. Simple Heuristics That Make Us Smart. New York: Oxford University Press. pp. 97–118.
  6. Hogarth, R. M.; Karelaia, N. (2005). "Ignoring information in binary choice with continuous variables: When is less "more"?". Journal of Mathematical Psychology. 49 (2): 115. doi:10.1016/j.jmp.2005.01.001.
  7. Samson, Alain. "The Behavioral Economics Guide 2015" (PDF). Behavioral Economics. Retrieved 12 December 2015.
  8. Demiguel, V.; Garlappi, L.; Uppal, R. (2007). "Optimal Versus Naive Diversification: How Inefficient is the 1/N Portfolio Strategy?". Review of Financial Studies. 22 (5): 1915. doi:10.1093/rfs/hhm075.
  9. Cooper, G. F. (1990). "The computational complexity of probabilistic inference using bayesian belief networks". Artificial Intelligence. 42 (2–3): 393–405. doi:10.1016/0004-3702(90)90060-D.
  10. Nozik, Robert (1963). The Normative Study of Individual Choice (Ph.D.). Harvard University.
  11. Payne, J. W.; Bettman, J. R.; Johnson, E. J. (1993). "The adaptive decision maker". doi:10.1017/CBO9781139173933. ISBN 9781139173933.
  12. Frank Hyneman Knight "Risk, uncertainty and profit" pg. 19, Hart, Schaffner, and Marx Prize Essays, no. 31. Boston and New York: Houghton Mifflin. 1921.
  13. Gigerenzer, Gerd; Brighton, Henry (2009). "Homo Heuristicus: Why Biased Minds Make Better Inferences". Topics in Cognitive Science. 1: 107–143. doi:10.1111/j.1756-8765.2008.01006.x. PMID 25164802.
  14. Smith, V. L. (2003). "Constructivist and Ecological Rationality in Economics†". American Economic Review. 93 (3): 465–508. doi:10.1257/000282803322156954.
This article is issued from Wikipedia - version of the 10/16/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.