Introspection illusion
The introspection illusion is a cognitive bias in which people wrongly think they have direct insight into the origins of their mental states, while treating others' introspections as unreliable. In certain situations, this illusion leads people to make confident but false explanations of their own behavior (called "causal theories"[1]) or inaccurate predictions of their future mental states.
The illusion has been examined in psychological experiments, and suggested as a basis for biases in how people compare themselves to others. These experiments have been interpreted as suggesting that, rather than offering direct access to the processes underlying mental states, introspection is a process of construction and inference, much as people indirectly infer others' mental states from their behavior.[2]
When people mistake unreliable introspection for genuine self-knowledge, the result can be an illusion of superiority over other people, for example when each person thinks they are less biased and less conformist than the rest of the group. Even when experimental subjects are provided with reports of other subjects' introspections, in as detailed a form as possible, they still rate those other introspections as unreliable while treating their own as reliable. Although the hypothesis of an introspection illusion informs some psychological research, the existing evidence is arguably inadequate to decide how reliable introspection is in normal circumstances.[3] Correction for the bias may be possible through education about the bias and its unconscious nature.[4]
Components
The phrase "introspection illusion" was coined by Emily Pronin.[5] Pronin describes the illusion as having four components:
- People give a strong weighting to introspective evidence when assessing themselves.
- They do not give such a strong weight when assessing others.
- People disregard their own behavior when assessing themselves (but not others).
- Own introspections are more highly weighted than others. It is not just that people lack access to each other's introspections: they regard only their own as reliable.[6]
Unreliability of introspection
[I]ntrospection does not provide a direct pipeline to nonconscious mental processes. Instead, it is best thought of as a process whereby people use the contents of consciousness to construct a personal narrative that may or may not correspond to their nonconscious states.
Timothy D. Wilson and Elizabeth W. Dunn (2004)[7]
A 1977 paper by psychologists Richard Nisbett and Timothy D. Wilson challenged the directness and reliability of introspection, thereby becoming one of the most cited papers in the science of consciousness.[8][9] Nisbett and Wilson reported on experiments in which subjects verbally explained why they had a particular preference, or how they arrived at a particular idea. On the basis of these studies and existing attribution research, they concluded that reports on mental processes are confabulated. They wrote that subjects had, "little or no introspective access to higher order cognitive processes".[10] They distinguished between mental contents (such as feelings) and mental processes, arguing that while introspection gives us access to contents, processes remain hidden.[8]
Although some other experimental work followed from the Nisbett and Wilson paper, difficulties with testing the hypothesis of introspective access meant that research on the topic generally stagnated.[9] A ten-year-anniversary review of the paper raised several objections, questioning the idea of "process" they had used and arguing that unambiguous tests of introspective access are hard to achieve.[3]
Updating the theory in 2002, Wilson admitted that the 1977 claims had been too far-reaching.[10] He instead relied on the theory that the adaptive unconscious does much of the moment-to-moment work of perception and behavior. When people are asked to report on their mental processes, they cannot access this unconscious activity.[7] However, rather than acknowledge their lack of insight, they confabulate a plausible explanation, and "seem" to be "unaware of their unawareness".[11]
The idea that people can be mistaken about their inner functioning is one applied by eliminative materialists. These philosophers suggest that some concepts, including "belief" or "pain" will turn out to be quite different from what is commonly expected as science advances.
The faulty guesses that people make to try and explain their thought processes have been called "causal theories".[1] The causal theories provided after an action will often serve only to justify the person's behaviour in order to relieve cognitive dissonance. That is, a person may not have noticed the real reasons for their behavior, even when trying to provide explanations. The result is an explanation that mostly just makes themselves feel better. An example might be a man who discriminates against homosexuals because he is embarrassed that he himself is attracted to other men. He may not admit this to himself, instead claiming his prejudice is because he believes that homosexuality is unnatural.
A study conducted by philosopher Eric Schwitzgebel and psychologist Russell T. Hurlburt was set up to measure the extent of introspective accuracy by gathering introspective reports from a single individual who was given the pseudonym "Melanie". Melanie was given a beeper which sounded at random moments, and when it did she had to note what she was currently feeling and thinking. After analyzing the reports the authors had mixed views about the results, the correct interpretation of Melanie's claims and her introspective accuracy. Even after long discussion the two authors disagreed with each other in the closing remarks, Schwitzgebel being pessimistic and Hurlburt optimistic about the reliability of introspection.[12]
Factors in accuracy
Nisbett and Wilson conjectured about several factors that they found to contribute to the accuracy of introspective self-reports on cognition.[8]
- Availability: Stimuli that are highly salient (either due to recency or being very memorable) are more likely to be recalled and considered for the cause of a response.
- Plausibility: Whether a person finds a stimulus to be a sufficiently likely cause for an effect determines the influence it has on their reporting of the stimulus.
- Removal in time: The greater the distance in time since the occurrence of an event, the less available and more difficult to accurately recall it is.
- Mechanics of judgment: People do not recognize the influence that judgment factors (e.g., position effects) have on them, leading to inaccuracies in self-reporting.
- Context: Focusing on the context of an object distracts from evaluation of that object and can lead people to falsely believe that their thoughts about the object are represented by the context.
- Nonevents: The absence of an occurrence is naturally less salient and available than an occurrence itself, leading nonevents to have little influence on reports.
- Nonverbal behavior: While people receive a large amount of information about others via nonverbal cues, the verbal nature of relaying information and the difficulty of translating nonverbal behavior into verbal form lead to a lower reporting frequency of this behavior.
- Discrepancy between the magnitudes of cause and effect: Because it seems natural to assume that a certain size cause will lead to a similarly-sized effect, connections between causes and effects of different magnitudes are not often drawn.
Unawareness of error
Several hypotheses to explain people’s unawareness of their inaccuracies in introspection were provided by Nisbett and Wilson:[8]
- Confusion between content and process: People are usually unable to access the exact process by which they arrived at a conclusion, but can recall an intermediate step prior to the result. However, this step is still content in nature, not a process. The confusion of these discrete forms leads people to believe that they are able to understand their judgment processes. (Nisbett and Wilson have been criticized for failing to provide a clear definition of the differences between mental content and mental processes.)
- Knowledge of prior idiosyncratic reactions to a stimulus: An individual’s belief that they react in an abnormal manner to a stimulus, which would be unpredictable from the standpoint of an outside observer, seems to support true introspective ability. However, these perceived covariations may actually be false, and truly abnormal covariations are rare.
- Differences in causal theories between subcultures: The inherent differences between discrete subcultures necessitates that they have some differing causal theories for any one stimulus. Thus, an outsider would not have the same ability to discern a true cause as would an insider, again making it seem to the introspector that they have the capacity to understand the judgment process better than can another.
- Attentional and intentional knowledge: An individual may consciously know that they were not paying attention to a certain stimulus or did not have a certain intent. Again, as insight that an outside observer does not have, this seems indicative of true introspective ability. However, the authors note that such knowledge can actually mislead the individual in the case that it is not as influential as they may think.
- Inadequate feedback: By nature, introspection is difficult to be disconfirmed in everyday life, where there are no tests of it and others tend not to question one’s introspections. Moreover, when a person’s causal theory of reasoning is seemingly disconfirmed, it is easy for them to produce alternative reasons for why the evidence is actually not disconfirmatory at all.
- Motivational reasons: Considering one’s own ability to understand their reasoning as being equivalent to an outsider’s is intimidating and a threat to the ego and sense of control. Thus, people do not like to entertain the idea, instead maintaining the belief that they can accurately introspect.
Choice blindness
Inspired by the Nisbett and Wilson paper, Petter Johansson and colleagues investigated subjects' insight into their own preferences using a new technique. Subjects saw two photographs of people and were asked which they found more attractive. They were given a closer look at their "chosen" photograph and asked to verbally explain their choice. However, in some trials, the experimenter had slipped them the other photograph rather than the one they had chosen, using sleight of hand.[13] A majority of subjects failed to notice that the picture they were looking at did not match the one they had chosen just seconds before. Many subjects confabulated explanations of their preference. For example, a man might say "I preferred this one because I prefer blondes" when he had in fact pointed to the dark-haired woman, but had been handed a blonde.[9] These must have been confabulated because they explain a choice that was never made.[13]
The large proportion of subjects who were taken in by the deception contrasts with the 84% who, in post-test interviews, said that hypothetically they would have detected a switch if it had been made in front of them. The researchers coined the phrase "choice blindness" for this failure to detect a mismatch.[14]
A follow-up experiment involved shoppers in a supermarket tasting two different kinds of jam, then verbally explaining their choice while taking further spoonfuls from the "chosen" pot. The pots were rigged so that when explaining their choice, the subjects were tasting the jam they had previously rejected. A similar experiment was also done with tea.[15] Another variation involved subjects choosing between two objects displayed on PowerPoint slides, then explaining their choice when the description of what they chose has been altered.[16]
Research by Paul Eastwick and Eli Finkel at Northwestern University also undermined the idea that subjects have direct introspective awareness of what attracts them to other people. These researchers examined male and female subjects' reports of what they found attractive. Men typically reported that physical attractiveness was crucial while women identified earning potential as most important. These subjective reports did not predict their actual choices in a speed dating context, or their dating behavior in a one-month follow-up.[17]
Consistent with choice blindness, Henkel and Mather found that people are easily convinced by false reminders that they chose different options than they actually chose and that they show greater choice-supportive bias in memory for whichever option they believe they chose.[18]
Criticisms
It is not clear, however, the extent to which these findings apply to real-life experience when we have more time to reflect or use actual faces (as opposed to gray-scale photos).[19] As Prof. Kaszniak points out: "although a priori theories are an important component of people's causal explanations, they are not the sole influence, as originally hypothesized by Nisbett & Wilson. Actors also have privileged information access that includes some degree of introspective access to pertinent causal stimuli and thought processes, as well as better access (than observers) to stimulus-response covariation data about their own behavior".[20]
Attitude change
Studies that ask participants to introspect upon their reasoning (for liking, choosing, or believing something, etc.) tend to see a subsequent decrease in correspondence between attitude and behavior in the participants.[21] For example, in a study by Wilson et al. participants rated their interest in puzzles that they had been given. Prior to the ratings, one group had been instructed to contemplate and write down their reasons for liking or disliking the puzzles, while the control group was given no such task. The amount of time participants spent playing with each puzzle was then recorded. The correlation between the puzzle ratings and time spent playing with each was much smaller for the introspection group than the control group.[22]
A subsequent study was performed to show the generalizability of these results to more “realistic” circumstances. In this study, participants were all involved in a steady romantic relationship. All were asked to rate how well-adjusted their relationship was. One group was asked to list all of the reasons behind their feelings for their partner, while the control group did not do so. Six months later, the experimenters followed up with participants to check if they were still in the same relationship. Those who had been asked to introspect showed much less attitude-behavior consistency based upon correlations between the relationship ratings and whether they were still dating their partners. This shows that introspection was not predictive, but this also probably means that the introspection has changed the evolution of the relationship.[22]
The authors theorize that these effects are due to participants changing their attitudes, when confronted with a need for justification, without changing their corresponding behaviors. The authors hypothesize that this attitude shift is the result of a combination of things: a desire to avoid feeling foolish for simply not knowing why one feels a certain way; a tendency to make justifications based upon cognitive reasons, despite the large influence of emotion; ignorance of mental biases (e.g., halo effects); and self-persuasion that the reasons one has come up with must be representative with their attitude. In effect, people attempt to supply a “good story” to explain their reasoning, which often leads to convincing themselves that they actually hold a different belief.[21] In studies wherein participants chose an item to keep, their subsequent reports of satisfaction with the item decreased, suggesting that their attitude changes were temporary, returning to the original attitude over time.[23]
Introspection by focusing on feelings
In contrast with introspection by focusing on reasoning, that which instructs one to focus on their feelings has actually been shown to increase attitude-behavior correlations.[21] This finding suggests that introspecting on one’s feelings is not a maladaptive process.
A priori causal theories
In their classic paper, Nisbett and Wilson proposed that introspective confabulations result from a priori theories, of which they put forth four possible discrete origins:[8]
- Explicit cultural rules (e.g., stopping at red traffic lights)
- Implicit cultural theories, with certain schemata for likely stimulus-response relationships (e.g., an athlete only endorses a brand because he is paid to do so)
- Individual observational experiences that lead one to form a theory of covariation
- Similar connotation between stimulus and response
The authors note that the use of these theories does not necessarily lead to inaccurate assumptions, but that this frequently occurs because the theories are improperly applied.
Explaining biases
Pronin argues that over-reliance on intentions is a factor in a number of different biases. For example, by focusing on their current good intentions, people can overestimate their likelihood of behaving virtuously.[24]
In perceptions of bias
The bias blind spot is an established phenomenon that people rate themselves as less susceptible to bias than their peer group. Emily Pronin and Matthew Kugler argue that this phenomenon is due to the introspection illusion.[25] In their experiments, subjects had to make judgments about themselves and about other subjects.[26] They displayed standard biases, for example rating themselves above the others on desirable qualities (demonstrating illusory superiority). The experimenters explained cognitive bias, and asked the subjects how it might have affected their judgment. The subjects rated themselves as less susceptible to bias than others in the experiment (confirming the bias blind spot). When they had to explain their judgments, they used different strategies for assessing their own and others' bias.[26]
Pronin and Kugler's interpretation is that when people decide whether someone else is biased, they use overt behavior. On the other hand, when assessing whether or not they themselves are biased, people look inward, searching their own thoughts and feelings for biased motives. Since biases operate unconsciously, these introspections are not informative, but people wrongly treat them as reliable indication that they themselves, unlike other people, are immune to bias.[25]
Pronin and Kugler tried to give their subjects access to others' introspections. To do this, they made audio recordings of subjects who had been told to say whatever came into their heads as they decided whether their answer to a previous question might have been affected by bias. Although subjects persuaded themselves they were unlikely to be biased, their introspective reports did not sway the assessments of observers.[26]
When asked what it would mean to be biased, subjects were more likely to define bias in terms of introspected thoughts and motives when it applied to themselves, but in terms of overt behavior when it applied to other people. When subjects were explicitly told to avoid relying on introspection, their assessments of their own bias became more realistic.[26]
Additionally, Nisbett and Wilson found that asking participants whether biases (such as the position effect in the stocking study) had an effect on their decisions resulted in a negative response, in contradiction with the data.[8]
In perceptions of conformity
Another series of studies by Pronin and colleagues examined perceptions of conformity. Subjects reported being more immune to social conformity than their peers. In effect, they saw themselves as being "alone in a crowd of sheep". The introspection illusion appeared to contribute to this effect. When deciding whether others respond to social influence, subjects mainly looked at their behavior, for example explaining other student's political opinions in terms of following the group. When assessing their own conformity, subjects treat their own introspections as reliable. In their own minds, they found no motive to conform, and so decided that they had not been influenced.[27]
In perceptions of control and free will
Psychologist Daniel Wegner has argued that an introspection illusion contributes to belief in paranormal phenomena such as psychokinesis.[28] He observes that in everyday experience, intention (such as wanting to turn on a light) is followed by action (such as flicking a light switch) in a reliable way, but the processes connecting the two are not consciously accessible. Hence though subjects may feel that they directly introspect their own free will, the experience of control is actually inferred from relations between the thought and the action. This theory, called "apparent mental causation", acknowledges the influence of David Hume's view of the mind.[28] This process for detecting when one is responsible for an action is not totally reliable, and when it goes wrong there can be an illusion of control. This could happen when an external event follows, and is congruent with, a thought in someone's mind, without an actual causal link.[28]
As evidence, Wegner cites a series of experiments on magical thinking in which subjects were induced to think they had influenced external events. In one experiment, subjects watched a basketball player taking a series of free throws. When they were instructed to visualise him making his shots, they felt that they had contributed to his success.[29]
If the introspection illusion contributes to the subjective feeling of free will, then it follows that people will more readily attribute free will to themselves rather than others. This prediction has been confirmed by three of Pronin and Kugler's experiments. When college students were asked about personal decisions in their own and their roommate's lives, they regarded their own choices as less predictable. Staff at a restaurant described their co-workers' lives as more determined (having fewer future possibilities) than their own lives. When weighing up the influence of different factors on behavior, students gave desires and intentions the strongest weight for their own behavior, but rated personality traits as most predictive of other people.[30]
Criticism
It should be noted that criticism to Wegner's claims regarding the significance of introspection illusion for the notion of free will has been published.[31]
Criticisms
Research shows that human volunteers can estimate response times accurately, in fact knowing their "mental processes", although this puts high demand on their attention and cognitive resources (i.e. they are distracted while estimating). Such estimates are likely more than post-hoc interpretation and may include privileged information.[32][33] Mindfulness training can also increase introspective capacity in some instances.[34][35][36] Nisbett and Wilson's findings were criticized by psychologists Ericsson and Simon, among others.[37]
Correcting for the bias
A study that investigated the effect of educating people about unconscious biases on their subsequent self-ratings of susceptibility to bias showed that those who were educated did not exhibit the bias blind spot, in contrast with the control group. This finding provides hope that being informed about unconscious biases such as the introspection illusion may help people to avoid making biased judgments, or at least make them aware that they are biased. Findings from other studies on correction of the bias yielded mixed results. In a later review of the introspection illusion, Pronin suggests that the distinction is that studies that merely provide a warning of unconscious biases will not see a correction effect, whereas those that inform about the bias and emphasize its unconscious nature do yield corrections. Thus, knowledge that bias can operate during conscious awareness, is the defining factor in leading people to correct for it.[4]
Moreover, also Timothy Wilson has tried to find a way to get out from "introspection illusion", and in his book Strangers to Ourselves he suggests that the observation of our own behaviours more than our thoughts can be one of the keys for a pure introspective knowledge.
See also
- Attitude Behavior Consistency
- Choice theory
- Change blindness
- List of cognitive biases
- Self-deception
- Self-perception theory
Notes
- 1 2 Aronson, Elliot; Wilson, Timothy D.; Akert, Robin M.; Sommers, Samuel R. (2015). Social Psychology (9th ed.). Pearson Education. p. 128. ISBN 9780133936544.
- ↑ Wilson 2002, p. 167
- 1 2 White, Peter A. (1988). "Knowing more about what we can tell: 'Introspective access' and causal report accuracy 10 years later". British Journal of Psychology. British Psychological Society. 79 (1): 13–45. doi:10.1111/j.2044-8295.1988.tb02271.x.
- 1 2 Pronin 2009, pp. 52–53
- ↑ Shermer, Michael (2007). The Mind of the Market: Compassionate Apes, Competitive Humans, and Other Tales from Evolutionary Economics. Times Books. p. 72. ISBN 978-0-8050-7832-9.
- ↑ Pronin 2009, p. 5
- 1 2 Wilson, Timothy D.; Dunn, Elizabeth W. (2004). "Self-Knowledge: Its Limits, Value, and Potential for Improvement". Annual Review of Psychology. 55 (1): 493–518. doi:10.1146/annurev.psych.55.090902.141954. PMID 14744224.
- 1 2 3 4 5 6 Nisbett, Richard E.; Wilson, Timothy D. (1977). "Telling more than we can know: Verbal reports on mental processes". Psychological Review. 84 (3): 231–259. doi:10.1037/0033-295x.84.3.231. reprinted in David Lewis Hamilton, ed. (2005). Social cognition: key readings. Psychology Press. ISBN 978-0-86377-591-8.
- 1 2 3 Johansson, P; Hall, L; Sikström, S; Tärning, B; Lind, A (2006). "How something can be said about telling more than we can know: On choice blindness and introspection" (PDF). Consciousness and Cognition. Elsevier. 15 (4): 673–692. doi:10.1016/j.concog.2006.09.004. PMID 17049881.
- 1 2 Wilson 2002, pp. 104–106
- ↑ Wilson, T. D.; Bar-Anan, Y (August 22, 2008). "The Unseen Mind". Science. American Association for the Advancement of Science. 321 (5892): 1046–1047. doi:10.1126/science.1163029. PMID 18719269.
- ↑ Schwitzgebel and Hurlburt (2007). Describing Inner Experience?. MIT Press. ISBN 978-0-262-08366-9.
- 1 2 Johansson, P; Hall, L; Sikström, S; Olsson, A (October 7, 2005). "Failure to Detect Mismatches Between Intention and Outcome in a Simple Decision Task" (PDF). Science. 310 (5745): 116–119. doi:10.1126/science.1111709. PMID 16210542.
- ↑ Hall, Lars; Johansson, Petter; Sikström, Sverker; Tärning, Betty; Lind, Andreas (2008). "Reply to commentary by Moore and Haggard". Consciousness and Cognition. Elsevier. 15 (4): 697–699. doi:10.1016/j.concog.2006.10.001.
- ↑ Hall, L.; Johansson, P.; Tärning, B.; Sikström, S.; Deutgen, T. (2010). "Magic at the marketplace: Choice blindness for the taste of jam and the smell of tea". Cognition. 117 (1): 54–61. doi:10.1016/j.cognition.2010.06.010. PMID 20637455.
- ↑ Hall, Lars; Petter Johansson. "Using choice blindness to study decision making and introspection" (PDF). Retrieved 2009-07-02. In P. Gärdenfors & A. Wallin (Eds.) (2008). Cognition – A Smorgasbord. pp. 267-283.
- ↑ Eastwick, P. W.; Finkel, E. J. (February 2008). "Sex differences in mate preferences revisited: Do people know what they initially desire in a romantic partner?". Journal of Personality and Social Psychology. American Psychological Association. 94 (2): 245–264. doi:10.1037/0022-3514.94.2.245. PMID 18211175.
- ↑ Henkel, L; Mather, M (2007). "Memory attributions for choices: How beliefs shape our memories". Journal of Memory and Language. 57 (2): 163–176. doi:10.1016/j.jml.2006.08.012.
- ↑ Johansson, Petter; Hall, Lars; Sikstrom, Sverker (2008). "From Change Blindness to Choice Blindness" (PDF). Psychologia. 51 (2): 142–155. doi:10.2117/psysoc.2008.142.
- ↑ Kaszniak, A. W. (2002). "How well can we know ourselves? — Further Exploration of Introspection". Psychology of Consciousness Class Notes. University of Arizona. Archived from the original on 2009-02-04.
- 1 2 3 Wilson, Timothy D.; Dunn, Dana S.; Kraft, Dolores; Lisle, Douglas J. (1989). "Introspection, attitude change, and attitude-behavior consistency: The disruptive effects of explaining why we feel the way we do". Advances in Experimental Social Psychology. 22. pp. 287–343. doi:10.1016/S0065-2601(08)60311-1. ISBN 9780120152223.
- 1 2 Wilson, Timothy; D. Dunn; J. Bybee; D. Hyman; J. Rotondo (1984). "Effects of analyzing reasons on attitude-behavior consistency". Journal of Personality and Social Psychology. 47: 5–16. doi:10.1037/0022-3514.47.1.5.
- ↑ Wilson, Timothy; D. Lisle; J. Schooler; S. Hodges; K. Klaaren; S. LaFleur (1993). "Introspecting about reasons can reduce post-choice satisfaction". Personality and Social Psychology Bulletin. 19 (3): 331–339. doi:10.1177/0146167293193010.
- ↑ Pronin, Emily (January 2007). "Perception and misperception of bias in human judgment". Trends in Cognitive Sciences. Elsevier. 11 (1): 37–43. doi:10.1016/j.tics.2006.11.001. PMID 17129749.
- 1 2 Gilovich, Thomas; Nicholas Epley; Karlene Hanko (2005). "Shallow Thoughts About the Self: The Automatic Components of Self-Assessment". In Mark D. Alicke; David A. Dunning; Joachim I. Krueger. The Self in Social Judgment. Studies in Self and Identity. New York: Psychology Press. p. 77. ISBN 978-1-84169-418-4.
- 1 2 3 4 Pronin, Emily; Kugler, Matthew B. (July 2007). "Valuing thoughts, ignoring behavior: The introspection illusion as a source of the bias blind spot". Journal of Experimental Social Psychology. Elsevier. 43 (4): 565–578. doi:10.1016/j.jesp.2006.05.011.
- ↑ Pronin, E; Berger, J; Molouki, S (2007). "Alone in a Crowd of Sheep: Asymmetric Perceptions of Conformity and Their Roots in an Introspection Illusion". Journal of Personality and Social Psychology. American Psychological Association. 92 (4): 585–595. doi:10.1037/0022-3514.92.4.585. PMID 17469946.
- 1 2 3 Wegner, Daniel M. (2008). "Self is Magic" (PDF). In John Baer; James C. Kaufman; Roy F. Baumeister. Are we free? Psychology and Free Will. New York: Oxford University Press. ISBN 978-0-19-518963-6.
- ↑ Pronin, E; Wegner, D. M.; McCarthy, K; Rodriguez, S (2006). "Everyday Magical Powers: The Role of Apparent Mental Causation in the Overestimation of Personal Influence" (PDF). Journal of Personality and Social Psychology. American Psychological Association. 91 (2): 218–231. doi:10.1037/0022-3514.91.2.218. PMID 16881760. Retrieved 2009-07-03.
- ↑ Pronin 2009, pp. 42–43
- ↑ e.g. criticism by H. Andersen in his paper with the title 'Two Causal Mistakes in Wegner's Illusion of Conscious Will' ; Also as a criticism, read "On the alleged illusion of conscious will' by Van Duijn and Sacha Bem. Other papers can be found).
- ↑ Marti, Sébastien; Sackur, Jérôme; Sigman, Mariano; Dehaene, Stanislas (2010). "Mapping introspection's blind spot: Reconstruction of dual-task phenomenology using quantified introspection". Cognition. 115 (2): 303–313. doi:10.1016/j.cognition.2010.01.003. PMID 20129603.
- ↑ Guggisberg, Adrian G.; Dalal, Sarang S.; Schnider, Armin; Nagarajan, Srikantan S. (2011). "The neural basis of event-time introspection". Consciousness and Cognition. 20 (4): 1899–1915. doi:10.1016/j.concog.2011.03.008. PMC 3161169. PMID 21498087.
- ↑ Djikic, Maja; Langer, Ellen J.; Fulton Stapleton, Sarah (June 2008). "Reducing Stereotyping Through Mindfulness: Effects on Automatic Stereotype-Activated Behaviors" (PDF). Journal of Adult Development. 15 (2): 106–111. doi:10.1007/s10804-008-9040-0. Archived from the original (PDF) on 2012-07-29.
- ↑ Roberts-Wolfe, D; Sacchet, M. D.; Hastings, E; Roth, H; Britton, W (2012). "Mindfulness training alters emotional memory recall compared to active controls: Support for an emotional information processing model of mindfulness". Frontiers in Human Neuroscience. 6: 15. doi:10.3389/fnhum.2012.00015. PMC 3277910. PMID 22347856.
- ↑ Chiesa, Alberto; Calati, Raffaella; Serretti, Alessandro (April 2011). "Does mindfulness training improve cognitive abilities? A systematic review of neuropsychological findings" (PDF). Clinical Psychology Review. 31 (3): 449–464. doi:10.1016/j.cpr.2010.11.003. PMID 21183265.
- ↑ Ericsson, K. Anders; Simon, Herbert A. (May 1980). "Verbal reports as data". Psychological Review. 87 (3): 215–251. doi:10.1037/0033-295X.87.3.215.
Sources
- Pronin, Emily (2009). "The Introspection Illusion". In Mark P. Zanna. Advances in Experimental Social Psychology. 41. Academic Press. pp. 1–67. doi:10.1016/S0065-2601(08)00401-2. ISBN 978-0-12-374472-2.
- Wilson, Timothy D. (2002). "Strangers to ourselves: discovering the adaptive unconscious". Belknap Press of Harvard University Press. ISBN 978-0-674-00936-3.
Further reading
- Goldman, Alvin I. (1993). "The Psychology of Folk Psychology". In Alvin I. Goldman. Readings in philosophy and cognitive science (2 ed.). MIT Press. pp. 347–380. ISBN 978-0-262-57100-5.
- Gopnik, Alison (1993). "How We Know Our Own Minds: The Illusion of First-person Knowledge of Intentionality". In Alvin I. Goldman. Readings in philosophy and cognitive science (2 ed.). MIT Press. pp. 315–346. ISBN 978-0-262-57100-5.
- Wilson, Timothy D. (2003). "Knowing When to Ask: Introspection and the Adaptive Unconscious". In Anthony Jack; Andreas Roepstorff. Trusting the subject?: the use of introspective evidence in cognitive science. Imprint Academic. pp. 131–140. ISBN 978-0-907845-56-0.
- Pronin, Emily; Gilovich, Thomas; Ross, Lee (2004). "Objectivity in the Eye of the Beholder: Divergent Perceptions of Bias in Self Versus Others". Psychological Review. 111 (3): 781–799. doi:10.1037/0033-295X.111.3.781. PMID 15250784.
- Gibbs Jr., Raymond W. (2006). "Introspection and cognitive linguistics: Should we trust our own intuitions?". Annual Review of Cognitive Linguistics. John Benjamins Publishing Company. 4 (1): 135–151. doi:10.1075/arcl.4.06gib.
- Johansson, Petter; Hall, Lars; Sikström, Sverker (2008). "From change blindness to choice blindness" (PDF). Psychologia. 51 (2): 142–155. doi:10.2117/psysoc.2008.142.
External links
- Choice Blindness Video on BBC Horizon site
- Choice Blindness Lab at Lund University
- ‘Choice blindness’ and how we fool ourselves by Ker Han, MSNBC.com, October 7, 2005
- Choice blindness: You don't know what you want (Opinion column by Lars Hall and Petter Johansson) New Scientist. April 18, 2009
- “People Always Follow the Crowd. But Not Me!”: The Introspection Illusion blog post by Dr. Giuseppe Spezzano, November 4, 2009
- Do Others Know You Better Than You Know Yourself? blog post by psychology professor Joachim Krueger, September 28, 2012