Journal of Behavioral and Brain Science
Vol.4 No.2(2014), Article ID:42753,9 pages DOI:10.4236/jbbs.2014.42010
A New Association Evaluation Stage in Cartoon Apprehension: Evidence from an ERP Study
1Department of Psychology, Institute of Education, China West Normal University, Nanchong, China
2Donders Institute for Brain, Cognition and Behaviour, Centre for Cognition, Radboud University Nijmegen, Nijmegen, The Netherlands
3Key Laboratory of Cognition and Personality, Ministry of Education, Chongqing, China
4Faculty of Psychology, Southwest University, Chongqing, China
5School of Psychology, Liaoning Normal University, Dalian, China
Email: firstname.lastname@example.org, email@example.com
Copyright © 2014 Shen Tu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. In accordance of the Creative Commons Attribution License all Copyrights © 2014 are reserved for SCIRP and the owner of the intellectual property Shen Tu et al. All Copyright © 2014 are guarded by law and by SCIRP as a guardian.
Received January 2, 2014; revised January 31, 2014; accepted February 7, 2014
Cartoon; Humor Apprehension; Association Evaluation; Event-Related Potentials (ERPs)
The aim of this study was to investigate the temporal cortical activation patterns underlying different stages of humor comprehension (e.g., detection of incongruity stage, resolution of incongruity stage, and affective stage). Event-related potentials (ERPs) were measured when 16 subjects were apprehending cartoon pictures including humorous, non-humorous and unrelated items. Results showed that both humorous and unrelated items elicited a more negative ERP deflection (N500-800) than non-humorous ones between 500 - 800 ms, which might reflect detection to incongruent element during humor apprehension. Then, both humorous and non-humorous items elicited a more positive ERP deflection (P800-1000) than unrelated ones between 800 - 1000 ms, which might reflect a classification process preliminarily evaluating whether there were attainable cues in the pictures used to form possible association between context and picture (we named it “association evaluation” stage). Furthermore, humorous items elicited a more positive slow wave than non-humorous items which also elicited a more positive wave than unrelated items between 1000 - 1600 ms, during which this component might be involved in the forming of novel associations (resolution of incongruity). Lastly, between 1600 - 2000 ms, humorous items elicited a more positive ERP deflection (P1600-2000) than both non-humorous and unrelated items, which might be related to emotion processing during humor apprehension. Based on these results, we deeply subdivided the second stage (resolution of incongruity) into two stages: association evaluation and incongruity resolution.
Humor is a high-level cognitive activity that plays a crucial role in social life. The ability to comprehend humor is considered by many investigators to be a significant component of what makes us unique as human beings , and to have a good sense of humor may represent an important coping strategy . Suls proposed an “Incongruity-Resolution theory” , according to which the humor processing could be divided into two stages: detection and resolution of incongruity [4,5]. The detection stage refers to perception of an incongruous element which is resolved in the incongruity resolution stage . The resolution stage involves frame-shifting process, in which perceiver activates a new frame from long-term memory to reinterpret information already active in working memory . In addition, Gardner et al. (1975) thought humor comprehension could be divided into cognitive and affective elements [8-10]. The cognitive element refers to the moments where people attempt to comprehend disparities between punch lines and prior experience . The affective element refers to the moments where people experience pure visceral and emotional responses dependent upon the exhilaration of experience . Together, humor apprehension could be separated into three sequential process stages: incongruity detection, incongruity resolution, and affective experience.
Early researches were from studies on patients [10-13], which showed that some brain regions play a key role in humor comprehension. Further fMRI studies investigated brain activity in the processing of comprehending humor information, such as dissociation between cognitive and affective elements [8,14,15]. However, it is difficult to distinguish detection from resolution of incongruity using fMRI because there exists no clear behavioral transition marker  except a recent study using more elaborate design .
Fortunately, ERPs that are time locked to the presentation of an external stimulus allow for more precise examinations of the time course of activation for different stages of humor. Using sentences with either jokes or equally surprising non-joke endings that did not entail frame-shifting, Coulson and Kutas had attempted to differentiate resolving incongruity stage from detecting incongruity stage of joke comprehension using ERP . Coulson and colleagues also investigated the relationship between handedness, hemispheric asymmetries, and joke or pun comprehension [17-20]. However, their results didn’t provide a simple mapping to the two cognitive stages of humor apprehension . Recently, Du et al. designed a funny or unfunny ending for a story, the ERP results suggested the dissociation among these three stages of humor apprehension . In addition, using “oddball” procedure, Gierych et al. investigated ERP correlates of processing funny pictures which were not preceded by a “context-setting” phase. Results showed that funny pictures elicited more positive ERP waves within broad latency windows, which, they thought, was the effects of emotional arousal .
Up to the present time, previous studies have used word jokes [5,19,21,23], episodes of television sitcoms , cartoons [16,22,24,25], or even laughter [26-28] as the materials to study the mechanism of humor, yet the result is still uncertain, as some didn’t aim at the process stages of humor directly and some had technique limitations. In the present study, we tried to use the cartoon pictures as materials to study the processing stages of humor apprehension, and devised three experimental conditions (humorous/non-humorous/unrelated) so as to differentiate the time courses of three stages with ERP effectively. Specifically, in our experiment, subjects were asked to judge whether the cartoon preceded by a context-setting caption was humorous, non-humorous or unrelated. The reasons for using this paradigm are as follows: in the first place, humorous cartoons in real life usually contained captions, and it was the relations between the caption and the cartoon that stirred humorous feeling. Secondly, we disassociated caption from picture attempting to avoid confusion between word processing and picture processing in order to analyze the ERPs elicited by human comprehension effectively. We speculated that, firstly, both humorous and unrelated conditions would elicit different ERP components than the non-humorous condition because of the detection of incongruity; secondly, the ERPs elicited by the three conditions would be differentiated from each other because of different extent of cognitive sources (resolution of incongruity); thirdly, the ERPs elicited by non-humorous and unrelated conditions would be differentiated from humorous condition because of exhilaration of personal experience (emotion stage).
2. Experimental Procedures
As paid volunteers, 16 adults (8 women, 8 men) aged 18 - 24 years (mean age, 21.6 years) from Southwest University in China participated in our experiment. All subjects who gave written informed consent, were righthanded and had no history of current or past neurological or psychiatric illness, and had normal or corrected-tonormal vision.
Prior to the experiment, 300 cartoon pictures were selected from the Internet. All the pictures were transformed into black & white pictures and were slightly altered in order that the pictures did not include any captions. Firstly, we selected 100 the most humorous cartoon pictures and adapted/created a caption context (4 - 6 Chinese characters) for each of the 100 cartoon pictures, and the cartoon pictures were funny with explanation of the caption context (humorous condition). Secondly, we selected 100 cartoon pictures from the rest 200 pictures and removed the humorous elements in the pictures if they had any. Similarly, we made a caption context for each of the 100 cartoon pictures, but the cartoon pictures were logically consistent with their caption context (nonhumorous condition). Thirdly, we removed the humorous elements in the remaining 100 pictures if they had, and made an unrelated caption context for each picture (unrelated condition). Examples of the three conditions are shown in Figure 1.
Then, the other 20 people, who did not join the ERP experiment, were asked to rate their attitudes on a scale of 1 to 4 (1-humorous; 2-non-humorous; 3-unrelated; 4- unclear) for each cartoon prior to the formal ERP experiment. The subjects were demanded to judge the cartoons with the context-setting caption. Finally, for humorous condition 60 cartoon pictures were chosen which were rated more than 14 times as humorous. In the same way, we chose 60 pictures for each of the other two conditions. All the cartoons were 8 cm × 6 cm, and were centered with a width of 6.6˚ and a height of 4.9˚.
The flow of stimulus presentation in each trial is shown in Figure 2. The subjects were asked to place their left hand on the space bar and right hand on the numbered keypad. First, a fixation point (+) appeared on the center of the screen for 300 ms, then the character context was presented. Subjects were asked to press the space bar when they had understood the meaning, the context disappeared as soon as the space bar was pressed. Then, after an asterisk (*) appeared randomly 300 - 500 ms, the cartoon pictures were presented. Subjects were required to make a “humorous/non-humorous/unrelated” judgment about the pictures (“1”, “2” and “3” keys stand for humorous, non-humorous and unrelated appreciations respectively) based on the relationship between pictures and captions context. The cartoons would disappear if subjects didn’t press any key within 6000 ms. Following an interval of 1.5 s, another trial continued. In addition, the stimulus-response key assignments (“1”, “2” and “3” keys) were counterbalanced across subjects.
The whole test was divided into two parts. There was a pre-test with six trials to familiarize the subjects with the procedure. Then, the formal ERP experiment started. There were 3 blocks and each one consisted of 60 trials, 20 trials for each condition (humorous/non-humorous/ unrelated). The different conditions in each block were displayed randomly. Between blocks, subjects could take an appropriate rest. Subjects were seated in a quiet room facing a screen placed approximately at 70 cm distance from the eyes and were instructed to respond as fast and
Figure 1. Example of cartoon for three conditions (left: humorous; middle: non-humorous; right: unrelated).
Figure 2. The flow of stimuli presentation in each trial.
accurately as possible by pressing the corresponding key of the keyboard. Subjects were asked to try to make few movements and little eye-blink.
2.4. ERP Recording and Analysis
Brain electrical activity was recorded from 64 scalp sites using Ag/AgCl electrodes mounted in an elastic cap (Brain Product), with the reference on the left and right mastoids. The vertical electrooculogram (VEOG) was recorded with electrodes placed above and below the right eye, and the horizontal electrooculogram (HEOG) with electrodes placed by right side of right eye and left side of left eye. All interelectrode impedance was maintained below 5 kΩ. The EEG and EOG were amplified using a 0.05 - 80 Hz bandpass and continuously sampled at 500 Hz/channel for off-line analysis. Eye movement artifacts (blinks and eye movements) were rejected offline. Trials with EOG artifacts (mean EOG voltage exceeding ± 80 μV) and those contaminated with artifacts due to amplifier clipping, bursts of electromyographic activity, or peak-to-peak deflection exceeding ± 80 μV were excluded from averaging.
The ERP waveforms were time-locked to the onset of the pictures. The averaged epoch for ERP, including a 200-ms pre-pictures baseline, was 2900 ms. Item was classified as humorous, non-humorous or unrelated condition if it was rated similarly as humorous, non-humorous or unrelated both in pilot study and in formal ERP study, EEG of each condition were separately averaged. And at least 30 trials were available for each condition of each subject. On the basis of the ERPs grand averaged potentials and voltage maps of difference waves ( Figures 3-5), the ERP component amplitudes were analyzed in a series of two-way repeated-measures ANOVAs using the factors of Task type (humorous/non-humorous/unrelated conditions) and central-anterior electrode site or centralposterior electrode site, separately for each ERP component. Because using data from multiple electrode sites may lead to a violation of the sphericity assumption, all ANOVA results were corrected using the GreenhouseGeisser procedure.
3.1. Behavioral Performance
The reaction times (RTs) for humorous, non-humorous and unrelated responses were 3335.9 ± 1272.3 ms, 2973.1 ± 1102.6 ms and 2737.9 ± 1352.8 ms respectively. Effect of RTs was significant, F(2, 30) = 17.79, p < 0.05. In addition, the post hoc test showed that effects of RTs between any conditions were significant: humorous vs. non-humorous, F(1, 15) =12.81, p < 0.05; humorous vs. unrelated, F(1, 15) = 32.71, p < 0.05; non-humorous vs. unrelated, F(1, 15) = 5.88, p < 0.05.
Figure 3. Grand average ERPs at central-anterior sites. Both humorous and unrelated items elicited a more negative ERP deflection (N500-800) than did non-humorous items between 500 - 800 ms; both humorous and non-humorous items elicited a more positive ERP deflection (P800-1000) than did unrelated items between 800 - 1000 ms; humorous items elicited a more positive ERP deflection (P1000-1600) than did non-humorous items which also elicited a more positive wave than unrelated items; humorous item elicited a more positive ERP deflection (P1600-2000) than did both non-humorous and unrelated items between 1600 - 2000 ms.
Figure 4. Grand average ERPs at central-posterior sites, where the ERP effects were apt to disappear.
The hit accuracies, which means that an item was rated similarly as humorous, non-humorous or unrelated both in pilot study and in formal ERP study, were 0.71% ± 0.19%, 0.73% ± 0.17%, and 0.76% ± 0.19% for humorous, non-humorous and unrelated conditions respectively. Effect of hit accuracies was not significant, F(2, 30) =
Figure 5. Voltage maps of difference waves were primarily at the central-anterior site between 500 - 800 ms and 800 - 1000 ms.
0.64, p > 0.05.
3.2. Electrophysiological Scalp Data
From the ERP grand average waveforms and voltage maps of difference waves ( Figures 3-5), it is obvious that ERPs elicited by the three conditions have similar effects (such as ERPs between 500 - 800 ms, 800 - 1000 ms, 1000 - 1600 ms and 1600 - 2000 ms) at the centralanterior electrode sites, and are almost the same and seem to be insignificant at central-posterior electrode site. Nine electrodes (FPz, AF3, AF4, Fz, F1, F2, FCz, FC1, FC2) at central-anterior electrode site and seven electrodes (Cz, C1, C2, CPz, CP1, CP2, Pz) at central-posterior electrode site were selected for analysis. Mean amplitudes in the time windows between 400 - 500 ms, 500 - 800 ms, 800 - 1000 ms, 1000 - 1600 ms and 1600 - 2000 ms were analyzed using two-way repeated-measures ANOVAs, with task type and electrode site as factors. In addition, the peak magnitudes of N100 about 100 ms, P170 about 170 ms, and N250 about 250 ms were also analyzed.
At the central-anterior electrode sites, results showed that the task type effects of peak magnitude on N100, F(2, 30) = 0.73, p > 0.05; P170, F(2, 30) = 1.33, p > 0.05; N250, F(2, 30) = 1.50, p > 0.05 and N400-500, F(2, 30) = 3.19, p > 0.05, did not reach significance. Furthermore, all the interactions between task type and electrode site were not significant, N100, F(16, 240) = 0.82, p > 0.05; P170, F(16, 240) = 0.62, p > 0.05; N250, F(16, 240) = 0.33, p > 0.05; N400-500, F(16, 240) = 0.79, p > 0.05. These results indicated that early processing were similar among three conditions.
Between 500 - 800 ms, there was a significant effect of task type, F(2, 30) = 5.42, p < 0.05. The interaction between task type and electrode site was not significant, F(16, 240) = 0.79, p > 0.05. Post hoc test results showed that task type between humorous and unrelated items did not reach significance, but between humorous/unrelated items and non-humorous items did: humorous vs. non- humorous, F(1, 15) = 7.29, p < 0.05; humorous vs. unrelated, F(1, 15) = .056, p > 0.05; non-humorous vs. unrelated, F(1, 15) =12.62, p < 0.05.
Between 800 - 1000 ms, there was a significant effect of task type, F(2, 30) = 5.81, p < 0.05. The interaction between task type and electrode site was not significant, F(16, 240) = 1.05, p > 0.05. Post hoc test results showed that task type between humorous and non-humorous items did not reach significance, but between humorous/ non-humorous items and unrelated items did: humorous vs. non-humorous, F(1, 15) = 0.01, p > 0.05; humorous vs. unrelated, F(1, 15) = 6.42, p < 0.05; non-humorous vs. unrelated, F(1, 15) = 13.29, p < 0.05.
Between 1000 - 1600 ms, there was a main effect of task type, F(2, 30) = 16.83, p < 0.05. The interaction between task type and electrode site was not significant, F(16, 240) = 1.51, p > 0.05). Post hoc test results showed that task type in each two conditions all reached significance: humorous vs. non-humorous, F(1, 15) = 5.30, p < 0.05; humorous vs. unrelated, F(1, 15) = 21.58, p < 0.05; non-humorous vs. unrelated, F(1, 15) = 34.91, p < 0.05.
Task type had an effect between 1600 - 2000 ms, F(2, 30) = 7.74, p < 0.05. Interaction between task type and electrode site was not significant, F(16, 240) = 0.63, p > 0.05. Furthermore, post hoc test results showed that task type between non-humorous and unrelated items did not reach significance, but between non-humorous/unrelated items and humorous items did: humorous vs. non-humorous, F(1, 15) = 5.37, p < 0.05; humorous vs. unrelated, F(1, 15) = 13.31, p < 0.05; non-humorous vs. unrelated, F(1, 15) = 3.10, p > 0.05.
In addition, in the seven central-posterior electrode sites (Cz, C1, C2, CPz, CP1, CP2, Pz; see examples in Figure 4), the main effects of task type in all these time windows were not significant: 500 - 800 ms, F(2, 30) = 3.01, p > 0.05; 800 - 1000 ms, F(2, 30) = 0.01, p > 0.05; 1000 - 1600 ms, F(2, 30) = 0.62, p > 0.05; 1600 - 2000 ms: F(2, 30) = 0.23, p > 0.05. The interactions between task type and electrode site in those time windows were also not significant.
In the present study, we attempted to use cartoon pictures as our experimental materials to distinguish the electrophysiological correlates of process stages during humor apprehension. The ERP results showed that there were some interesting findings about the neural basis of humor apprehension in the time windows of 500 - 800 ms, 800 - 1000 ms, 1000 - 1600 ms and 1600 - 2000 ms, which indicated that three process stages in cartoon comprehension might be probably distinguished on the millisecond scale of event-related potentials. We would discuss the implications of the ERP components as following.
First, both humorous and unrelated items elicited a more negative ERP deflection (N500-800) than did nonhumorous items between 500 - 800 ms, which might be involved in detecting the incongruent elements in cartoon apprehension. Previous studies have shown that the N400 is a good marker of incongruity and appears when participants respond to incongruous sentence endings [29, 30]. Similar components were found in response to pictures of objects that were semantically unrelated to previously displayed pictures or sentence contexts [31-34]. In these studies, the anomalous final pictures generated a larger N400 than did congruous ones. However, the scalp distribution of N400 differed between pictures and words. Specifically, The N400 effect for pictures was largest over the frontal midline site rather than posterior sites [31, 32,34]. In addition, few researches indicated the relation of N400 to humor, except that Coulson and Kutas (2001) found joke sentence endings elicited a more negative N400 than did the non-joke sentence endings which were equally unexpected. In the present study, clues in the non-humorous pictures could be sensed consistent with expectation inspired by previous context captions, but clues in the humorous and unrelated pictures could be sensed inconsistent with expectation, although subjects had not apprehended the pictures in detail. Therefore, the N500-800 might be related to N400 potential, and reflect the registration of surprise in humorous cartoon apprehension.
Second, between 800 - 1000 ms, humorous and nonhumorous items elicited a more positive ERP deflection than did unrelated items between 800 - 1000 ms at central-anterior electrode sites. We thought this positivity might be a late positive component (LPC). Previous study showed that LPC was associated with task classification . Other studies also suggested that this positive component with latency in the range of 500 - 900 ms post-stimulus (sometimes this component was called P600 or P800) was related to recollection processes of a more elaborative nature, based on information stored in long-term memory [36,37]. In the present study, the delay of latency might be due to the relation between texts and images which demand the recollection of previously presented texts, as well as complexity of the cartoon . This positivity might reflect a classification process preliminarily evaluating whether there were attainable cues in the pictures to form possible association between context and picture (association evaluation) before apprehending the relationships in detail. Because it is really difficult for the subjects to get any cues in unrelated condition to form association, they had to pay more attention to assure whether there were any associations, the smaller amplitude for unrelated condition might index greater attentional resources employed [38,39]. The specific process of picture details and the forming associations between contexts and pictures might be reflected in following process stage.
Third, humorous items elicited a more positive slow wave than did non-humorous items between 1000 - 1600 ms, which might be involved in forming of novel associations (resolution of incongruity). Resolution of incongruity involves a process of frame-shifting, in which the perceiver activates a new frame from long-term memory to reinterpret the information already active in working memory . Coulson and Kutas (2001) found that ERPs to jokes post-onset were more positive over medial posterior sites between 500 and 900 ms during which frameshifting was thought to occur. Previous studies also indicated that slow waves correlated with rehearsal/retention operations in working memory [40,41]. It was suggested that larger slow wave indicated more process demands to retain object information in working memory . In our study, following previous stage of preliminary association evaluation, subjects might process the pictures in detail and recheck any possibilities of forming association. Obviously, they needed more cognitive resources to form novel association for understanding the humorous items than understanding the non-humorous items. Therefore, this slow wave might be related to the extent to which the working memory are demanded to form novel association between context and picture, that is to say, the larger amplitude, the more cognitive resources used. In addition, we found that the non-humorous items elicited a more positive slow wave than unrelated items between 1000 - 1600 ms, which might reflect forming of the consistent association under the non-humorous conditions, but only processing of the pictures in the unrelated conditions because of the less possibility of forming associations between contexts and pictures.
At last, humorous item elicited a more positive ERP deflection (P1600-2000) than did both non-humorous and unrelated items between 1600 - 2000 ms, which might be related to emotional processing in humorous cartoon apprehension. Many studies had found that emotional pictures (i.e. pleasant and unpleasant ones) elicited a larger late positive potential than neutral pictures, which started around 300 - 400 ms following picture onset and lasted for several hundred milliseconds [43-46]. In addition, using a slow time constant, an extended late positive slow wave was observed which was significantly larger for emotional pictures compared to neutral pictures, and was sustained over a 6-s picture viewing period [47, 48]. In humor studies, a more positive ERP wave was elicited by funny items than unfunny items within broad latency windows [21,22], which was correlated with emotional arousal. In our experiment, humorous condition involved positive emotion compared to non-humorous/unrelated condition. Therefore, it is reasonable to postulate that the significant difference between humorous and non-humorous/unrelated condition between 1600 - 2000 ms might reflect humorous emotion processing in cartoon apprehension.
Together, the ERP results might indicate electrophysiological correlates of three process stages in humorous cartoon apprehension. Moreover, these results suggested that the incongruity resolution stage might be subdivided into two stages: association evaluation and incongruity resolution, comprising the four stages model of humor apprehension. The speculation might be consistent with the findings that general resolution process is dissociated from incongruity resolution process in a fMRI study using the same paradigm . However, the conclusions we got are only from the “context-setting” paradigm by using cartoon as material. The future studies adopting different kinds of materials and paradigms are necessary to better understand the process stages in humor apprehension.
Thank J. Y. Yang and S. Xue for the advisement on writing. This research was supported by the Fundamental Research Funds of China West Normal University (13E010) and the National Natural Science Foundation of China (30800293).
- H. H. Brownell and H. Gardner, “Neuropsychological Insights into Humour,” In: J. Durant and J. Miller, Eds., Laughing Matters: A Serious Look at Humour, Longman Scientific & Technical/Longman Group, Harlow, 1988, pp. 17-34.
- B. G. Celso, D. J. Ebener and E. J. Burkhead, “Humor Coping, Health Status, and Life Satisfaction among Older Adults Residing in Assisted Living Facilities,” Aging & mental Health, Vol. 7, No. 6, 2003, pp. 438-445. http://dx.doi.org/10.1080/13607860310001594691
- J. M. Suls, “A Two-Stage Model for the Appreciation of Jokes and Cartoons: An Information-Processing Analysis,” In: J. H. Goldstein and P. E. McGhee, Eds., The Psychology of Humor: Theoretical Perspectives and Empirical Issues, Academic Press, New York, 1972, pp. 81-100. http://dx.doi.org/10.1016/B978-0-12-288950-9.50010-9
- A. Bartolo, F. Benuzzi, L. Nocetti, P. Baraldi and P. Nichelli, “Humor Comprehension and Appreciation: An FMRI Study,” Journal of Cognitive Neuroscience, Vol. 18, No. 11, 2006, pp. 1789-1798. http://dx.doi.org/10.1162/jocn.2006.18.11.1789
- S. Coulson and M. Kutas, “Getting It: Human EventRelated Brain Response to Jokes in Good and Poor Comprehenders,” Neuroscience Letters, Vol. 316, No. 2, 2001, pp. 71-74. http://dx.doi.org/10.1016/S0304-3940(01)02387-4
- J. Uekermann, S. Channon and I. Daum, “Humor Processing, Mentalizing, and Executive Function in Normal Aging,” Journal of the International Neuropsychological Society, Vol. 12, No. 2, 2006, pp. 184-191. http://dx.doi.org/10.1017/S1355617706060280
- S. Coulson, “Semantic Leaps: Frame-Shifting and Conceptual Blending in Meaning Construction,” Cambridge University Press, Cambridge, 2001. http://dx.doi.org/10.1017/CBO9780511551352
- J. M. Moran, G. S. Wig, R. B. Adams Jr., P. Janata and W. M. Kelley, “Neural Correlates of Humor Detection and Appreciation,” Neuroimage, Vol. 21, No. 3, 2004, pp. 1055-1060. http://dx.doi.org/10.1016/j.neuroimage.2003.10.017
- K. K. Watson, B. J. Matthews and J. M. Allman, “Brain Activation during Sight Gags and Language-Dependent Humor,” Cerebral Cortex, Vol. 17, No. 2, 2007, pp. 314- 324. http://dx.doi.org/10.1093/cercor/bhj149
- H. Gardner, P. K. Ling, L. Flamm and J. Silverman, “Comprehension and Appreciation of Humorous Material Following Brain Damage,” Brain: A Journal of Neurology, Vol. 98, No. 3, 1975, pp. 399-412.
- H. H. Brownell, D. Michel, J. Powelson and H. Gardner, “Surprise but Not Coherence: Sensitivity to Verbal Humor in Right-Hemisphere Patients,” Brain and Language, Vol. 18, No. 1, 1983, pp. 20-27. http://dx.doi.org/10.1016/0093-934X(83)90002-0
- A. M. Bihrle, H. H. Brownell, J. A. Powelson and H. Gardner, “Comprehension of Humorous and Nonhumorous Materials by Left and Right Brain-Damaged Patients,” Brain and Cognition, Vol. 5, No. 4, 1986, pp. 399-411. http://dx.doi.org/10.1016/0278-2626(86)90042-4
- P. Shammi and D. T. Stuss, “Humour Appreciation: A Role of the Right Frontal Lobe,” Brain, Vol. 122, No. 4, 1999, pp. 657-666. http://dx.doi.org/10.1093/brain/122.4.657
- V. Goel and R. J. Dolan, “The Functional Anatomy of Humor: Segregating Cognitive and Affective Components,” Nature Neuroscience, Vol. 4, No. 3, 2001, pp. 237-238. http://dx.doi.org/10.1038/85076
- B. Wild, F. A. Rodden, A. Rapp, M. Erb, W. Grodd and W. Ruch, “Humor and Smiling Cortical Regions Selective for Cognitive, Affective, and Volitional Components,” Neurology, Vol. 66, No. 6, 2006, pp. 887-893. http://dx.doi.org/10.1212/01.wnl.0000203123.68747.02
- S. Tu, Y. Ma, G. Zhao, Q. Zhang and J. Qiu, “Dissociation between Incongruity Detection and Resolution in Humor Processing,” Psychological Science, 2014, in Press.
- S. Coulson and C. Lovett, “Handedness, Hemispheric Asymmetries, and Joke Comprehension,” Cognitive Brain Research, Vol. 19, No. 3, 2004, pp. 275-288. http://dx.doi.org/10.1016/j.cogbrainres.2003.11.015
- S. Coulson and E. Severens, “Hemispheric Asymmetry and Pun Comprehension: When Cowboys Have Sore Calves,” Brain and Language, Vol. 100, No. 2, 2007, pp. 172-187. http://dx.doi.org/10.1016/j.bandl.2005.08.009
- S. Coulson and R. F. Williams, “Hemispheric Asymmetries and Joke Comprehension,” Neuropsychologia, Vol. 43, No. 1, 2005, pp. 128-141. http://dx.doi.org/10.1016/j.neuropsychologia.2004.03.015
- S. Coulson and Y. C. Wu, “Right Hemisphere Activation of Joke-Related Information: An Event-Related Brain Potential Study,” Journal of Cognitive Neuroscience, Vol. 17, No. 3, 2005, pp. 494-506. http://dx.doi.org/10.1162/0898929053279568
- X. Du, Y. Qin, S. Tu, H. Yin, T. Wang, C. Yu and J. Qiu, “Differentiation of Stages in Joke Comprehension: Evidence from an ERP Study,” International Journal of Psychology, Vol. 48, No. 2, 2013, pp. 149-157. http://dx.doi.org/10.1080/00207594.2012.665162
- E. Gierych, R. Milner and A. Michalski, “ERP Responses to Smile-Provoking Pictures,” Journal of Psychophysiology, Vol. 19, No. 2, 2005, pp. 77-90. http://dx.doi.org/10.1027/0269-8803.19.2.77
- R. Filik and H. Leuthold, “Processing Local Pragmatic Anomalies in Fictional Contexts: Evidence from the N400,” Psychophysiology, Vol. 45, No. 4, 2008, pp. 554- 558. http://dx.doi.org/10.1111/j.1469-8986.2008.00656.x
- A. C. Samson, C. F. Hempelmann, O. Huber and S. Zysset, “Neural Substrates of Incongruity-Resolution and Nonsense Humor,” Neuropsychologia, Vol. 47, No. 4, 2009, pp. 1023-1033. http://dx.doi.org/10.1016/j.neuropsychologia.2008.10.028
- A. C. Samson, S. Zysset and O. Huber, “Cognitive Humor Processing: Different Logical Mechanisms in Nonverbal Cartoons—An fMRI Study,” Social Neuroscience, Vol. 3, No. 2, 2008, pp. 125-140. http://dx.doi.org/10.1080/17470910701745858
- P. Derks, L. S. Gillikin, D. S. Bartolome-Rull and E. H. Bogart, “Laughter and Eletroencephalographic Activity,” Humor, Vol. 10, No. 3, 1997, pp. 285-300. http://dx.doi.org/10.1515/humr.19188.8.131.525
- A. Rodden, B. Wild, M. Erb, M. Titze, W. Ruch and W. Grodd, “Humour, Laughter and Exhilaration Studied with Functional Magnetic Resonance Imaging (fMRI),” Neuroimage, Vol. 13, No. 6, 2001, p. 466. http://dx.doi.org/10.1016/S1053-8119(01)91809-9
- D. Shibata and J. Zhong, “Humour and Laughter: Localization with fMRI,” Neuroimage, Vol. 13, No. 6, 2001, pp. 476. http://dx.doi.org/10.1016/S1053-8119(01)91819-1
- M. Kutas and S. A. Hillyard, “Reading Senseless Sentences: Brain Potentials Reflect Semantic Incongruity,” Science, Vol. 207, No. 4427, 1980, pp. 203-205. http://dx.doi.org/10.1126/science.7350657
- M. Kutas and K. D. Federmeier, “Electrophysiology Reveals Semantic Memory Use in Language Comprehension,” Trends in Cognitive Sciences, Vol. 4, No. 12, 2000, pp. 463-470. http://dx.doi.org/10.1016/S1364-6613(00)01560-6
- W. C. West and P. J. Holcomb, “Event-Related Potentials during Discourse-Level Semantic Integration of Complex Pictures,” Cognitive Brain Research, Vol. 13, No. 3, 2002, pp. 363-375. http://dx.doi.org/10.1016/S0926-6410(01)00129-X
- G. Ganis, M. Kutas and M. I. Sereno, “The Search for ‘Common Sense’: An Electrophysiological Study of the Comprehension of Words and Pictures in Reading,” Journal of Cognitive Neuroscience, Vol. 8, No. 2, 1996, pp. 89-106. http://dx.doi.org/10.1162/jocn.19184.108.40.206
- G. Ganis and M. Kutas, “An Electrophysiological Study of Scene Effects on Object Identification,” Cognitive Brain Research, Vol. 16, No. 2, 2003, pp. 123-144. http://dx.doi.org/10.1016/S0926-6410(02)00244-6
- K. D. Federmeier and M. Kutas, “Meaning and Modality: Influences of Context, Semantic Memory Organization, and Perceptual Predictability on Picture Processing,” Journal of Experimental Psychology: Learning, Memory, and Cognition, Vol. 27, No. 1, 2001, pp. 202-224. http://dx.doi.org/10.1037/0278-73220.127.116.11
- B. Palmer, V. T. Nasman and G. F. Wilson, “Task Decision Difficulty: Effects on ERPs in a Same-Different Letter Classification Task,” Biological Psychology, Vol. 38, No. 2, 1994, pp. 199-214. http://dx.doi.org/10.1016/0301-0511(94)90039-6
- M. E. Smith, “Neurophysiological Manifestations of Recollective Experience during Recognition Memory Judgments,” Journal of Cognitive Neuroscience, Vol. 5, No. 1, 1993, pp. 1-13. http://dx.doi.org/10.1162/jocn.1918.104.22.168
- M. Besson, M. Kutas and C. Van Petten, “An EventRelated Potential (ERP) Analysis of Semantic Congruity and Repetition Effects in Sentences,” Journal of Cognitive Neuroscience, Vol. 4, No. 2, 1992, pp. 132-149. http://dx.doi.org/10.1162/jocn.1922.214.171.124
- A. Kok, “Event-Related-Potential (ERP) Reflections of Mental Resources: A Review and Synthesis,” Biological Psychology, Vol. 45, No. 1, 1997, pp. 19-56. http://dx.doi.org/10.1016/S0301-0511(96)05221-0
- J. Polich, “Updating P300: An Integrative Theory of P3a and P3b,” Clinical Neurophysiology, Vol. 118, No. 10, 2007, pp. 2128-2148. http://dx.doi.org/10.1016/j.clinph.2007.04.019
- A. Mecklinger and E. Pfeifer, “Event-Related Potentials Reveal Topographical and Temporal Distinct Neuronal Activation Patterns for Spatial and Object Working Memory,” Cognitive Brain Research, Vol. 4, No. 3, 1996, pp. 211-224. http://dx.doi.org/10.1016/S0926-6410(96)00034-1
- J. W. King and M. Kutas, “Who Did What and When? Using Wordand Clause-Level ERPs to Monitor Working Memory Usage in Reading,” Journal of Cognitive Neuroscience, Vol. 7, No. 3, 1995, pp. 376-395. http://dx.doi.org/10.1162/jocn.19126.96.36.1996
- S. Berti, H. Geissler, T. Lachmann and A. Mecklinger, “Event-Related Brain Potentials Dissociate Visual Working Memory Processes under Categorial and Identical Comparison Conditions,” Cognitive Brain Research, Vol. 9, No. 2, 2000, pp. 147-155. http://dx.doi.org/10.1016/S0926-6410(99)00051-8
- H. T. Schupp, M. Junghöfer, A. I. Weike and A. O. Hamm, “The Selective Processing of Briefly Presented Affective Pictures: An ERP Analysis,” Psychophysiology, Vol. 41, No. 3, 2004, pp. 441-449. http://dx.doi.org/10.1111/j.1469-8986.2004.00174.x
- M. C. Pastor, M. M. Bradley, A. Löw, F. Versace, J. Moltó and P. J. Lang, “Affective Picture Perception: Emotion, Context, and the Late Positive Potential,” Brain Research, Vol. 1189, 2008, pp. 145-151. http://dx.doi.org/10.1016/j.brainres.2007.10.072
- D. Palomba, A. Angrilli and A. Mini, “Visual Evoked Potentials, Heart Rate Responses and Memory to Emotional Pictorial Stimuli,” International Journal of Psychophysiology, Vol. 27, No. 1, 1997, pp. 55-67. http://dx.doi.org/10.1016/S0167-8760(97)00751-4
- A. Keil, M. M. Bradley, O. Hauk, B. Rockstroh, T. Elbert and P. J. Lang, “Large-Scale Neural Correlates of Affective Picture Processing,” Psychophysiology, Vol. 39, No. 5, 2002, pp. 641-649. http://dx.doi.org/10.1111/1469-8986.3950641
- J. K. Olofsson, S. Nordin, H. Sequeira and J. Polich, “Affective Picture Processing: An Integrative Review of ERP Findings,” Biological Psychology, Vol. 77, No. 3, 2008, pp. 247-265. http://dx.doi.org/10.1016/j.biopsycho.2007.11.006
- B. N. Cuthbert, H. T. Schupp, M. M. Bradley, N. Birbaumer and P. J. Lang, “Brain Potentials in Affective Picture Processing: Covariation with Autonomic Arousal and Affective Report,” Biological Psychology, Vol. 52, No. 2, 2000, pp. 95-111. http://dx.doi.org/10.1016/S0301-0511(99)00044-7