The Neural Correlates of Moral Thinking: A Meta-Analysis
Douglas J. Bryant1, Wang F2, Kelley Deardeuff3, Emily Zoccoli4, Chang S. Nam5*
1 Department of Psychology, University of Oklahoma, Norman, OK, USA.
2 Department of Industrial Engineering, University of Arkansas, Fayetteville, AR, USA.
3 Department of Psychology, University of Oklahoma, Health Sciences Center, Oklahoma City, OK, USA.
4 Department of Psychology, University of Oklahoma, Norman, OK, USA.
5 Department of Industrial and Systems Engineering, North Carolina State University, Raleigh, NC, USA.
*Corresponding Author
Chang S. Nam,
Department of Industrial and Systems Engineering,
North Carolina State University, Raleigh, NC 27695 USA.
Tel: 919-515-8140
Fax: 919-515-5281
E-mail: csnam@ncsu.edu
Received: December 11, 2015; Accepted: June 22, 2016; Published: July 04, 2016
Citation: Douglas J. Bryant, Wang F, Kelley Deardeuff, Emily Zoccoli, Chang S. Nam (2016) The Neural Correlates of Moral Thinking: A Meta-Analysis. Int J Comput Neural Eng. 3(2),28-39. doi: dx.doi.org/10.19070/2572-7389-160005
Copyright: Chang S. Nam© 2016. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution and reproduction in any medium, provided the original author and source are credited.
Abstract
We conducted a meta-analysis to evaluate current research that aims to map the neural correlates of two typical conditions of moral judgment: right-wrong moral judgments and decision-making in moral dilemmas. Utilizing the activation likelihood estimation (ALE) method, we conducted a meta-analysis using neuroimaging data obtained from twenty-one previous studies that measured responses in one or the other of these conditions. We found that across the studies (n = 400), distinct neural circuits correlated with a distinction in the type of moral question used as stimulus. For right-wrong moral judgments, the significantly active regions were identified in the right medial frontal gyrus, the bilateral anterior cingulates, the left inferior frontal gyrus, the right middle temporal gyrus, the left superior frontal gyrus and the left posterior cingulate. When reasoning through moral dilemmas, the significantly active regions included the left cingulate gyrus, the right superior temporal gyrus, the left precuneus, the left inferior temporal gyrus and the right middle frontal gyrus. We further found that the two types of moral judgment share some overlapping regions, including the left medial frontal gyrus, the left middle frontal gyrus, the left middle temporal gyrus and the left superior temporal gyrus. Different moral tasks engage distinct neural correlates of moral thinking that share a common fundamental structure. These correlates parallel those engaged in emotional and intuitive judgments, on the one hand, and those engaged in deliberative assessment, on the other.
2.Introduction
3.Methods
3.1.Literature Search and Selection
3.2.Meta-Analysis Procedure
4.Results
4.1.Moral right-wrong judgment only
4.2.Moral dilemma only
4.3.Moral right-wrong judgment vs. moral dilemma decisionmaking
5. Discussion
5.1.Particular neural basis of moral right-wrong judgment
5.2.Particular Neural Basis of decision-making in moral dilemma
5.3.Overlapping results for two types of moral judgments
5.4.Multiple moral brains of moral judgment
5.5.Limitations of this meta-analysis
6.Conclusion
7. References
Keywords
Right-Wrong Moral Judgments; Decision-Making in Moral Dilemmas; Moral Brain; Coordinate-Based Brain Activation.
Introduction
Moral judgments are “evaluations (good versus bad) of the actions or character of a person that are made with respect to a set of virtues held by a culture or subculture to be obligatory” [1]. Both rationalist and, later, intuitionist approaches have sought to explain the mental processes underlying moral judgment [1-3]. More recently, fMRI models have demonstrated a synthesis incorporating distinct neural circuitry corresponding to different types of moral thinking [4,5]. The application of neuroimaging techniques in the search for the neural correlates of moral thinking in the last fifteen years has produced significant overlap in a number of findings and replicable distinctions. Challenging the purely rationalist view of moral judgment dominant in the last century are studies demonstrating abnormal moral judgments resulting from damage to emotion-related brain areas such as the ventromedial prefrontal cortex (VMPFC), while other cognitive abilities remain unaffected [6,7]. A number of functional imaging studies suggest that both conscious reasoning and emotional intuition play critical roles in moral judgment and its post hoc justification [5,8, 9]. Greene et al., [4,6,10,11] proposed the dual-track theory of moral judgment based on neuroimaging evidence. This theory postulated two different processing systems involved in moral judgment: an emotion system and a cognition system. It further postulated that these processing systems may conflict or compete with each other when facing a complicated situation such as a moral dilemma. Moll et al., [12] argued that the emotion system and the cognition system are inseparable in forming moral motivations, as some regions of the brain are involved in cognition-emotion interaction.
Investigations of the neural correlates of emotional responses have similarly found multiple substrates are involved in formulating stimulus responses, some of which may form the basis for the evolution of morality. Among these are immoral judgments accompanying disgust responses to incest, pathogen exposure, and certain food exposures [13]. Borg and colleagues found that while disgust is intimately linked to morality, both are multifaceted concepts in which different types of disgust activate distinct neural substrates. Haidt and colleagues [14] have argued that conceptions of disgust expanded from food-based emotional responses to moral judgments in social spaces, and Lieberman et al., [15] have offered a plausible evolutionary explanation for why such an expansion should occur. More, Lieberman and colleagues [16] demonstrated that childhood co-residence with an opposite sex child predicts the strength with which one will judge incest by others disgusting. Because individual differences in disgust sensitivity, particularly to incest, vary widely according to the circumstances of one’s childhood and one’s culture, it is difficult to identify the neural correlates of morality by disgust responses alone. Instead, the present study looks specifically at moral judgment in two conditions: right-wrong moral judgment and moral dilemma.
By correlating neural activity with moral thinking tasks, the findings from existing neuroimaging studies show that moral judgment is supported by neural substrates including the medial frontal gyrus, the posterior cingulate, the precuneus, the retrosplenial cortex, the superior temporal sulcus and the inferior parietal lobe, which constitutes the “moral brain” [4]. Moll et al., [12] summarized that brain areas involved in moral cognition include the anterior prefrontal cortex (PFC), the orbitofrontal cortex (OFC), the posterior superior temporal sulcus (STS), the anterior temporal lobes, the insula, the anterior cingulate cortex (ACC) and the limbic regions. Since the contents of moral stimuli are usually straightforward, corresponding moral judgments are highly rule-based and have clear right or wrong responses with very high agreement [13]. In contrast, moral dilemma tasks require participants to make decisions between two imperfect options, such as “would you sacrifice a few people’s lives to save many other people?”, or “would you steal food to feed the hungry?” When artfully constructed, a moral dilemma task presents no fully satisfactory answer, thus preventing the participant from merely and straight forwardly relying on existing norms. All available options should incur some moral violation, though to differing degrees. Our analysis revealed a consensus of findings demonstrating that moral dilemma tasks activated distinct neural substrates from those activated in moral right-wrong judgment tasks.
Although two types of moral judgments rely on some identical neural correlates [11], the differences between them are instructive and should not be ignored in further investigations. Moral rightwrong judgments are mostly simple rule-based decisions drawn from a knowledge base of clear and consistent concepts and norms that contain accepted and conventional social morality, whereas moral dilemmas force the subject to go beyond these simple rules. Doing so requires additional neural resources to facilitate complex and novel decision-making [4,]6,17]. For instance, moral dilemmas may compel the decision maker to compute each option’s economic valence and choose the option with higher valence (e.g. minimizing the harmful outcome or maximizing the positive outcome). Furthermore, moral dilemma decisions rely on not only complex reasoning processes, but also on multiple emotional factors, including sympathy, preference, and fairness [17]. Greene and Haidt [4] showed that a number of neural structures are selectively involved in either moral rightwrong conditions or moral dilemmas: orbitofrontal/ventromedial frontal cortex and temporal poles are specific to moral right-wrong judgment, while dorsolateral prefrontal cortex (DLPFC) is only associated with moral dilemma reasoning. Eslinger and colleagues [18] investigated the neural correlates of moral development in nine individuals aged 10-17. They compared morally ambiguous decision-making and moral right-wrong judgment and argued that the interpersonal processing of morally ambiguous judgments such as lies for the greater good involves empathy, theory of mind, and intentionality, which results in increased reaction time and lower agreement on the morality of the decisions reached - 57 percent as compared to 98 percent in rule-based right-wrong judgments. Moreover, their fMRI findings indicated that the morally ambiguous condition stimulated significant activations in the bilateral frontal polars, the bilateral superior parietal lobe, the right middle/superior frontal gyrus, the precuneus, and the fusiform, while the activations of the hippocampal, the left temporal pole, the insula, and the superior temporal regions were specific to moral right-wrong judgment. Like Eslinger and colleagues [18], our meta-analysis demonstrated a consensus across multiple studies that additional cognitive resources and their neural neural correlates are mustered in the moral judgment of morally ambiguous or dilemmatic conditions. Interestingly, Eslinger et al., did not report significant levels of overlap in both conditions like those our meta-analysis revealed across a number of studies. Apart from these studies, however, very few neuralbased studies have drawn direct comparisons between moral dilemma decision-making and moral right-wrong judgment. The aim of our study is to distinguish two types of moral judgment from a neuroanatomical perspective and to summarize a functional brain map for each. Based on the distinctions discussed above, we expected that moral dilemma decision-making and moral right-wrong judgments have two different moral pathways, in addition to several common brain regions. We hypothesized that the moral circuitry of moral right-wrong judgment mainly depends on two types of regions: those associated with semantic processing and normative knowledge such as social rules, and those in response to negative emotions aroused by stimuli that present immoral content. We also expected the neural substrates involved in moral dilemma decision-making to depend on a more complicated structure, that is, the regions associated with more complex decision-making function.
We found a consensus in the literature that the neural correlates operational in moral right-wrong judgment are distinct from those operational in judgments of moral dilemmas. The neural substrates activated in moral right-wrong judgment correspond to the emotional centers of the brain. These substrates are also active in judgments of moral dilemmas, but their emotive force is suppressed by other brain areas, including the ACC, while other neural substrates undertake more deliberate and rational consideration of the conflict at hand.
In this study, we used a meta-analysis technique, activation likelihood estimation (ALE) to analyze the neuroimaging data from moral judgment studies. ALE is a quantitative meta-analysis technique that pools the foci of brain activity reported by a set of studies and identifies the brain regions with the highest activation probability [19]. ALE analyzes a large quantity of data from previous studies in order to find a result maximizing the inter-study consistency and minimizing the subjectivity of each study. Hence, we used ALE to mitigate the bias caused by various experimental designs and conditions and seek common points among different studies in one category of moral judgment.
Methods
We obtained the literature for analysis by searching PubMed and Google Scholar using the keywords fMRI, PET, neuroimaging, functional imaging, moral judgment, moral dilemma and moral decision making. We also utilized the reference lists of the identified papers as additional resources of usable literature. All the articles reporting brain activation foci as 3-D coordinates in stereotactic space were included in our study.
Three selection criteria were employed to screen out useable data for analysis. First, the moral judgment task had to include the explicit procedure of judgment or decision in order to investigate brain activity while subjects were making moral judgments, rather than the brain activity during the period when subjects were simply reading or viewing moral stimuli. Therefore, the studies which only asked subjects to passively observe the moral stimuli were excluded. Second, the selected neuroimaging studies had to encompass the whole brain since including partial brain studies could result in statistical bias in the meta-analysis. Third, the experimental participants had to be healthy individuals. Thus data from clinical populations (e.g. the patients with brain lesion) were not included. Using these criteria, 20 articles were selected and split into two group: 13 papers on moral right-wrong judgment and 8 papers on moral dilemma (Table 1). One study [18] included data for both types of moral judgment. Within each group, coordinates were collected from between-task comparisons contrasting moral judgment with other types of judgments (e.g. moral judgment vs. factual judgment) and within-task comparisons such as personal moral dilemma vs. impersonal dilemma. For these within-task comparisons, the activation coordinates from two groups were both imputed as combined data.
Together, the selected articles yielded 225 foci, including 100 foci for the moral right-wrong judgment and 125 foci for the moral dilemma decision-making. The selected articles used two different coordinate systems: Talairach atlas [36] and Montreal Neurological Institute (MNI) template. We used Talairach atlas as our standard coordinate system and transformed all the MNI coordinates to Talairach space coordinates.
Meta-Analysis Procedure
The ALE meta-analysis technique was initially proposed by Turkeltaub et al., [19], and later improved by Laird et al., [37] and Eickhoff et al., [38]. ALE technique takes the peak focus of brain activation as a probability distribution around the near-by coordinates instead of an exact single point. Aggregating these distributions can produce a probabilistic map for all the relevant brain activities, namely, an ALE map. The ALE technique also provides a subtraction analysis function, which can be used to make a direct statistical comparison between two groups of neuroimaging data. Krain et al., [39] successfully used this method to build ALE maps of brain activity for risky decision-making and ambiguous decision-making, and conducted a comparison between them. Later, Soros et al., [40] employed the ALE technique to probe differences in neural control between water swallowing and saliva swallowing using the foci from 7 studies of water swallowing and 5 studies of saliva swallowing. Recently, The ALE technique has been widely used in neuroscience literature reviews and is considered a very effective analytical tool for neuroimaging data.
In an ALE meta-analysis, a voxel’s ALE value represents the probability that at least one of the active foci occurs in the voxel. Therefore, the ALE value is computed for every voxel by modeling all the coordinates with equal weight using a 3-Dimensional Gaussian probability density function filtered with an empirical value of full-width half-maximum (FWHM) [38]. The statistical significance of the result is then assessed by a permutation of randomly generated foci. Five thousand permutations are generated using the same number of foci and FWHM to generate the ALE map. All computed ALE values are corrected with false discovery rates (FDR) and then thresholded with a p value of 0.05. In the current study, we set the minimal cluster size at 100 mm3 to correct the results, meaning that any cluster with a size less than 100 mm3 was removed from the results. We then overlaid the ALE clusters onto an anatomical template in Talairach space (colin1.1, brainmap.org/ale/colin1.1.nii) using MANGO (www.ric.uthscsa.edu/mango/) to view the significantly active brain regions. Both the ALE computing and coordination transforming from MNI to Talairach space were implemented by Ginger ALE software (www.brainmap.org/ale).
We also conducted subtraction analysis to statistically compare two types of moral judgments. In this step, the ALE maps of “moral right-wrong judgment – moral dilemma” and “moral dilemma – moral right-wrong judgment” were generated by subtracting the foci of the latter ones from those of the former ones. The ALE maps of subtraction were created with the same permutation tests as the individual ALE method discussed above.
Results
With 100 imputed foci of the moral right-wrong judgment condition, the meta-analysis revealed 10 clusters (see Table 2 and Figure 1a) with significant activation likelihoods and 16 extreme coordinates, which constitute a distributed network of brain regions involving the frontal lobe, the temporal lobe and the limbic lobe. The highest activation likelihoods were found in the right medial frontal gyrus (BA 9). The left medial frontal gyrus (BA 10) also showed significant activation likelihood.
Apart from the medial frontal gyrus, the frontal lobe showed significant activation likelihoods in the areas of left inferior frontal gyrus (BA 45; BA 47), left superior frontal gyrus (BA 8) and left middle frontal gyrus (BA 46). The temporal lobe reported significant probabilities of activity in the areas of middle temporal gyrus (BA 39; BA 21) and left superior temporal gyrus (BA 38). Besides, the significant areas also included the left posterior cingulated (BA 32) and right anterior cingulated (BA 30).
For the moral dilemma decision condition, 125 imputed foci produced 14 clusters with significant activation likelihoods and 14 extreme coordinates (see Table 2 and Figure1b). The coordinates with the highest activation likelihood appeared in the left cingulate gyrus (BA 31), and the left superior temporal gyrus (BA 39).
Similar to the result from the moral right-wrong judgment condition, our analysis of moral dilemma identified not only the left superior temporal gyrus, but also the left middle frontal gyrus (BA 10) and the left middle temporal gyrus (BA 21) as significantly active regions. Furthermore, the distributed network of moral dilemma decision-making included the left cingulate gyrus (BA 31; BA 32), the right superior temporal gyrus (BA 39; BA 21), the left inferior temporal gyrus (BA 37), the precuneus (BA 7) and the right middle frontal gyrus (BA 10), all of which differ from the results of the moral right-wrong judgment analysis.
Figure 1. Brain regions showing significant activation likelihoods in (a) moral right-wrong judgment and (b) in moral dilemma. The clusters with significant probability of activations are showed as the red-yellow highlighted regions. The coordinate of the cross section in Talairach space is showed above each section. The number beside the highlighted region corresponds to the cluster with the highest local ALE in Table 2.
Using the subtraction analysis, we compared two types of moral judgment statistically. Direct comparison between two groups revealed distinct patterns of brain activity for each of them. Compared with moral dilemma decision-making, moral right-wrong judgment indicated greater activation probabilities in 8 clusters (see Table 3 and Figure 2a), including the medial frontopolar gyrus (BA 10), the right middle frontal gyrus (BA 21), the left inferior frontal gyrus (BA 47), the right anterior cingulate (BA 32), the left middle frontal gyrus (BA 46) and left superior frontal gyrus (BA 8). These regions also appeared in the result of ‘moral right-wrong judgment only.’ There were more regions showing greater probabilities of activation in the moral dilemma condition than in moral right-wrong judgment condition. This is consistent with a more complicated mechanism of decisionmaking involved in consideration of a moral dilemma relative to making a moral right-wrong judgment. In total, 16 clusters and 18 extreme coordinates were found with significantly higher ALE values for moral dilemma (see Table 3 and Figure 2b). The left cingulate gyrus had the highest ALE value among 16 regions where moral dilemma decision-making had greater probability of activation relative to moral right-wrong judgment, similar to the result of ‘moral dilemma only.’
The precuneus (BA 7), the right cuneus (BA 30) and the left fusiform gyrus (BA 37), which were not found in the individual study of each group, indicated greater activation probabilities in the moral dilemma condition than in moral right-wrong judgment condition.
Figure 2. Comparison between two groups: (a) Brain regions showing greater activation likelihood in moral right-wrong judgment than in moral dilemma. (b) Brain regions showing greater activation likelihood in moral dilemma than in moral right-wrong judgment. The clusters with significant probability of activations are shown as the red-yellow highlighted regions. The coordinate of the cross section in Talairach space is shown above each section. The number beside the highlighted region corresponds to the coordinate with the highest local ALE value in Table 3.
Discussion
Results of the meta-analysis showed distinct neural correlates corresponding to the two conditions: moral right-wrong judgment and moral dilemma. The findings also reflect the common neural correlates in both categories of moral judgment. These overlapping regions constitute a fundamental neural substrate of general moral judgment across different task designs and stimuli. Some regions are active across a number of studies regardless of experimental designs, stimuli, or contrast, and are thus active in both moral judgment conditions. The variety of experimental designs and conditions in previous investigations of the neural correlates of moral judgment have found a number of neural substrates responses to various stimuli. These sundry results can obscure the consensus revealed by the current metaanalysis. In brief, moral judgment in both right-wrong and moral dilemma conditions involves overlapping regions that include the emotion centers of the brain. In the moral dilemma condition, moral judgment is further facilitated by the aid of the ACC, which regulates emotional responses. This regulation allows more deliberate and cognitive substrates to participate in evaluating the content of the dilemma and arriving at a judgment.
Significant regions unique to moral right-wrong judgment were identified in the areas of right medial frontal gyrus, bilateral anterior cingulates, left inferior frontal gyrus, right middle temporal gyrus, left superior frontal gyrus and left posterior cingulate (Table 2).
Our meta-analysis result clearly revealed the critical role of the frontal lobe in basic moral judgment tasks. Three out of six exclusive significant regions for moral right-wrong judgment,the right medial frontal gyrus, the left inferior frontal gyrus, and the left superior frontal gyrus, lie in the frontal lobe, including the right medial frontal gyrus, the left inferior frontal gyrus and the left superior frontal gyrus. Moral right-wrong judgment simultaneously activated bilateral medial frontal gyri. Stuss et al., [41] found patients with right medial frontal lesions are unable to detect the deception of a protagonist, which required mental state attribution. Similarly, the results of Decety et al.,’s study [42] suggested that the right medial frontal cortex is involved in thinking about and perceiving others’ behaviors. Based on these results, we inferred that when people judge whether a protagonist’s actions are right or wrong from a moral standpoint, they take into account the protagonist’s mental state (e.g. belief and intention).
Such a mechanism would activate the medial frontal gyrus, especially its right lateral.
The exact function of the cluster in the left inferior frontal gyrus is unclear. As part of the Broca's area, it is located in the Pars triangularis region, which is relevant to processing the language in the moral stimuli. This supposition supports the result in our subtraction analysis, which showed that the activation likelihood of the left inferior frontal gyrus was higher in the moral dilemma than the moral right-wrong judgment (see Table 3), since most moral dilemma stimuli have more complicated and substantial semantic content. Nevertheless, several studies have argued that the activation of the left inferior frontal gyrus might correlate with negative emotions. Stimuli of disgusting and immoral scenes [12] and descriptions about embarrassing scenes [43] were both found to significantly activate the inferior frontal gyrus. Here we cannot define whether the role of inferior frontal gyrus is related to language processing or negative affective state.
The meta-analysis also showed that the left superior frontal gyrus had a different activation characteristic as compared to its right counterpart. Farrow et al., [44] showed overlapping activation of the left superior frontal gyrus in both empathic and forgiving judgments when compared with a baseline task. Furthermore, Takahashi et al., [43] found the left superior frontal gyrus could significantly be activated by both guilt and embarrassing conditions. These results suggest that the left superior frontal gyrus’ function in moral judgment is relevant to monitoring one’s own or other’s emotions, which is consistent with Ruby and Decety’s [45] conclusion that the left superior frontal gyrus is crucial in perspective-taking associated with social emotions.
The neural structure active in moral right-wrong judgments also involves the limbic lobe. Two significant regions, the ACC and the posterior cingulate, are identified in the cingulate cortex. According to Bush et al.,’s meta-analysis [46], the ACC can be subdivided into a cognitive division and an affective division which are respectively activated by tasks associated with complex cognitive processes and emotional content. Table 2 and Figure. 1 show that the significant cluster in the ACC (cluster #6 in Table 2) is located at the ventral part, which is the affective division of the ACC. Bush and colleagues also concluded that the affective division is primarily responsible for the assessment of the emotional information and the regulation of emotional responses. This result is consistent with Berthoz et al.’s finding [25] that an intentional moral transgression incurred more significant activation of the anterior cingulate gyrus compared with an accidental moral transgression, which caused relatively less negative moral emotions.
Green et al., [6] found the posterior cingulate was activated in all their experimental comparisons of moral judgment tasks (e.g. personal vs. impersonal, difficult vs. easy, utilitarian vs. non-utilitarian), implying that the posterior cingulate plays a fundamental role in general moral tasks. Besides, there are considerable studies emphasizing the posterior cingulate gyrus’ important role in emotional tasks. Vogt et al., proposed that this region may respond to some emotional content of events, especially in a self-relevant condition [47]. Greene and Haidt [4] considered a function of this region as an integration of emotion, imagery and memory - a conclusion further supported by Harenski and Hamann [48], and Maddock [49].
In addition to the frontal and limbic lobes, moral right-wrong judgment exclusively activates the right middle temporal gyrus in the temporal lobe. The right middle temporal gyrus’ activation is found without the activation of the left lateral, when processing negative facial expressions such as fear and disgust [50]. This finding suggests that the middle temporal gyrus is sensitive to negative emotion-arousing contents in moral right-wrong judgment, different from the left posterior middle temporal gyrus’ role of processing semantic emotion.
Moral dilemmas involved five exclusive significant regions: the left cingulate gyrus, the right superior temporal gyrus, the left precuneus, the left inferior temporal gyrus and the right middle frontal gyrus. In comparison with the moral right-wrong judgment condition, the moral dilemma condition’s active brain areas are more widely dispersed throughout the brain, involving the limbic lobe, the parietal lobe, the temporal lobe and the frontal lobe.
There are two distinct active areas in the limbic lobe. Table 2 shows that the left cingulate gyrus’ four clusters are located at two regions: BA 31 and BA 32, corresponding to the dorsal posterior cingulate gyrus and the anterior cingulate gyrus, respectively. In our subtraction analysis (Table 3: moral dilemma > moral right-wrong judgment), these two areas showed significantly larger activation likelihood in moral dilemma tasks than moral right-wrong judgment tasks. The larger activation likelihood of the dorsal posterior cingulate gyrus can be explained by the posterior cingulate gyrus’ emotion-related role mentioned above. Since most moral right-wrong judgment studies included both negative and positive moral stimuli, such blends inevitably reduced the statistical significance of the region associated with negative emotions. In contrast, moral dilemma studies’ stimuli were relatively homogeneous in terms of emotional content; they all described unpleasantly dilemmatic situations. Therefore, the larger activation likelihood of the dorsal posterior cingulate gyrus that appeared in moral dilemma judgments is reasonable.
For BA 32, as discussed above, the ACC has two functional partitions: cognitive and affective. Moral dilemma’s two significant clusters in the ACC (cluster #4 and cluster #11 in Table 2) both lie in the cognitive division. Greene and colleagues have demonstrated the relationship between activation of the ACC and the need for higher cognitive control [6]. Moreover, in some difficult moral dilemma tasks, the ACC was found to be recruited when the subject was making decisions which would incur a strong personal moral violation, suggesting that cognitive conflict calls for an ability to guide one’s thought and action according to his/her intentions [6]. This conclusion is consistent with the fact that the ACC (BA 32) has a significantly higher activation likelihood in moral dilemma tasks than in moral right-wrong judgment tasks. The temporal lobe encompasses three distinct active areas in moral dilemma judgments, making it a key neural correlate distinguishing moral dilemma processing from moral right-wrong judgments.
Two clusters in the region of right superior temporal gyrus were found in separate areas: BA 39 and BA 21. With respect to BA 39, Moll et al.’s findings [20] showed that only unpleasant moral statements activated this area, whereas unpleasant nonmoral statements had no effect on its activation. The finding shows that BA 39 is not merely sensitive to emotional content. Green et al. [4] argued that BA39 supports representations of socially significant movements (e.g. complex representations of personhood) and demonstrated in a later study that BA 39 is more active in processing personal moral dilemmas over impersonal moral dilemmas, and difficult moral scenarios over simple moral scenarios [6]. That finding is consistent with results from Borg and colleagues [13] showing that the activation of BA 39 is closely related to thought-provoking, first-time moral judgments that require executive resources. Taken together, the findings suggest the activation of BA 39 is due to the need for executive resources in navigating complex and competing moral norms, and that this is particularly so when the scenario under consideration concerns oneself. Consistent with these findings, our analysis found greater activation of BA 39 in moral dilemma conditions than moral right-wrong judgment.
For the cluster in BA 21, we found it spanned across the right middle temporal gyrus and right superior temporal gyrus, which are divided by the anterior superior temporal sulcus (STS). We posit that this active cluster implies the significant activation of the STS, especially its anterior part. Existing studies associate the activation of the anterior STS with a ‘theory of mind’ mechanism [51,52]. Borg and colleagues [31] identified the right anterior STS’s activation in the interaction between morality and intention, which showed that responses in a moral judgment task were affected by whether the moral violation was intentional or unintentional. This may explain why the activation likelihood of BA 21 is greater in moral dilemma conditions than in moral rightwrong judgment conditions (Table 3), as moral dilemma tasks involve assuming the position of the decision-maker, making an informed and willful choice, and, at least within the context of the counterfactual scenario, taking on the responsibility of the decision as an autonomous agent.
Another significant temporal lobe cluster active in moral dilemma conditions was located at BA 37, stretching from the inferior temporal gyrus to the fusiform gyrus. Moreover, the significant likelihoods of the left inferior temporal gyrus and the left fusiform gyrus both showed up in the result of “moral dilemma > moral right-wrong judgment” (Table 3). Borg and colleagues’ [13] study showed that BA 37 was not only more active in incest-related immoral stimuli relative to the nonsexual immoral condition, but also the most significantly activated in a pathogen-related disgust condition. The results revealed that the inferior temporal/ fusiform gyrus’ activations were positively correlated with the degree of disgust [See also [53]].
In addition to the limbic and the temporal lobes, the parietal and the frontal lobes have significant areas of activation in moral dilemma conditions. Studies by Saxe and colleagues [46,47] not only identified the anterior STS, but also implicated the precuneus in a ‘theory of mind’ mechanism. In taking on the role of decisionmaker in a moral dilemma judgment task, participants engage in theory of mind by taking on the role of a moral agent, even if themselves, in a counterfactual scenario. Young et al., showed that the precuneus was activated in order to process the moral agent’s before hand belief when the subject was judging whether the moral agent’s behavior is moral-permissible or moral-forbidden [54,55]. Cavanna and Trimble’s review [56] presented a similar finding, and summarized the precuneus’ three relevant functions as visual imagery, episodic memory retrieval and, importantly, self-processing (e.g. taking a first-person perspective). Such firstperson perspective-taking is in all probability part of a ‘theory of mind’ mechanism, namely, imagining “if I were the protagonist in this scenario” in order to evaluate a moral agent’s behavior. Activation of the right middle frontal gyrus reflects brain lateralization in moral judgment. While significant activation likelihood of the left middle frontal gyrus was found in both moral tasks, the right middle frontal gyrus was only identified in moral dilemma conditions. Our ALE subtraction analysis showed that the right middle frontal gyrus’ activation likelihood in moral dilemma judgment is significantly greater than that in moral rightwrong judgment. This result may be due to an inhibitory control mechanism that is activated when the subject is facing a forced choice in a dilemmatic scenario. Support for this view comes from Garavan et al., [57] who investigated right hemispheric dominance of inhibitory control including the right middle frontal gyrus in its right-lateralized dominant neural circuit. Activation is thought to indicate resistance to interference from strong emotional impulses when the subject is attempting to make a rational decision. In other words, the activation of the right middle frontal gyrus reflects the conflict and competition between an abstract, rational processing system and an intuitive emotion-processing system.
Our findings also showed that regions of engagement common to two types of moral judgment include the left medial frontal gyrus, the left middle frontal gyrus, the left middle temporal gyrus and the left superior temporal gyrus. These overlapping regions constitute a fundamental neural substrate of general moral judgment, across different task designs and stimuli.
There are two frontal lobe regions active in both conditions, the left medial and left middle frontal gyrus. Following Greene and Haidt [4], we posit that the left medial frontal gyrus’ activation indicates processing of both decision-making and emotional factors. Greene and Haidt concluded that activation of the medial frontal gyrus was associated with multiple moral tasks in addition to moral judgment. They considered the likely function of the medial frontal gyrus as combining emotional factors with the process of decision-making or theory-of-mind function. The role of the medial frontal gyrus in theory of mind processing is further evidenced by Farrow et al., [44] study which found activation in the structure when one is judging others' emotional states and the forgivability of their crimes. This dual role explains why the structure would be active in both conditions, as emotional responses correlate to moral right-wrong processing, and theory of mind correlating to moral dilemma processing.
Another frontal lobe region active in both conditions, the left middle frontal gyrus, significantly overlaps with the DLPFC and is often considered part of the DLPFC [4,6]. Previous studies have demonstrated middle frontal gyrus/DLPFC activity in each moral judgment condition separately. Greene and colleagues found that activity in the DLPFC was positively correlated with a subject’s utilitarian judgment in moral dilemma tasks. Schleim [35] and colleagues found that both moral right-wrong judgment and legal judgment would activate the left middle frontal gyrus. Interestingly, activation in the legal condition was stronger, a result thought to be due to more explicit rules and more complicated semantic processing in the legal condition stimulus. Though these studies revealed the correlations between the left middle frontal gyrus’ activity and moral dilemma and right-wrong judgments, the exact function of the left middle frontal gyrus in moral judgment remains elusive.
Determination of the precise role of the left superior temporal gyrus in moral judgment likewise requires further investigation. Schleim and colleagues [35] posited that activations of the left superior temporal gyrus contributed to the analysis of goals and intentions of the moral agent. However, because the left superior temporal gyrus is collocated with Wernicke's area, a region active in processing semantic content, the possibility that the activation is due to the semantic content of the stimulus rather than the moral content cannot be ruled out.
The other significant region located at the temporal lobe is the left middle temporal gyrus, or more precisely, the left posterior middle temporal gyrus. Previous studies associated its activation with semantic perception [58]. Damage to the middle temporal gyrus can result in alexia and agraphia [59,60]. A number of studies show that the left middle temporal gyrus is important in processing motion-related verbs in sentences. Wallentin et al., [61] speculated that the left posterior middle temporal region’s activity is a result of actively constructing the mental space set up by the sentence. These findings suggest that the left middle temporal region is engaged in processing the motion verbs in the moral stimuli so as to carry out the depicted action in the mind, rather than any specifically moral content.
Table 4 summarizes all the clusters with significant activation likelihoods for each type of moral judgment in our meta-analysis. As we can see from the column of possible roles in Table 4, the regions active in both moral right-wrong judgment and moral dilemma judgment conditions constitute a fundamental neural substrate for processing moral stimuli and issuing moral judgments. In this fundamental moral circuitry, neural regions are activated to complete distinct cognitive procedures, such as semantic perception (left middle temporal gyrus), intentions analysis (left superior temporal gyrus) and the processing of normative and semantic content. Only one overlapping region is associated with emotion-related function (left medial frontal gyrus). However, those brain regions active in only one of the moral judgment conditions reflect differences in advanced structure between two types of moral judgment. In other words, two categories of particular regions constitute two different moral substrates.
Of six particular regions of moral right-wrong judgment, five regions are emotion-related. This implies that moral right-wrong judgment consists largely of emotion based responses. This finding gives some support to Emotivist theories that take initial moral judgments to be emotional responses that are only later supported with post hoc rationales [1,3]. The finding is further consistent with anthropological accounts that postulate distinct at least two neural substrates for moral thinking: an advanced and evolutionarily recent human substrate involved in processing semantically complex systems of norms and contexts including dilemmas, and a more primitive, emotion based substrate held in common with other social animals [6].
The neural substrate active in moral dilemma conditions relies on more complicated structures, especially those associated with complex decision-making functions. As shown in Table 4, we observed that moral dilemma conditions implicate thoughtprovoking (i.e. right superior temporal gyrus), cognitive control (i.e. left anterior cingulate gyrus), belief and intention reasoning (i.e. right superior temporal gyrus) and inhibitory control (i.e. right middle frontal gyrus), most of which play important roles in facilitating decision-making under dilemmatic situations. In particular, the activations of the right middle frontal gyrus and the left anterior cingulate gyrus reflect the conflict and competition between emotional and cognitive reasoning when a subject in a moral predicament is making a hard trade-off to reach a decision. The structures active in making judgments in moral dilemmas are more complex, more recent in terms of evolutionary development, and correlated to higher cognitive functions such as processing semantic content and exercising executive control. From a neural perspective, a person has multiple, distinct moral substrates which are active in processing specific moral tasks, but which share a common fundamental structure. In our view, the neural correlates of moral right-wrong judgment and moral dilemma share a structure primarily engaged in basic cognitive processing. Beyond that, the neural substrate of moral rightwrong judgment has additional modules for emotion-processing, while the neural substrate of moral dilemma judgment has additional modules for complex decision-making. This distinction should inform future experiment design and stimulus selection in investigations that measure neural responses in moral judgment.
The ALE meta-analysis assigned equal weight for each study where the data of the foci were collected, meaning all included coordinates have the same measure in determining the activation likelihood of a focal point. Hence, the different criteria of data selection in different studies may cause statistical bias. Specifically, different studies may define entirely different thresholds in filtering the reported coordinates. For instance, some moral judgment studies focusing on special regions of interests adopt a relatively lenient threshold so as not to omit activities in these special regions. Such a threshold might result in over-representation of these regions relative to the threshold selected for a whole brain study. Generally speaking, a low threshold usually results in a larger number of foci and a larger scope of active regions, while a conservative threshold produces fewer foci and a smaller scope. Mixing low-threshold and high-threshold data while using the same weight can result in statistical bias.
Moreover, the number of studies used as data sources was limited due to the low number of available neuroimaging studies for moral decision-making within the analysis parameters. However, the heterogeneity of experimental subjects across various studies, including the differences in gender and age, may have an effect on the results of the meta-analysis. Several previous studies pointed out the effects of these factors, such as Harenski et al.,’s 2008 [27] investigation of gender differences of neural mechanism in moral judgments. A larger sample size would help to offset these effects.
Conclusion
We decomposed the general moral judgment framework into two specific sub-models. Our hypotheses, though supported, were incomplete. We confirmed that moral right-wrong judgments largely activate emotion centers in the brain and that moral dilemma decision-making activates more complex rational decisionmaking structures. Beyond these hypotheses, we found that moral dilemma decision-making also involves emotional centers of the brain, but that these emotional responses are countered and suppressed by the activation of neural substrates necessary for emotional control and executive function while more deliberative and rational consideration of competing norms takes place. Since these two types of moral judgment engage considerably distinct neural mechanisms, it is inappropriate to put them under the same theoretical framework. The consensus demonstrated by the metaanalysis shows that there are distinct neural substrates for at least two types of moral judgment, but there are, of course, many types of moral judgment, and many types of moral processing and emotion apart from judgment. While a number of studies have demonstrated the neural correlates of specific moral emotions and responses including, most notably disgust, much remains to be done.
The formation of a consensus around the neural correlates of moral judgment demonstrates a broader point with more farreaching implications; namely, that given the neural basis for moral judgments, and by extension, behaviors, what we take to be aberrations in behavior may be due, more often than we suppose, to neurological differences. In the most drastic of cases where neurological damage is clear, localized, and coextensive with equally evident behavioral changes, conclusions are relatively straightforward. When this is the case, we adjust our measure of responsibility, both legally and morally, to whatever level of control we suppose the subject capable, replacing legal punishments with clinical treatments where available. Beyond these considerations, we hope to carve out spaces of inclusion of those with neurological differences as an application of neurodiversity. Yet, without more extensive and exacting knowledge of the precise neural substrates of moral processing and judgment, and the degree of variation between individuals, it is difficult to know when behavioral differences are the result of neurological differences, and to what extent such allowances are warranted.
References
- Haidt J (2001) The emotional dog and its rational tail: A social intuitionist approach to moral judgment. Psychological Review 108(4): 814–834.
- Kohlberg L (1981) Essays on Moral Development: The Philosophy of Moral Development. (1st edn), Harper & Row, San Francisco, CA. 1: 441.
- Kunda Z (1990) The case for motivated reasoning. Psychological Bulletin 108(3): 480-498.
- Greene J D, Haidt J (2002) How (and where) does moral judgment work? TRENDS in Cognitive Science 6(12): 517–523.
- Björn Bonath, Jana Tegelbeckers, Marko Wilke, Hans-Henning Flechtner, Kerstin Krauel (2016) Regional gray matter volume differences between adolescents with ADHD and typically developing controls: further evidence for anterior cingulate involvement. Journal of Attention Disorders. 1-12.
- Greene JD, Nystrom LE, Engell AD, Darley JM, Cohen1 JD (2004) The neural bases of cognitive conflict and control in moral judgment. Neuron 44(2): 389–400.
- Ciaramelli E, Muccioli M, Làdavas E, di Pellegrino G (2007) Selective deficit in personal moral judgment following damage to ventromedial prefrontal cortex. Social Cognitive and Affective Neuroscience 2(2): 84–92.
- Koenigs M, Young L, Adolphs R, Tranel D, Cushman F, et al., (2007) Damage to the prefrontal cortex increases utilitarian moral judgments. Nature 446(7138): 908–911.
- Pizarro D, Bloom P (2003) The Intelligence of the Moral Intuitions: Commenton Haidt (2001). Psychological Review 110(1): 193–196.
- Greene JD, Sommerville RB, Nystrom LE, Darley JM, Cohen JD (2001) An fMRI investigation of emotional engagement in moral judgment. Science 293(5537): 2105–2108.
- Greene JD (2007) Why are VMPFC patients more utilitarian? A dualprocess theory of moral judgment explains. TRENDS in Cognitive Science 11(8): 322–323.
- Moll J, De Oliveira-Souza R, Zahn R (2008) The neural basis of moral cognition. Annals of the New York Academy of Sciences 1124: 161–180.
- Borg SJ, Lieberman D, Kiehl KA (2008) Infection, incest, and iniquity: investigating the neural correlates of disgust and morality. J Cogn Neurosci 20(9): 1529–1546.
- Haidt J, Rozin C, McCauley C, Imada S (1997) Body, psyche, and culture: The relationship between disgust and morality.” Psychology & Developing Societies 9(1): 107-131.
- Lieberman D, Tooby J, Cosmides L (2003a) The evolution of human incest avoidance mechanisms: An evolutionary psychological approach. Evolution and the Moral Emotions: Appreciating Edward Westermarck. Stanford University Press, Stanford, CA.
- Lieberman D, Tooby J, Cosmides L (2003b) Does morality have a biological basis? An empirical test of the factors governing moral sentiments relating to incest. Proceedings of the Royal Society of London B: Biological Sciences 270(1517): 819-826.
- Moll J, Zahn R, De Oliveira-Souza R, Krueger F, Grafman J (2005) The neural basis of human moral cognition. Nature Reviews Neuroscience 6: 799–809.
- Eslinger PJ, Robinson-Long M, Realmuto J, Moll J, deOliveira-Souza R, et al., (2009) Developmental frontal lobe imaging in moral judgment: Arthur Benton’s enduring influence 60 years later. Journal of Clinical and Experimental Neuropsychology 31(2): 158–169.
- Turkeltaub PE, Eden GF, Jones KM, Zeffiro TA (2002) Meta-analysis of the functional neuroanatomy of single-word reading: method and validation. Neuroimage 16(3): 765–780.
- Moll J, Eslinger PJ, Oliveira-Souza R (2001) Frontopolar and anterior temporal cortex activation in a moral judgment task: Preliminary functional MRI results in normal subjects. ArqNeuropsiquiatr 59(3): 657–664.
- Moll, J., De Oliveira-Souza, R., Eslinger, P. J., Bramati, I. E., Mourao-Miranda, J., et al. (2002) The neural correlates of moral sensitivity: A functional magnetic resonance imaging investigation of basic and moral emotions. Neuroscience 22(7): 2730–2736.
- Kouichi Takahashi (2004) Multi-algorithm and multi-timescale cell biology simulation: Requirements analysis, algorithm design, and software implementation. 13-193.
- Heekeren HR, Wartenburger I, Schmidt H, Schwintowski HP, Villringer A (2005) Influence of bodily harm on neural correlates of semantic and moral decision-making. Neuroimage 24(3): 887–897.
- Luo Q, Nakic M, Wheatley T, Richell R, Martin A, Blair RJ (2006) The neural basis of implicit moral attitude-An IAT study using event-related fMRI. NeuroImage 30(4): 1449–1457.
- Berthoz S, Gre`zes J, Armony JL, Passingham RE, Dolane RJ (2006) Affective response to one’s own moral violations. NeuroImage 31(2): 945–950.
- Takahashi H, Kato M, Matsuura M, Koeda M, Yahata N, et al., (2008) Neural correlates of human virtue judgment. Cerebral Cortex 18(8): 1886–1891.
- Harenski CL, Antonenko O, Shane MS, Kiehl KA (2008) Gender differences in neural mechanisms underlying moral sensitivity. Social Cognitive and Affective Neuroscience 3(4): 313–321.
- Prehn K, Wartenburger I, Mériau K, Scheibe C, Goodenough OR, et al., (2008) Individual differences in moral judgment competence influence neural correlates of socio-normative judgments. Social Cognitive and Affective Neuroscience 3(1): 33–46.
- Harenski CL, Antonenko O, Shane MS, Kiehl KA (2010) A functional imaging investigation of moral deliberation and moral intuition. NeuroImage 49(3): 2707–2716.
- Tsukiura T, Cabeza R (2010) Shared brain activity for aesthetic and moral judgments: implications of the beauty-is-good stereotype. Social CognitiveAffective Neuroscience 6(1): 138–148.
- Borg JS, Hynes C, Van Horn J, Grafton S, Sinnott-Armstrong W (2006) Consequences, action, and intention as factors in moral judgments: An fMRI investigation. Journal of Cognitive Neuroscience 18(5): 803–817.
- Pérez-Alvarez F, Timoneda C (2007) An fMRI study of emotional engagement in decicion-making. Transaction on Advanced Research 2: 45–51.
- Cikara M, Farnsworth RA, Harris LT, Fiske ST (2010) on the wrong side of the trolley track: Neural correlates of relative social valuation. Social Cognitive Affect Neuroscience, 5(4): 404-413.
- Shenhav A, Greene J (2010) Moral Judgments Recruit Domain-General Valuation Mechanisms to Integrate Representations of Probability and Magnitude. Neuron 67(4): 667–77.
- Schleim S, Spranger TM, Erk S, Walter H (2011) From moral to legal judgment: the influence of normative context in lawyers and other academics. Social Cognitive Affective Neuroscience 6(1): 48–57.
- Talairach J, Tournoux P (1988) Co-planar Stereotaxic Atlas of the Human Brain 3-Dimensional Proportional System: An Approach to Cerebral Imaging. (1edn). New York: Thieme. 132.
- Laird AR, Fox M, Price CJ, Glahn DC, Uecker AM, et al., (2005) ALE meta analysis: Controlling the false discovery rate and performing statistical contrasts. Human Brain Mapping 25(1): 155–164.
- Eickhoff SB, Laird AR, Grefkes C, Wang LE, Zilles K, et al., (2009) Coordinate based activation likelihood estimation meta-analysis of neuroimaging data: A random-effects approach based on empirical estimates of spatial uncertainty. Human Brain Mapping 30(9): 2907–2926.
- Krain AL, Wilson AM, Arbuckle R, Castellanos FX, Milham MP (2006) Distinct neural mechanisms of risk and ambiguity: A meta-analysis of decision-making. Neuroimage 32(1): 477–484.
- Sörös P, Inamoto Y, Martin RE (2009) Functional brain imaging of swallowing: an activation likelihood estimation meta-analysis. Human Brain Mapping 30(8): 2426–2439.
- Stuss DT, Gallup GGJr, Alexander MP (2001) The frontal lobes are necessary for theory of mind. Brain 124(2): 279-286.
- Decety J, Sommerville JA (2003) Shared representations between self and others: A social cognitive neuroscience view. Trends in Cognitive Science 7(12): 527–533.
- Takahashi H, Yahata N, Koeda M, Matsuda T, Asai K, Okubo Y (2004) Brain activation associated with evaluative processes of guilt and embarrassment: An fMRI study. Neuroimage 23(3): 967–974.
- Farrow TF, Zheng Y, Wilkinson ID, Spence SA, Deakin JF, et al., (2001) Investigating the functional anatomy of empathy and forgiveness. Neuroreport 12(11): 2433–2438.
- Ruby P, Decety J (2004) How would you feel versus how do you think she would feel? A neuroimaging study of perspective taking with social emotions. Journal of Cognitive Neuroscience 16(6): 988–999.
- Bush G, Luu P, Posner MI (2000) Cognitive and emotional influences in anterior cingulate cortex. Trends in Cognitive Sciences 4(6): 215–222.
- Vogt BA, Vogt L, Laureys S (2006) Cytology and functionally correlated circuits of human posterior cingulate areas. Neuroimage 29(2): 452-466.
- Westin D, Blagov PS, Harenski K, Kilts C, Hamann S (2006) Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 US presidential election. Journal of Cognitive Neuroscience 18(11): 1947-1958.
- Maddock RJ, Garrett AS, Buonocore MH (2003) Posterior cingulate cortex activation by emotional words: fMRI evidence from a valence decision task. Human Brain Mapping 18(1): 30–41.
- Phillips ML, Young AW, Scott SK, Calder AJ, Andrew C, et al., (1998) Neural responses to facial and vocal expressions of fear and disgust. Proceedings of the Royal Society - Biological Sciences 265(1408): 1809–1817.
- Saxe R, Kanwisher N (2003) People thinking about thinking people: fMRI investigations of theory of mind. Neuroimage 19(4): 1835–1842.
- Saxe R, Carey S, Kanwisher N (2004) Understanding other minds: Linking developmental psychology and functional neuroimaging. Annual Review of Psychology 55: 87–124.
- Stark R, Zimmermann M, Kagerer S, Schienle A, Walter B, et al., (2007) Hemodynamic brain correlates of disgust and fear ratings. Neuroimage 37(2): 663–673.
- Young L, Cushman F, Hauser M, Saxe R (2007) The neural basis of the interaction between theory of mind and moral judgment. Proceedings of the National Academy of Sciences of the United States of America 104(20): 8235-8240.
- Young L, Saxe R (2008) The neural basis of belief encoding and integration in moral judgment. Neuroimage 40(4): 1912–1920.
- Cavanna AE, Trimble MR (2006) The precuneus: A review of its functional anatomy and behavioural correlates. Brain 129(3): 564–583.
- Garavan H, Ross TJ, Stein EA (1999) Right hemispheric dominance of inhibitory control: an event-related functional MRI study. Proceedings of the National Academy of Sciences of the United States of America 96(14): 8301–8306.
- Martin A, Haxby JV, Lalonde FM, Wiggs CL, Ungerleider LG (1995) Discrete cortical regions associated with knowledge of color and knowledge of action. Science 270(5233): 102–105.
- Kantha SS (1992) Albert Einstein’s dyslexia and the significance of Broadman area 39 of his left cerebral cortex. Medical Hypotheses 37(2): 119–122.
- Sakurai Y, Mimura I, Mannnen T (2008) Agraphia for kanji resulting from a left posterior middle temporal gyrus lesion. Behavioural Neurology 19(3): 93–106.
- Wallentin M, Lund TE, Ostergaard S, Ostergaard L, Roepstorff A (2005) Motion verb sentences activate left posterior middle temporal cortex despite static context. Neuroreport 16(6): 649-652.