Educational Psychology and Interventions

Growth Mindset: What the Meta-Analyses Show

Published: July 23, 2025 · Last reviewed:
📖3,126 words13 min read📚10 references cited

Few ideas in education have spread as widely as Carol Dweck’s growth mindset — the proposal that believing intelligence is malleable rather than fixed produces better academic outcomes. Schools, corporate training, and parenting books have built programs around it, often presenting growth mindset as a transformative force. The honest answer to “does it actually work” is more bounded: meta-analyses converge on small average effects (typical Cohen’s d in the 0.05–0.10 range, far below early laboratory estimates), publication bias inflated earlier numbers, and the modest signal that does exist is concentrated among low-achieving students in supportive environments. Growth mindset is a real but minor lever in education, not a substitute for high-quality instruction or for the structural conditions that drive most learning outcomes.

What does growth mindset theory actually claim?

Dweck and Leggett (1988), in Psychological Review, formalized what they called implicit theories of intelligence — the beliefs people hold about whether intellectual ability is fixed or developable. Their framework distinguishes two orientations:

  • Fixed mindset (entity theory): intelligence is a stable trait; you either have it or you don’t. People holding this view tend to avoid challenges that might reveal limits, give up faster on difficult tasks, and read failure as evidence of permanent inadequacy.
  • Growth mindset (incremental theory): intelligence is developable through effort, feedback, and persistence. People holding this view tend to seek challenges, persist through setbacks, and treat failure as informative rather than disqualifying.

The mechanism predicted by the theory is behavioral: students with growth mindsets engage more productively with the learning process — choosing harder tasks, asking for feedback, sustaining effort — and that engagement compounds into better academic outcomes. Mueller and Dweck (1998), in the Journal of Personality and Social Psychology, ran a now-canonical experiment showing that praising children for effort (“you worked hard”) rather than ability (“you’re so smart”) led to greater challenge-seeking and better post-failure performance. Effects in the original laboratory studies were striking, and the framework gained rapid traction in educational psychology.

What did the early meta-analyses show?

The first comprehensive meta-analytic synthesis was Burnette et al. (2013), in Psychological Bulletin. Pooling 113 effect sizes from 28,217 participants, the authors found that growth mindset (incremental implicit theory) was positively associated with self-regulatory processes — goal-setting, monitoring, and effort — with effect sizes in the small-to-moderate range. The paper concluded that mindset “matters” for self-regulation and treated the framework’s empirical foundation as solid.

This early synthesis was the high-water mark for the field. Subsequent meta-analyses, conducted with growing samples and more rigorous attention to publication bias and intervention quality, produced systematically smaller effect estimates.

What did the rigorous large-scale studies find?

Three landmark analyses drove the revision downward. They are worth describing in turn because the headline numbers from each are commonly mis-cited.

Sisk et al. (2018) — Psychological Science. Two meta-analyses in one paper. The first synthesized 273 studies on the cross-sectional correlation between mindset and academic achievement; the average effect was r = 0.10 — small. The second synthesized 43 intervention studies; the average effect of mindset interventions on academic outcomes was Cohen’s d = 0.08 — small in absolute terms and concentrated in studies of at-risk students. Sisk et al. were measured: the effects existed, they were not zero, but they were far smaller than the popular discourse suggested.

Yeager et al. (2019) — Nature: the National Study of Learning Mindsets. The largest randomized trial of a growth mindset intervention ever conducted: 12,490 ninth-graders across 65 nationally representative U.S. high schools, randomly assigned to a brief online mindset module (~50 minutes total) or a control condition. The headline result: the intervention produced a 0.03 grade-point average increase in core academic GPA across the full sample. The effect was concentrated in lower-achieving students (≈0.10 GPA) and in schools with peer norms supportive of challenge-seeking; it was essentially zero for higher-achieving students and in unsupportive school contexts. Yeager et al. were carefully neutral in their framing: the intervention worked for some students some of the time, in conditions that were specifiable. The broader implication — that a brief, scalable belief-change intervention could close achievement gaps at population scale — was substantially weaker than mid-2010s rhetoric had suggested.

Macnamara & Burgoyne (2023) — Psychological Bulletin. Sixty-three intervention studies, with explicit modeling of publication bias and study-quality moderators. The corrected average effect size was Cohen’s d = 0.05 — closer to zero than to the original Burnette estimate. The authors found that studies with positive results were systematically more likely to be published than studies with null results, that authors with financial ties to mindset interventions reported larger effects than those without, and that earlier meta-analytic estimates had been inflated by these asymmetries. After publication-bias correction, the average effect was statistically detectable but practically negligible.

Burnette et al. (2023) — Psychological Bulletin, same issue. A parallel meta-analysis published alongside Macnamara & Burgoyne, using different methodology and reaching more favorable conclusions. Burnette et al. emphasized the heterogeneity of effects across populations and implementation conditions: their headline framing was not “does growth mindset work” but “for whom, how, and why might such interventions work.” They reported small overall effects but argued these averages obscured meaningful variation — interventions worked better for low-SES students, students with more depressed mood, students at academic transitions, and when implemented with fidelity. The two meta-analyses, published back-to-back in the same journal, reached substantively different bottom-line interpretations of largely overlapping data.

Tipton et al. (2023) — Psychological Bulletin, commentary. A methodological commentary by Tipton, Bryan, Murray, McDaniel, Schneider, and Yeager critiquing both 2023 meta-analyses. They argued that traditional aggregate-then-test meta-analytic methods are poorly suited to interventions like growth mindset, where effects vary substantially across populations and contexts. Their contention: the right question is not “what is the average effect” but “where, for whom, and under what conditions does the effect operate” — and answering that requires multilevel meta-regression rather than the single-effect-plus-moderators approach both 2023 papers used in different forms. The commentary positions the Macnamara & Burgoyne skeptical conclusion as artifactual to method choice as much as to the underlying data.

Meta-analysis k studies Effect size Position
Burnette et al. (2013) — self-regulation 113 effects r ≈ 0.20 Favorable (early)
Sisk et al. (2018) — correlational 273 r = 0.10 Modest, with moderators
Sisk et al. (2018) — interventions 43 d = 0.08 Modest, larger for at-risk
Yeager et al. (2019) — national RCT 1 (large) 0.03 GPA avg; 0.10 in low-achievers Bounded, context-dependent
Burnette et al. (2023) — interventions 53 small overall, larger in subgroups “For whom, how, why”
Macnamara & Burgoyne (2023) — interventions 63 d = 0.05 (bias-corrected) Skeptical
Tipton et al. (2023) — commentary Both 2023 papers methodologically inadequate

The defensible synthesis as of 2024: average effects of growth mindset interventions on academic achievement are small (Cohen’s d generally below 0.10), the field is in an active methodological dispute about how to model that average versus the heterogeneity around it, and even the most favorable interpretations concentrate the meaningful effects in identifiable subgroups (low-achieving students, transitional academic periods, supportive school contexts) rather than in the average student. The most skeptical interpretation (Macnamara & Burgoyne, near-zero after publication-bias correction) and the most favorable (Burnette et al. and Tipton et al., real-but-heterogeneous) are not as far apart as the discourse suggests — both agree that population-average effects are small and that meaningful effects exist in specifiable subgroups. They disagree on whether to lead with the small average or the conditional effects.

To put any of these numbers in context: a Cohen’s d of 0.05–0.10 explains roughly 0.06–0.25% of the variance in academic outcomes. By comparison, IQ explains 25–36% of variance in academic achievement (Schmidt & Hunter, 1998; Strenze, 2007), and educational interventions like high-quality tutoring routinely produce d = 0.40–0.80 — an order of magnitude larger than mindset under any meta-analytic specification.

Why are the effects so much smaller than the laboratory results?

Several factors explain why effects shrunk dramatically as the field moved from controlled experiments to scaled intervention research.

  • Beliefs alone don’t change behavior. Believing intelligence is malleable is necessary but not sufficient for productive learning. Students also need adequate prior knowledge, effective study strategies, environmental support, and time. Brief mindset modules deliver only the belief component without the behavioral scaffolding that would translate it into different study behavior.
  • Dosage is small. Most scalable mindset interventions are 30–90-minute online modules. Asking such minimal exposure to durably shift beliefs that have been forming for years is empirically unrealistic. The laboratory experiments that produced larger effects typically involved more intensive, multi-session protocols.
  • Ceiling effects on baseline mindset. Many students already hold a partially growth-oriented mindset before any intervention. In populations where baseline mindset is already moderate-to-high, the room for intervention-driven change is small.
  • Cognitive ability constraints. Effort and persistence cannot fully compensate for differences in fluid reasoning, working memory, and processing speed. Research on the genetic and environmental origins of cognitive abilities documents the substantial heritability of intelligence — a constraint that bounds how much belief change can reshape outcomes.
  • Publication bias. Macnamara & Burgoyne (2023) found systematic asymmetry in which studies got published. Positive results were over-represented; null results were filed away. This inflated the apparent effect in earlier meta-analyses and was correctable only when bias-modeling techniques were applied.

When does growth mindset actually work?

The evidence does not show that growth mindset interventions are useless. It shows that effects are moderated — concentrated in specific student populations and specific contexts.

  • Low-achieving and at-risk students. The most consistent positive effects across Sisk (2018), Yeager (2019), and Macnamara (2023) appear among students who are struggling academically. For these students, a fixed-mindset belief (“I’m not smart enough”) may be functionally debilitating, and shifting that belief can re-engage them with the learning process.
  • Transitional academic periods. Effects tend to be larger during entry into new academic environments (start of high school, first year of college) when students are forming beliefs about their capabilities in the new context.
  • Supportive environments. Yeager et al. (2019) found the intervention worked best in schools with peer norms that supported challenge-seeking — where asking for help was normalized and effort was valued. In environments hostile or indifferent to academic struggle, even a perfect mindset intervention has little to bite on.
  • Combined with strategy instruction. Telling students “you can grow your brain” is more effective when paired with explicit instruction in how to learn — spaced practice, retrieval practice, elaborative encoding. Belief and skill complement each other; belief alone does less. Research on strategic self-control and academic performance documents the complementary role of self-regulatory technique.

The honest summary: growth mindset interventions help some students some of the time, in specifiable conditions, with modest effect sizes. They do not transform educational systems, close achievement gaps at scale, or substitute for cognitive ability and quality instruction.

The mismatch between meta-analytic evidence and popular discourse has structural sources.

  • Egalitarian appeal. Mindset offers an emotionally satisfying alternative to the perception that intelligence is fixed and inegalitarian. The promise that anyone can develop the capacities needed for success — through belief change alone — is far more attractive than the constrained reality.
  • Marketability. A scalable, low-cost intervention that promises substantial achievement gains is highly attractive to schools and districts under pressure to improve outcomes with limited budgets. Even small effects, multiplied across millions of students, justify implementation costs — but they do not justify the hyped framing.
  • Conceptual stretching. “Growth mindset” in popular use has expanded far beyond Dweck’s original construct. It now overlaps with optimism, grit, resilience, and “having a positive attitude” — none of which require any specific belief about the nature of intelligence. When proponents are challenged on the empirical effect of growth mindset specifically, they often retreat to the broader cluster, which has its own (separate) evidence base.
  • Confirmation through testimony. Anecdotes about students whose lives “turned around” after adopting a growth mindset are vivid and emotionally compelling. These cases support the narrative even when population-level data do not.

How does growth mindset compare to other educational interventions?

Calibrating mindset’s effect size against other educational levers is sobering for proponents:

  • High-quality 1:1 or small-group tutoring: d ≈ 0.40–0.80
  • Formative assessment with feedback: d ≈ 0.40–0.70
  • Metacognitive strategy instruction: d ≈ 0.50–0.60
  • Reducing class size meaningfully: d ≈ 0.20–0.30
  • Growth mindset intervention: d ≈ 0.05–0.10

This does not mean mindset is worthless. Its scalability — delivery via brief online modules to large student populations at near-zero marginal cost — means that even small effects can aggregate. A 0.03 GPA shift across a million ninth-graders is a real outcome at population scale. But the comparison clarifies that mindset is one of the smallest available levers, not a primary intervention. Educational systems with finite attention and budget should prioritize tutoring, formative assessment, and skilled instruction first.

What about the broader malleability of intelligence?

One source of confusion in the public discourse: the question “is intelligence malleable” is empirically separate from the question “does growth mindset work.” Intelligence is genuinely malleable in important ways — education raises IQ by roughly 1–5 points per year of additional schooling, the Flynn effect shifted population means by more than a standard deviation across the 20th century, and research on education and intelligence documents real cognitive gains from sustained instruction. None of this depends on students’ beliefs about whether intelligence can change.

Conversely, intelligence is also substantially heritable and stable in adulthood, which means that even strongly held growth-oriented beliefs cannot eliminate ability differences. The framework’s strength is its honest acknowledgment that effort and engagement matter; its weakness is the popular over-interpretation that effort can fully compensate for ability or environmental constraint.

What should educators and parents do?

The actionable summary follows from the evidence:

  • Don’t abandon mindset language — but don’t over-rely on it. Encouraging students to view challenges as growth opportunities is unlikely to harm and may help struggling learners. It should not, however, replace investment in instruction, materials, or structural support.
  • Pair belief with technique. “You can grow your brain” is less effective than teaching specific learning skills — spacing, retrieval, elaboration, self-monitoring. Mindset supports the use of these techniques but does not substitute for them.
  • Avoid promising too much. Telling students that effort alone determines outcomes sets them up for disillusionment when they hit genuine ability or environmental constraints. A more honest message: effort matters, strategy matters, and ability and resources set the context within which effort and strategy operate.
  • Prioritize high-impact interventions. For limited educational budgets, the meta-analytic evidence supports investing first in high-quality instruction, formative assessment with feedback, targeted tutoring for struggling students, and adequate time on task. Mindset programming is at best a complement to these, not a substitute.

Frequently Asked Questions

Does growth mindset really work?

It produces small but real average effects on academic outcomes (Cohen’s d ≈ 0.05–0.10 in recent rigorous meta-analyses), with the largest effects concentrated in low-achieving students in supportive school environments. It does not produce the transformative gains that early popular accounts suggested.

How big is the effect compared to IQ on grades?

IQ correlates roughly r = 0.50–0.60 with academic achievement (Schmidt & Hunter, 1998; Strenze, 2007), explaining 25–36 percent of variance. Growth mindset interventions explain less than 1 percent of variance on average. The two effects operate on different timescales and through different mechanisms; they are not directly comparable, but the order-of-magnitude difference is meaningful for resource allocation decisions.

Why did the early studies show such large effects?

Three reasons: (1) lab studies use intensive, controlled manipulations that don’t scale to the brief online modules used in real schools; (2) publication bias favored studies with positive results, inflating early meta-analytic estimates (Macnamara & Burgoyne, 2023); (3) early experiments often used outcome measures sensitive to mindset specifically rather than general academic achievement.

Don’t different meta-analyses contradict each other?

Two 2023 meta-analyses in Psychological Bulletin reached different bottom-line interpretations of overlapping data. Macnamara & Burgoyne emphasized that publication-bias-corrected average effects approach zero. Burnette et al., in the same issue, emphasized that small averages mask meaningful variation by population and implementation. Tipton et al. (2023) wrote a methodological commentary arguing both papers are methodologically inadequate for the kind of heterogeneous intervention growth mindset is. The honest summary: the field is in an active dispute about how to model the data, but all three positions agree that average effects are small and that meaningful effects exist in identifiable subgroups.

Is growth mindset the same as grit or resilience?

No, though the popular discourse often blurs them. Grit, resilience, conscientiousness, and self-regulation are distinct constructs with their own measurement traditions and evidence bases. Conflating them with growth mindset makes the latter appear more empirically grounded than the specific construct’s evidence supports.

Should schools stop teaching growth mindset?

No, but schools should calibrate expectations. Growth mindset programming is low-cost, generally harmless, and may help some struggling students. It should not be prioritized over high-impact investments in instruction, tutoring, or formative assessment, all of which produce effect sizes that are an order of magnitude larger.

What if my child has a fixed mindset?

Encouraging effort, modeling response to setbacks, and reframing failure as informative are reasonable parenting practices supported by the broader literature on motivation. But “fixing” a child’s mindset will not transform their academic trajectory — adequate instruction, supportive environment, and engagement with challenging material at the right level remain the dominant inputs.

Did Carol Dweck overstate the effects?

The early experimental work was rigorous within its scope. The over-statement came later, as the framework moved into trade books, corporate training, and educational policy with effect-size claims that the experimental literature did not support at scale. Dweck has since acknowledged the importance of the moderating conditions documented in Yeager et al. (2019).

References

  • Burnette, J. L., Billingsley, J., Banks, G. C., Knouse, L. E., Hoyt, C. L., Pollack, J. M., & Simon, S. (2023). A systematic review and meta-analysis of growth mindset interventions: For whom, how, and why might such interventions work? Psychological Bulletin, 149(3-4), 174–205. https://doi.org/10.1037/bul0000368
  • Burnette, J. L., O’Boyle, E. H., VanEpps, E. M., Pollack, J. M., & Finkel, E. J. (2013). Mind-sets matter: A meta-analytic review of implicit theories and self-regulation. Psychological Bulletin, 139(3), 655–701. https://doi.org/10.1037/a0029531
  • Dweck, C. S., & Leggett, E. L. (1988). A social-cognitive approach to motivation and personality. Psychological Review, 95(2), 256–273. https://doi.org/10.1037/0033-295X.95.2.256
  • Macnamara, B. N., & Burgoyne, A. P. (2023). Do growth mindset interventions impact students’ academic achievement? A systematic review and meta-analysis with recommendations for best practices. Psychological Bulletin, 149(3-4), 133–173. https://doi.org/10.1037/bul0000352
  • Mueller, C. M., & Dweck, C. S. (1998). Praise for intelligence can undermine children’s motivation and performance. Journal of Personality and Social Psychology, 75(1), 33–52. https://doi.org/10.1037/0022-3514.75.1.33
  • Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin, 124(2), 262–274. https://doi.org/10.1037/0033-2909.124.2.262
  • Sisk, V. F., Burgoyne, A. P., Sun, J., Butler, J. L., & Macnamara, B. N. (2018). To what extent and under which circumstances are growth mind-sets important to academic achievement? Two meta-analyses. Psychological Science, 29(4), 549–571. https://doi.org/10.1177/0956797617739704
  • Strenze, T. (2007). Intelligence and socioeconomic success: A meta-analytic review of longitudinal research. Intelligence, 35(5), 401–426. https://doi.org/10.1016/j.intell.2006.09.004
  • Tipton, E., Bryan, C., Murray, J., McDaniel, M. A., Schneider, B., & Yeager, D. S. (2023). Why meta-analyses of growth mindset and other interventions should follow best practices for examining heterogeneity: Commentary on Macnamara and Burgoyne (2023) and Burnette et al. (2023). Psychological Bulletin, 149(3-4), 229–241. https://doi.org/10.1037/bul0000384
  • Yeager, D. S., Hanselman, P., Walton, G. M., Murray, J. S., Crosnoe, R., Muller, C., et al. (2019). A national experiment reveals where a growth mindset improves achievement. Nature, 573(7774), 364–369. https://doi.org/10.1038/s41586-019-1466-y

Related Research

Child Cognitive Development

Gifted Children: Identification and Testing

Your child taught themselves to read at four. They ask questions about black holes at dinner. Their teacher says they are "ahead" but seems unsure…

Apr 21, 2026
Intelligence Research and Cognitive Abilities

The G Factor: What General Intelligence Means

The g factor — Charles Spearman's name for the common variance that runs through all cognitive tests — is the most replicated and the most…

Apr 10, 2026
Cognitive Neuroscience and Brain Function

Mindfulness and Cognitive Performance

Meditation has entered the mainstream cognitive-enhancement market. Corporate wellness programs, military training pipelines, schools, and clinics promote mindfulness as a way to sharpen attention, expand…

Apr 6, 2026
Cognitive Neuroscience and Brain Function

Caffeine and Cognitive Performance

Caffeine is the world's most widely consumed psychoactive substance — about 85 percent of American adults drink at least one caffeinated beverage daily, and global…

Mar 29, 2026
Psychometric Testing and IQ Assessment

Raven's Progressive Matrices: Culture-Fair IQ Test

Among the hundreds of cognitive tests developed over the past century, few have achieved the global reach of Raven's Progressive Matrices. Administered in settings from…

Mar 19, 2026

People Also Ask

What are gender differences in early education impacts on cognitive outcomes?

This study, published by Burchinal et al. (2024), examines the long-term effects of early childhood education (ECE) interventions on cognitive outcomes, with a focus on how impacts vary by gender. Using data from the Carolina Abecedarian Project, the researchers explore treatment effects from infancy through middle adulthood, highlighting key differences in outcomes between males and females.

Read more →
What is impact of cannabis use on iq decline in youth?

Power et al. (2021) conducted a systematic review and meta-analysis examining how frequent or dependent cannabis use during youth affects Intelligence Quotient (IQ) over time. Their findings provide valuable insights into the potential developmental consequences of cannabis exposure during critical cognitive development periods.

Read more →
How Education Can Improve Intelligence?

The connection between education and intelligence has long been a subject of scientific inquiry. Ritchie and Tucker-Drob's (2018) meta-analysis provides significant insights into this relationship, offering evidence that additional years of education can enhance cognitive abilities across various life stages and cognitive domains.

Read more →
What is the impact of growth mind-set interventions on academic achievement?

Growth mindset theories suggest that students who believe their abilities can improve through effort tend to achieve better outcomes in academics. Sisk et al. (2018) conducted two meta-analyses to assess how growth mindsets correlate with academic success and whether interventions designed to foster growth mindsets are effective in improving student achievement.

Read more →
What are the key aspects of what is growth mindset theory??

Dweck's theory, developed over decades of research, proposes that people hold implicit beliefs about the nature of intelligence that fall on a continuum: The theory predicts that students with growth mindsets will achieve more because they engage more productively with the learning process — seeking feedback, choosing harder tasks, and sustaining effort in the face of difficulty.

How does what did the original research show? work in practice?

Dweck's early laboratory studies produced striking results. In controlled experiments, students who were praised for effort ("You worked really hard") rather than ability ("You're so smart") were more likely to choose challenging tasks, persist longer, and perform better on subsequent tests. These findings were replicated across age groups and cultures, generating enormous enthusiasm.

📋 Cite This Article

Sharma, P. (2025, July 23). Growth Mindset: What the Meta-Analyses Show. PsychoLogic. https://www.psychologic.online/growth-mindset-evidence/