In 2020, the McGill University Office for Science and Society published a critique of a CST meta-analysis that had been getting positive attention. The analysis was detailed and worth taking seriously. It raised questions not only about that one study but about some of the foundational assumptions underlying craniosacral therapy itself.
This kind of critical work is part of how scientific understanding develops. When a therapy is being studied, it matters not only whether trials come out positive, but whether they're designed well enough to answer the question. The McGill analysis raised doubts on both fronts.
At the same time, scientific critique doesn't settle whether CST helps people. Many find genuine benefit, and that's real, whatever the mechanism turns out to be. Holding both — honest scientific concern alongside felt benefit — is where a thoughtful read of CST lands.
What the McGill analysis found
The McGill critique focused on a CST meta-analysis and found that 8 of the 10 studies it included were either negative in their findings or so poorly designed as to be unreliable. A meta-analysis is only as strong as its underlying studies. If the inputs are weak, combining them doesn't produce a strong result. McGill argued that's exactly what happened.
They also flagged comparators. Several included studies compared CST to no treatment at all rather than to a sham or placebo. This matters because any treatment, even one with no active ingredient, tends to produce some improvement when compared to nothing. Attention, touch, a warm room, the expectation of help: all contribute to outcomes in ways that make no-treatment comparisons hard to interpret.
The reliability of the craniosacral assessment itself was also called into question. A 1994 inter-rater reliability study with three trained CST examiners independently assessing 12 patients found essentially zero agreement on the perceived craniosacral pulse rate. An intraclass correlation of -0.02 means the three examiners weren't even agreeing at chance levels. That's a significant finding, and one McGill returned to with weight.
The biological plausibility question
Beyond study design, McGill raised questions about biological plausibility — whether the mechanism CST proposes could actually be happening in the body. The core concern: adult skulls are largely fused. Sutures close over time, and research has estimated that moving a skull bone by even 1 millimetre would require around 20 kilograms of force. The light touch used in CST is typically 5 to 10 grams. A feather's weight.
If the bones don't move meaningfully under that touch, the question becomes what the practitioner is perceiving, and what mechanism could produce change. CST proponents have offered various responses — micro-movement, fascial dynamics, the nervous system response to touch rather than cranial bone movement itself. But McGill presented the skull-fusion question as a real and unresolved challenge to the theoretical model.
Biological plausibility is a reasonable scientific criterion, though not the only one. Plenty of treatments in conventional medicine work through mechanisms that weren't understood when first used. Lack of a confirmed mechanism doesn't, by itself, mean a treatment has no effect.
What this means if you're considering CST
The McGill analysis is worth reading if you want to understand the scientific critique in some depth. It's not an emotional attack on the therapy or its practitioners. It's a methodological critique of research quality and theoretical coherence, raising points that haven't been fully resolved.
There's a meaningful difference between "the evidence is weak" and "the therapy doesn't work." Weak evidence means we can't confidently say from population studies that CST produces a reliable effect. It doesn't mean someone with a grinding tension headache who walks out feeling lighter hasn't experienced something real. Population statistics and individual experience answer different questions.
Many people find CST deeply helpful. Some come back for years because it seems to support their wellbeing. That experience doesn't vanish because a meta-analysis has methodological problems. What scientific critique invites is honest engagement: clear expectations, appropriate scepticism of dramatic claims, and judging the work primarily by your own experience rather than by certainty about mechanism.
Scientific criticism of CST is real and worth knowing about. The McGill analysis raises legitimate questions about research quality and biological plausibility. None of that erases what people have found in sessions. It does call for honesty about what we know and what stays uncertain.