There's a study that finally wrapped up looking at the effects of the drug rapamycin on a variety of health metrics and assessed its safety. That trial is called the PEARL trial and it showed conflicting data to much of the other work that's been done on rapamycin. I'd like to take a few moments to explain why and what it means for the future of rapamycin as a longevity drug.
If you aren't familiar with rapamycin, it's a molecule that is most well known for inhibiting a master protein within your cells called mTOR. This mTOR protein is implicated in several diseases as well as being associated with reduced lifespan. Using rapamycin provided promise because, well, it inhibits mTOR and has been shown in non-human studies to provide a longevity effect.
So naturally it seemed reasonable to conduct a larger human study to see if rapamycin could offer benefit. So the researchers recruited 115 participants that finished the study, including men and women, and gave one group five milligrams of rapamycin once per week, 10 milligrams once per week, or no rapamycin in a placebo form. They also had participants consume each once per week for 48 weeks.
So quite a long time. And then made a series of measurements. So what happened? Considering the pre-clinical data in other studies, I think the expectation by many in the public was to see some noticeable, noteworthy results. However, I think scientists had a bit more tempered tone about it, and for good reason, as I'll get into.
First, the results. On two notable metrics, bone mineral content and muscle mass in men and women respectively, the 10 milligram rapamycin condition indicated improvements. Other than that, except for one or two obscure measures, there were no differences across 20 plus measurements.
In addition, if we look at the data again and we focus on the effect size, the sort of the amount of change, the effects are very small considering the duration of the study. A few percent difference. So is rapamycin a bust then?
No, far from it. The reason comes down to many reasons actually. One, the trial was primarily a safety trial, which is important because studies are designed in such a way to have enough data collected for their main outcomes of interest, which means studies are then sometimes inadequate to effectively measure other ancillary outcomes, known as secondary endpoints.
So as a safety trial, their main outcome would have been to see if it led to serious adverse effects. There were no serious adverse effects that could be linked to rapamycin, so that's actually a positive. In addition, the researchers couldn't run a power calculation, which means that before the study begins the researchers estimate how much data that they'll need to detect an effect, if there is one.
That's done through a power calculation. However, since there's Well, little data on rapamycin in humans to estimate from, that just wasn't feasible, leaving the researchers just to take a stab in the dark. Finally, although there are some other considerations that I'm leaving out, the study was dosed at 5 and 10 milligrams of rapamycin per week, but due to an error in how the rapamycin was packaged, the actual doses were about 30% of the intended dose. So in effect, they accidentally nerfed the trial.
The researchers actually explained this tricky position that they were in here. We didn't know. I mean, we didn't know at the time. I mean, it wasn't, it was never like brought to our attention by any of the advisors or, you know, any of the articles that reviewed that there was a potential difference in bioavailability. It was only about, like I said, one third of the way into the, into enrollment that I think, credit goes, I think the Ross Pelton who brought it out, that.
that along the way rapamycin became encapsulated to improve its bioavailability. And then we later found out that the ITP had trouble with absorption, with kind of non-modified rapamycin. So we actually paused enrollment and we did like some preliminary data.
And, you know, we enrolled a few volunteers and kind of did a head-to-head comparison. And there was clues. there that it was, there was bioavailability, it was just lower. And then the larger study that we published seemed to indicate, you know, there was like this one to three ratio of absorption. So then we may, we have to make a decision.
Do we, you know, stop the trial and, you know, and start all over again with a higher dose? Or do we just kind of continue and hope that the lower availability, you know, will give us some results? Yeah, so they opted for the latter.
And that's just the way that it goes in science. You can only move forward, especially when you have a limited budget, which this study did. In that panel discussion, they mentioned the two graphs that I showed you here.
And they also point out some individual people had seemingly great responses. Dr. Kaberlein, an expert on rapamycin, add this to say on how to interpret this study? There were no side effects over the time frame of the study.
That's good. And there was at least some decent evidence, in my view, for improvements in body composition, particularly preservation of lean mass in women taking rapamycin. There were some more speculative potential benefits in terms of perceived quality of life.
And I think other than that, there wasn't a lot that we can hang our hats on from this study. Now, I certainly respect his position on the matter, and I agree on the side effects, but I'm even more conservative in my interpretation of this data. I'd say that this study just didn't indicate any effect that I'd even raise an eyebrow at.
I think that the researchers in their panel discussion offered some insight on some of the effects, speculating a lot, but again, by the strictness of what we see here. I think this study simply fails to show anything exciting. That said, I don't think it's fair comparing for the reasons that we outlined, not even to mention that many of the individuals enrolled were already quite healthy.
So I think we've learned one thing. Rapamycin at very low doses is likely safe, and we need to repeat this study using a fair, effective dose to see if there's any effects or not. Unfortunately, as Dr. Kaeberlein emphatically put it, it's easier said than done.
I know a lot of people will be frustrated by the fact that we're saying it's limited what we can interpret from this study and we need to do more studies. I don't know what else to tell you other than deal with it. If the people who can and should fund the larger trials for efficacy would get off their asses and step up and do it. we'd be able to answer these questions definitively.
But there's no point in complaining until somebody is going to step up and fund definitive clinical trials on rapamycin. It is an ongoing source of frustration to us all. We get it. Again, the funders have got to step up and do what they should have done 15 years ago and fund these trials.
If you're interested in listening to the researchers discuss the study, I'll link their video in the description, as well as Dr. Cableine's discussion on the topic and If you're interested in other longevity related research videos, I've got some right here for you.