Another beta-alanine study. Don’t buy the hype.

Some day, I’m going to preface a review with a sentence like, “Today’s study was really well done. I was impressed by the comprehensiveness of the reporting, the concise data analysis and the practical relevance of the trial.”

Today is not going to be that day.

I really try hard not to be antagonistic in these write-ups. And I really hate the fact that I sound like a broken record in every review (too many significance tests, no primary research question, inappropriate statistics, inadequate reporting of trial details–can you see the glaringly obvious trend here?), but I don’t make this stuff up. I don’t think I could if I tried. Quite honestly, these kinds of studies just make me angry.

In keeping with the beta-alanine theme, here’s my angry review of:

Hoffman J, Raramess N, Kang J, Mangine G, Faigenbaum A, Stout J. Effect of creatine and beta-alanine supplementation on performance and endocrine responses in strength/power athletes. International Journal of Sport Nutrition and Exercise Metabolism. 16: 430-446, 2006.

Part of the reason why I keep bringing the same points up is because we’ve seen Hoffman and Stout before in previous reviews. If anyone is looking for a methodologist, I’m available–and at pretty cheap rates too! Who am I kidding…I’m not making any friends with these reviews….

The purpose as stated by the authors was, “…to compare the effects of creatine plus beta-alanine to creatine alone on strength, power and body compositional changes during a 10-week resistance training program in collegiate football players. In addition, a secondary purposes of this study was to examine the effect of creatine and creatine plus beta-alanine supplementation on the hormonal responses to resistance training.”

Unfortunately, this is not what they actually did. Or maybe they did, but it’s not what they reported. Read on.

Methods:

Thirty-three male “strength power athletes” were recruited for this study. I guess football players are also strength power athletes–whatever that means. Subjects were not permitted to use any other nutritional supplement (again, not sure what goes in this category), and did not use steroids or other “anabolic agents”. All subjects had at least 2 years experience with resistance training.

Subjects were randomly assigned to one of three groups. One group got creatine with beta-alanine (10.5g/day of creatine monohydrate and 3.2g/day of beta alanine); another group got creatine (10.5g/day of creatine monohydrate); and the last group got a placebo (10.5g of dextrose). All subjects had one drink in the morning (the powder packet mixed in 8-10 oz of water), and then within 1 hour of their workout, or in the late afternoon or evening if it was a non-gym day.

[Broken record note: There is no description of the method by which the randomization sequence was generated, who assigned people to their groups, who had access to the sequence, how allocation concealment was performed, nor how blinding was performed (they mentioned the “double-blind” word, but who was the other blinded party apart from the subjects? Why thirty-three subjects? Is this another one of those, “Oh, ten subjects per group should be enough,” things?]

All subjects received the same workout (4 day upper/lower body split roughly). All workouts were supervised by study personnel. This study was done for 10 weeks.

Variables:

Strength: Subjects were tested for 1RM for squat and bench press.

Power: Subjects underwent Wingate anaerobic power testing. Wingate testing is a standard protocol of interval cycling (warm-up with some 5 sec sprints for 5 minutes, followed by 3 minutes stead state, followed by 30 second all out sprint, then 3 minutes steady state, 30 seconds sprint, 3 minutes steady state and 30 second sprint). In addition to the Wingate test, subjects also did an unvalidated “jump power” test, which involved performing 20 consecutive jumps off a portable force plate with the instruction to maximize the height of each jump with minimal time in contact with the force plate. Subjects had to keep their hands on their waist at all times. The data was processed by some unknown means, reported as, “Computer analysis was used to calculated peak power, mean power and a fatigue index.”

[What the *bleep* is this fatigue index??]

Body composition: Body composition was measured using DEXA.

Biochemical outcomes: Blood was drawn as fasting samples (except for 9 subjects who had their levels drawn 2h post-pradially–after a meal, that is, because of class scheduling). They tested for serum testosterone, growth hormone, IGF-I, sex hormone binding globulin, and cortisol levels.

[Why they bothered to test for cortisol levels if the 9 subjects weren’t fasting and first-thing in the morning is beyond me. This pretty much invalidates any cortisol analysis in this study–one third of subjects would have had an inappropriate, invalid cortisol level.]

Diet: Subjects used a 3 day recall method to track diet.

Statistics:

The authors used repeated-measures ANOVAs to make comparisons within groups. The used regular ANOVAs to make comparisons between groups. They also calculated an effect size, “…to determine the magnitude of treatment effects, and [the effect sizes] are reported with all statistically significant results as a measure of practical significance.”

[Pet Peeve number 1: In a randomized controlled trials, we are not generally interested in whether the groups did better compared to themselves. It’s usually interesting, but not very important, because the question we’re trying to answer here is, “Does beta-alanine with creatine do better than creatine alone or a placebo?” not, “Does beta-alanine with creatine, creatine or a dextrose improve strength/body composition/etc?” The reason why it’s interesting and not important is because even if dextrose doesn’t improve anything and beta-alanine with creatine does improve everything, if their effects are not substantially different from one another, the fact that dextrose did nothing and BA with creatine did something is irrelevant.

Pet Peeve number 2: You should, as a researcher, never use a statistic (like a calculated effect size) to determine whether something is practically relevant or not. The statistic doesn’t know what’s important or not. It’s just a number. If we were looking at an always-fatal disease and found that a therapy could save 20% of patients’ lives, that would be practically relevant, despite an abysmal effect size. I refer to you back to a previous entry, “Different kinds of important.”

Broken record note: Lots of variables means lots of tests. And a higher chance of making a type I error. Pick a primary variable already.]

Results:

[Broken record note: There was no mention as to how many subjects dropped out of the study or why. When I count the little dots on the horrible graph above, I count 30 dots. Whether this means that 3 people dropped out, or were lost to follow-up, or whether it means that 3 people’s dots overlap 3 other dots, I have no idea.]

Apart from the weaknesses I’ve pointed out above, the results section in this paper was…hellish to read. Let me try to sort this out by variable.

Diet: No statistically significant difference was detected between any of three groups for caloric intake. What’s evident is that there was MASSIVE variation within each group in terms of caloric intake. The mean caloric intake of the placebo group was 2991 calories (SD 809 kcal). The BA+Creatine group ate an average of 3222 calories (SD 856 kcal), and the creatine group ate 2999 calories (SD 546kcal).

[In a normal distribution (which is implied when you report a mean and standard deviation–although, not everyone knows this), 66% of all the observed values will lie within one standard deviation of the mean; 95% of all the observed values will lie within 2 standard deviations. So, for the BA+creatine group, 95% of the subjects in that group ate anywhere between 1510 calories to 4934 calories a day! Albeit, it doesn’t look like the groups really differed from one another very much, so we are relatively assured that this doesn’t really affect the final conclusion of this study, but that’s still a very wide variance!]

Body composition: No significant differences were detected within groups from PRE to POST with respect to total body mass. Redundant testing of the change in total body mass also did not reveal and significant differences within groups.

This is where the reporting goes all wonky.

The authors report that significant differences were found for change in percent body fat, but only cite two numbers: -1.21 (SD 1.12) vs. 0.25 (SD 1.53). Err….there were THREE groups in this study. What’s even more baffling is that they don’t tell us which two of the three groups these number belong to! The same goes for the reported significant difference between two groups for change in lean body mass: 1.74 (SD 1.72) vs -0.44 (SD 1.62). What the hell is that?

There are three graphs as well, that look like this:

The above graph is supposed to represent individual data for change in lean body mass. This has got to be one of the most useless graphs I have seen in a paper. The only thing it tells me is that the BA+Creatine group (CA) isn’t normally distributed–they’re all clustered up in the 2.5-3 kg range–which is good if you’re trying to show BA+Creatine helps to build lean body mass, but bad if you’re using summary statistics (i.e. mean and standard deviation) that are meant for normal distributions when your distribution clearly isn’t normal. And also bad if you’re using parametric statistical tests on non-parametric data. Just because a test is robust against violations of its assumptions, doesn’t mean you should just go ahead and use it. The reason why we even bothered to figure out how robust it was, was because it was clear that a crapload of investigators were using inappropriate statistics and we didn’t want the research to go to COMPLETE waste.

[What gets my goat though, is the fact that they found a significant difference between groups in change in percent body fat, but didn’t find one in change in fat mass, but one of the “major” findings it that beta-alanine+creatine caused a greater decrease in percent body fat. So, basically, they picked the “significant p-value” to report as a major finding, while ignoring the disparate result that they failed to find a difference in change in ACTUAL fat mass. So does beta-alanine+creatine help you burn fat or not? Apparently it does if you use one number, but not if you use a different one–even though they measure the SAME THING. I suppose it’s possible that BA+creatine increases lean body mass to the point where you’ll see a difference in relative fat mass (which is what percent body fat is); but I don’ t see any results for “percent lean body mass”. ]

Strength: All three groups showed strength improvements in comparison to their initial 1RM’s for both squat and bench press. Both the creatine and the BA+creatine groups did statistically better than the placebo group. The authors did not comment on how they did compared to each other. I can only assume that they failed to find a statistically significant difference between the two groups (at least that’s how it looks in their graphs).

Power: No significant differences were detected within or between groups for any of the power tests.

Blood chemistry: No significant differences were detected between groups for any of the biochemical markers. They did find a significant difference within the creatine group for resting testosterone levels, but that’s after more than 20 statistical tests. And again, no difference between groups, so it’s pretty moot anyways.

Workout intensity/volume: The authors also did an unplanned analysis of workout intensity and volume. Intensity was measured by how close subjects lifted (for squat and bench press) to their 1RM. Volume was measured as the total weight lifted for bench press or squat. They found that both the creatine only and the BA+creatine group tended to have more intense workouts with higher volumes for squats, but only higher volumes for bench press.

[Unplanned analyses can be tricky. They’re good for generating questions for future studies, but should never be taken at face value when the research question isn’t being addressed.]

Discussion by the authors:

Before I delve into the authors’ conclusions, I want to preface this section by saying I lost count of the number of significance tests performed on the data. All of the authors conclusions are based on significant values observed in the midst of a multitude of tests. There was virtually no effort on the part of the authors to explain all of the non-significant results. Instead, much of the focus was placed on the significant findings, despite the fact that they’re all on a background of a plethora of tests and unadjusted for multiple testing. Additionally, even though the original stated purpose of the study was to compare creatine to creatine+beta-alanine, there was very little comment towards making conclusions to this comparison.

The major points by the authors were:

1) “The use of creatine and creatine with beta-alanine appeared to provide for a higher quality workout, and the addition of beta-alanine to creatine appeared to enhance training volume more so than supplementing with creatine alone.”

If I ignore the background of multiple tests, then I could agree with the fact that both creatine and creatine with beta-alanine appeared to make for a better workout. However, there’s nothing in this paper to support the idea that the addition of beta-alanine enhanced workout “quality” more than just creatine alone.

2) “In addition, beta-alanine supplementation appeared to have the greatest effect on lean tissue accruement and improvements in body fat composition.”

I think I’ve already been through this one above.

3) “It does appear that the addition of beta-alanine to creatine provides an additive benefit in reducing fatigue rates during training sessions compared to creatine alone.”

I have absolutely no idea where this conclusion comes from. It’s true that just because you don’t find a significant difference between groups that one doesn’t actually exist in the larger population, but there’s no evidence in this study to back-up this conclusion at all.

4) “No significant changes were seen during the 10wk training program in any of the power performance measures…”

The authors attribute this finding to the fact that the subjects did not train specifically for the Wingate test (i.e. the lifted weights, instead of doing interval cycling training), which begs the question of why they chose to use the test in the first place. Specificity is not exactly a new concept in training…

5) “A significant elevation in total testosterone concentrations was seen in this 10wk study for creatine only. In addition, a trend (P=0.056) was seen for a greater free testosterone index in creatine as well, suggesting a greater availability of testosterone to interact with androgen receptors. It is difficult to explain why resting testosterone concentrations were elevated for creatine but not for creatine with beta-alanine…”

If I thought I could swear on this blog without sounding unprofessional, I would (for those of you who will be at the JP Summit this weekend, you will likely get the full sensory experience that is “Bryan loses it.”) a) It’s not difficult to explain at all, if you consider how many tests were performed. Could this be a type I error? (The answer is most probably, “YES.”) b) A p-value that is close to the critical level (in this case 0.05) does NOT, I repeat NOT indicated a “trend”. You either meet the critical value or you don’t. There is no meaning to the phrase, “The data approached p=0.05.” At any rate, if we actually corrected for multiple tests, a p=0.056 wouldn’t even come close to the adjusted critical alpha level. So this discussion point is moot.

The discussion rambles on, but I don’t think I can stand picking apart the rest of it because it just makes me angry.

Both this study and the Stout study (from last week) come from the same research group and are both funded by EAS. I’m not saying that that invalidates the study–because there’s so much more that does that, but I’m glad to see that it’s in their disclaimer.

The Bottom Line:

If I ignore a lot of bad stuff, here’s what I take away from this study:

1) There is weak evidence from this study to show that creatine and creatine with beta-alanine supplementation can have benefits to body composition, strength and workout quality.

2) There is NO evidence from this study to show that the addition of beta-alanine to creatine supplementation has any additional benefit to anything that was measured in this study.

3) I despair at the future of fitness research on a weekly basis.

So, if you’re thinking of buying beta-alanine, stop. Serious. Just stop. I don’t know how to get this point across any more effectively. Stick with your creatine and you’ll be fine.

I really need a flashy anti-ad to compete with the beta-alanine hype.


Click Here to view the Full Version of our Website