Editorial Standards & Review Methodology
How we research, score, and rate every supplement we review — and the principles that guide our editorial decisions.
Our Review Process
Every review published on What's In It? follows a standardized process designed to produce consistent, comparable, and evidence-based evaluations:
- Product selection. We identify products based on market popularity, search demand, and category gaps — not brand requests. No brand can request, pay for, or influence the timing of a review.
- Label analysis. We obtain the full supplement facts panel and ingredient list. We document every ingredient, its form, and its per-serving dose.
- Literature review. For each ingredient, we research the clinical evidence — effective doses established in peer-reviewed studies, bioavailability differences between forms, and any known safety considerations.
- Dose evaluation. We compare each ingredient's listed dose to clinically effective ranges from the research literature. Ingredients below effective thresholds are flagged.
- Formula assessment. We evaluate the formula holistically — ingredient synergies, potential redundancies, the overall nutritional profile, and how the formula serves its stated purpose.
- Scoring. We apply our standardized scoring criteria (detailed below) to assign a rating out of 10.
- Review & publication. The complete review is fact-checked, reviewed for accuracy, and published with full transparency about our scoring rationale.
Scoring Criteria
Our overall score is a weighted assessment across the following dimensions:
| Criteria | Weight | What We Evaluate |
|---|---|---|
| Formula & Dosing | 35% | Are ingredients dosed at clinically effective levels? Are forms bioavailable? Is the formula well-designed for its stated purpose? |
| Ingredient Quality | 25% | Sourcing standards (grass-fed, pasture-raised, organic where relevant), processing method (freeze-dried vs. heat-processed), third-party testing. |
| Transparency | 15% | Full ingredient disclosure (no proprietary blends), clear labeling, honest marketing claims, accessible supplement facts. |
| Value | 15% | Cost per serving relative to formula quality. A higher-priced product can still score well if the formula justifies the premium. |
| Brand & Trust | 10% | Manufacturing standards (GMP, third-party audits), return policy, company transparency, consistency of claims across channels. |
Rating Scale
Our 10-point scale is designed to be meaningful and differentiated — not inflated. An average supplement should score around 5, not 8.
- 9.0 – 10 Exceptional. Best-in-class formula with clinically effective doses, excellent sourcing, full transparency, and strong value.
- 7.5 – 8.9 Very Good. Strong formula with minor gaps. Recommended for most people in the target demographic.
- 6.0 – 7.4 Above Average. Decent formula with notable gaps in dosing, transparency, or value. Better options likely exist.
- 4.0 – 5.9 Below Average. Significant issues with dosing, transparency, or ingredient quality. Not recommended for most people.
- 2.0 – 3.9 Poor. Major deficiencies — underdosed ingredients, misleading claims, poor transparency, or serious value concerns.
- 0 – 1.9 Unacceptable. Potentially harmful, deceptive, or entirely ineffective at stated doses.
What We Don't Do
- We do not accept payment, samples, or sponsorship from supplement brands.
- We do not use affiliate links or earn commissions from product purchases.
- We do not provide medical advice or recommend specific products for medical conditions.
- We do not test products in a laboratory (we analyze published ingredient lists and doses against clinical literature).
- We do not penalize brands for being new or small — formula quality is what matters, not brand recognition.
A note on lab testing: Unlike services such as ConsumerLab that perform laboratory testing to verify label accuracy, our analysis focuses on formula design — whether the stated ingredients and doses are clinically meaningful. Both approaches provide value: lab testing answers "is what's on the label actually in the bottle?" while our analysis answers "even if the label is accurate, is this a well-designed formula?"
Sources & Research
Our reviews cite clinical research from peer-reviewed journals, government databases (NIH, USDA), and established nutritional science references. We prioritize:
- Randomized controlled trials (RCTs) over observational studies
- Human studies over animal models
- Meta-analyses and systematic reviews where available
- Primary research over secondary summaries
We link to specific studies when making claims about effective doses or ingredient benefits. If the evidence for a particular ingredient is limited or mixed, we say so explicitly.
Corrections & Updates
We take accuracy seriously. When we get something wrong — whether it's a factual error, an outdated formula, or a mischaracterization — we correct it promptly.
- Minor corrections (typos, formatting, non-material clarifications) are made directly without notation.
- Material corrections (factual errors, score changes, formula updates) are noted at the top of the relevant review with a dated correction notice.
- Formula changes: When a brand reformulates a product, we update the review to reflect the current formula and note the change.
To report an error, email us at hello@whatsinit.co with specific details and supporting sources. We investigate every factual dispute.
Editorial Independence
Every editorial decision at What's In It? — what to review, how to score it, what to say about it — is made independently by our team. No external party has the ability to preview, approve, modify, or suppress any content on this site.
This independence is non-negotiable and foundational to our mission. If we can't be trusted to tell the truth about a supplement formula, we have no reason to exist.