← The Nerd's Guide
Buying GuideMarch 23, 2026

How We Aggregate Running Shoe Reviews

Our process for collecting, analyzing, and synthesizing expert reviews into clear recommendations — without bias or affiliate influence.

How We Aggregate Running Shoe Reviews

The methodology behind every rating, review, and "best of" pick on The Shoe Nerds.

Most running shoe reviews tell you what one person thinks after a few weeks in a pair. We do something different. Every review, rating, and category pick on this site is built by aggregating what dozens of reviewers, testers, and real runners are actually saying — then synthesizing that into one clear, trustworthy verdict.

Here's exactly how it works.


Why Aggregation?

A single reviewer — even a great one — brings their own biomechanics, preferences, and blind spots to every shoe they test. A heel striker may underrate a shoe that shines for forefoot runners. A lightweight runner may miss durability issues that heavier runners encounter. A professional tester in controlled conditions may miss what happens at mile 400.

No single reviewer can cover all of this. But dozens of them, taken together, can.

The Shoe Nerds exists to do the work of reading all of them so you don't have to.


Our Sources

Before writing anything, we search across four categories of sources for every shoe:

Expert and lab reviewers — sources that test shoes under controlled conditions, measure energy return, stack height, torsional rigidity, and outsole grip. These include RunRepeat, Doctors of Running, WearTesters, and Sole Review.

Enthusiast reviewers — experienced runners who log real miles and write detailed, opinionated breakdowns. These include Believe in the Run, Road Trail Run, Running Shoes Guru, OutdoorGearLab, and others.

YouTube reviewers — video reviewers who often show wear patterns, flex tests, and on-foot footage that written reviews miss. We pull from Kofuzi, Ben Parkes, The Run Testers, EddBud, Fordy Runs, Seth James DeMoor, and more.

Real runners — user ratings and review threads from Running Warehouse, Fleet Feet, Dick's Sporting Goods, REI, Zappos, and Amazon. We also pull from Reddit communities including r/RunningShoeGeeks, r/running, and r/Ultramarathon, as well as LetsRun forums.

We do not write about a shoe until we have meaningful coverage across at least several of these source types. If a shoe only has early first-impression reviews, we wait.


How We Write Aggregated Reviews

Once we've gathered sources, we identify three things:

Trends — what do most reviewers agree on? If eight out of ten sources call a foam "lively and responsive," that's a trend. If two call it "too soft," that's a caveat worth noting.

Repeated praise — specific things that multiple reviewers love, often in similar language. These become the shoe's genuine strengths.

Recurring complaints — the most important signal. A complaint that appears across multiple reviewers, across different running styles and body types, is a real issue. A complaint that appears once may just reflect that reviewer's preferences.

We then write the review to reflect those trends honestly — including the caveats. A shoe that earns a 9.2/10 from us still has a "The caveat" section. That's not a bug. It's the point.

We do not claim firsthand testing. Every review on this site carries a disclaimer making this explicit. We are analysts, not testers. Our value is synthesis and objectivity — not personal experience.


How We Calculate Aggregated Ratings

Every shoe in our database has a single aggregated rating out of 10. Here's how we arrive at it.

We search broadly across all source types — lab reviewers, enthusiast reviewers, YouTube reviewers, Reddit threads, and retail user ratings. We then calculate a weighted score based on:

  • How many sources cover the shoe
  • The consistency of sentiment across sources
  • The depth of testing (long-term reviews weighted higher than first impressions)
  • Retail user ratings, which represent the largest sample of real-world runners

We weight the score based on what the shoe is actually for. A race shoe is rated on how good it is as a race shoe — not on whether it's comfortable for recovery runs. A stability shoe is rated on how well it guides overpronators — not on how fast it feels at tempo pace.

Scores are rounded to one decimal place. We only publish a rating when we have enough source coverage to be confident in it. If coverage is thin, we note that.


How We Pick the "Best Of"

Our "Best Of" articles — Best Daily Trainers of 2026, Best Race Shoes, and so on — follow a separate research process. For each category, we search across the same source list with category-specific queries ("best daily trainer 2026," "best marathon shoe 2026," etc.) and identify which shoes appear most frequently across top-pick lists.

The winner is the shoe that:

  1. Appears most frequently across expert top-pick lists
  2. Has the highest aggregated rating across sources
  3. Is currently available — not discontinued or superseded
  4. Has strong user review consensus — not just press attention

We pick one shoe per category. If the consensus is genuinely split between two shoes, we pick the one with the higher aggregated rating and explain the tradeoff in the article.


What We Don't Do

We don't accept payments from brands to influence reviews or picks. We don't write about shoes we can't find meaningful independent coverage for. We don't fabricate sources or invent reviewer quotes. If we link to a source, it's a source we actually consulted.

We use AI to help synthesize information across large numbers of sources — but every article is reviewed and edited by human running-shoe analysts before publication. The AI speeds up the research. The humans make sure it's accurate.


Why This Matters

The running shoe market is noisy. New shoes launch every month. Brands spend heavily on marketing. Review sites face pressure from affiliate relationships and sponsored content. It's genuinely hard to know what to believe.

Our goal is to be the source you can trust precisely because we don't have a personal stake in any particular shoe winning. We just read what everyone else is saying, find the signal in the noise, and tell you what it means.

That's The Shoe Nerds.


Take the Quiz

Not sure where to start? Our shoe finder quiz asks you a few quick questions about how you run and gives you personalized recommendations based on your actual needs.

Take the Quiz →