The people behind the vault
We got tired of
reviews written by people
who never touched the product.
“The product that got this started? A $65 ceramic spray that destroyed the clear coat on a panel we’d spent two weeks preparing. The Amazon listing had 1,400 five-star reviews.”
Garage Vault started out of frustration, not ambition. In 2022, after wasting close to $300 on automotive accessories over six months — a phone mount that cracked in direct sun, a detail spray that hazed a black hood, a jump starter that failed to start a cold 5.0 in January — it became clear that the Amazon review system in the automotive accessories space had completely broken down. The products with the most reviews weren’t the best products. They were the most marketed ones.
The first Garage Vault piece was a comparison of three ceramic detail sprays, bought on the same afternoon, applied to three panels of the same car, photographed at 30 days and 60 days under the same conditions. Two of the three — including the Amazon’s Choice badge winner — showed clear degradation by week six. That piece went live with the specific brand names, the specific lot numbers, and the photos. It got shared. People wanted more.
The first few months were rough in ways that aren’t usually talked about in the origin stories affiliate sites tell about themselves. We bought 22 products. We published 9 reviews. The rest either failed testing outright or didn’t produce findings clear enough to commit to a verdict. That’s a burn rate that doesn’t make business sense on a short timeline — but it makes sense if you’re trying to build something you’d actually read yourself. Which was always the only goal.
“We published 9 out of 22 products tested in the first four months. The other 13 were our cost of figuring out what an honest bar looks like.”
80+ products tested since then. 3 previously recommended products delisted. Roughly a quarter of everything tested has never been published. Those numbers don’t make the site look efficient — but they make the recommendations worth something.
Mission — Day to Day
Buy it. Use it. Publish what’s true.
Every week, at least one automotive product is purchased at retail price, used under real driving or garage conditions, and evaluated alongside at least one competitor at the same price point. We track build quality, actual performance metrics where measurable, and whether the product holds up past the point where most first-impression reviews stop paying attention. If a product fails at six weeks, the review reflects that. If something we already published fails at six months, the page gets updated and the recommendation gets pulled.
Vision — The Longer Game
A version of Amazon where the best product wins — not the best-reviewed one.
The automotive accessories category rewards review farming, white-labeling, and paid placements. It penalizes products that are actually better but harder to market. We want to be the resource that slowly, category by category, makes the better product more visible — not by gaming the system, but by doing the documentation work the system itself refuses to do. When enough people make buying decisions based on real test data instead of manufactured social proof, the incentive structure shifts. That’s the long game.
How we operate
The six rules
we actually follow.
The 6-Week Floor
No verdict before six weeks of use. Not negotiable.
The window between “first impression” and “this is actually how it works” is where bad products hide. A floor mat looks great at two weeks. By week seven, you know whether it’s curling at the heels and trapping water under the edges. We don’t publish before we have that answer.
Own Money Only
We buy everything we test. No review samples. No brand loans.
A brand that sends you a free product has already bought something from you — your independence. Even when we receive unsolicited samples (it happens), we don’t review them. If we want to cover a product, we buy it from the same place you would. Same listing, same variation, same shipping. No special treatment.
Name the Competitor
Every winner gets tested against at least one other option at the same price.
“The best dash cam under $100” is a meaningless statement without knowing what it beat. We buy competing products simultaneously, run them under the same conditions, and the comparison data gets published in full — including the tests where the losing product performed better on specific metrics. Context isn’t optional.
Kill Your Darlings
Previously recommended products get pulled if long-term testing says so.
Three products have been delisted since launch. In each case, the initial review was accurate at the time. Six months later, reality disagreed. When that happens, we update the page with the delisting notice and the specific reason — adhesive failure, capacity degradation, material breakdown. The page stays up. The recommendation comes down.
No Paid Spots. Ever.
Position on this site cannot be purchased, sponsored, or arranged.
We don’t have sponsored review programs, promoted placement tiers, or brand partnership structures. The brands we cover don’t know they’re being covered until after the piece goes live. We’ve received emails from PR reps offering “collaboration opportunities” after publishing. The answer is always no. Affiliate commissions from Amazon are the only financial relationship we have with the products we cover.
Show the Losing Data
The products that didn’t make it get published too — just not recommended.
When a product fails our testing bar, we don’t just quietly exclude it. In most comparison pieces, you’ll see the full performance data for every product tested — including the ones that lost. That data is useful. Knowing that a $90 jump starter produced worse cold-cranking results than the $55 unit next to it tells you more than knowing the winner’s spec sheet does.
How a review actually gets made
From cart to verdict:
what the process looks like.
Most affiliate site reviews are written off Amazon listings and manufacturer spec sheets. Here’s what ours actually involves — including the parts that cost us money and time before a word gets published.
Before We Buy
Category mapping and competitor selection
We start by identifying which products in a category are actually distinct versus which share the same factory origin under different branding. We cross-reference FCC IDs, internal model numbers visible in packaging photos, and known white-label manufacturer databases. This step determines how many products we buy — and occasionally reveals that five “different” products are actually one.
Day One
Retail purchase, baseline inspection, first-use documentation
Everything is purchased from Amazon at standard retail price, same as you would buy it. On arrival: packaging condition, contents check against listing, first physical inspection for material quality and build tolerances. About 15% of products get returned at this stage because the gap between the listing photos and the actual product is significant enough to document separately.
Weeks 1–6
Real-condition use — not staged, not ideal circumstances
Products get used the way you’d actually use them. Dash cams run through daily commutes, highway miles, overnight parking, and temperature variation. Detail products get applied to real panels under real environmental conditions, not climate-controlled demo environments. We specifically look for failure conditions — not to destroy products, but because that’s where the meaningful data lives.
The Verdict
A committed position — then follow-up at 3 months and 6 months
The published verdict names a winner, explains the margin, and publishes the comparison data including the products that lost. No “it depends on your needs” hedging unless the use case split is genuinely significant. Then it gets revisited. At 3 months and 6 months, we re-examine the products that made it to publication. Some get confirmed. Three have been pulled. Both outcomes get documented.
You’ve read this far. Good.
The reviews are waiting.
So is the mailing list.
If you found your way to this page, you probably already know what you’re looking for — a site that treats you like you can handle real information, including the unflattering kind. That’s what Garage Vault is. The reviews are organized by category. The newsletter goes out when there’s something actually worth sending. Neither will waste your time.
Skeptical? Good. That’s exactly the right starting point.
