← Back to Field notes
Performance5 min

Before vs after: what changes when ads get author comments

A clean way to measure what really changes after proactive Page comments are added: CTR, CVR, quality signals, and how to avoid fake wins from spend shifts.

Most “before vs after” comparisons are useless because people compare different ads, different weeks, and different budgets.

If you want a real answer, treat author comments like a small change to the ad unit and measure it the same way you’d measure any performance change: tight cohorts, stable spend, and a control group.

The thread under an ad isn’t decoration. It’s where users look for confirmation, answers, and red flags. When you shape that environment early (with a short, helpful Page comment), you often change behavior in two places: the click decision and the conversion decision.

## What usually changes after adding author comments

### 1) Faster trust, fewer drop-offs People don’t buy because they saw your headline. They buy when they believe you and understand the next step.

A good first comment compresses the decision into: proof + clarity + path. It makes the ad feel “alive” and reduces the time it takes for a user to decide whether your offer is real.

### 2) Cleaner click path A lot of users do not click the primary CTA immediately. They scan the thread, look for legitimacy, and hunt for a direct link.

If the first thing they see is a one-liner plus the correct landing page, you remove friction. That can show up as a CTR lift, especially on mobile.

### 3) More stable quality perception Messy threads create “this feels sketchy” signals. Helpful threads create “this is normal” signals.

You can’t control what everyone says, but you can control your first move: short, human, accurate, and not spammy.

## The clean before/after setup (don’t cheat)

You’re trying to answer one question: **Does adding proactive author comments improve outcomes versus similar ads that didn’t change?**

Use these rules:

- Compare ads with the same offer, market, optimization goal, and landing page type. - Keep spend stable (no scaling week vs low-spend week comparisons). - Use the same time window (seasonality is real). - Keep a control group (a slice of ads with no author comments).

Without a control group, you’ll accidentally measure budget movement and call it “comment impact.” Humans love doing that.

## Metrics to track (simple, not cute)

### Core performance - Link CTR (or outbound CTR) - CVR (landing page conversion rate) - CPA / cost per purchase

### Thread health - Negative feedback trend (hides/reports if you track it) - Objection frequency (what users keep asking) - Comment link clicks (UTMs help)

## What a real “win” looks like

A real win is not “more comments.” A real win is:

- CTR improves (clearer next step) - CVR improves (less confusion arriving) - CPA drops or stabilizes (money)

If only CTR goes up but CVR drops, your comment might be pushing low-intent clicks. That’s not a win, that’s a bill.

## Two failure modes that ruin results

### 1) Wrong link, wrong market, wrong language This kills trust instantly. At high volume, a small error rate becomes constant damage.

### 2) Spam vibes Repetitive, overly salesy comments look like automation. Users respond with annoyance, not purchases.

## The repeatable playbook

1) Pick 2–3 stable markets. 2) Create a matched cohort of ads. 3) Add author comments to half. 4) Run 7–14 days. 5) Compare CTR, CVR, CPA. 6) Review thread health (questions, sentiment, negative feedback).

If the lift is real, it shows up in money metrics, not in vibes.

## Bottom line

Author comments work when they act like a tiny in-feed FAQ and a trust anchor: short, accurate, and market-correct.

The advantage at scale is consistency: correct link, correct language, correct tone, every time.

Sources (for further reading): - Meta Business Help Center: Ad Relevance Diagnostics and ad quality concepts - Meta Business Help Center: Comment moderation tools and best practices - Spiegel Research Center (Northwestern / Medill): how reviews influence conversion behavior - Nielsen: research on trust in advertising and recommendations

More field notes