Leasebox · TrinitasConfidential
Campaign review

Reflex Blu campaign
review

Review of the 82-day Reflex Blu flight for Atlas at Richland Road and Current on Center, compared against GA4 site data and industry benchmarks.

Flight
Jan 9 — Mar 31, 2026
Properties
Atlas · Current on Center
Impressions reviewed
~5.35 M
Prepared
Apr 14, 2026
01 · CoverReflex Blu review
What's in this review02

What's in this review

§ 02

Three sources, one flight window. Each answers a different question; together they show a consistent picture.

Source 1

Reflex Blu campaign report

Their Mar 31 deck: impressions, clicks, CTR, frequency, and the conversion figures they used to characterize the flight.

Source 2

Atlas exchange placement file

51,028 rows covering every placement across 7,437 unique domains and apps. Shows where the impressions actually ran.

Source 3

GA4 data for both sites

Reflex Blu traffic is UTM-tagged blu/*, so it can be isolated and compared against every other source on the site.

§ 10 tests Reflex Blu's own explanation for the short-session volume. The closing slides summarize and list next steps.

Framing02
Source 1 · Reflex Blu report03

What Reflex Blu reported

§ 03

Aggregated across both properties and both channels from the Mar 31 Reflex Blu campaign review.

Impressions
5.35M
Meta + Display, both sites
Clicks
6,515
Blended CTR 0.12%
Applicant approvals
4
3 Current · 1 Atlas
Leasing-office visits
177
RFID geofence signal
Line itemImpressionsClicksCTRReported outcome
Current · Meta — Prospecting234,7496060.26%3 approvals
Current · Meta — RFID431,9331,1760.27%
Current · Display — Prospecting758,8723540.047%
Current · Display — RFID1,508,6361,2350.082%63 visits
Atlas · Meta — Prospecting115,2233410.30%1 approval
Atlas · Meta — RFID433,3401,3580.31%
Atlas · Display — Prospecting339,8801530.045%
Atlas · Display — RFID1,531,3171,2920.084%114 visits

Source: Trinitas Campaign Review 03312026.pdf.

Reflex Blu report01/09 – 03/31/2026
Industry benchmarks04

Industry benchmarks for comparison

§ 04

The Reflex Blu deck cites its own benchmarks. Public industry benchmarks sit higher. Both are shown below.

ChannelReflex Blu referenceIndustry referenceFlight result
Display — standard 0.03 – 0.05% 0.08 – 0.10% 0.045 – 0.06% CTR
Display — RFID audience 0.03 – 0.05% 0.08 – 0.10% 0.082 – 0.15% CTR
Meta (Facebook / Instagram) 0.16 – 0.20% ~0.99% (real estate) 0.26 – 0.35% CTR
The Meta comparison is direct — the WordStream benchmark measures the same link-click CTR cited in the deck. The display range is a central estimate; it varies by source.

Sources: WordStream Facebook Ads Industry Benchmarks 2024 (real-estate vertical); Basis / Centro / Databox programmatic display reports 2024.

Benchmarks04
Source 2 · placement file05

Where the ads ran

§ 05

The Atlas exchange report covers 1.74 M impressions across 7,437 unique domains and apps. Most of the top publishers are mobile games and legacy webmail.

Yahoo Mail66,981 imps0.004%
Happy Color (game)46,2400.052%
Wordscapes42,6260.040%
Solitaire Associations39,1010.041%
iFunny (meme app)37,7500.029%
Woodoku34,2330.023%
KakaoTalk messenger29,0560.021%
Vita Mahjong28,9600.086%
AOL mail26,7170.004%
Jigsawscapes puzzles21,7950.133%
Scrabble GO20,1780.005%
Grindr (dating app)16,1370.099%
The higher-CTR rows are mobile games with interstitial ads — a format known for accidental-tap clicks. The low-CTR rows (Yahoo Mail, AOL, Nativo 0.007%) are traditional display surfaces.

Source: Trinitas_-_Atlas_Richland_Rd_-_ExchangeReport_012826-032526.xlsx.

Placement mix05
Source 3 · GA4 — Current on Center06

Current on Center — GA4 traffic by source

§ 06

GA4 property 484378882, full flight, sources ordered by session volume. Reflex Blu traffic is UTM-tagged blu/*.

Source / mediumSessionsEngagedEngagement rateKey events
direct / (none)2,7181,80866.5%3,034
google / cpc2,6301,69864.6%2,393
google / organic1,7561,29974.0%2,016
gro_marketing / facebook5,0201,12622.4%459
blu / display  (Reflex Blu)3,3621634.8%5
blu / social4864910.1%13
blu / paid_social421358.3%13
Reflex Blu's three channels combined: 4,269 sessions, 31 key events. Google CPC at fewer sessions: 2,393 key events.
GA4 · 48437888206
Source 3 · GA4 — Atlas at Richland Rd07

Atlas at Richland Rd — GA4 traffic by source

§ 07

GA4 property 484357026. Atlas is informative because blu/display and google/cpc arrived at nearly identical session volumes — a near-controlled comparison.

Source / mediumSessionsEngagedEngagement rateKey events
direct / (none)3,3032,10263.6%3,098
google / cpc2,5021,67667.0%2,025
google / organic1,7341,24271.6%1,868
gro_marketing / facebook93732634.8%310
blu / display  (Reflex Blu)2,497251.0%1
blu / paid_social1373727.0%8
blu / social721318.1%3
zillow / cpc927480.4%97
Same session volume as Google CPC. One key event versus 2,025. A Zillow CPC line 27× smaller produced 97 key events.
GA4 · 48435702607
Channel quality08

Key events per 1,000 sessions (Atlas)

§ 08

Of visitors each channel delivers, how many produce a tracked key event. From the previous slide.

google / organic
1,077
zillow / cpc
1,054
direct
938
google / cpc
809
gro_marketing / fb
331
blu / paid_social
58
blu / social
42
blu / display
0.4
blu/display: 0.4 key events per 1,000 sessions. Google CPC: 809. Same property, same window. That gap is what this review is trying to explain.
Channel quality08
Flight vs prior 82 days09

Site totals: before and during the flight

§ 09

Each property vs its own 82-day baseline (Oct 19 – Jan 8). Sessions rose; per-session quality softened.

Property / periodSessionsUsersAvg durationBounce rateKey events
Current · baseline6,8474,434233s40.7%4,885
Current · flight18,98512,590171s60.7%9,892
Atlas · baseline12,4107,806177s37.6%7,261
Atlas · flight13,8749,883156s50.4%9,697
Duration down 25–27%, bounce up 13–20 points. Remove blu/* and the other channels sit at baseline.

Sources: GA4 properties 484378882 (Current on Center) and 484357026 (Atlas). Run-report API, Apr 14, 2026.

Site totals09
Testing the noise hypothesis10

Geographic filter: testing the noise hypothesis

§ 10

Reflex Blu's explanation: DSPs validate click-through URLs from datacenter IPs, creating short "sessions" that aren't real users. Testing it by filtering GA4 to each property's local metro.

Channel · propertySessionsIn-marketIn-market %In-market key events
google / cpc · Current2,88099834.7%968
google / cpc · Atlas2,89559920.7%537
gro_marketing / fb · Atlas94448751.6%202
gro_marketing / fb · Current5,1502,28344.3%260
direct · Current3,07548315.7%584
direct · Atlas3,65139010.7%512
blu / display · Current3,32638511.6%1
blu / display · Atlas2,494180.7%0
Partially supported: blu/display has the lowest in-market share of any paid channel (0.7% Atlas, 11.6% Current), so some volume is plausibly validator traffic. But in-market sessions produced 0 and 1 key events, versus 537 and 968 for google/cpc. The filter shrinks the denominator without changing the outcome.

In-market = GA4 city field inside the local metro (NW Arkansas / Auburn–Opelika) with matching state. IP-derived and imperfect; for student housing it errs toward under-counting legitimate remote prospects, which makes this a conservative test.

Hypothesis test10
Measurement notes11

What's measured, what isn't

§ 11

The deck's conversion figures and GA4's figures describe different things. Both are legitimate with different error modes.

In the deck

4 applicant approvals

Back-matched from Trinitas's leasing system against Reflex Blu's audience lists. Useful but depends on PII matching and has no control group.

In the deck

177 leasing-office visits

RFID geofence pings for device IDs in the audience who appeared near the leasing office. A proximity signal — captures incidental pedestrians alongside walk-ins.

Not yet installed

No Meta Pixel or conversion pixel on the sites

Reflex Blu's "Next Steps" slide lists pixel installation as a future task. Until then the DSP can optimize toward clicks but not toward on-site behavior.

In GA4

Clickstream attribution via UTMs

Because blu/* is tagged on every ad URL, GA4 reports exactly what those clicks do after they arrive. This is what §§ 06–07 show.

Measurement11
Explaining the gap12

Explaining the gap

§ 12

Three things appear to be happening at once on the blu/display line. None require bad intent; together they explain the gap.

Factor A

DSP validator and link-test noise

Pre-bid URL checks and ad-verification scanners hit landing pages from datacenter IPs that don't resolve to Auburn or Fayetteville. The geo filter confirms this: blu/display has the lowest in-market share of any channel. It explains a slice of the short-session volume — the part Reflex Blu described.

Factor B

Accidental taps on mobile-game interstitials

The exchange file's top publishers are Wordscapes, Happy Color, Woodoku, Solitaire, Scrabble GO. Interstitial formats produce elevated CTR from close-button fat-fingers. The resulting sessions bounce in under two seconds. In GA4 they look identical to Factor A and make up a larger share of the volume.

Factor C

No pixel → optimize toward click

Without a conversion pixel on the Trinitas sites, the DSP has no on-site signal to learn from. Its only feedback loop is CTR, so it concentrates spend on whatever inventory produces the cheapest clicks — which is the inventory in Factor B.

Why the filter doesn't rescue it

Even in-market, the channel stays flat

Current in-market blu/display: 385 sessions, 1 key event. Atlas in-market: 18 sessions, 0 key events. If Factor A were the main story, filtering would recover meaningful engagement on the remaining slice. It does not — because Factors B and C affect in-market taps equally.

The deck and GA4 are both telling the truth about different events. It's the predictable output of a media setup without on-site measurement running against uncurated inventory.

Explanation12
Next steps13

Next steps

§ 13

Four actions. 1 and 2 are no-regret regardless of how the rest of this lands.

01

Install conversion tracking on both sites

Meta Pixel and GA4 conversion events. Until this is done, every media partner — Reflex Blu included — is flying partially blind. One-day task.

02

Ask Reflex Blu for spend and CPMs by line item

The deck doesn't include dollars. Cost per key event (their invoices combined with the GA4 figures in §§ 06–07) is the number that makes the display-line decision obvious either way.

03

Discuss inventory curation with Reflex Blu

Specifically: app-category exclusions for casual games, interstitial-format exclusions, and a reviewed publisher allowlist. A typical multifamily programmatic buy runs on a much shorter list than 7,437 placements.

04

Run a holdout test once the pixel is live

Pause Reflex Blu display on one property for 30 days while keeping it on the other. Compare key events on each. This is the only clean way to test whether the display line contributes incrementally.

Next steps13
Sources and methodology14

Sources and methodology

§ 14 · End

Every figure in this deck traces to a file or an API call that can be rerun.

SourceScopeContent
Trinitas Campaign Review 03312026.pdfReflex Blu self-report, 8 line items, 1/9 – 3/31/26impressions, clicks, CTR, reach, frequency
Atlas Exchange Report 012826–032526.xlsxPlacement-level data for Atlas display51,028 rows · 7,437 unique placements
GA4 Data API · property 484378882Current on Center · flight + baselinesessions, users, engagement, key events, source/medium, geo
GA4 Data API · property 484357026Atlas at Richland Rd · flight + baselinesessions, users, engagement, key events, source/medium, geo
WordStream Facebook Benchmarks 2024Real-estate vertical link-click CTRindustry avg ~0.99%
Basis / Centro / Databox display benchmarksProgrammatic display CTRindustry avg 0.08 – 0.10%

Prepared by Leasebox for Trinitas Ventures leadership. Questions, methodology requests, or re-runs of any table — reach out directly. All GA4 queries are rerunnable on file.

April 14, 2026 · Confidential

End of reviewLeasebox
↑ ↓ · space · scroll