Lenses and Reports
A) User Behavior
Traffic Quality presents a normalized view of source effectiveness using a 0–100 scale so you can judge performance beyond raw sessions. Typical output includes an overall quality score and per-source scores used throughout the module (e.g., Email 82/100, Organic 78/100, Direct 72/100). Engagement Quality by Source breaks down average session duration, pages per session, bounce, engaged sessions %, and scroll depth so teams see how users actually interact before conversions shift (e.g., Email with longer sessions and higher engaged % than Social). Time-Based Trends summarize weekly sessions/users to spot acceleration or drag early, with clean, week-over-week lines for each source. Peak Performance Windows call out the best day and hour by channel, making it straightforward to schedule campaigns and content (e.g., Email Thursday 9:00 AM; Paid Monday 11:00 AM). Cohort & Device Breakdowns reveal quality gaps between new vs. returning users and between desktop/mobile/tablet, helping prioritize where to optimize first.
Geographic Variations list top countries with their best-performing source to guide localization and regional spend (e.g., UK led by Organic; US led by Email).
Behavioral analytics from GA4 (bounce, exit, scroll depth, average time, average time to conversion) appear in one consolidated chart per campaign with metric and duration selectors, last-synced timestamp, hover tooltips, and AI summaries under the graph for quick interpretation; the view supports CSV export and shows a connect prompt if GA/GTM isn’t linked.
B) Ads Intelligence
Creative Lifecycle flags performance decay and refresh timing by tracking creative curves over time, preventing overspend on fatigued assets.
Message–Market Fit correlates messages with audience response so you can pair the right headlines and CTAs with the right segments.
Audience Fatigue highlights oversaturation early and suggests rotation windows.
Platform Synergy compares how platforms work together so integrated plans can re-balance spend to combinations that lift each other.
C) Competition
Trending Topics surfaces emerging themes early so you can ship pages and assets while the window is open.
Keyword Gaps identifies queries competitors win that you don’t, with practical bid/content guidance to capture missed demand.
Ad Copy Intelligence summarizes winning patterns (emotional triggers, CTA structures) seen in competitor ads, mapped to estimated performance.
Audience Targeting shows segments where rivals succeed and you under-index, with expansion ideas.
Landing Page Benchmarks distill UX and conversion tactics from competitor pages linked from their ads, to inform your own page updates.
D) User Segments
Audience Quality Clustering groups users by behavior to reveal high-value archetypes that deserve personalization or tests.
Device & Location Analytics compares performance across device types and geographies to guide prioritization (e.g., Mobile lagging for Paid Search).
Content Gaps highlight topics/forms that specific segments engage with but your site under-serves, a direct feed to content and CRO backlogs.
Churn Risk flags cohorts with falling engagement so lifecycle plays can be triggered before attrition shows up in revenue.
Outputs teams rely on
Key Performance Cards with sessions, users, and overall quality plus trend deltas for the period.
Traffic Source Matrix summarizing sessions, users, trend vs. last period, and quality score for each channel.
Engagement Quality Table consolidating session duration, pages/session, bounce, engaged %, and scroll depth by source.
AI Actionable Insights that call out anomalies, opportunities, budget shifts, patterns, and benchmark context; followed by Performance Summary & Next Steps with concrete actions (e.g., test paid search LPs, fix mobile speed, adjust budget, schedule content in top windows).
Last updated