Augmenting A/B Testing Insights with AI at AT&T

Summary

While embedded on AT&T’s A/B testing team, I identified an opportunity to improve how our team interpreted and acted on raw experimentation data. I lead the project to build a custom interface powered by a local language model (LLM) trained on Adobe Analytics exports — allowing designers and strategist to query results, explore data-driven insights, and even generate new testing hypotheses.

The Challenge

AT&T’s marketing and design teams ran frequent A/B tests across the website — but the insights often arrived as raw exports from Adobe Analytics with minimal context or accessible framing.

Pain points included:

  • Slow interpretation cycles from analysts to design teams

  • Difficulty surfacing repeatable patterns or overlooked anomalies

  • Lack of creative ideation inspired by live test data

My Role

  • Interfaced with the analytics team to understand data structure and exports

  • Built a lightweight local language model (LLM) trained on historical testing data

  • Designed a simple queryable UI to index and explore the model’s understanding

  • Added functionality for the LLM to:

    • Summarize test results

    • Highlight anomalies or trends

    • Suggest new test ideas based on behavioral patterns

Process Overview

1. Data Exploration

  • Reviewed Adobe Analytics exports and metadata structures

  • Identified variables like CTR, bounce, CTA placement, time-on-page, device type

2. LLM + UI Build

  • Created a sandboxed environment to feed test data into a local LLM

  • Prompt-tuned model outputs for:

    • Executive summaries

    • Test prioritization suggestions

    • Pattern surfacing ("Tests with similar device drop-off")

3. UI Design

  • Designed a simple front-end interface to:

    • Search test logs by goal, date, variant, and device

    • Generate insights on-demand

    • Log and rank potential future test ideas

4. Internal Sharing

  • Shared the tool with design + experimentation teams

  • Received positive feedback on speed, clarity, and ideation support

Impact

  • Reduced time-to-insight for designers and strategists

  • Sparked more data-inspired design experimentation

  • Encouraged broader team adoption of AI tooling in analysis workflows

  • Helped teams identify overlooked patterns that were previously buried in raw CSVs

Reflection

This project showed me how design thinking, data, and AI can meet to do more than automate — they can augment. By turning raw analytics into a conversation, we unlocked creativity that was buried in spreadsheets.

Previous
Previous

Scaling Design Operations — Migrating AT&T Creative Teams from Sketch to Figma