Aligned with research frameworks from
Stanford IO·Columbia Journalism Review·Harvard Shorenstein Center·DFRLab
Trusted by communications, security, and research teams
The data your methods section will reference
Authenticity metadata, platform coverage flags, and coordinated-activity signals — structured for reproducibility and peer review from day one.
Capabilities
Citation-grade authenticity scoring, reproducible exports, and methodology documentation written for methods sections — so your findings hold up.
Cross-platform signal data
Normalized data from 8 platforms — X, Reddit, YouTube, Facebook, Instagram, Threads, Bluesky, and LinkedIn — in a single schema. No platform-by-platform collection scripts, no ad hoc normalization work—Rolli handles the pipeline so you can focus on analysis.
Reproducible exports
Structured JSON exports with documented schema versions, platform_coverage flags, and field-level provenance metadata. Every export can be fully described in a methods section.
Longitudinal narrative tracking
Track how a narrative evolves across time and platforms. Configure retention windows appropriate for your study timeline and receive consistent, timestamped data snapshots.
Authenticity metadata for methodology
Every item includes authenticity_score, coordinated_flag, and velocity_score—fields that enable methodological controls for inorganic amplification in your study design.
Multi-platform ingestion
One schema. Eight platforms. Zero normalization work.
Research Outcomes
What researchers actually get
Peer-reviewable output
Every Rolli export is designed to survive academic scrutiny — structured for citation, methods documentation, and reproducibility.
- Citation-ready data with schema version and collection timestamp
- Structured JSON/CSV exports with field-level provenance metadata
- Methodology documentation written for methods sections
- Reproducible study configurations for multi-researcher replication
Who Uses Rolli
From university labs to policy think tanks — one truth layer built for academic rigor.
Academic researchers
Social scientists, communications scholars, and computational researchers studying narrative diffusion, political communication, and online discourse at scale.
Request access →Policy analysts
Government-adjacent analysts and non-profit policy researchers tracking legislative narrative environments, public opinion signals, and influence operations affecting policy outcomes.
Request access →Think tanks
Research organizations producing reports on media ecosystems, platform governance, and information integrity that require cross-platform, citable data sources.
Request a demo →AI safety researchers
Teams studying LLM-generated narrative amplification, AI-assisted influence operations, and the detection of machine-generated content at social scale.
Talk to an engineer →Get your first narrative intelligence brief in 24 hours
Talk to a researcher who built this. No sales cycle required.
Methodology Readiness
What your methods reviewer will ask — and what Rolli gives you to answer
Next onboarding cohort: 14 spots remaining · Setup in under 10 minutes
Methodology documentation built to cite
Field-level provenance, versioned schema, and methodology docs purpose-built for methods sections — used by ICFJ fellows and university researchers.
See how researchers use Rolli →How academic researchers use Rolli
A sample query output — showing the structured, reproducible format built for methods sections and peer review.
All fields documented in methodology guide · Versioned schema · Field-level provenance metadata included
Methodology, data quality, and access
Authoritative research on coordinated inauthentic behavior
Rolli's methodology is grounded in academic and platform research on coordinated inauthentic behavior detection.





