Viewing Document Usage
Key Metrics per Document
Understanding these metrics helps you identify which documents are performing well and which need attention.Retrieval Count
How many times this document was retrieved across all traces.| Count Range | What It Means | What To Do |
|---|---|---|
| High (100+) | Core document that’s frequently needed | Keep this up to date—it’s critical to your AI’s performance |
| Medium (10-100) | Regularly used and important | Monitor for accuracy and keep content fresh |
| Low (1-10) | Occasionally useful for niche topics | Verify the content is still needed and relevant |
| Zero | Never retrieved, potentially irrelevant | Consider removing or improving embeddings |
Unique Queries
How many different queries retrieved this document.A document with 100 retrievals from only 2 unique queries is less valuable than one with 100 retrievals from 50 unique queries.
| Uniqueness Level | What It Means | Use Case |
|---|---|---|
| High Uniqueness | Broadly applicable content | Answers many different questions and serves diverse use cases |
| Low Uniqueness | Narrow use case | Same questions repeatedly, limited applicability |
| Document | Retrievals | Unique Queries | Assessment |
|---|---|---|---|
| Document A | 200 | 80 | Versatile content serving many use cases |
| Document B | 200 | 5 | Narrow focus, repetitive queries |
Average Relevance Score
Mean relevance score when this document was retrieved.| Score Range | Meaning | What It Tells You |
|---|---|---|
| > 0.8 | Strong match | Highly relevant, excellent semantic alignment |
| 0.6 - 0.8 | Good match | Relevant and useful for the queries |
| 0.4 - 0.6 | Weak match | Marginally relevant, may not be helpful |
| < 0.4 | Very weak match | Likely not helpful, poor semantic match |
Last Retrieved
When this document was most recently used.| Time Range | Status | What It Indicates |
|---|---|---|
| Recent (< 7 days) | Actively used | Currently in use for ongoing operations |
| Moderate (7-30 days) | Regularly used | Still relevant and useful |
| Old (30-90 days) | Infrequently used | Worth reviewing for continued relevance |
| Very Old (> 90 days) | Rarely/never used | Possibly outdated or no longer relevant |
User Feedback Correlation
How users rated traces that used this document. This metric shows you the quality of your documents from your users’ perspective:Thumbs Up Count
Traces using this document that received positive feedback
Thumbs Down Count
Traces using this document that received negative feedback
Satisfaction Rate
Percentage of positive feedback overall
| Feedback Pattern | What It Might Mean | Recommended Action |
|---|---|---|
| High Thumbs Down | Document contains wrong/outdated information, is unclear or confusing, or doesn’t answer the questions users are asking | Review and update immediately |
| High Thumbs Up | Document is valuable and helpful to users | Keep it well-maintained, consider creating similar high-quality content, and use it as a template for other documents |
Document Details Page
Click on any document to access comprehensive information and insights.Full Content
The complete text of this document chunk, exactly as stored in your vector database. This lets you see precisely what your AI has access to when this document is retrieved.Metadata
All fields synced from your vector database provide context about the document:- Source — Original file or URL where this content came from
- Last updated — When the source was last modified
- Tags — Categories or labels for organization
- Custom fields — Any other metadata you’ve configured
Usage Timeline
A graph showing document retrievals over time reveals important patterns:| Pattern Type | What It Looks Like | What To Investigate |
|---|---|---|
| Trending Documents | Sudden increase in usage | Something made this document more relevant—find out what changed |
| Declining Documents | Used to be popular, now ignored | Investigate why usage dropped—may be outdated or replaced |
| Seasonal Patterns | Spikes at certain times | Plan updates around predictable usage patterns |
Recent Traces
List of most recent traces that retrieved this document:- Trace ID — Link to the full trace for detailed analysis
- Query — What the user asked
- Timestamp — When this retrieval happened
- User feedback — Thumbs up/down if available
Related Documents
Documents that are frequently retrieved alongside this one reveal important relationships:| Insight Type | What It Reveals | Action To Take |
|---|---|---|
| Document Clusters | Related topics that are often retrieved together | Ensure these documents are well-maintained as a group |
| Coverage Gaps | Missing related documents that users need | Add new content to fill the gaps |
| Redundancy Check | Multiple docs with similar content | Consolidate duplicates for clarity |
Sorting and Filtering Documents
Find exactly the documents you need to review with powerful sorting and filtering options.Sort Options
| Sort By | What It Shows You | When To Use |
|---|---|---|
| Most Retrieved | Your core documents that power your AI | Finding high-impact content to maintain |
| Least Retrieved | Unused content that may need attention | Identifying candidates for removal |
| Highest Relevance | Best semantic matches across retrievals | Finding well-matched, quality documents |
| Lowest Relevance | Poorly matched documents needing improvement | Identifying documents to fix or remove |
| Most Recent | Recently added or updated documents | Tracking new content performance |
| Oldest | Long-standing content that may need refresh | Finding potentially outdated documents |
Filter Options
| Filter Type | Example Use | What It Helps You Find |
|---|---|---|
| Retrieval Count Range | Documents with 10-100 retrievals | Medium-usage documents for review |
| Relevance Threshold | Only docs with >0.7 average relevance | High-quality, well-matched documents |
| Date Range | Documents retrieved in last 30 days | Recently active content |
| User Feedback | Filter by positive or negative feedback | Documents linked to user satisfaction |
| Source Filter | Filter by original file or URL | Content from specific sources |
| Custom Metadata | Filter using your custom metadata fields | Documents with specific tags or attributes |
Use Cases
Here’s how to apply document usage analytics to solve real problems.Find High-Impact Documents to Update
Goal: Focus your limited time on updating what matters most.Result: Maximum impact from your update efforts—you’re improving the documents that matter most.
Remove Unused Documents
Goal: Clean up your vector database for better performance.Result: Leaner database, faster retrievals, and better-quality results.
Investigate Low-Quality Documents
Goal: Fix documents that are retrieved but not helpful.Filter for problematic docs
Find documents with high retrieval count + low average relevance (< 0.6)
Diagnose the issue
Determine if the problem is:
- Content is wrong/outdated → Update the content
- Content is fine but embeddings are bad → Re-embed the document
- Retrieval parameters are too loose → Adjust similarity threshold
Result: Better retrieval quality and more relevant documents in your results.
Track Content Gaps
Goal: Find topics where you need more documentation.Result: Comprehensive knowledge base with fewer “I don’t know” responses.
Measure Update Impact
Goal: Verify that updating a document improved performance.Result: Data-driven confirmation that your updates actually helped.
Debug Wrong Answers
Goal: Fix AI responses that provide incorrect information.Diagnose the issue
Determine if:
- Documents contain incorrect information
- Correct documents weren’t retrieved
- Wrong documents were prioritized
Fix the root cause
- If documents are wrong, update the source files
- If documents are missing, add new content
- If retrieval is broken, adjust parameters
Result: Trace errors back to source data and fix them systematically.
Compliance and Audit Trail
Goal: Demonstrate data provenance for compliance requirements.Result: Full provenance chain for regulatory compliance and auditing.
Exporting Document Usage Data
Export document metrics for external analysis and reporting.CSV Export Contents
The exported CSV includes all key metrics:- Document ID
- Content (truncated or full based on your selection)
- Retrieval count
- Unique queries
- Average relevance score
- Last retrieved date
- User feedback statistics
Common Use Cases for Exports
| Use Case | What You Can Do | Who Benefits |
|---|---|---|
| Custom Dashboards | Create visualizations in Excel, Tableau, or other BI tools | Data analysts and leadership teams |
| Team Collaboration | Share metrics with content teams and stakeholders | Content managers, product teams, and editors |
| Archival & Reporting | Maintain historical records and generate periodic reports | Compliance teams and auditors |
| Trend Analysis | Track changes in document performance over time | Data scientists and ML engineers |
| Content Planning | Identify gaps and prioritize content creation efforts | Documentation teams and content strategists |
Setting Up Alerts
Get notified about important changes to stay proactive.Critical Document Alerts
Monitor your most important documents for issues:Example Alert: “Pricing Policy Doc” retrieval count dropped 80% in the last week → investigate potential retrieval issue or content problem.
New Content Alerts
Track newly added documents to ensure they’re working properly:Best Practices
Follow these practices to get the most value from document usage analytics.Review Top 20 Documents Monthly
| Review Task | What To Check | Why It Matters |
|---|---|---|
| Check Accuracy | Verify your most-used documents contain correct information | Errors in frequently-used docs impact many users |
| Verify Freshness | Confirm documents are up to date with current information | Outdated content leads to wrong AI responses |
| Review Feedback | Read user feedback on traces that used these documents | User feedback reveals quality issues |
| Update When Needed | Make improvements based on your findings | Continuous improvement maintains quality |
Investigate Zero-Retrieval Documents
Every month, check documents that are never retrieved:| Question To Ask | What To Look For | Possible Action |
|---|---|---|
| Are they truly irrelevant? | Does this content apply to your AI’s use cases? | Remove if genuinely not needed |
| Are embeddings broken? | Is the content findable with semantic search? | Regenerate embeddings if broken |
| Should they be removed? | Is this taking up space without value? | Delete to optimize database |
Correlate with User Feedback
When users give thumbs down to AI responses:Track Seasonal Patterns
Some documents may have seasonal usage patterns that affect your update planning:
- Tax documents — Retrieved heavily in April
- Holiday policy docs — Spike in November-December
- Budget documents — Active at fiscal year end
Use Related Documents Feature
When creating new content:| Task | What To Do | Benefit |
|---|---|---|
| Check Related Docs | Review related documents for similar topics | Understand existing content landscape |
| Avoid Duplication | Ensure new content doesn’t duplicate existing docs | Prevent confusion and redundancy |
| Fill Gaps | Create content that bridges gaps between related documents | Provide comprehensive coverage |
