Skip to main content

Know what your users are actually asking

Once your assistant is live, real users start asking real questions. Analytics gives you a window into those conversations — so you can understand what’s working, what’s not, and where to improve. Droog has two levels of analytics:
  • Workspace Analytics — a bird’s-eye view across all your assistants combined
  • Bot-Level Analytics — a detailed breakdown for each individual assistant

Workspace Analytics

The Workspace Analytics page gives you a combined view of activity across all your assistants in one place. This is the right place to start when you want to understand the overall health of your Droog deployment.
Workspace Analytics Dashboard

How to open it

  1. Go to your main Dashboard
  2. Click Analytics in the left navigation
You can filter by date range using the From and To date pickers, and toggle between Day and Month views to zoom in or out on trends. Use Active Bots Only to filter out inactive assistants from the view.

Summary metrics

The five cards at the top give you an instant health check across your entire workspace:
MetricWhat it means
Total SessionsThe number of individual conversations started by users across all your assistants
Total MessagesThe total number of messages exchanged — both user questions and assistant replies
Avg LatencyThe average time your assistant takes to reply to a message
P95 LatencyThe response time for the slowest 5% of messages — useful for spotting occasional delays
Escalation RateThe percentage of conversations where the assistant couldn’t fully resolve the user’s query on its own
A high Escalation Rate means users are frequently hitting dead ends. This is a strong signal to review your knowledge base and add missing content.

Below the summary cards is a chart that plots messages, sessions, and average latency over your selected date range — all in one view.
Usage and Performance Trends Chart
Use this chart to spot:
  • Spikes in sessions or messages — often linked to campaigns, product launches, or events bringing traffic to your site
  • Latency increases — if the latency line climbs sharply on a particular day, check if large knowledge base changes were made around that time
  • Quiet periods — consistently low engagement may mean users aren’t finding the chat widget easily

Bot Usage Breakdown

At the bottom of the Workspace Analytics page is a table breaking down activity per assistant — so you can see which bots are doing the most work.
Bot Usage Breakdown Table
ColumnWhat it tells you
Bot NameThe name of the assistant
StatusWhether the assistant is currently active or inactive
SessionsNumber of conversations started on this assistant
MessagesTotal messages exchanged on this assistant
Storage (MB)How much knowledge base storage this assistant is using

Bot-Level Analytics

For a deeper look at any individual assistant, open that assistant from your dashboard. The overview panel shows detailed metrics for that bot alone — broken into four sections.
Bot Analytics Detail Panels

Performance & Cost

How fast the bot responds and what it costs to run. This panel tells you about the speed and operational cost of your assistant.
MetricWhat it means
Avg LatencyThe average time taken to respond to a user message
P95 LatencyResponse time for the slowest 5% of messages — helps identify occasional delays
Cost / SessionEstimated cost per conversation
Cost / MessageEstimated cost per individual message
Cost metrics help you understand usage at scale. As your assistant handles more conversations, these numbers help you plan your usage and choose the right plan.

User Engagement

How actively users interact with the bot. This panel shows how users are actually behaving inside conversations.
MetricWhat it means
Total Users (Est.)Estimated number of unique users who have interacted with the assistant
Active UsersUsers who sent at least one message in the selected period
Avg Msg / SessionAverage number of messages per conversation — higher means users are engaging more deeply
Dropoff RatePercentage of users who opened the chat but didn’t send a message
A high Dropoff Rate may mean users are unsure what to ask when they open the chat. Consider adding a welcome message or example questions to give users a clear starting point.

User Feedback

How users rate the quality of your assistant’s responses. Users can rate individual responses with a thumbs up or thumbs down directly inside the chat. This panel aggregates that feedback.
MetricWhat it means
Thumbs UpNumber of responses users found helpful
Thumbs DownNumber of responses users found unhelpful
Avg RatingOverall satisfaction score out of 5
Your Thumbs Down responses are your most valuable improvement signal. Browse those specific conversations to find exactly which answers frustrated users — then update the relevant documents in your knowledge base.

Automation Health

The bot’s ability to handle conversations on its own — without needing to escalate.
MetricWhat it means
Total EscalationsNumber of conversations where the assistant couldn’t resolve the query independently
Escalation RatePercentage of all conversations that ended in an escalation
A low escalation rate means your assistant is self-sufficient. A high rate is a clear signal that your knowledge base has gaps that need filling.

Using analytics to improve your assistant

Analytics isn’t just a report — it’s a feedback loop. Here’s how to act on what you see:

High escalation rate?

Review conversations that escalated. Identify topics the assistant couldn’t answer and upload content to cover those gaps.

Low feedback scores?

Filter for Thumbs Down responses. Find the weak answers and improve the relevant documents in your knowledge base.

Latency spikes?

Check if latency increases coincide with large knowledge base uploads or high-traffic days. Flag to support if spikes persist.

High dropoff rate?

Add a friendly opening message or example questions to your assistant to give users a clear starting point.

A simple weekly review routine

A 10-minute weekly check is enough to keep your assistant improving consistently.
  1. Open Workspace Analytics → check total sessions and escalation rate at a glance
  2. Review the trends chart → spot any unusual spikes or drops in the past week
  3. Open Bot-Level Analytics for your most active assistant → check feedback scores and dropoff rate
  4. Act on one thing → upload a missing document, adjust a constraint, or improve the welcome message
Small, consistent improvements compound quickly. An assistant reviewed weekly will be significantly sharper after a month than one left untouched.

What’s next?

Update your knowledge base

Add content to cover the gaps your analytics revealed.

Test your changes

After uploading new content, test to confirm the improvements are working.
Have a question? Reach us at hi@droog.io