Smartlook Surveys Integration

Push Responsly survey responses into Smartlook as session events. Filter recordings by NPS score, CSAT rating, or specific answers — then watch the experience that produced the feedback.
Integration line 1
Integration line 2
Integration line 3
Integration line 4
Integration line 5
Integration line 6
Integration line 7
  1. Red bull
  2. Schindler
  3. Bayer
  4. Booksy
  5. KraftHeinz
  6. Danone

See exactly what users did before and after giving feedback

Responsly connects survey responses to Smartlook session recordings by pushing each submission as a custom event on the visitor’s session timeline. Your team filters recordings by feedback score or answer, watches the experience that led to the response, and acts on behavioral evidence — not assumptions.

For product, UX, and CX teams, this integration answers the question that survey dashboards can’t: “What actually happened?” A low score stops being a mystery when you can watch the session that produced it.

Closing the gap between sentiment and behavior

Survey responses are opinions. Session recordings are evidence. Separately, each tells an incomplete story: surveys say “I’m frustrated” without showing why; recordings show behavior without explaining how the user felt about it.

Together, they form a complete feedback loop:

  • a detractor’s NPS score becomes a specific product issue when you watch their session and see a payment form error,
  • a feature request becomes urgent when the recording shows the user spent four minutes trying a workaround that doesn’t exist,
  • a positive CSAT score confirms which design patterns are working when you watch promoters navigate effortlessly.

Teams that connect feedback to behavior fix the right problems first. For a deeper framework on connecting feedback to action, see the guide on customer experience strategy.

Watching what happened before a detractor NPS score

A fintech product team runs a post-login NPS survey. Detractor submissions appear as custom events in Smartlook. The team filters for scores 0–6 and watches the five most recent detractor sessions.

They discover three patterns:

  • two detractors encountered a timeout error on the transaction history page — the page loaded but the data request failed silently, showing an empty table,
  • one detractor spent 90 seconds searching for the account settings page, clicking three wrong navigation items before finding it,
  • two detractors completed their task successfully but experienced noticeable lag on every page transition (3–4 second load times).

Each pattern maps to a different team: infrastructure fixes the API timeout, design relocates account settings in the navigation, and performance engineers investigate the rendering lag. Without session context, the NPS survey would have produced an action item like “improve product experience” — too vague to act on. Read about structuring NPS programs in our survey alternatives comparison.

Identifying UI friction from low-CSAT session replays

An e-commerce platform triggers a CSAT survey after order completion. Low scores (1–2 stars) are investigated by watching the corresponding sessions.

Session replay reveals:

  • 38% of low-CSAT sessions include at least one rage click — rapid repeated clicks on an element that doesn’t respond,
  • the most common rage-click target is the “Apply Coupon” button, which has a 1.5-second delay before confirmation,
  • 22% of low-CSAT sessions show users scrolling past the shipping options section and returning to it — the default selection isn’t visible without scrolling.

The team fixes the coupon button responsiveness and repositions shipping options above the fold. Follow-up CSAT shows a 0.9-point improvement within one release cycle. Checkout completion rate rises 7%.

Matching exit survey reasons to page interactions

A B2B SaaS triggers an exit survey when visitors navigate away from the pricing page: “What’s preventing you from signing up today?”

Responses sort into categories — “pricing unclear,” “need to talk to sales,” “comparing options,” “missing feature.” Smartlook recordings add context to each category:

  • “pricing unclear” respondents hover over the pricing table for an average of 45 seconds but never click “See full comparison” — the link is too small and positioned below the fold,
  • “missing feature” respondents navigate to the features page and use Ctrl+F to search for a specific term — revealing exactly which feature they need,
  • “comparing options” respondents switch between the pricing page and a competitor’s tab (visible in the URL bar during screen transitions).

The team enlarges the pricing comparison link, adds the three most-searched features to the pricing page, and creates a competitive comparison section. Pricing page conversion rate improves from 3.1% to 4.6% over two months. Learn more about survey techniques in our guide to WordPress survey makers.

Bug reports enriched with session recordings

A product team adds a feedback widget on their web app: “Did you encounter any issues?” with a yes/no toggle and an optional description field. “Yes” responses fire as high-priority events in Smartlook.

Over 30 days:

  • 127 bug reports are submitted through the widget,
  • 89 have a matching Smartlook session recording,
  • the engineering team resolves 34 bugs in the first sprint — each ticket includes the user’s description, the exact session recording, and a timestamp of the error moment.

Average time to reproduce a reported bug drops from 45 minutes to under 5 minutes. The backlog of unactionable bug reports (“it just didn’t work”) shrinks by 60% because engineers can see exactly what happened. Use skip logic to branch from the yes/no toggle into a detailed description field only when needed.

What data is sent to Smartlook

Each survey submission pushes a custom event containing:

  • survey name and campaign identifier,
  • question text and full answer content for each completed question,
  • response type (NPS score, CSAT rating, text, multiple-choice selection),
  • response ID and submission timestamp,
  • completion status (full or partial submission).

Events are searchable in Smartlook’s filter system, visible on session timelines, and usable in funnels and retention analysis.

Start watching the sessions behind your feedback

Connect Responsly to Smartlook, deploy your first on-site survey, and stop guessing why users gave the scores they did. Watch the experience, identify the problem, and fix it with evidence — not speculation.

Smartlook Integration FAQ

How are survey responses attached to session recordings?

When a visitor submits a Responsly survey on a page where Smartlook is active, the response is fired as a custom event on the visitor's session timeline. The event contains the survey name, answers, and score.

Can I filter Smartlook recordings by NPS score range?

Yes. Survey scores are stored as event properties. Filter for detractors (0–6), passives (7–8), or promoters (9–10) to watch only the sessions behind a specific sentiment group.

What happens if a user submits a survey on a page without Smartlook?

The survey response is still captured in Responsly. However, it will not appear as a session event in Smartlook because no active recording session exists to attach it to. Deploy both tools on the same pages for full linking.

Can I share a specific recording with a linked survey response to my team?

Yes. Smartlook lets you share recordings via direct link. When presenting feedback to product or engineering teams, attach the session recording of a detractor — it communicates the problem faster than any written report.

How does this differ from just reading survey comments?

Written comments describe problems in the user's words. Session recordings show the exact interactions, hesitations, rage clicks, and errors that happened. Combined, you understand both what the user felt and what they experienced.

Can I use survey events in Smartlook funnels and retention analysis?

Yes. Survey submission events can be added as funnel steps or retention milestones. Track what percentage of users who reach a specific page also submit feedback, or measure retention among users who gave specific scores.

Does the event data include open-ended text responses?

Yes. The full text of open-ended answers is included in the event payload. You can search recordings by keywords within survey responses — for example, finding all sessions where a user mentioned 'checkout' in their feedback.

Popular survey integrations

More integrations
  • 62%

    62% of our surveys are opened on mobile devices. Responsly forms are well optimized for phones and tablets.

  • 2x

    Responsly get 2x more answers than other popular tools on the market.

  • 98%

    Responsly service get an average satisfaction score of 98%

effect
effect

Enterprise grade security

effect
  • GDPR compliant

    We're complaiant with General Data Protection Regulation (GDPR) that businesses in Europe must comply with when processing personal data.

  • CCPA compliant

    USA state of California intruduces California Consumer Privacy Act (CCPA) that defines how to handle users' personal data.

  • SSL & 2-Factor Authentication

    All connections are protected by TLS 1.2 and AES with a 256-bit key. Enable 2-Factor Authentication for even better security.

  • SSO

    Sign up users with Single Sign-On (SSO) and manage their access to your team. Set permissions and resource access.

Responsly platform helps us to manage customer satisfaction and communication within our organization.

Alicja Zborowska, Administration Specialist

Red bull
Bayer

We automated the product experience management process.

KraftHeinz

Managing customer experience is made easy with Responsly.

Danone

Our suppliers are surveyed quickly and efficiently.

Feel the Responsly advantage over other products

Talk to us!