Home    About

Segment Visual Tagger


Visual Tagger is a no-code tool that helps Segment customers set up website tracking quickly by simply pointing and clicking on their own websites.






Role
Product Designer
Company
Segment
Duration
8 month (September 2019 - April 2020)


Responsibilities
End to end process, including user research, ideation, user flow, execution to prototyping
Team
5 Engineers, 1 EM, 1 PM
XFN partners
Research Lead, Data Analyst, PMMs, Sales






Business Objectives

A collective dream at Segment is that every company in the world uses good data to power their growth in a customer-centric way. The first step in making this dream a reality is to enable all of our customers to get clean, relevant, and useful data into Segment. Unfortunately, due to the technical effort involved in instrumenting applications, companies of all sizes find friction in this phase of their Segment journey.

Visual Tagger was developed as a tool to reduce friction for our customers to on-baord to Segment. It is a no-code tool that helps Segment customers set up website tracking quickly by simply pointing and clicking the elements they want to track on their own websites.



Problem Statement

Orginally, the product was built as a Chrome Extension with poor usability. The fact that customers were struggling to use the product resulted in numerous Zendesk tickets flushing in. User satisfaction was low and the product was unable to meet the needs of our customers. Firstly, the team had to find out what problems are, prioritize the problems and new features, build a roadmap toward our vision.





Role

I was recruited to lead the product redesign as we re-built it in Segment web app while validating the product market fit. At the time the product had only basic functionality. Since then, I led the design, tested new features with our users, and validated its business impact with our PM, EM, engineers, data analysts & researcher.



Before




After






Desired Business Objectives
Increase activation rate by 30% and improve the customer satisfaction.
Scope
We had a rapid development cycles and short timeline to phase out the product from Alpha release, Beta release to GA launch within 8 months.
Constraints
Technology wise, we bind website tracking to CSS selectors and support only client side tracking, excluding server side tracking.




Discovery/ User Research

1. Our users are less technical and are looking for analytics guidance.

By conducting numerous user interviews in the beginning, we were able to define user personas, who are marketers, product managers and business owners who don’t necessarily possess technical knowledge about coding or/and analytics. Our users want to learn analytics by doing it, and they want to enable analytics for their companies without relying or waiting on developers, which is a scarce resource for them.

“The vision for building a no-code tool did not just suddenly come to us one day. It was actually based on what we learned about our users.”

After gaining this insight, we were looking for solutions to create friendly user interfaces and friendly analytical-languages.

2. The mental models were mismatched.

Users had hard time using the product because fundamentally, it was not how they expected it to work. There were a lot of problems lying in the UIs and the original product experiences. Through numerous iterations and testing, we gradually discovered solutions to bridge the gaps by redesigning the user flow (for many times), navigation and interfaces. We also drastically simplified the UIs and were able to explain things better & clearer in the product with friendlier languages and on-boarding experiences.






Ideation

Original product experience as Chrome Extension



The first assumption the team made:
“Switching between modes” is a natural mental model to our users. 

As a user, you would click “navigate” on the left bottom corner to navigate to the web pages where you want to track visitor behavior. You would click “Event Tagger” and then click “Tag an event” to start implementing the tracking.

By far, you may already notice how poorly this experience was designed - the position of buttons do not flow naturally as users interact with the tool; the concept of “switching between modes” is not clearly visualized in the UI, etc.

A couple things that I wanted to do as I on-boarded to the team were
1. Fix obvious UIUX problems based on best practices and past experiences.
2. Test the assumptions with our customers.
3. Outline user personas. -- Who are our primary users?




One of the very first design iterations | Aligning user mental models
[insert flows here]





We want to test:

First assumption: User would be familiar with “switching between modes” to perform a task that is tracking visitor behavior on their own websites.

Second assumption: Users want to track a website funnel easily and quickly as if they were website visitors themselves.


if Our users want to track a funnel, they can implement tracking as if they were website visitors themselves. (in this case, we can use allbirds for example, let’s pretend we all work as PMs at allbirds and we want to track the funnel conversion from a visitor visiting allbirds homepage, to landing on Women’s wool runner product page, to clicking “add to cart” button. To implement all these, we could just turn on the “creation mode” to start clicking around, and we could end up seeing 2 to 3 trackings created that would show up on the right hand side as we click

So that was our assumption about our users’ mental model. When we tested with users, they could figure how it works but it was a disaster. why?

Bc Implementing tracking for our users is actually a job that needs to be done carefully. There are consequences if you implement tracking poorly - that is your data would be messed up. And this whole experience was just way too easy for users to end up adding bunch of poor trackings. And also it misaligned with what our users’ mental model. They did not feel guided through the experience and often found themselves lost.





So that leads to another many iterations.

This is what we landed on - also how it looks like today.

Focusing on fixing the interaction of creating trackings. We removed the model of switching modes, and designed the experience in a way which lets users feel more guided, by letting them only focus on and do one to few actions at a time.

So now say you come to Segment app, you would learn a little bit about analytics and what event tracking is, you come to VT, and click that blue button “add event”

You now see you can track button or link. You click button

and you can hover on elements that you want to track

In this case, you click the “learn more” button

And a modal pops up, you see controls on the right hand side where you can input how you want to name it, you don’t see any code, yes you see properties but you can click on “learn more” to fill the knowledge gap if you don’t know what that is.

we focus users on one to few tasks at a time

And at the same time we Improve users' mental models By explaining things better, also adding tips as users hover around. so that they can more accurately reflect our system.

The result is that our Zendesk tickets related to usability issues are reduced by large amount, i believed was close to 0.







[first iteration]
- user flow
- wireframes. lo-fi
[final iteration]
- user testing
- how we get here?
- pre-launch validation













Other Challenges

  1. Visual Tagger is a highly interactive product that enables our users to achieve a task that traditionally can only be done by code. How might we design the product elegantly with simple interfaces that offer the numerous complicated techical-capabilities?
  2. The product experience could be very different from the rest of Segment web app given its interactiveness. How might we bridge that expectation gap?
  3. The product was at a very early phase that we were still trying to validate the PMF. How might we navigate through the ambiguity and build a Minimum Lovable Product?







Impact

80% CSAT score

  1. Pre-launch usability research deep dive and validation
  2. Post-launch product research and business value evaluation






Reflection

- Improving misaligned mental models
- radically simplifying the product












︎   ︎   ︎   ︎  ︎

© QIAO HUANG. ALL RIGHTS RESERVED