Good Mobile Experience Changes In Real Time

Good Mobile Experience Changes In Real Time

Mobile analytics is a daily or hourly pursuit, not just numbers in a spreadsheet

Since user experience is vital to the success of a web or mobile system, it is critical to establish feedback loops within the system to inform potential changes. There are multiple methods for effective feedback loops, both manual and automated the end result for either is establishing how users felt regarding their experience. 

Although user experience is somewhat subjective as a whole, metrics and analytics are one method that can provide objective feedback and insight into how effective the user experience trends may be. Analytics has traditionally been utilized for marketing purposes, informing strategy, implementations, and trends, however, more and more user-experience implementations are relying on this quantitative data source to aid in project research and design. Modern system analytics are key to tracking the user experience and should be reviewed daily when available. Utilizing the data effectively and efficiently through a well-designed system can successfully change a user’s experience (in some cases real-time) for the better.

This principle in practice:

In order to analyze metrics and data effectively, success objectives must first be established. How do you define success? To begin, here are some example goals for establishing a project’s user experience success:

  1. 100 new registered users per month
  2. 70% average user retention rate per month
  3. Mobile experience consistent across all supported mobile devices

Initially, one effective strategy to measure success of a project’s designed user experience is to select a few key metrics and focus on them over time. One of the most prominent issues with analytics is that without proper direction, they can become a distraction or just numbers without any context or actionable interpretation. Mobile analytics is data, which can provide valuable information when utilized in the way that furthers an agency’s goals. The question then comes down to – what set(s) of analytics data is relevant in the context of the project? Why is the project gathering and tracking metrics to begin with? These are perfectly understandable questions and key to establishing an effective user experience feedback loop across any project.

These are high-level objectives that are established and agreed upon by the project’s organizational stakeholders as measures of success. With established success measures, here are some example categorizations of relevant user experience analytics implementations:

  • Notifications of blockers to the system’s essential flows.
  • Identification of root causes to blockers.

Notifiers provide continuously monitored information hourly, daily, or weekly. The analytics can be used to define specific issues or provide information supporting insight to further guide human behavior. Examples of potential notification analytics:

  • Unique page views over a defined period of time
  • Average time on page per session
  • User actions (defined and tracked, – user clicks on a button, download of a PDF, filling out a form, etc.)

Identifiers are implemented analytics that are used in conjunction to better understand the overall human behavior and user experience. Identifiers fall under several categories themselves not limited to: traffic issues, technical issues, content related issues, navigation issues, and UI design issues. Some examples are below with their equivalent Google Analytics (GA) implementations as a starting point:

  • Traffic: to determine if there is one traffic source that is responsible for any changes in page views.
    • GA: Pages – Unique Resource Indicator with Source as secondary dimension
  • Technical: determine if there are any UI elements that are broken or non-functional
    • GA: Event Pages – tracking of the pages with defined User actions (see above)
  • Content: determine if taxonomies or other media is confusing users
    • GA: In-Page Analytics – percentages for links clicked by users
  • Navigation: determine if specific buttons or menus are not being utilized correctly
    • GA: Pages – Unique Resource Indicator with Navigation Summary as secondary dimension
  • UI Design: fonts, UI elements, buttons, view layout, or other media is confusing intended user-flows
    • GA: In-Page Analytics – percentages for links, or heat maps, clicked by users
  • Users: Don’t forget that user comments and feedback can also tell you how users feel about your work – this is more qualitative but can be valuable in the feedback process
    • GA: Comments on app review pages, emails to help desk, etc.

Once a targeted core set of analytics has been implemented and monitored, a basic feedback loop will have been established. The analytics and data can then be utilized by the project team to evaluate opportunities to improve user experience. Does the experience need to change? How should it change? Does the information available warrant additional user testing? Specifically, does it call for A/B testing?