The Art of Measuring Remote End User Experience

“To me, when we say mobile first, it’s not the mobility of the device, it’s actually the mobility of the individual experience,” Microsoft CEO Satya Nadella said in November 2014 at a UK event. When remoting into Windows desktops and applications is part of your enterprise mobility strategy, acceptable end user experience is an important success factor. Unfortunately, traditional benchmarking parameters, such as frame rates and system performance counters, do not entirely represent the perceived end user experience (EUX) on a remote client. But what are the parameters and metrics that allow you to judge remote EUX and manage customer expectations?

There are predictions that within the next couple of years, customer experience (CX) or end user experience (EUX) will overtake price and product as the key brand differentiator. But how can the quality of CX and EUX be measured in remote environments, allowing reliable statements about what end users can expect? To be very clear, remote CX and EUX are NOT about usability of applications or user interface design. It’s more about performance when interacting with a remote system. So benchmarking remote EUX rather reminds you of online games performance and load testing.

But what exactly are the tests about? Performance tests are used to evaluate an environment’s individual areas in terms of speed. A test tool must therefore be able to simulate many simultaneous accesses to determine the respective metrics. The results can be integrated in the corresponding service level agreements. With this type of test, it is, for example, possible to determine the application startup and dialog response times of a server-hosted application under different conditions.

A load test subjects the environment to the kind of access and usage rate expected in routine operation. This requires the test to define the maximum response time for the individual system components and to align the response time with the existing state of technical implementation. The corresponding measurements need realistic load samples simulated by the test tool. The results show whether the response times are within the expected limits.

Performance and load tests should not be confused with scalability or stress tests which describe the system behavior in relation to predefined criteria such as UI response times when the number of users is increased. The main goal of a stress test is to find out when an environment starts generating errors or unacceptable latencies under excessive loads and whether it reverts to normal after extreme stress.

The following list shows the most important remote EUX performance and load test criteria:

  • User session initialization time, measuring the time between a user starting the logon process and being able to interact with the remote system
  • Application startup delays, measuring the time how long it takes for an application to reach a state where user input is being accepted
  • User interface response times, measuring delayed appearance of graphics primitives and UI elements after user-initiated triggers
  • Performance impacts caused by client capabilities and media redirection
  • Image quality and compression artefacts related to limited or changing network conditions
  • Correct individual media timing according to real-time reference clock (animation and video lag or stuttering)
  • Synchrony or asynchrony of multiple media streams (sometimes also referred to as “lip-sync”)

In accordance to this list, remote EUX performance and load metrics can be logon time, reconnect time, latency at the client, multi-media performance, application startup time and user interface response time. Measuring application launch and user interface response times in a reproducible manner is relatively simple. Collecting other metrics is harder, but screen recordings are a viable option here. A frame grabber can be used to record the client screen without introducing any influence on the overall performance of a remoting test sequence. By attaching the frame grabber to a dedicated recorder PC, the process of recording videos of test sequences is completely decoupled from the environment under test. Analyzing the videos at a later time allows measuring times, delays and image quality frame by frame. Recorded videos also allow for visual side by side comparisons – something that even non-technical executives understand easily.

Together with Shawn Bass and later with other members of TeamRGE I began creating a remoting protocol benchmarking suite, collecting screen recordings and systematically analyzing the test results about five years ago. All this resulted in an archive of over 5,000 recorded test sequences, and the number of videos is still growing as we have never stopped testing new product versions, protocols and use cases. It’s a rather simple test methodology, but it works flawlessly if done right.

Unfortunately, five years ago there was no commonly accepted and adequately designed benchmarking tool focused on measuring EUX performance in individual user sessions, so we had to build our own. And without any doubt, there is a need for such a tool. A nice statement by Citrix CEO Mark Templeton at Citrix Summit 2015 explains why: “Quality products are the lure, great services keep them around, and memorable experiences build loyalty.” As a result, only proper remote EUX benchmarking and comparison helps vendors, system integrators, consultants and customers to manage end users’ expectations and predict acceptance rates – and thus helps to build long-term loyalty.

No comments yet. You should be kind and add one!

The comments are closed.