Blog Image

EUC Score Toolset

Benny Tritsch 2 September 2024

This article was first published on LinkedIn

When digital workplace users complain about insufficient performance, identifying the reason can be challenging. The first step is to replace user anecdotes with reproducible facts and performance tests. The EUC Score toolset measures and visualizes the perceived user experience of a digital workplace. The underlying trusted scientific methodology is based on simulated user workloads, telemetry data collectors and endpoint screen recordings. In combination with a unique analysis and visualization component, EUC Score produces objective, reliable and reproducible performance information. Anecdotes and assumptions can be either confirmed or debunked. Consensus on results is achieved through expert reviews and an iterative revision of concepts and theories.

In this context, “trust the science” is a synonym for the rational process of systematic observation, question, hypothesis, experiment, and data analysis. The EUC Score toolset includes sophisticated components designed for the collection of system configuration settings before and telemetry data during test runs. The intuitive data analysis tool Sync Player plays screen recordings and animates telemetry data generated during EUC Score test sequences in a synchronized fashion. The advantage of the scientific approach and the unprecedented data visualization is that decision makers, infrastructure designers and product managers can rely on unbiased EUC Score test results. In a previous article, I published the quality criteria for end-user computing benchmarking tests.

Why would you want to use EUC Score? The most popular use cases are guided proof of concepts in digital workplace migration projects, quality assurance in EUC software development processes, physical and virtual desktop design comparisons, Desktop-as-a-Service troubleshooting, and remoting protocol benchmarking.

 

Here are the most important EUC Score links: