- GameBench Staff
- 06. August 2020
-
0 Comments
Clients of GameBench services, including both game makers and device makers, are already familiar with our performance rating system. They use our simple, color-coded badges on a regular basis, often monthly or quarterly, to help them visualize key data and make vital decisions such as:
- How does my game perform on the most important devices compared to similar games in its category?
- How does my device perform compared to the competition when running the most popular mobile games in the market?
- Where does my team need to focus resources to prevent a lack of gaming performance hurting commercial performance?
In fact, our badges have become so important to so many decisions, from development all the way through to marketing, that it's about time we revisited what the badges mean and how they're evolving.
What Are GameBench Badges?
GameBench badges are simple visualizations which reveal whether a gaming experience happened the way it was intended.
A great thing about badges is that they take a gamer’s perspective rather than an engineer’s. They help our clients to acknowledge whether an experience met certain healthy thresholds without worrying prematurely about the causes of any problems and how readily fixable they are.
Every gaming experience is a culmination of many different variables, including multiple products working together in harmony (or disharmony), so starting with questions of causality isn’t always helpful or conducive to good prioritization.
For example, cloud gaming can easily be affected by at least four different products: which device you use, which game you choose, which streaming service you subscribe to and which WiFi or cellular network you connect through. Different engineering teams in this chain might prioritize issues that they can directly fix, and deprioritize those perceived as being outside of their domain. They might also deprioritize issues where the range of uncontrolled variables means a cause can’t immediately be pinned down. Our badges deliberately push against this way of thinking.
Yes, we keep a log of all key variables and products involved in a particular test, because this context is vital and a different context could result in a different badge. However, the first step should be to look at whether the end result did or didn’t work well. This then highlights the issues most likely to cause a gamer to abandon an experience, even if these issues had complex causes or required multiple companies to cooperate towards a fix. This sort of common language and set of standards has been missing from the gaming industry.
What Does Each Badge Mean?
Our badges summarise objective performance metrics that would otherwise need to be presented in a dense spreadsheet. These metrics cover many different aspects of a gaming experience, from the visual fluidity through to input latency, image consistency, battery drain and so on. Each badge represents just one of these aspects, and it is color-coded to make it quick to understand whether this aspect was within desired thresholds. Based on client feedback, we’re continually improving both the design of our badges and the thresholds they represent. We’ve recently decided to change the names of some of our badges as shown in the table below:
Badge | Badge Name |
Badge Meaning |
Ultra | The underlying metrics indicate that the experience would have satisfied the most demanding gamer, such as a hardcore competitive or eSports gamer. | |
Enthusiast (previously "Good") |
The underlying metrics indicate that the experience would have satisfied an enthusiast gamer who knows what good performance looks and feels like, even though they don’t rely on it as much as a hardcore competitive gamer. | |
Casual (previously "Basic") |
The underlying metrics indicate that the experience was good enough to satisfy someone looking for a fun experience, without taking things too seriously. | |
Out-of-Threshold / OOT (previously "Poor") |
The underlying metrics did not meet any of GameBench’s thresholds. Depending on how they’re designed, not all games necessarily need to meet our threshold for things like frame rate or latency. It’s not correct to assume or imply that such games are “poor,” so we’re switching to a more neutral badge. |
How GameBench Badges Work
Although GameBench badges summarise objective metrics, they also contain an element of subjectivity in the form of the thresholds they represent. In other words, how high or low does a particular metric need to be in order to be considered "good"?
Given this element of subjectivity, our approach is to always make our thresholds transparent and always be open to discussing them and refining them. What matters most is that they continue to serve gamer’s interests, even while mobile gaming continually changes.
To determine whether a threshold makes sense, we use the following sources of data:
- We are gamers and we have been participating in community discourse around gaming for many years. This discourse constitutes a valid source of data and has already established many meaningful thresholds that we do not necessarily need to reinvent. This includes 30fps as a minimum frame rate for enthusiast gamers. Another example would be 133ms as a threshold for enthusiast-grade latency -- something gaming sites like Digital Foundry have been discussing for more than a decade.
- Thanks to GameBench tools, we have seven years' worth of performance data covering thousands of devices and tens of thousands of games, so we know what gamers are experiencing in the wild.
We have also aggregated millions of user reviews of these experiences, so we have been able to correlate and mine the two datasets to find interesting thresholds. For example, when a gaming experience depletes a phone battery by more than one third in one hour, we've found that there's likely to be many more complaints about battery life. That is why our "Casual" badge for battery life requires battery drain to be no higher than 33% per hour.
- Client discussions and feedback. GameBench is in the unique position of being in the middle of the gaming ecosystem. Our clients include chip designers, phone makers, platform holders (including OS and engine makers), studios and publishers, and all of these have contributed to discussions about the value of our metrics and their associated thresholds. The most common feedback is that our thresholds are “too harsh”, which tells us that we are doing something right in defending the interests of end users.
How GameBench Badges Change Over Time
Our badges are based on thresholds which constantly need to evolve and adapt to gamers' changing expectations. This is essential, because gaming is progressing rapidly and being delivered in new and exciting ways that are not always directly comparable to what came before.
For us, the goal is not to create and defend permanent thresholds, but to curate and evolve meaningful thresholds over time, while always being transparent about how and why we're doing this, and continually discussing these topics with our clients.
Deliver seamless gaming experiences
No matter where your company or your product sits in the gaming ecosystem, GameBench badges can help you get a clear view of how you compare to competitors to determine where your strongest and weakest points.
Once this is done, our Labs service can go further to identify specific causes to specific issues and help you prioritize and track efforts towards fixing them.