When we started GameBench back in 2013, it seemed that mobile gaming would forever be a distinct category. The way we played games on our phones was totally different to how we played on PC and console. This was seen as an inevitable consequence of the limitations of mobile hardware. But now, at least from a gamer's perspective, this distinction is now starting to become fuzzy to the point where it may soon disappear altogether. Gamers increasingly focus on the game they want to invest in, based very often on what their friends are playing, and then consider only offhandedly what "screen" they'll use to access it. Why is this happening?
Clients of GameBench services, including both game makers and device makers, are already familiar with our performance rating system. They use our simple, color-coded badges on a regular basis, often monthly or quarterly, to help them visualize key data and make vital decisions such as:
- How does my game perform on the most important devices compared to similar games in its category?
- How does my device perform compared to the competition when running the most popular mobile games in the market?
- Where does my team need to focus resources to prevent a lack of gaming performance hurting commercial performance?
In fact, our badges have become so important to so many decisions, from development all the way through to marketing, that it's about time we revisited what the badges mean and how they're evolving.
There are plenty of tools available that let app creators check whether poor connectivity is hurting their users. However, such app-centric solutions have rarely been tailored to the needs of game studios or gamer-focused OEMs, especially now that network-heavy games (including streamed games) have become more popular.
At GameBench Labs, we've now worked on numerous projects for our clients that involve monitoring and rating the network resilience of gaming products, so we'd like to share a few things we've learned along the way.
GameBench Labs regularly conducts confidential performance tests for its clients. Occasionally, and with the explicit consent of those clients, we will publish those results for the benefit of the gaming community. This article is one such example where Huawei, a GameBench client, has allowed the results of commissioned tests to be shared.
DISCLAIMER: Huawei is a GameBench client, like many device and game makers, and it sent us pre-release devices and a list of popular Tencent games to test. However, Huawei had no control over our test methodology, results or performance ratings, which are standardised across all our testing projects.
If you've been following the launch of the P40 Pro, you might have spotted that this new Huawei phone sports a 90Hz display rather than the 60Hz that is typical of iPhones and older Androids. A quicker refresh rate is potentially of interest to gamers, promising smoother and more responsive gameplay, but only if games actually animate consistently at this higher speed. And that's a big 'if'.
In fact, ever since the first 90Hz and 120Hz phones started coming out a couple of years ago, we've found that the overwhelming majority have failed to maintain their marketed rates of animation for any length of time, particularly in high-fidelity competitive games that would benefit most from the technology. The optimisation between hardware and software just wasn't there.
Fortunately for gamers, the P40 Pro is different. It achieves our GameBench Ultra 90 badge with both Game for Peace and QQ Speed, reflecting solid 90fps animation even after prolonged gameplay. Read on to see our test data.