Android’s performance gap relative to iOS is nothing new, but the nature of this gap is changing in subtle ways that we didn’t predict when we first reported on the problem back in 2015. In this blog, we’ll highlight a couple of typical examples which demonstrate why Android gamers need to be as vigilant as ever to ensure they're getting the same quality of experience as iPhone owners, and why we at GameBench are building new tools to help maintain this vigilance. Please read on for more…
To be fair, Android gaming has improved significantly over the past few years, especially in terms of animation fluidity (i.e., frame rates). This is thanks to consumer pressure, often expressed through negative Play Store reviews and eventual abandonment of Android games, combined with better-coordinated responses from studios, game engines, phone makers and chip designers. Overall, Android is becoming a great platform for gaming, but it still has plenty of issues. For every good example of a cross-platform experience, there’s usually a counter example that we can easily dig out from our in-house test rig or crowdsourced data. Often, these examples show that there's a continuing gap in image quality rather than frame rates.
iPhone games deliver high-fidelity visual effects with great consistency. That's not always the case on Android. Put an Android and iOS device side-by-side while running a real-world game or gaming benchmark and you'll often see that the Android device is missing some visual effects. These effects might have been deliberately removed (e.g., by an Android phone maker trying to get a higher benchmark score), or they might simply be broken because the Android experience was too difficult to optimise. Whatever the cause, the result is the same: worse graphics.
If you play PUBG Mobile on a Huawei Mate 20 Pro, Mate 20 X or Honor View 20 for instance, you can request High Dynamic Range (HDR) mode in the graphics menu and yet not see any HDR during gameplay. Meanwhile, the iPhone version of the game does show HDR effects as do some other Android devices like the Samsung Galaxy Note 9. Take a look at the screenshots below and you'll see differences in the sun bloom effect that is cast over the player character and trees, as well as in how the sun stops being visible through the clouds when HDR is removed:
Not everyone will see this as a big deal. Competitive gamers might not be fussed about HDR effects and indeed our own PUBG tests and ratings have deliberately avoided the HDR mode in order to maximise the frame rate. But that's not really the point. Games are works of art as well as technology; they operate at the subconscious as well as conscious level, so visual effects should be displayed as the artist intended.
We’ve written to Huawei about this HDR issue and they said it's a bug that they're working to fix. This won't be easy to do without reducing performance, because HDR is computationally heavy. We'll therefore reserve judgement until we get a chance to test the fix and make sure that compromises haven't just been shifted to some other aspect of the game. In the meantime, our rating methodology is clear: we can't give top ratings to game experiences that sacrifice image quality to improve frame rate (or vice versa).
Optimisations that lead to a reduction in graphical quality should be sign-posted and transparent to the gamer. By this definition, however, even the most established Android manufacturers often come up short. For example, Samsung's software downgrades the resolution of popular games like Modern Combat 5 and Arena of Valor on Galaxy devices even when the user has explicitly requested maximum graphical detail. Here's an example caught by our test rig:
The Note 9 is rendering at a noticeably worse 720p, rather than the 1080p you get on all recent iPhones as well as on many Android devices. According to Samsung, this is because of a bug with its automatic optimisation software. The bug isn't causing the resolution reduction as such, but it's preventing the user from disabling the optimisation in order to play the game at full-res if they wish. Until Samsung rolls out a fix later this year and we get a chance to test it, it's hard to classify this resolution downgrade as a "optimisation" in any positive sense of the word.
At GameBench Labs, we produce verified performance ratings -- Ultra, Smooth, Basic or Poor – for the performance of mobile products. For these ratings to have meaning, it’s essential that we spot differences in quality between different games, devices or platforms. Currently we’re doing this manually, using expert testers and a well-equipped lab that has video editing suites and other tools for comparing different outputs. When we find a significant issue we communicate it to the companies involved and to the gaming community in the hope that this helps to speed up a fix.
For now, this manual approach is just about enough. But as we grow to cover tens of thousands of games and devices, we need to scale things up. That's why we’re investing in the development of automatic image quality analysis tools, including machine learning, to help us spot differences as consistently and as quickly as possible.
Whether manual or automatic, our primary goal as always is to represent the interests of the gamer and always push for better quality across both mobile operating systems. If you’d like to know more about our GameBench Labs service or methodology, or receive some of latest product rating reports, please get in touch!