The Unsettling Side of Realism in Graphics and Why Stylized Works Better


Article by Yuri Ilyin
For a very long time, 'good' graphics in video games meant 'realistic'. As technologies progressed, the ever-moving target of realism seemingly remained out of reach. But when videorealistic titles appeared, they made people feel unsettled. Holy Grail or Zabriskie Point?
From the Humble Beginnings
Video games started way before PCs or gaming consoles were at all capable of delivering anything even remotely resembling photorealism. Now the technology is there, but it's not necessarily what people actually want.
Reference: The Long Dark
Steam, or any other digital game distribution platform these days, is thick with titles featuring highly stylized and often intentionally lo-fi visuals, and not just because nostalgic Gen Xers and Millennials have the deepest pockets.
It seems counterintuitive and strange that there's so much demand for pixelated and/or Gouraud-shaded graphics after decades of developers striving to make video game visuals look realistic.
Although that wasn't always the case. There was little computer graphics to speak of until probably the late 1970s, when the first home consoles started to emerge (like the Fairchild Channel F and Atari VCS).
Reference: Space Invaders on Atari 2600
With the advent of home computers in the early 1980s, various vendors strove to expand the color options of their systems - with varying degrees of success.
While today's personal computer platforms have very limited diversity, back in the late 1970s and early 1980s there was a real zoo of competing systems.
For example, modern Windows-based systems originate from IBM's PC standard, established in 1981. That same year, IBM produced its first graphics card - the Color/Graphics Monitor Adapter (CGA). It supported only four colors simultaneously (although the full palette included 16 colors), and the most remembered combo was Black, White, Cyan, and Magenta. Not exactly eye candy.
Reference: Wikimedia Commons
Just one year later, the ZX Spectrum with 15-color support and the Commodore 64 with a 16-color palette emerged. IBM's own 16-color Enhanced Graphics Adapter (EGA) arrived only in 1984, and the 256-color VGA (Video Graphics Array) came into existence in 1987.
Meanwhile, the Amiga 1000, which arrived in 1985, delivered up to 4096 colors, and the entire Amiga line of PCs was widely used for multimedia tasks that IBM-compatibles wouldn't handle properly until the 1990s.
Apple's Macintosh II (1987) supported a total palette of 16 million colors, but could only display 256 of them simultaneously on screen.
Game developers had to keep all these differences in mind when making their games. The same titles might look radically different on the ZX Spectrum, Amiga, PC, and, say, the Acorn Archimedes.
Besides, there was always the issue of resolution. Until at least the mid-90s, video game artists were often confined to low resolutions, like 320x200 pixels or less, depending on the platform and time period. The larger the used palette was, the lower the resolutions that were generally available.
As the IBM-compatible PC became the apex predator, so did the standard VGA graphics cards. They could deliver the full range of 256 colors, but only at a 320x200 resolution (stretched over the entire display). For quite a few years, however, VGA remained the baseline that DOS and early Windows games were expected to support, no matter what.
With limitations like that, the achievements of video game artists from that era are amazing. But there wasn't much room for any sort of realism in games back then, not until support for full 24-bit (16.7 million color) palettes became a commodity, and truly advanced real-time 3D graphics arrived.
Or was there?
A Moving Target
First of all, it's necessary to establish what realism even is. And it's a slippery, always-in-motion fish. If we speak of photo- or videorealistic visuals alone, that's something developers have strived to achieve for decades, by various means.
Some elements of realism were available early on and were sometimes very cleverly used (more on that below), but in general...
Actually, before the advent of advanced 3D graphics, there were attempts to make games video in quite a literal sense: by using full-motion video with live-acting characters as the foundation of the entire game.
Starting with 1990's Western-styled Mad Dog McCree, a rail shooter, and continuing with such titles as Night Trap, The 7th Guest, Phantasmagoria, The Daedalus Encounter, and somewhat culminating with Wing Commander III and IV, games that featured Hollywood dignitaries such as Mark Hamill, Tom Wilson, John Spencer, and Malcolm McDowell.
Reference: Wing Commander IV: The Price of Freedom
All of these featured live-action mixed, to varying degrees, with computer graphics. Unfortunately, quite a few were ahead of their time: due to the limited capabilities of IBM-compatible PCs in the early 1990s, the quality of in-game FMV was usually very poor. There was also a purely financial issue: shooting video was an expensive endeavor, and only larger companies could afford it.
So by the time the technology was capable of delivering clean imagery, FMV had already fallen out of favor. Although there were later examples like Red Alert 3, with its Hollywood ensemble cast, and even some more recent indie efforts.
Reference: Red Alert 3
Along Came 3D Graphics
When real-time 3D graphics finally arrived, it sent industry-wide tremors. The gaming press occasionally lauded Quake for its "realism", at least in relation to its physics. Quake still plays great, but not because of its realism.
Players were wowed by the dynamic sky and atmospheric effects of Morrowind, and most of all, its realistic-looking pixel-shaded water. This was enough to distract from less appealing visual and gameplay aspects, such as character models, unimpressive vegetation, etc. Today, it's barely playable without facelifting mods.
Graphics improvements in the subsequent games - Oblivion and Skyrim - were mind-blowing at the time. But just a few years later, their visuals already looked dated. Dragon Age: Inquisition arrived only three years after Skyrim, yet was vastly superior graphics-wise. That said, modders continue to facelift Skyrim, and their visual achievements are extraordinary, at times, even very close to videorealism.
Reference: Skyrim NOLVUS V6 mod
But that's a different story, out of this article's scope.
All in all, photorealism or videorealism is like a horizon - always a runaway.
Zabriskie Point?
On the other hand, titles like Bodycam and Unrecord, upcoming tactical shooters, may halt that chase for a while, courtesy of Unreal Engine 5's capabilities.
Reference: Bodycam
These games, however, use third-party assets, likely the products of photogrammetry and 3D scanning. Modern asset stores carry these in droves, enough to populate any game striving for realistic environments. Even large swaths of pristine terrain are now available - and widely used. It remains to be seen whether Bodycam and Unrecord will live up to their presumed potential as games.
But at least some people are already feeling unsettled by this level of realism. Apparently, there is a point where it becomes just too much, the Zabriskie Point of video game graphics, at which the whole appeal is lost.
Meanwhile, highly stylized graphics hold up, and for a very long time.
Stylish Looks
Even disregarding the high demand for all things retro and nostalgic, some very old gaming titles with their schematic but clean graphics still feel good.
You don't expect any kind of visual realism from, say, SimCity 2000 or the original Syndicate (both from 1993). Are they still playable? Oh, they are indeed.
Moreover, there seems to be no reason for any kind of visual facelifting. Hard to believe, but Syndicate's gameplay is entirely 16-color. It was all about the clever use of dithering that enabled a high resolution on the main screen.
Dune II from 1992 feels timeless too, much more than its 1998 reimplementation (not that Dune 2000 was bad, though).
Then there are two essential titles that can't be omitted here: the original Prince of Persia and Another World. These were legendary from day one, thanks to incredibly smooth and lifelike character animations, while the graphics overall were very much stylized.
Reference: Prince of Persia
Jordan Mechner, the creator of Prince of Persia, quite openly said that he wasn't very good at drawing (yet), so he resorted to a technique known as rotoscoping. The titular character's movements are, in fact, those of Mechner's younger brother, who was filmed running, jumping, and climbing around in white clothes, to be traced over for use in the game.
Reference: Prince of Persia - Behind the Scenes
And the sword duels' primary rotoscoping source was the final duel scene between Errol Flynn and Basil Rathbone in The Adventures of Robin Hood (1938). Unlike the movements, Prince of Persia's environment is completely non-realistic. Was anybody bothered about it? Nope.
ric Chahi's Another World was clearly inspired by Prince of Persia, but in many ways moved forward - toward further cinematicity. Chahi demonstrated the power of a clever use of vector graphics, creating both a very engaging and visually appealing experience. The game heavily referenced various sci-fi works and, in more than one way, was a technical masterpiece too.
Reference: Another World
Chahi did use some rotoscoping to create the characters' fluid animations, but it was just one part of the game's appeal.
Shaken, Not Stirred
And this is probably how it works best: the recipe includes some degree of realism admixed into a generally non-realistic environment, provided that the mix retains an organic feel. FMV titles like the aforementioned Phantasmagoria and The Daedalus Encounter (featuring Tia Carrere) were great for their time, but the seam between computer-generated imagery and live action shot against a blue screen was rough, and unsmoothable.
Reference: The Daedalus Encounter
It's cool to have good and realistic-looking (i.e. expectable) character and creature animations, but it's not paramount for a game to be good. Anime-styled titles do just fine with non-realistic motion.
It's good to have naturally looking and animated vegetation, but a non-natural look works too, as long as there is stylistic integrity. Cinematic lighting can be emulated without turning the end-user's GPU into a raging furnace that requires liquid nitrogen to keep going.
Games are games, a form of complex art. Game graphics are art in their own right. And art is not a full re-implementation of reality, nor does it need to be. We play games to take a break from reality, where there are no 'save' buttons or healing potions, magic is commonly believed to be non-existent, and manned spacecraft are still tin cans circling the pale blue dot.
Let the games be games.