How Perfectly Can Reality Be Simulated?

How Perfectly Can Reality Be Simulated?

On a warm afternoon last fall, Steven Caron, a technical artist at the video-game company Quixel, stood at the edge of a redwood grove in the Oakland Hills. “Cross your eyes, kind of blur your eyes, and get a sense for what’s here,” he instructed. There was a circle of trees, some logs, and a wooden fence; two tepee-like structures, made of sticks, slumped invitingly. Quixel creates and sells digital assets—the objects, textures, and landscapes that compose the scenery and sensuous elements of video games, movies, and TV shows. It has the immodest mission to “scan the world.” In the past few years, Caron and his co-workers have travelled widely, creating something like a digital archive of natural and built environments as they exist in the early twenty-first century: ice cliffs in Sweden; sandstone boulders from the shrublands of Pakistan; wooden temple doors in Japan; ceiling trim from the Bożków Palace, in Poland. That afternoon, he just wanted to scan a redwood tree. The ideal assets are iconic, but not distinctive: in theory, any one of them can be repeated, like a rubber stamp, such that a single redwood could compose an entire forest. “Think about more generic trees,” he said, looking around. We squinted the grove into lower resolution.

Quixel is a subsidiary of the behemoth Epic Games, which is perhaps best known for its blockbuster multiplayer game Fortnite. But another of Epic’s core products is its “game engine”—the software framework used to make games—called Unreal Engine. Video games have long bent toward realism, and in the past thirty years engines have become more sophisticated: they can now render near-photorealistic graphics and mimic real-world physics. Animals move a lot like animals, clouds cast shadows, and snow falls more or less to expectations. Sound bounces, and moves more slowly than light. Most game developers rely on third-party engines like Unreal and its competitors, including Unity. Increasingly, they are also used to build other types of imaginary worlds, becoming a kind of invisible infrastructure. Recent movies like “Barbie,” “The Batman,” “Top Gun: Maverick,” and “The Fabelmans” all used Unreal Engine to create virtual sets. In 2022, Epic gave the Sesame Workshop a grant to scan the sets for “Sesame Street.” Architects now make models of buildings in Unreal. NASA uses it to visualize the terrain of the moon. Some Amazon warehouse workers are trained in part in gamelike simulations; most virtual-reality applications rely on engines. “It’s really coming of age now,” Tim Sweeney, the founder and C.E.O. of Epic Games, told me. “These little ‘game engines,’ as we called them at the time, are becoming simulation engines for reality.”

Quixel got its start helping artists create the textures for digital models, a practice that historically relied on sleight of hand. (Online, a small subculture has formed around “texture archaeology”: for Super Mario 64, released in 1996, reflective surfaces would have been too inefficient to render, so a metal hat worn by Mario was made with a low-resolution fish-eye photograph of flowers against a blue sky, which created an illusion of shininess.) It soon became clear that the best graphics would be created with high-resolution photographs. In 2011, Quixel began capturing 3-D images of real-world objects and landscapes—what the company calls “megascans.” “We have, to a great extent, mastered our ability to digitize the real world,” Teddy Bergsman Lind, who co-founded Quixel, said. He particularly enjoyed digitizing Iceland. “Vast volcanic landscapes, completely barren, desolate, alienlike, shifting from pitch-black volcanic rock to the most vivid reds I’ve ever seen in an environment to completely moss-covered areas to glaciers,” he said. “There’s just so much to scan.”

Digitizing the real world involves the tedium of real-world processes. Three-dimensional models are created using lidar and photogrammetry, a technique in which hundreds or thousands of photographs of a single object are stitched together to produce a digital reproduction. In the redwood grove, as Caron set up his equipment, he told me that he had spent the past weekend inside, under, and atop a large “debris box”—crucially, not a branded Dumpster, which might not pass legal review—scanning it from all angles. The process required some nine thousand photographs. (“I had to do it fast,” he said. “People illegally dump their stuff.”) Plants and leaves, which are fragile, wavery, and have a short shelf life, require a dedicated vegetation scanner. Larger elements, like cliff faces, are scanned with drones. Reflective objects, such as swords, demand lasers. Lind told me that he loved looking at textures up close. “When you scan it, a metal is actually pitch-black,” he said. “It holds no color information whatsoever. It becomes this beautiful canvas.” But most of Quixel’s assets are created on treks that require permits and months of planning, by technical artists rucking wearable hard drives, cameras, cables, and other scanning equipment. Caron had travelled twice to the I’on Swamp, a former rice paddy on the outskirts of Charleston, South Carolina, to scan cypress-tree knees—spiky, woody growths that rise out of the water like stalagmites. “They look creepy,” he said. “If you want to make a spooky swamp environment, you need cypress knees.”

The company now maintains an enormous online marketplace, where digital artists can share and download scans of props and other environmental elements: a banana, a knobkerrie, a cluster of sea thrift, Thai coral, a smattering of horse manure. A curated collection of these elements labelled “Abattoir” includes a handful of rusty and sullied cabinets, chains, and crates, as well as twenty-seven different bloodstains (puddle, archipelago, “high velocity splatter”). “Medieval Banquet” offers, among other sundries, an aggressively roasted turnip, a rack of lamb ribs, wooden cups, and several pork pies in various sizes and stages of consumption. The scans are detailed enough that when I examined a roasted piglet—skin leathered with heat and torn at the elbow—it made me feel gut-level nausea.

Assets are incorporated into video games, architectural renderings, TV shows, and movies. Quixel’s scans make up the lush, dappled backgrounds of the live-action version of “The Jungle Book,” from 2016; recently, watching the series “The Mandalorian,” Caron spotted a rock formation that he had scanned in Moab. Distinctive assets run the risk of being too conspicuous: one Quixel scan of a denuded tree has become something of a meme, with gamers tweeting every time it appears in a new game. In Oakland, Caron considered scanning a wooden fence, but ruled out a section with graffiti (“DAN”), deeming it too unique.

Epic creates detailed simulations of people as part of a project called MetaHumans.Source: Epic Games

After a while, he zeroed in on a qualified redwood. Working in visual effects had given him a persnickety lens on the world. “You’re just trained to look at things differently,” he said. “You can’t help but look at clouds when you’ve done twenty cloudscapes. You’re hunting for the perfect cloud.” He crouched down to inspect the ground cover beneath the tree and dusted a branch of needles—distractingly green—out of the way. Caron’s colleagues sometimes trim grass, or snap a branch off a tree, in pursuit of an uncluttered image. But Caron, who is in his late thirties and grew up exploring the woods of South Carolina, prefers a leave-no-trace approach. He hoisted one of the scanning rigs onto his back,  » …
Read More

0 I like it
0 I don't like it