How Perfectly Can Reality Be Simulated? (2024)

Digitizing the real world involves the tedium of real-world processes. Three-dimensional models are created using lidar and photogrammetry, a technique in which hundreds or thousands of photographs of a single object are stitched together to produce a digital reproduction. In the redwood grove, as Caron set up his equipment, he told me that he had spent the past weekend inside, under, and atop a large “debris box”—crucially, not a branded Dumpster, which might not pass legal review—scanning it from all angles. The process required some nine thousand photographs. (“I had to do it fast,” he said. “People illegally dump their stuff.”) Plants and leaves, which are fragile, wavery, and have a short shelf life, require a dedicated vegetation scanner. Larger elements, like cliff faces, are scanned with drones. Reflective objects, such as swords, demand lasers. Lind told me that he loved looking at textures up close. “When you scan it, a metal is actually pitch-black,” he said. “It holds no color information whatsoever. It becomes this beautiful canvas.” But most of Quixel’s assets are created on treks that require permits and months of planning, by technical artists rucking wearable hard drives, cameras, cables, and other scanning equipment. Caron had travelled twice to the I’on Swamp, a former rice paddy on the outskirts of Charleston, South Carolina, to scan cypress-tree knees—spiky, woody growths that rise out of the water like stalagmites. “They look creepy,” he said. “If you want to make a spooky swamp environment, you need cypress knees.”

The company now maintains an enormous online marketplace, where digital artists can share and download scans of props and other environmental elements: a banana, a knobkerrie, a cluster of sea thrift, Thai coral, a smattering of horse manure. A curated collection of these elements labelled “Abattoir” includes a handful of rusty and sullied cabinets, chains, and crates, as well as twenty-seven different bloodstains (puddle, archipelago, “high velocity splatter”). “Medieval Banquet” offers, among other sundries, an aggressively roasted turnip, a rack of lamb ribs, wooden cups, and several pork pies in various sizes and stages of consumption. The scans are detailed enough that when I examined a roasted piglet—skin leathered with heat and torn at the elbow—it made me feel gut-level nausea.

Assets are incorporated into video games, architectural renderings, TV shows, and movies. Quixel’s scans make up the lush, dappled backgrounds of the live-action version of “The Jungle Book,” from 2016; recently, watching the series “The Mandalorian,” Caron spotted a rock formation that he had scanned in Moab. Distinctive assets run the risk of being too conspicuous: one Quixel scan of a denuded tree has become something of a meme, with gamers tweeting every time it appears in a new game. In Oakland, Caron considered scanning a wooden fence, but ruled out a section with graffiti (“DAN”), deeming it too unique.

Epic creates detailed simulations of people as part of a project called MetaHumans.Source: Epic Games

After a while, he zeroed in on a qualified redwood. Working in visual effects had given him a persnickety lens on the world. “You’re just trained to look at things differently,” he said. “You can’t help but look at clouds when you’ve done twenty cloudscapes. You’re hunting for the perfect cloud.” He crouched down to inspect the ground cover beneath the tree and dusted a branch of needles—distractingly green—out of the way. Caron’s colleagues sometimes trim grass, or snap a branch off a tree, in pursuit of an uncluttered image. But Caron, who is in his late thirties and grew up exploring the woods of South Carolina, prefers a leave-no-trace approach. He hoisted one of the scanning rigs onto his back, clipped in a hip belt to steady it, and picked up a large digital camera. After making a series of tweaks—color calibration, scale, shooting distance—he began to slowly circle the redwood, camera snapping like a metronome. An hour passed, and the light began to change, suboptimally. On the drive home, I considered the astonishing amount of labor involved in creating set pieces meant to go unnoticed. Who had baked the pork pies?

Sweeney, Epic’s C.E.O., has the backstory of tech-founder lore—college dropout, headquarters in his parents’ basem*nt, posture-ruining work ethic—and the stage presence of a spelling-bee contestant who’s dissociating. He is fifty-three years old, and deeply private. He wears seventies-style aviator eyeglasses, and dresses in corporate-branded apparel, like an intern. He is mild and soft-spoken, uses the word “awesome” a lot, and tweets in a way that suggests either the absence of a communications strategist or a profound understanding of his audience. (“Elon Musk is going to Mars and here I am debugging race conditions in single-threaded JavaScript code.”) He likes fast cars and Bojangles chicken. Last year, he successfully sued Google for violating antitrust laws. Epic, which is privately held, is currently valued at more than twenty-two billion dollars; Sweeney reportedly is the controlling shareholder.

When we spoke, earlier this spring, he was at home, in Raleigh, North Carolina, wearing an Unreal Engine T-shirt and drinking a soda from Popeyes. Behind him were two high-end Yamaha keyboards. We were on video chat, and the lighting in the room was terrible. During our conversation, he vibrated gently, as if shaking his leg; I wondered if it was the soda. “It’s probably going to be in our lifetime that computers are going to be able to make images in real time that are completely indistinguishable from reality,” Sweeney told me. The topic had been much discussed in the industry, during the company’s early days. “That was foreseeable at the time,” he said. “And it’s really only starting to happen now.”

Sweeney grew up in Potomac, Maryland, and began writing little computer games when he was nine. After high school, he enrolled at the University of Maryland and studied mechanical engineering. He stayed in the dorms but spent some weekends at his parents’ house, where his computer lived. In 1991, he created ZZT, a text-based adventure game. Players could create their own puzzles and pay for add-ons, which Sweeney shipped to them on floppy disks. It was a sleeper hit. By then, he had started a company called Potomac Computer Systems. (It took its name from a consulting business he had wanted to start, for which he had already purchased stationery.) It operated out of his parents’ house. His father, a cartographer for the Department of Defense, ran its finances. Sweeney renamed the company Epic MegaGames—more imposing, to his ear—and hired a small team, including the game designer Cliff Bleszinski, who was still a teen-ager. “In many ways, Tim Sweeney was a father figure to me,” Bleszinski told me. “He showed me the way.”

Cartoon by Suerynn Lee

Link copied

Sweeney’s lodestar was a company called id Software. In 1993, id released Doom, a first-person shooter about a husky space marine battling demons on the moons of Mars and in Hell. Doom was gory, detailed, and, crucially, fast: its developers had drawn on military research, among other things. But id also took the unusual step of releasing what it called Doom’s “engine”—the foundational code that made the game work. Previously, games had to be built from scratch, and companies kept their code proprietary: even knowing how to make a character crouch or jump gave them an edge. Online, Doom “mods” proliferated, and game studios built new games atop Doom’s architecture. Structurally, they weren’t a huge departure. Heretic was a fantastical first-person shooter about fighting the undead; Strife was a fantastical first-person shooter about fighting robots. But they were proofs of concept for a new method and philosophy of game-making. As Henry Lowood, a video-game historian at Stanford, told me, “The idea of the game engine was ‘We’re just producing the technology. Have at it.’”

Sweeney thought that he could do better. He soon began building his own first-person shooter, which he named Unreal. He recalled looking through art reference books and photographs to better understand shadows and light. When you spend hours thinking about computer graphics, he told me, the subject “tends to be unavoidable in your life. You’re walking through a dark scene outdoors at night, and it’s rainy, and you’re seeing the street light bounce off of the road, and you’re seeing all these beautiful fringes of color, and you realize, Oh, I should be able to render this.” Unreal looked impressive. Water was transparent, and flames flickered seemingly at random. After screenshots of the game were published, before its release, developers began contacting Sweeney, asking to use his engine for their own games.

How Perfectly Can Reality Be Simulated? (2024)

FAQs

How Perfectly Can Reality Be Simulated? ›

Insofar that physics optimally implements itself, you cannot simulate reality in general without a massive performance hit (think about the recursive absurdity of the simulator simulating itself).

Is it possible to simulate the entire universe? ›

Simulating the universe is impossible. There are too many variables; there is too much we don't understand.

Is it possible that we are living in simulation? ›

Evidence AGAINST Simulation Hypothesis

Several logical and scientific arguments challenge the likelihood we are living in an ancestor simulation. First, creating such a complex simulation would require astronomical computing power and energy far beyond what humans can currently achieve.

What is the likelihood we are in a simulation? ›

“You can just exclude that [hypothesis] right off the bat. Then you are only left with the simulation hypothesis,” Kipping says. “The day we invent that technology, it flips the odds from a little bit better than 50–50 that we are real to almost certainly we are not real, according to these calculations.

Could scientists perfectly simulate the entire universe in a computer down to the last atom? ›

Explanation: It is currently not possible to perfectly simulate the entire universe in a computer, down to the last atom.

Are we living in a matrix? ›

In short, it's a process he said could support the idea that we may be living in a Matrix-style artificial reality akin to a computer simulation, where we and anything else we perceive is nothing more than a virtual facsimile. "It doesn't prove entirely that," Vopson said in the video.

Am I in the Matrix right now? ›

The answer is no, we are not living in the Matrix, and this is not a computer simulation. The easiest way to know this is to understand the incredible complexity of the known universe and its infinite interactions.

How do we escape the simulation? ›

You may be able to escape by staying self-aware, achieving an abnormal amount, or simply asking for a way out. Even if we are in a simulation, you and your decisions matter. You would be (and are) totally unique and valid in who you are.

How to prove we are in a simulation? ›

Perhaps the most supportive evidence of the simulation hypothesis comes from quantum mechanics. This suggest nature isn't “real”: particles in determined states, such as specific locations, don't seem to exist unless you actually observe or measure them. Instead, they are in a mix of different states simultaneously.

Who created the simulation we live in? ›

Swedish philosopher Nick Bostrom contends in his 2003 paper that future generations might have mega-computers that can run numerous and detailed simulations of their forebears, in which simulated beings are imbued with a sort of artificial consciousness.

Does Elon Musk think we are living in a simulation? ›

That's right, Elon Musk suspects that he himself — along with you, everyone you know, and indeed the entirety of the observable universe itself — are merely simulations.

What is the chance of us living in a simulation? ›

Still, the reasoning goes, the simulation hypothesis might be true. It's possible enough that we have to take it seriously. Bostrom estimates a 1-in-3 chance that we are sims. Chalmers estimates about 25%.

What is the matrix theory? ›

Matrix Theory is based on the idea that the world is like a computer program, with a set of rules and algorithms that determine how things work. According to as per research, understanding these rules and algorithms is the key to success in life, and anyone can learn to "HACK" the matrix by mastering them.

What is the theory that the universe will never end? ›

A theory called "Big Bounce" proposes that the universe could collapse to the state where it began and then initiate another Big Bang, so in this way, the universe would last forever but would pass through phases of expansion (Big Bang) and contraction (Big Crunch).

What is the most likely theory for how the universe will end? ›

Theories about the end of the universe. The fate of the universe may be determined by its density. The preponderance of evidence to date, based on measurements of the rate of expansion and the mass density, favors a universe that will continue to expand indefinitely, resulting in the "Big Freeze" scenario below.

How close are we to true quantum computing? ›

We are rather close to a “real” quantum computer in terms of hardware, as several companies have already demonstrated working quantum computers with dozens of qubits. The main problem is not hardware, but the SOFTWARE.

What is the largest simulation of the universe? ›

Using one of the most powerful supercomputers in the world, astronomers have carried out the largest ever cosmological simulations. Known as Flamingo, the simulations trace the growth of the large-scale structure of the universe over 13.75 billion years.

Is it possible that the universe was created from nothing? ›

In this view, the Big Bang arises from an almost nothing. That's what's left over when all the matter in a universe has been consumed into black holes, which have in turn boiled away into photons – lost in a void.

Could a quantum computer simulate the universe? ›

Lloyd also postulates that the Universe can be fully simulated using a quantum computer; however, in the absence of a theory of quantum gravity, such a simulation is not yet possible. "Particles not only collide, they compute."

How big is the universe simulation? ›

All transitions are completely seamless, and this virtual universe has a size of billions of light-years across and contains trillions upon trillions of planetary systems. The procedural generation is based on real scientific knowledge, so SpaceEngine depicts the universe the way it is thought to be by modern science.

Top Articles
Latest Posts
Article information

Author: Kimberely Baumbach CPA

Last Updated:

Views: 5835

Rating: 4 / 5 (41 voted)

Reviews: 80% of readers found this page helpful

Author information

Name: Kimberely Baumbach CPA

Birthday: 1996-01-14

Address: 8381 Boyce Course, Imeldachester, ND 74681

Phone: +3571286597580

Job: Product Banking Analyst

Hobby: Cosplaying, Inline skating, Amateur radio, Baton twirling, Mountaineering, Flying, Archery

Introduction: My name is Kimberely Baumbach CPA, I am a gorgeous, bright, charming, encouraging, zealous, lively, good person who loves writing and wants to share my knowledge and understanding with you.