Remote Sensing in Synthetic Landscapes

UPDATED: 4:15, MAY 7, 2017

No Man's Sky_20180506144039

Buried, ruined, player-made structure in No Man’s Sky.

The abandoned, ruined structure on the planet Horner in the procedurally generated universe of No Man’s Sky was player-made. I know this because the structure is marked with a flag, which indicates a base. Players have the option to creatively build a base on their adopted worlds. The bases I am exploring for my PhD (really) were built in the version 1.2, Pathfinder era (March 8, 2017–August 10, 2017), when bases could be publicly shared for the first time. Version 1.3 (Atlas Rises) cataclysmically changed the geomorphology and climate of every planet in the game, which resulted in ruined bases often completely or partially buried. Thanks to the Terrain Manipulator tool made available in v1.3, one can now excavate the past quite literally.

I could be forgiven for thinking that the structure pictured above was the entire, complete structure. Many abandoned bases now lack their required base unit, a small, round, and computerized hut. I thought this base was one of those. But one thing tipped me off that this might not be the case.

When investigating a landscape as an archaeologist, one can actively engage with the place using sight, sound, and smell, as well as identifying contours and other natural features, plus natural resources that might encourage both movement and settlement. People can use ground-penetrating radar to see below the surface, identifying things like walls or other evidence without the need for digging. Aerial photography can often show other features of past occupation based on crop mark patterns or looter holes.

Aerial reconnaissance of a possible area of archaeological interest in No Man’s Sky is no different than that of the natural world. I can fly and film, looking for patterns. But because this is a synthetic world, I have learned that sight and sound are not the only indicators that there is more to the landscape than meets the eye. Observe:

 

I filmed the above video during my initial recon of this built heritage site, and noticed that time slowed down almost to a standstill when flying close to the structure. Players of video games know this phenomenon as “lag” or “latency”, when complex elements of a digital game bump into each other to stop the action on-screen to the point of a complete system freeze, or to slow things down enough to grossly affect the outcome of MMO boss fights and raids. Most of the ruined, abandoned bases I’d found so far were floating above the ground. There was no overburden to remove. I decided to investigate further by walking.

 

In this video you can see how my walking slows down to a crawl, again indicating something beneath the surface. I decided to dig a testpit on the outside of the western wall to see if I could identify foundations, or something else. As you can see, I did. This led ultimately to the partial excavation of the structure to a depth of around 10m. The screenshot below shows the view down from the top of the hole on the level of the base unit, which I was able to recover.

No Man's Sky_20180506155458

The lag (sluggishness) of both aerial and ground movement around the aboveground structure indicated subterranean features, but I was unable to determine the extent. The reason for this was because my PlayStation 4 console completely crashed. In No Man’s Sky, the console’s memory caches the ProcGen landscape as well as the elements both in and under it. This is a lot of information. Combine that with the presence of a massively complex, player-built base, and that’s a recipe for instability.

When the PS4 reboots, all open applications/games shut down completely and must be reopened after logging back in to the PlayStation Network. As a result of the crash, my NMS save point reverted to my first point of contact, touching down on Horner’s surface about one minute’s walk to the ruins. Cresting the hill again, I noticed with horror that the ruins were gone without a trace. Player communication stations nearby, however, remained. Panicked, I returned to space in my ship to re-scan Horner from orbit, the scan returning (with relief), the signal for the base, albeit three hours to the southwest of its original location. This is what I saw when I arrived:

No Man's Sky_20180506203600

The entire base was restored in full to its original player-made design, fully excavated, and sat atop the surface as if it had been built there originally. I know it’s the same base because the base unit inside has the player-given name, “‘Tohoulvaldou’-Außenposten”. Also, the top of the tower matches 100% in both of the structures.

 

Fully restored, one can see the full extent of the building, the majority of which was hidden underground, the only indication of which was the fact that my movement was severely slowed. Latency/lag can be used as a remote sensing tool in No Man’s Sky, and possibly other games, where massive concentrations of data affect how we experience the landscape of software.

This is a new way of experiencing a synthetic landscape, and something that differs fundamentally from experiencing a landscape in the natural world. When I visited a landscape in Kansas, time did not slow down for me, and my movement went unhindered, even though I was fieldwalking in a place of ancient settlement, which we would later excavate. In No Man’s Sky time did appear to slow down, and my movement was restricted. I experienced the landscape in a new way, which allowed me to interpret it archaeologically, something I would not necessarily have done without that clue. This is not a glitch. The latency is itself an artifact of the game/code, which manifests in how players interact with the environment.

UPDATED PART: This reminds me of the idea of noise (in the mathematical sense), where in traveling through the world (natural or synthetic) one encounters a fairly static state of noise, but as one approaches things of interest, I think (I need to do the actual math here, so this is presumption for now), that noise accumulates and achieves some kind of order, maybe fractal order in a way, but there’s less chaos. So the noise of the data underground creates latency, adding to the data already there. In No Man’s Sky the coded voxels carry the same bits of information all over the universe, but the player-made constructs interfere with that order, and we physically feel it (or at least our avatars do) when we cross into that human-oriented area. In messaging with Shawn Graham this afternoon, we talked a bit about how this works in the natural world, how a lot of data does cause a lag and indicates things of statistical importance. So maybe the latency in a game is not a new thing for understanding a landscape, because landscape is data, too. Topography can create latency on walks, and human-introduced data (by way of settlements, etc.) slows the archaeologist’s pace to a crawl. Less data, more speed. That’s true of any discipline. And it’s true of travel. We travel fast across the hardpan of the Alvord Desert. But the topographic data of Steens Mountain slows the traveler down with its physical manifestation of data. To explore a landscape of any kind, we not only have our “normal” senses, but time as well.

I look forward to testing this elsewhere in NMS and with other digital built environments.

—Andrew Reinhard, Archaeogaming

*Thank you to Prof. Sara Perry for helping me to clarify my thinking on the differences between landscapes in natural and synthetic spaces.

One thought on “Remote Sensing in Synthetic Landscapes

  1. Pingback: The Ziggurat Tomb of Planet Horner | Archaeogaming

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s