Photogrammetry of In-Game Artifacts: A Skyrim VR Test Case

Ancient_nordic_pickaxe

Ancient Nord Pickaxe (image: Elder Scrolls wiki)

I have always wanted to know if I could 3D-print things that I have in my inventory in the games that I play, so I finally tried it with some success, but I have a long way to go.

One of the more useful ideas for archaeologists in digital built environments is to conduct photogrammetry of the artifacts they find and then export them for 3D printing. For my test case, I used a static artifact from a 100% designed game, Skyrim, meaning that the object was completely created by one of the game’s artists and looks the same for anyone else playing the game anywhere else in the world. Success here means that one can use similar methods and software tools to extract procedurally generated artifacts from future synthetic worlds, these artifacts created through algorithms instead of explicit design. For games such as No Man’s Sky, a player could scan and print a plant, animal, or building, which had not been created by a person. In a designed game like Skyrim, a player using a PC could easily locate the image file for the artifact or, as in the case of this test, scan the artifact through the gaming console in order to achieve usable output. In both cases, we are extracting something in 2D (with the appearance of being 3D), and then creating an actual 3D replica of it that we can hold in our hands.

This post will walk you through the steps I took during my first attempt to extract an imaginary video game object into something I could actually print.

Step 1: In-Game Photogrammetry

I hold an “ancient Nord pickaxe” in my player inventory in Skyrim (pictured above, but a crisp, original image from the actual Nord pickaxe gamefile, DLC2RR03NordPickaxe). Its handle is incised with vine-like decoration. In order to scan it for printing, I needed to approach a flat surface in the game’s environment (e.g., a wall or the side of a hill) in order to guarantee a featureless, black background, which is not unlike a finds photographer using a black velvet backdrop to create a defined silhouette. Because the game’s data-windows are translucent, trying to scan inventory in the middle of a field for example would create a lighter, textured background that would interfere with the 3D rendering tool used later in this process. Once flush with the wall, I opened my inventory screen and selected the item to scan, in this case a pickaxe (for archaeology of course). On my handheld PS4 controller, I started recording video (double-tap of the Share button). I then activated the zoom feature (right trigger or R2 button) to make the pickaxe fill the screen. Next I rotated the pickaxe on the Y-axis a few times, and repeated with the X-axis by using the right stick, maintaining consistent pressure to assure that the rotation speed was constant. This action mirrored what happens when one uses a turntable to support an object for photogrammetry. After I rotated the pickaxe I released the R2 button and then stopped the video recording with a double-tap of the Share button. I then exported the MP4 movie file to a USB drive through the PS4’s Capture Gallery area.

Here is the original video of recording the pickaxe prior to converting it for 3D-rendering:

 

Step 2: Rendering the 3D Image

I extracted the MP4 file from my USB drive onto my computer, and then navigated to a free online file conversion utility that split the MP4 moving image into dozens (or hundreds) or JPG files. For this example, I used FileZigZag, but there are other free online tools and standalone software apps that work as well (e.g., ffmpeg). Once I had the 71 JPGs, I imported them into the open source regard3D app (thanks to Shawn Graham for introducing me to this program and helping me troubleshoot—see his detailed regard3d documentation ), which as the following images show, computed the matches of the collection of images to produce a point cloud, the points of which were then triangulated in advance of creating a printable surface. Once the surface was generated, I exported it to OBJ and MTL files for cleanup prior to printing.

Screen Shot 2018-01-20 at 5.34.00 PM

Regard3d matching images to begin creating the points for the resulting mesh

Screen Shot 2018-01-20 at 5.34.30 PM

Setting parameters to create a dense, high-quality pointcloud

Screen Shot 2018-01-20 at 5.35.02 PM

Resulting pickaxe-shaped pointcloud including noise from the original video recording

Screen Shot 2018-01-20 at 5.35.28 PM

Pickaxe surface applied, which was exported for cleanup in Meshlab

Note: Because the Skyrim pickaxe was recorded in VR, the resulting 3D image is curved. Skyrim VR projects its images onto a curved surface, and that curve is reflected in the imported image files. Also, even though an artifact is recorded in-the-round through rotation, regard3d treats the final scan as a curved, uniface image and not as a true 3D object. The resulting 3D printed artifact will reflect this instead of presenting a true 3D object. It is as yet unclear to me if this will happen in other games/environments, if this issue is unique to VR, or if this an issue with the software.

Step 3: Cleaning and Printing the Artifact

Most 3D scans need to be touched up before they are printed, eliminating any digital “cruft.” Cleaning can be done with the tools (vertice select and vertice delete) in the open source app Meshlab. Once the 3D model has been cleaned, one can save it at which point it can be brought to a Makerspace (in this example) for printing. I am still fighting with this step as it is my first time using Meshlab and my first 3D surface mesh. It doesn’t look like I was expecting, and I have miles to go before I am ready to print it. Any help would be appreciated.

Screen Shot 2018-01-20 at 5.41.54 PM

Cleaning up the pickaxe mesh in Meshlab

Variations: Note that the above works for Skyrim, but the “turntable” feature will differ from game to game. In some cases, inventory will rotate, and you can film the rotation. In other cases you might need to “fly” around something in order to film it. The more images that result from your filming, the higher quality the resulting model will be. Note that when you record video through a console or computer, there is no actual camera, so sometimes the 3D rendering software you choose to use will not be able to assign a camera focal length. You may need to do this by hand to “spoof” the camera settings used, in this case a width of 5.8 and a focal length of 3.9. This is most easily done to the regard3d R3D project file by conducting a find-and-replace on FocalLength and sensorWidth in a simple text editor.

I look forward to finishing this test, but wanted to publish what I’ve done so far. It’s clear I need some assistance, but I am ecstatic that I was able to extract an in-game artifact and prepare it (almost) for printing, bringing it into the natural world for the first time.

—Andrew Reinhard, Archaeogaming

2 thoughts on “Photogrammetry of In-Game Artifacts: A Skyrim VR Test Case

  1. Pingback: 3d models from archival film/video footage – Electric Archaeology

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s