NeRF: from 2D Photos to a Drone Footage using just an iPhone

ARShopia
7 min readSep 1, 2023

--

Luma AI iPhone application:

The free Luma AI iPhone application, is one of the best free tools that can make fully rendered video from 2D images using NeRF technology. Luma AI is an iPhone app that lets you capture and edit NeRFs, which are 3D models of objects, landscapes and scenes that can be viewed from any angle. You can also create cinematic videos of your NeRFs using a camera path editor or by moving through the NeRF in AR preview mode. Luma AI is a great way to create realistic 3D scenes with your iPhone, using the power of AI and NeRF.

You can create videos with impossible camera moves for Social Networks, or just relive the moment. Luma AI is compatible with iPhones as old as the iPhone 11, and does not require any LiDAR sensor or fancy capture equipment. With just your iPhone, you can create incredible and lifelike videos of whatever you want, wherever you are.

Using the App, you can record a video of your surrounding scene, or around a certain object (recording at 3 different heights, towards the object’s centre). The app takes care of the rest by uploading the video to Luma AI’s servers. As a result, an ultra-smooth orbit around the NeRF can be downloaded as a video, and a 3D textured object can be generated within a few minutes. You can also share the generated NeRF as an interactive Web view.

How to use Luma AI iPhone application:

To use Luma AI iPhone application, you need to follow these steps:

  • Download Luma AI from the App Store and sign up with your Apple or Google account.
  • Choose Object or Scene that you want to capture in 3D. Make sure it has enough details, textures, and lighting, and avoid transparent, reflective, or thin objects.
  • Tap on the Capture button and follow the AR instructions on the screen. Try to move slowly and steadily, and avoid occluding the object or scene with your hand or body.
  • After you finish capturing, you can preview your scene or 3D model.
  • You can either use a preset path or create your own to gain total control of the “flight” path, by tapping Reshoot, then Custom. This will open a camera path editor based on keyframes. We can move our viewpoint to a new position and tap the Add Keyframe button to record that position. Pinch and drag to move to another location and camera angle, then click the Plus button at the right to add another keyframe. Luma AI App places keyframes two seconds apart by default, and pressing the Play button shows how it smoothly moves the virtual camera through those two points. We can continue to add keyframes to make a drone-like path. It’s laborious compared to walking and moving around in AR mode (more on this in the next section below), but using keyframes allows for greater precision and smoother path.
  • Dragging the purple, diamond-shaped keyframes below the camera view allows us to fine-tune the timing. The stopwatch icon allows us to adjust the total length to speed up or slow down the different video sections. We can tap the Overview tab at the top of the screen to see the camera path as a purple line with the camera icons at each keyframe location.
  • When we’re satisfied with my flight path, we can use the options at the bottom of the screen to set the aspect ratio and other settings. When first clicking on Render button, we can select the video resolution (SD, HD or Full HD) and the number of frames per second, then we have just to click on Render again, and voila!
Reshoot path: Custom Camera path editor based on keyframes
  • Luma AI App will upload these details to Luma AI’s servers to create the video. Your iPhone isn’t needed for this step. You can close the app, or record another video. When the render is complete, you can download the video.
  • You can share a web link of your NeRF through Luma AI’s website, where anyone can view it in 3D.

The following video is a NeRF scan done recently, on a Beautiful Sunny Day ☀️.

AR Mode:

Luma AI also has an AR mode, which lets you travel through a NeRF (Neural Radiance Field) with your iPhone, revisiting places that you have previously captured. You can simulate a drone fly-through with your iPhone, creating impossible camera moves and perspectives.. Luma AI’s AR mode is a new way to explore and relive your memories in 3D.

When first opening the app, you will find a gallery of NeRFs previously captured. Tapping any item will place you into the Cinematic Render view. Tapping the AR button at the top right, places the NeRF within your physical space. In your real space, you can see either the extracted object resting on the floor, or the entire NeRF scene filling your surroundings as an immersive view. You can even record a new video of the scene while navigating in AR mode, it’s as you’re revisiting the same place again!. Please note that you must stay aware of your surroundings, because your physical space doesn’t match the NeRF generated environment.

The following video is recorded in AR Mode:

Stabilization of the recorded video

When you first click on Render, you will get more options like HD/Full HD, or the number of Frames Per Second, then you have to click Render again.

Select FPS & Resolution of the video

The video will process for few seconds (or more depending on the length of the video), but you can continue do other tasks while the video is rendering.

You can refer to this Luma AI’s X post (Twitter), when they first introduced this AR feature.

The limitations of NeRFs:

When using Luma AI App, we sometimes see “floaters,” the equivalent of noise or artifacts. If we move slowly and steadily enough while capturing the NeRF, this is minimized, and the details are much sharper.

When reshooting a camera path for a video, there’s a limit to how far and where you can move without these distortions affecting the video. As I move further from the center of the NeRF, the scene distorts and blurs due to a lack of detail, the scene gets blurred if you venture too far.

We hope this will improve, as NeRF and AI models evolves, and Luma AI has already significantly enhanced the quality of their AI models.

Luma AI Capturing Best Practices:

For best results, move the phone slowly and try to avoid rapid movements, especially rotation, as much as possible.

For best results, the object or scene should be captured from as many unique viewpoints as possible. Additionally, it is better to move the phone around (in 3D space) rather than rotating it from a stationary position when capturing. Standing in the same place and capturing outwards in a sphere typically does not work well.

Currently, the app struggles with complex reflections (e.g., curved mirror-like surfaces), curved transparent objects (e.g., car windows or plastic water bottles), and very large textureless surfaces (e.g. white walls). Most other materials work well.

The app can capture objects in most lighting conditions as long as textures can still be identified (i.e., not washed out or completely dark). Lighting conditions will be baked in, so the scene should be lit however you would like it to appear in the final result.

Any movement in the scene during capture may degrade the quality of the final result. For example, tree leaves moving in the wind may result in loss of detail, and people moving around in the background could introduce artifacts.

If using video capture, it is very important that video stabilization is off, since it causes the frames to have unstable camera intrinsics; this is particularly important on Android devices. Also avoid using the “HDR video” option on iOS.

Please refer here, for more details.

Flythroughs iPhone App:

Luma AI has recently simplified the drone footage feature into a new iPhone app called “ Flythroughs” which it is marketing explicitly to the real estate industry, that anyone can download from the App Store.

According to Luma AI, the app can replace expensive professional interior shots for real estate sales.

Luma AI shows high-quality examples of flythroughs generated with the app, like this one:

Please note that exporting the video is a paid feature in this App.

Summary:

Nerf have been around for some time, but was difficult to implement and use. Now it has finally become accessible for anyone to use. Luma AI makes NeRFs almost as easy as shooting a video. We can capture a scene in three dimensions, then reshoot it as a video later using a virtual camera.

The possibilities are endless, simplifying the creation of 3D objects and scenes.

Please share with us your experiences with NeRFs and write down in the comment section any information you would like to add.

Originally published at https://www.arshopia.com.

--

--

ARShopia
ARShopia

Written by ARShopia

Building interactive AR Experiences for Marketing & Entertainment using AR & AI. More info on: arshopia.ai

No responses yet