Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

I used an app to create 3D models with my iPhone, and it’s shockingly great

The pace of innovation in artificial intelligence image generation is phenomenal. One company — Luma Labs — provides an excellent example of a practical, yet hugely entertaining use of the latest technology applied to 3D images.

Luma AI is in beta testing on the iPhone and eventually will be made available on Android as well. I got into the beta test group and can share some information about what this amazing app does and how easy it is to get incredible results.

What is Luma AI?

Alan Truly captures a 3D model of a figurine with an iPhone 13 Pro Max
Photo by Tracey Truly

Luma AI is an app and a service developed by Luma Labs. It captures three-dimensional images using a technique known as Neural Radiance Fields (NeRF). It’s similar to the ray-tracing technique that makes the graphics in high-end gaming look so realistic.

NeRFs have been around for a few years now but have existed primarily in research facilities until very recently. With the explosion of AI image generation, headlined by photorealistic Dall-E renderings, NeRFs are beginning to be explored by a much broader audience. The first wave of new NeRF software required some developer skills and installing software packages from GitHub, then training the A.I. on a set of photos. It was a bit much for the average person.

Luma Labs is about to make the process dramatically simpler with its Luma AI app. From start to finish, the entire process can be managed from an iPhone, and the end result is more accessible as well.

Luma AI iPhone compatibility

Someone holding the iPhone 14 Pro Max.
Joe Maring/Digital Trends

Since Apple made a point of demonstrating the 3D depth measuring capabilities of LiDAR sensors, you might expect Luma AI to require the more expensive iPhone 14 Pro or iPhone 14 Pro Max to capture 3D models. However, the clever developers at Luma Labs use artificial intelligence instead. That makes this technology compatible with iPhones as old as the iPhone 11.

In the future, the app will become available on Android and there’s already a web version in beta testing as well. In an interview, Luma Labs CEO Amit Jain said the iPhone app is expected to be ready for public release in a few weeks.

How to use Luma AI

The rear cameras on the iPhone 14 Pro Max.
Joe Maring / Digital Trends

To use Luma AI, you simply circle slowly around an object at three different heights. An AR overlay guides you through the process, which takes a few minutes and becomes easier after a few tries as you get familiar with the process. Before long, you’ll be able to capture a medium-sized object like a chair in a couple of minutes.

Any size object can be handled because, to Luma AI, it’s just a series of images — no matter how big the subject is. If you circle a cup, a statue, or a building, the general idea remains the same.

The app will let you know when it has enough images, and when that happens, a Finish button will appear. You can also keep circling and filling in gaps in the AR cloud of rings and rectangles that represent the photos taken so far. The app will automatically stop the capture when an ideal amount of photos have been collected. There’s also a freeform mode that lets you capture even more photos, at different angles and distances. You can see the process in the YouTube video I created below. It’s an iPhone app, so it’s a portrait video.

Luma AI beta demo for Digital Trends

Processing is the next step, which happens on Luma Labs’ servers. After an hour or so, the finished NeRF will be available in the app in several different forms. The first view given is a generated video, showing a fly-by of the object in its natural environment. An interactive version is next and lets you spin the view by dragging a finger or a mouse across the image.

Most impressive of all, the subject of the capture, extracted from the background, is also available. With this representation, you can pivot the 3D object on any axis and zoom in to see it more closely. The sharpness depends on how many images were collected and how slow and stable you were during the capture process.

Getting better all the time

Luma Labs is updating the app and service at a remarkable pace. Within a week of receiving the beta test invitation, two powerful new features were added that expand the possibilities greatly. The first is a web upload option that allows you to capture video without the app, then upload it to Luma Labs website for processing. The results appear online and in the app.

This means it’s possible to use any of the iPhone’s camera modes, capture video with a dedicated camera, or even record video with AR glasses like Ray-Ban Stories. For example, a drone video becomes even more epic when you can smooth the motion and change direction after you’ve already landed. Luma Labs shared a good example showing an aerial view of autumn leaves in this tweet.

Fall in Palo Alto is gorgeous! 🍂

— Luma AI (@LumaLabsAI) October 22, 2022

The other new feature opens up 3D editing, painting, and 3D printing opportunities. The 3D meshes can be exported with textures in OBJ or GLTF format. They aren’t optimized but can be viewed with textures intact even with an online viewer such as the free, open-source website Online3DViewer.

A Luma AI capture of an art figurine is being refined in MeshLab.
Sprout Sprite Fairy Figurine

It’s also possible to open the 3D files in a mesh editor like the free, open-source MeshLab to delete any stray artifacts that appear as floating blobs, as well as clean up, and simplify the model before exporting in a variety of formats. The figurine featured above is about three inches tall and was sculpted by my wife, Tracey, for her business, ALittleCharacter. Luma AI captured a remarkable amount of detail in the sculpture and the log that it was resting upon. The log could have been selected and removed by MeshLab as well.

The highs and lows of 3D scanning

Kyle Brussell shared a dessert display from a party, mentioning he asked the adults to wait for their treats so he could capture it as a digital diorama.

Used @LumaLabsAI at a birthday party last night, made a bunch of adults not eat dessert so I could circle the table with my phone to make a 3D AI dream of the setup like a very cool person

— Kyle Russell (@kylebrussell) October 30, 2022

Although Luma AI can process video, it relies on still images to construct a three-dimensional scene. That means if the subject moves, it might reduce the quality or clarity of the capture. A 3D image of a person who is seated, as shown in Albert Bozesan’s Tweet, will be good. In the same tweet, the second capture of a sculpture shows what happens when there’s movement within the scene. The background shows people that walked near the subject as distorted shapes.

Took two @LumaLabsAI #NeRFs by a Bavarian lake today. Great way to capture memories, feels like Minority Report. #Tegernsee

— Albert Bozesan (@AlbertBozesan) October 30, 2022

Luma AI price and availability

Luma AI is currently in beta testing, and invitations are periodically given via the company’s Twitter account. If you have a compatible iPhone and an interest in this technology, you might be able to get early access. There’s also a waitlist on the Luma Labs’ website.

Luma Labs CEO Jain indicated that pricing is yet to be determined and depends upon how broad the user base turns out to be and how the results of the scans are being used. Based on these statements, there might be a professional subscription with more advanced features and a personal subscription for less. For the time being, it will remain free to use.

Editors' Recommendations

Alan Truly
Computing Writer
Alan is a Computing Writer living in Nova Scotia, Canada. A tech-enthusiast since his youth, Alan stays current on what is…
The iPhone 15 Pro’s Action button has a serious flaw
Action button on the iPhone 15 Pro.

Apple’s Wonderlust event brought us a revamped iPhone 15 lineup, including the iPhone 15 Pro, as well as the Apple Watch Series 9 and Apple Watch Ultra 2.

With the iPhone 15 lineup, the Dynamic Island and USB-C charging make their way to all models. The iPhone 15 Pro models got even more cool new hardware features — including the highly anticipated Action button, which was inspired by the Apple Watch Ultra.

Read more
10 colors I wish Apple made for the iPhone 15 and iPhone 15 Pro
iPhone 15 Pro in a red color.

Apple finally revealed the iPhone 15 at its “Wonderlust” event on September 12. This lineup includes the iPhone 15, iPhone 15 Plus, iPhone 15 Pro, and iPhone 15 Pro Max. There were other products announced during the event as well, but the iPhone 15 series is the biggest of the bunch.

However, Apple’s really been dropping the ball lately in terms of device colors. This year is especially bad with the very pastel and muted colors of the iPhone 15 models, not to mention the various shades of gray for the iPhone 15 Pro variants.

Read more
The iPhone 15 Pro isn’t the iPhone upgrade I hoped it would be
Apple iPhone 15 Pro with titanium frame.

When Apple introduced the first “Pro” iPhone in 2019, it stood with an additional telephoto camera at the back. To this day, the third lens serves as a proud marker of the Pro moniker. The camera chops are what predominantly separated these pricey trims from the non-Pro variants.

As Apple’s fall launch event unfolded last week, I was hoping to see some big camera upgrades fitting for the new Pro models. Yet, the only meaningful camera upgrade we got was kept exclusive to the iPhone 15 Pro Max. Instead of the iPhone 14 Pro’s 3x zoom camera, the iPhone 15 Pro Max makes the jump to a 5x telephoto camera.

Read more