Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

I used an app to create 3D models with my iPhone, and it’s shockingly great

The pace of innovation in artificial intelligence image generation is phenomenal. One company — Luma Labs — provides an excellent example of a practical, yet hugely entertaining use of the latest technology applied to 3D images.

Luma AI is in beta testing on the iPhone and eventually will be made available on Android as well. I got into the beta test group and can share some information about what this amazing app does and how easy it is to get incredible results.

What is Luma AI?

Alan Truly captures a 3D model of a figurine with an iPhone 13 Pro Max
Photo by Tracey Truly

Luma AI is an app and a service developed by Luma Labs. It captures three-dimensional images using a technique known as Neural Radiance Fields (NeRF). It’s similar to the ray-tracing technique that makes the graphics in high-end gaming look so realistic.

NeRFs have been around for a few years now but have existed primarily in research facilities until very recently. With the explosion of AI image generation, headlined by photorealistic Dall-E renderings, NeRFs are beginning to be explored by a much broader audience. The first wave of new NeRF software required some developer skills and installing software packages from GitHub, then training the A.I. on a set of photos. It was a bit much for the average person.

Luma Labs is about to make the process dramatically simpler with its Luma AI app. From start to finish, the entire process can be managed from an iPhone, and the end result is more accessible as well.

Luma AI iPhone compatibility

Someone holding the iPhone 14 Pro Max.
Joe Maring/Digital Trends / Digital Trends

Since Apple made a point of demonstrating the 3D depth measuring capabilities of LiDAR sensors, you might expect Luma AI to require the more expensive iPhone 14 Pro or iPhone 14 Pro Max to capture 3D models. However, the clever developers at Luma Labs use artificial intelligence instead. That makes this technology compatible with iPhones as old as the iPhone 11.

In the future, the app will become available on Android and there’s already a web version in beta testing as well. In an interview, Luma Labs CEO Amit Jain said the iPhone app is expected to be ready for public release in a few weeks.

How to use Luma AI

The rear cameras on the iPhone 14 Pro Max.
Joe Maring / Digital Trends

To use Luma AI, you simply circle slowly around an object at three different heights. An AR overlay guides you through the process, which takes a few minutes and becomes easier after a few tries as you get familiar with the process. Before long, you’ll be able to capture a medium-sized object like a chair in a couple of minutes.

Any size object can be handled because, to Luma AI, it’s just a series of images — no matter how big the subject is. If you circle a cup, a statue, or a building, the general idea remains the same.

The app will let you know when it has enough images, and when that happens, a Finish button will appear. You can also keep circling and filling in gaps in the AR cloud of rings and rectangles that represent the photos taken so far. The app will automatically stop the capture when an ideal amount of photos have been collected. There’s also a freeform mode that lets you capture even more photos, at different angles and distances. You can see the process in the YouTube video I created below. It’s an iPhone app, so it’s a portrait video.

Luma AI beta demo for Digital Trends

Processing is the next step, which happens on Luma Labs’ servers. After an hour or so, the finished NeRF will be available in the app in several different forms. The first view given is a generated video, showing a fly-by of the object in its natural environment. An interactive version is next and lets you spin the view by dragging a finger or a mouse across the image.

Most impressive of all, the subject of the capture, extracted from the background, is also available. With this representation, you can pivot the 3D object on any axis and zoom in to see it more closely. The sharpness depends on how many images were collected and how slow and stable you were during the capture process.

Getting better all the time

Luma Labs is updating the app and service at a remarkable pace. Within a week of receiving the beta test invitation, two powerful new features were added that expand the possibilities greatly. The first is a web upload option that allows you to capture video without the app, then upload it to Luma Labs website for processing. The results appear online and in the app.

This means it’s possible to use any of the iPhone’s camera modes, capture video with a dedicated camera, or even record video with AR glasses like Ray-Ban Stories. For example, a drone video becomes even more epic when you can smooth the motion and change direction after you’ve already landed. Luma Labs shared a good example showing an aerial view of autumn leaves in this tweet.

Fall in Palo Alto is gorgeous! 🍂

— Luma AI (@LumaLabsAI) October 22, 2022

The other new feature opens up 3D editing, painting, and 3D printing opportunities. The 3D meshes can be exported with textures in OBJ or GLTF format. They aren’t optimized but can be viewed with textures intact even with an online viewer such as the free, open-source website Online3DViewer.

A Luma AI capture of an art figurine is being refined in MeshLab.
Sprout Sprite Fairy Figurine

It’s also possible to open the 3D files in a mesh editor like the free, open-source MeshLab to delete any stray artifacts that appear as floating blobs, as well as clean up, and simplify the model before exporting in a variety of formats. The figurine featured above is about three inches tall and was sculpted by my wife, Tracey, for her business, ALittleCharacter. Luma AI captured a remarkable amount of detail in the sculpture and the log that it was resting upon. The log could have been selected and removed by MeshLab as well.

The highs and lows of 3D scanning

Kyle Brussell shared a dessert display from a party, mentioning he asked the adults to wait for their treats so he could capture it as a digital diorama.

Used @LumaLabsAI at a birthday party last night, made a bunch of adults not eat dessert so I could circle the table with my phone to make a 3D AI dream of the setup like a very cool person

— Kyle Russell (@kylebrussell) October 30, 2022

Although Luma AI can process video, it relies on still images to construct a three-dimensional scene. That means if the subject moves, it might reduce the quality or clarity of the capture. A 3D image of a person who is seated, as shown in Albert Bozesan’s Tweet, will be good. In the same tweet, the second capture of a sculpture shows what happens when there’s movement within the scene. The background shows people that walked near the subject as distorted shapes.

Took two @LumaLabsAI #NeRFs by a Bavarian lake today. Great way to capture memories, feels like Minority Report. #Tegernsee

— Albert Bozesan (@AlbertBozesan) October 30, 2022

Luma AI price and availability

Luma AI is currently in beta testing, and invitations are periodically given via the company’s Twitter account. If you have a compatible iPhone and an interest in this technology, you might be able to get early access. There’s also a waitlist on the Luma Labs’ website.

Luma Labs CEO Jain indicated that pricing is yet to be determined and depends upon how broad the user base turns out to be and how the results of the scans are being used. Based on these statements, there might be a professional subscription with more advanced features and a personal subscription for less. For the time being, it will remain free to use.

Editors' Recommendations

Alan Truly
Alan is a Computing Writer living in Nova Scotia, Canada. A tech-enthusiast since his youth, Alan stays current on what is…
iOS 18 makes an 11-year-old iPhone feature exciting again
Someone holding an iPhone 14, showing the Lock Screen.

Following the Worldwide Developers Conference (WWDC 2024) keynote, developers are starting to dig into the first iOS 18 developer beta. Though this beta lacks Apple Intelligence and many of the other features demoed on Monday, it offers a surprising new take on an old iOS feature: the flashlight.

The built-in flashlight feature has been available on the iPhone since iOS 7, which was released in 2013. It hasn't changed much at all since then, which makes sense, given its basic function. Interestingly, it has received a significant update in iOS 18.

Read more
iOS 18 is official. Here’s how it’s going to change your iPhone forever
Screenshots of new features in iOS 18.

It’s been a long time coming, but it’s finally here: iOS 18 has just been announced at Apple’s Worldwide Developers Conference (WWDC) 2024 keynote. And, like the rumors have been saying, this is a very big, juicy update for your iPhone.

There's been a lot of anticipation for iOS 18. Rumors and leaks for the new update have been particularly intense this year, more so than iOS 17 rumors last year. So, was the hype worth it? Here's what's coming to your iPhone with iOS 18.
Home screen changes

Read more
MacOS 15 will completely change how you use your iPhone
An iPhone being mirrored on a MacBook.

Apple just announced macOS 15 at WWDC 2024. Called macOS Sequoia, the updated operating system brings a suite of new features to Macs this fall. The key change, however, is a new Continuity feature that allows you to mirror your iPhone on your Mac, from the MacBook Air to the Mac Studio.

Although iPhone mirroring takes center stage, there are a ton of new features in MacOS 15. Here are all of them.
iPhone mirroring

Read more