Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

I used an app to create 3D models with my iPhone, and it’s shockingly great

The pace of innovation in artificial intelligence image generation is phenomenal. One company — Luma Labs — provides an excellent example of a practical, yet hugely entertaining use of the latest technology applied to 3D images.

Luma AI is in beta testing on the iPhone and eventually will be made available on Android as well. I got into the beta test group and can share some information about what this amazing app does and how easy it is to get incredible results.

What is Luma AI?

Alan Truly captures a 3D model of a figurine with an iPhone 13 Pro Max
Photo by Tracey Truly

Luma AI is an app and a service developed by Luma Labs. It captures three-dimensional images using a technique known as Neural Radiance Fields (NeRF). It’s similar to the ray-tracing technique that makes the graphics in high-end gaming look so realistic.

NeRFs have been around for a few years now but have existed primarily in research facilities until very recently. With the explosion of AI image generation, headlined by photorealistic Dall-E renderings, NeRFs are beginning to be explored by a much broader audience. The first wave of new NeRF software required some developer skills and installing software packages from GitHub, then training the A.I. on a set of photos. It was a bit much for the average person.

Luma Labs is about to make the process dramatically simpler with its Luma AI app. From start to finish, the entire process can be managed from an iPhone, and the end result is more accessible as well.

Luma AI iPhone compatibility

Someone holding the iPhone 14 Pro Max.
Joe Maring/Digital Trends / Digital Trends

Since Apple made a point of demonstrating the 3D depth measuring capabilities of LiDAR sensors, you might expect Luma AI to require the more expensive iPhone 14 Pro or iPhone 14 Pro Max to capture 3D models. However, the clever developers at Luma Labs use artificial intelligence instead. That makes this technology compatible with iPhones as old as the iPhone 11.

In the future, the app will become available on Android and there’s already a web version in beta testing as well. In an interview, Luma Labs CEO Amit Jain said the iPhone app is expected to be ready for public release in a few weeks.

How to use Luma AI

The rear cameras on the iPhone 14 Pro Max.
Joe Maring / Digital Trends

To use Luma AI, you simply circle slowly around an object at three different heights. An AR overlay guides you through the process, which takes a few minutes and becomes easier after a few tries as you get familiar with the process. Before long, you’ll be able to capture a medium-sized object like a chair in a couple of minutes.

Any size object can be handled because, to Luma AI, it’s just a series of images — no matter how big the subject is. If you circle a cup, a statue, or a building, the general idea remains the same.

The app will let you know when it has enough images, and when that happens, a Finish button will appear. You can also keep circling and filling in gaps in the AR cloud of rings and rectangles that represent the photos taken so far. The app will automatically stop the capture when an ideal amount of photos have been collected. There’s also a freeform mode that lets you capture even more photos, at different angles and distances. You can see the process in the YouTube video I created below. It’s an iPhone app, so it’s a portrait video.

Luma AI beta demo for Digital Trends

Processing is the next step, which happens on Luma Labs’ servers. After an hour or so, the finished NeRF will be available in the app in several different forms. The first view given is a generated video, showing a fly-by of the object in its natural environment. An interactive version is next and lets you spin the view by dragging a finger or a mouse across the image.

Most impressive of all, the subject of the capture, extracted from the background, is also available. With this representation, you can pivot the 3D object on any axis and zoom in to see it more closely. The sharpness depends on how many images were collected and how slow and stable you were during the capture process.

Getting better all the time

Luma Labs is updating the app and service at a remarkable pace. Within a week of receiving the beta test invitation, two powerful new features were added that expand the possibilities greatly. The first is a web upload option that allows you to capture video without the app, then upload it to Luma Labs website for processing. The results appear online and in the app.

This means it’s possible to use any of the iPhone’s camera modes, capture video with a dedicated camera, or even record video with AR glasses like Ray-Ban Stories. For example, a drone video becomes even more epic when you can smooth the motion and change direction after you’ve already landed. Luma Labs shared a good example showing an aerial view of autumn leaves in this tweet.

Fall in Palo Alto is gorgeous! 🍂 https://t.co/EwNkiv0DQV pic.twitter.com/hdd7iBLYgV

— Luma AI (@LumaLabsAI) October 22, 2022

The other new feature opens up 3D editing, painting, and 3D printing opportunities. The 3D meshes can be exported with textures in OBJ or GLTF format. They aren’t optimized but can be viewed with textures intact even with an online viewer such as the free, open-source website Online3DViewer.

A Luma AI capture of an art figurine is being refined in MeshLab.
Sprout Sprite Fairy Figurine

It’s also possible to open the 3D files in a mesh editor like the free, open-source MeshLab to delete any stray artifacts that appear as floating blobs, as well as clean up, and simplify the model before exporting in a variety of formats. The figurine featured above is about three inches tall and was sculpted by my wife, Tracey, for her business, ALittleCharacter. Luma AI captured a remarkable amount of detail in the sculpture and the log that it was resting upon. The log could have been selected and removed by MeshLab as well.

The highs and lows of 3D scanning

Kyle Brussell shared a dessert display from a party, mentioning he asked the adults to wait for their treats so he could capture it as a digital diorama.

Used @LumaLabsAI at a birthday party last night, made a bunch of adults not eat dessert so I could circle the table with my phone to make a 3D AI dream of the setup like a very cool person pic.twitter.com/sP0vVPB3yx

— Kyle Russell (@kylebrussell) October 30, 2022

Although Luma AI can process video, it relies on still images to construct a three-dimensional scene. That means if the subject moves, it might reduce the quality or clarity of the capture. A 3D image of a person who is seated, as shown in Albert Bozesan’s Tweet, will be good. In the same tweet, the second capture of a sculpture shows what happens when there’s movement within the scene. The background shows people that walked near the subject as distorted shapes.

Took two @LumaLabsAI #NeRFs by a Bavarian lake today. Great way to capture memories, feels like Minority Report. #Tegernsee pic.twitter.com/HLC0ekF7uD

— Albert Bozesan (@AlbertBozesan) October 30, 2022

Luma AI price and availability

Luma AI is currently in beta testing, and invitations are periodically given via the company’s Twitter account. If you have a compatible iPhone and an interest in this technology, you might be able to get early access. There’s also a waitlist on the Luma Labs’ website.

Luma Labs CEO Jain indicated that pricing is yet to be determined and depends upon how broad the user base turns out to be and how the results of the scans are being used. Based on these statements, there might be a professional subscription with more advanced features and a personal subscription for less. For the time being, it will remain free to use.

Editors' Recommendations

Alan Truly
Computing Writer
Alan is a Computing Writer living in Nova Scotia, Canada. A tech-enthusiast since his youth, Alan stays current on what is…
Why you should buy the iPhone 15 Pro Max instead of the iPhone 15 Pro
Someone holding an iPhone 15 Pro Max outside on a patio, showing the back of the Natural Titanium color.

If you want the best iPhone money can buy in 2024, you have two options: the iPhone 15 Pro and the iPhone 15 Pro Max. They have the same chipset, similar display technology, nearly identical cameras, etc. It's a really close battle, save for the fact that the iPhone 15 Pro is $200 cheaper.

It might be tempting to save some cash and choose the iPhone 15 Pro, but I recommend you splurge for the larger (and more expensive) iPhone 15 Pro Max. Why? Let me explain.
It's a big iPhone you won't hate using

Read more
This one thing could make iOS 18 the best iPhone update in years
The Home Screen on the iPhone 15 Pro Max.

Apple’s WWDC 2024 is just a couple of months away. As with every WWDC, we’ll see what Apple has in store for the next generation of software across its hardware portfolio, including the iPhone with iOS 18.

Rumors have been swirling about iOS 18 and how it will be “one of the biggest updates yet.” We know some features like RCS support in Messages are definitely coming, with other whispers of big home screen customization changes and more.

Read more
This is our best look yet at the iPhone 16’s big design changes
iPhone 15 Pro in Natural Titanium held in hand in front of a cement brick wall.

It seems Apple is prepping yet another design refresh for its smartphones this fall season. In 2023, the iPhone 15 Pro made an aesthetic deviation by serving thinner bezels and titanium looks alongside a new multi-function button. This year, it’s going to be the entry-point iPhone 16 and its Plus variant that are apparently lined up for a design refresh.

Tech commentator Sonny Dickson has shared dummy units reportedly depicting all four iPhone 16 variants, which seem to confirm what previous leaks have predicted so far. On the iPhone 15 and iPhone 15 Plus, the camera lenses dance diagonally on a square bump. Apple is reportedly ditching the current camera arrangement for their respective successors in favor of a pill-shaped vertical setup.

Read more