Here I am demoing the different detail levels of Apple's new Photogrammetry API in Xcode 13 beta 1. There is raw, reduced, preview, medium, and full. Full is currently bugged and will not generate a model.
You can do this on an M1 or an Intel-based Mac, but the M1 has better and faster results.
I took 234 pictures of Smough from the Dark Souls board game with an iPhone XR, which does not have LiDAR.
Apple’s downloadable playground: [ Ссылка ]
Apple’s 3D models keynote: [ Ссылка ]
If you'd like to support my work to help me improve video quality and bring more content, please consider donating to my PayPal:
[ Ссылка ]
There are a few requirements when considering doing this:
1. You need to be on macOS Monterey 12.0 beta or later
2. You need to have Xcode 13 beta or later installed
3. You need to have a decent spec’d Mac (preferably an M1, as they complete the process ~4x as fast as an Intel-based Mac)
4. You need a camera or phone to take pictures of your model; I used an iPhone XR
5. You need a model that doesn’t have too many reflective surfaces
6. You need to put your model on a dark, neutral background that doesn’t have texture
7. Preferably use a lazy-suzan while taking photos, and some kind of arm to keep your camera steady
Subscribe for more content, as I will be creating more tutorial and how-to videos for coding in Swift, as well as longer tutorials on building macOS apps.
Thank you for watching!
Interested in anything Mac related? Check out the channels below for more info!
Mr. Macintosh
[ Ссылка ]
The Apple Ninja
[ Ссылка ]
![](https://i.ytimg.com/vi/Q41xNLDJy-Y/maxresdefault.jpg)