You'd totally do it for the Snap if all your lenses looked like this.
Look, I’m impressed with what app developers and third-party partners have created for Instagram, Facebook, and Snapchat using the middling front-facing cameras of most phones — but it's about to get way more interesting.
SEE ALSO: This is how Apple built the iPhone X
Apple’s iPhone X and its TrueDepth camera is about to transform these lenses (or filters) we know and love into something even more powerful and, potentially, enriching.
Apple’s TrueDepth Camera is a bit of a misnomer since the technology, which sits in the notched-out space on the iPhone X’s spectacular OLED screen, is not just a camera, it’s a collection of technologies.
Included in that quarter-inch by 1.5-inch iPhone X dark space is a 7 MP front-facing camera, a dot projector, infrared camera, and flood illuminator, all of which work together to give the iPhone X the ability to see your face in three dimensions.
The camera marries that information with a live picture of you and then uses augmented reality algorithms and the iPhone X A11 Bionic CPU to create a symphony of real-time facial special effects.
In these early days and as I tested the iPhone X for my monster review, the utility of the TrueDepth camera was limited to a few crucial and fun features. There’s Face ID, which is the iPhone X’s ability to identify my face out of all others and let me use it to unlock the phone with nothing more than a glance, the ability to take Portrait Mode selfies, and the wildly entertaining animojis, iMessage character animations driven entirely, and in perfect sync, by your face.
Most intriguing, though, for a Snapchat fan like myself was the special preview I got of a Snapchat update that takes full advantage of the TrueDepth Camera’s ability to see every curve, line, and movement of your face.
Snap Inc. is excited about the new technology. A company spokesperson told me that the lenses I've tested are more realistic thanks to faster and more accurate tracking.
In this very limited Snapchat Beta app, I got exactly four lenses: Luchador face paint, what looks like a porcelain mask, feathers with a jewel tiara, and a flower wreath.
Each lens hugs my face, expressions and head movements to an extraordinary degree. For example, I can still see the lines on my face through the green luchador lens, giving it the appearance of real face paint.
The Snapchat app beta lenses also read the room light to give these filters an almost unprecedented reality.
Snap told me that realism comes courtesy of facial depth data provided by the TrueDepth camera, which personalizes each lens to the contours of my face.
What Snapchat still can't do is tell the difference between my face and someone else's. It's combining object recognition with depth information to draw the lenses on each face, but it will do the same thing for any face-shaped object. At one point I had it put one of the new lenses on an Einstein robot toy.
These depth-information-powered Snapchat lenses are not perfect. The TrueDepth Camera doesn’t work if your head is too close. I also noticed almost as much judder (the lens shifting in my face as I moved) on these lenses as I did on ones I’ve used in Instagram and the full version of Snapchat.
Even so, as more Apple developers get their hands-on Apple’s face-tracking configuration in its ARKit API and start to test it with the Apple iPhone X and its TrueDepth Camera, we'll see more, never-before imagined ways of bringing our own faces into third-party applications. It's going to open entirely new vistas of application interaction.
As Apple’s global VP of marketing told me recently, when it comes to AR, “pretty much every new developer is thinking about incredible ways to do things that they could never do before.”
Stay tuned for the full Snapchat update, but remember, your lens experience might not be as good without the iPhone X and its TrueDepth camera.
No comments:
Post a Comment