Thanks to the recent release of Apple’s ARKit, front end developer Frances Ng has created a point-and-translate app. That’s right, simply point your phone at the item you want to translate and if the item is recognized, associated language options will display.
Here it is in action.
Whether you’re looking to learn a new language in the comfort of your own home or in a foreign land looking for a helping hand, the app is a great example of often-too-rare AR utility.
The UX magic is possible through a combination of Apple’s ARKit and an existing database of about 1,000 learned objects that Ng ported into the app. While Ng says her app is just a demo and she has no immediate plans to take it to market, what’s so remarkable is that while companies like Microsoft spent many years and dollars on mastering object recognition, Ng was able to build her app in a weekend, simply because it’s building off so much past work that’s now freely available and baked into platforms like Apple’s.
Why It’s Hot
With the new AR development kits, we should definitely see a run of useful apps like this (as well as I’m sure lots of non-useful examples!)
It’s also an example of how more and more development is being done off of robust platforms increasing speed to market and standardizing patterns for fast consumer orientation