Apple Ups the Ante in Augmented Reality at WWDC 2017

How Apple are helping you to put a virtual sofa in your living room and visualise new shoes on your feet

Each year Apple announces a host of new developments at its Worldwide Developer Conference (WWDC) in June targeted at the enthusiastic developer community who hope drive innovation and research by developing new apps and other services.

This year there were some very interesting announcements that could take some time for the world to appreciate. That’s not to say that there aren’t some great things happening in Silicon Valley, but that they are primarily changing what’s possible for specialists rather than the less-specialised developers.

So we’re tempering our usual over-exuberant rush to start prototyping and we’re going to watch the Apple space very closely in the coming weeks to really get to know these new offerings. We are curious to explore some of the newly launched iOS11’s developments, to follow the evolution of machine learning’s integration into our everyday lives and are intrigued by the developments in augmented reality (AR) coming from Apple.

CoreML Opens Up Apple’s Machine Learning Features to All Apps

Machine learning is a topic we are hearing about more and more (take a look at our beginner’s guide article if you’re still a little unsure about what this actually is). Apple devices already employ some machine learning, such as the face recognition and detection for photos. Something that Apple do well that makes this feel more like something developers can really own and get their teeth into, is to offer machine learning in apps, rather than in the cloud. WWDC brought to us CoreML – a machine learning framework that allows us to integrate high performance, trained machine learning models into our apps. Whilst there might be some cool things we could do with this, machine learning still remains a very complex field so we are keen to see what those more specialist developers make of this new kit and see just how useable it really is. We hope ‘very’ of course.

Augmented Reality Blends Closely With the Real World

The announcement that wowed us a little more with its demo and intrigues us is the launch of the ARKit. Apple are relative late-comers to the field of augmented reality but that gives them the huge advantage of learning from others and bringing us an improved solution. As they are proud to profess, “It’s not about being first. It’s about being the best.”

So with ARKit, Apple have upped the ante in terms of quality. They are taking advantage of the high performance technology in iOS devices, such as the camera, CPU and GPU, and motion sensors and putting these together with the new software. Whilst current AR platforms rely on the device continuously recapturing the image being viewed, and doing so incredibly frequently, Apple’s approach is subtly but powerfully different. Whilst the image that the camera sees is still being recaptured, the frequency of gathering this information is less, utilising built-in iPhone and iPad capabilities such as GPS and the gyroscope to fill in the gaps. This frees up the device’s processor, translating to more power for the overlaying AR app to do its job better. This is exciting for developers as it enables much higher quality AR apps to be effectively displayed and subsequently further blurring the line between reality and the virtual world – as is AR’s purpose.

Visualise Furniture in Your Home

And this isn’t just about games such as Pokémon GO (though this does mean now that those pesky little critters appear to actually be firmly planted on the ground in front of you). There are already a few great examples of Apple’s ARKit being used to develop apps that truly do fulfil a customer need. Ikea announced at WWDC that they are developing an AR app to help customers visualise what products will look like in their own home, before making a purchase. It’s going to mean that, even if you live right by the store and you’ve been and sat on the sofa (though this is something that fewer of us actually do these days), you can be far more certain that it is going to look right in your living room and actually take up the space that your tape-measure indicates as you plot out a 2D area on the floor. How many of us have thought “ooh yes, that modest-sized Christmas tree would look great in the hallway” only to finally wrestle the monstrous beast into the car (this task in itself should have given us our first clue to the ensuing reality…) and then have it lose the majority of its “no drop” needles as you and your neighbours force it through the door, to discover that it actually now occupies not only your hallway, but also you stairs and doorways…basically far more than you’d imagined whilst you were in that, now noticeably, vast open space of the garden centre. So there is enormous value in being able to visualise how a fairly substantial purchase, such as an item of furniture, will actually change the landscape of your home.

Virtual Interactions in a Real Scene

The keynote presentation at WWDC brought us a sophisticated demonstration of the overlaying app blending seamlessly with the background image being viewed. The iOS devices are able to recognise the edge of a table and distinguish the table size (pretty important to avoid that over-sized Christmas tree problem!). There is even impressive interaction between the virtual objects – in the demo we saw a table lamp being moved around the real table and the corresponding shadow cast by the virtual cup also moving accordingly. The introduction of dual cameras to iPhone 7 Plus means we may see more of this in future iterations of the iPhone and iPad which can only enhance this sophisticated tool by refining the depth perception possible.

There are other great apps already out there, such as the sampler from footwear brand Converse, that are assisting shoppers in making choices. The possibilities now are opening up as sophisticated results are now achievable for brands seeking to convert passing interest into custom.

No Need for New Headgear, Millions Already Own the Hardware

From a user point-of-view, Apple have surely unveiled something that we might actually take up. Whilst Microsoft and Google have their HoloLens and Tango technologies respectively, we just don’t see (excuse the pun) us or our clients having a use for these in the near future. ARKit, on the other hand, is affordable and accessible – iPhone celebrated its 10th birthday this year so it’s not surprising to know that over a billion iPhones have been sold and iPads are the market leading tablet. Opening up ARKit to developers means implementing sophisticated technology through very capable devices. Other competitors in the AR field, such as Snapchat and Microsoft, need to build apps that run on top of iOS and Android as they don’t have popular phones of their own.

One potential shortfall of the Apple offering is the reliance on a hand-held device, rather than a headset. This means that it is only going to be useful for short periods at a time. However, how many applications of AR require long user-times? I’d wager that you’re getting into VR then.

So our guess is that take up by developers will be much higher with ARKit than previous offerings by competitors.

The Future of Apps is Looking Visually and Functionally Impressive

With the launch of the new 10.5in iPad Pro replacing the 9.7in version and a slight improvement to the 12.9in iPad Pro with the addition of True Tone display, we are eager to see how the apps we build start to change now that there are toolkits facilitating this development to really take advantage of iOS devices to create some stunning AR- and clever ML-based apps.