It’s always an exciting time of the year when we hit Apple’s flagship developer event – Worldwide Developers Conference. In the past we have seen new devices introduced to the world, an operating system funeral and the birth of some hugely important features in the Apple ecosystem, including the App Stores.


So this year was a little underwhelming when it came to features, and so it should have been. There is very often an intention of some of the large players (Google and Microsoft, to be specific) to constantly push innovative new approaches rather than solve old ones. Additionally all this constant feature loading means that old devices become obsolete (something that Apple found themselves in a bit of bother for late 2017). That being said, there are still some great announcements from this event that we think are highly important for our clients.



Alexa and Google Assistant have been constantly progressing. They provide an ecosystem for developers to work on top of and, due to which, we have seen a large uptake in physical hardware supporting these wonderful pieces of software. Whilst a little underwhelming that Siri still won’t be fully a part of this ecosystem in the way it’s close rivals are (embedding in to third party hardware, for example) we can now see developers create intricate interactions with Siri.

These interactions allow users to directly integrate certain applications within Siri, but additionally we can start to use Siri Shortcuts (surely the first development introduction of Workflow we have seen since Apple’s acquisition of them last year) to add our applications to a user’s day to day processes. Siri Shortcuts works simply on the premise that you ask Siri to start a process, and as you get to certain points (time/location/context) of that process, things simply happen. A quick example in this case is using your phone to unlock your front door when you arrive home, which shuts down the engine on your car, turns on your kettle so you can make a cup of tea, turns on your oven and sets your lighting and heating.

It is evident Siri is becoming very important in the Apple ecosystem, so this is an area we advise considering where your application can sit within that ecosystem.


ARKit 2

Probably one of the biggest feature announcements during Craig Federighi’s time on stage was this. AR/MR/VR is obviously becoming prevalent throughout the market, and we are seeing more and more people present us with briefs for this. But ARKit 2 brings some great additional functionality out of the box that allows us to make much more engaging experiences for end-users.

The first of these that really show the power of ARKit 2 is perhaps the ability for multiple users to be able to share a single AR experience. At the moment a very novel execution, but one that has the ability to be very powerful and carry an unbelievable narrative. The demo on stage that was undertaken was by the innovation team from Lego and it was utterly captivating. Business uses are definitely going to be there, and this is something we are proactively exploring with our clients.

Additionally, the ability to store AR states is another advantage. This one also came from Lego, whereby they stored an entire AR scene in a marker/object that took the shape of a Lego house. Allowing a user to continue their journey where they left off is likely to be a game changer – with the real estate market as well as construction market potentially being able to reap benefits here (for example, being able to fit out your kitchen before moving in and making the order directly from your iPad, but always being able to go back and modify, see and explore that scene when you return).

Some other points around performance (face tracking and object rendering) and also file format (USDZ is a file format that can be used across devices and software packages) means that Apple has clearly banked on making augmented reality a real winner for them in the future.



A light touch on this one, however Apple have now added support for third party map and navigation apps. That means with a little tweaking you can now push your application through Apple to be deployed within cars and give your customers much more real estate to be able to interact with your application. This could be just the start to creating a better ecosystem of sustainable apps for CarPlay, and an exciting development.



As mentioned previously, this was a majority of Apple’s focus. There are some bold claims around the pace of native features working much quicker, battery life no longer degrading with new releases and ultimately what this now means is that your application can now be supported by 5 generations worth of Apple devices (increasing to 6 when we hit the major Apple presentation in September). This however also means that keeping your app life cycle and development cycle up to date is more important, ensuring that it is designed to support the numerous different view ports.



The best of the rest

There were lots of nice announcements, but some that maybe aren’t particularly relevant in the development of client applications. That being said, we see some opportunities in the likes of CreateML and CoreML 2. Apple is seemingly looking to move machine learning to on-device, and wants us to be able to integrate this, but right now it is too early to tell how exactly the ecosystem of machine learning will work within applications outside of the Apple ecosystem.

iOS introduced some great digital health tools to ensure you start to disconnect from your devices. This acknowledgement to device addiction feels now like a good indicator to perhaps creating applications for a specific use case, rather than repeated and excessive usage.

watchOS also comes with a hefty update, allowing third party application developers to utilise background sounds within their applications. It’s some time away we think from seeing third party applications that use rich media, but that is the logical next step for a device that is fast becoming a huge part of the Apple ecosystem.

MacOS too sees some great updates. In 2019 it’ll be even more important to understand your application ecosystem as some elements of iOS development are shared with macOS. This will mean ports of your iOS application to macOS desktop could realistically happen, however we imagine this will be limited whilst Apple maintain that the two operating systems and viewports have their own uses as well as limitations – and for a good reason. However information is available at a high level on what can and can’t be done, and it is worthwhile keeping this in mind for your future roadmaps.

Finally, tvOS sees some great updates too, however more on the consumer side with the ability to deliver content using Dolby Atmos. That being said, delivering content across a smart device in 4K and using Atmos can result in a highly immersive experience that shouldn’t be underestimated (this would be particularly handy for events, for example).

We’re really excited to share with you some of our experiments in the coming months. If you can’t wait that long, get in touch with us today. We’d love to talk with you about how we can help you create and maintain successful applications in the next generation of Apple’s software ecosystem.

Rate this article
We use cookies to provide you with a better experience. Carry on browsing if you're happy with this, or learn more about opting-out the following here.
Follow MakoLab on Social Media
Want to be up-to-date with our MakoNews? Sign up to our newsletter.