
Earlier this month, a handful of iOS practitioners from Deloitte Digital studios across the globe flew to San Jose to attend Apple’s 2018 Worldwide Developer Conference (WWDC). They spent the week learning about new technology and attending hands-on labs to help them implement the latest APIs, frameworks, and services for our clients.
Find out which new features they are most excited about and get a glimpse of what they learned with a roundup of their individual, high-level takeaways.
Thomas Hack, Studio Lead, Duesseldorf, Germany
Animoji tongue and eye detection. This might be one of the most underrated new features, but aside from giving you the ability to make silly faces with your tongue out, Animojis have all of the technology of a powerful motion sensing input device at the fraction of the size. It’s impressive! Animojis are basically a demonstration of what new facial recognition engines can accomplish. I see huge potential in using them in a medical or educational context. Although it was not mentioned in the keynote, I learned in a session that eye movement detection will also be improved in the next version.
Fluid interfaces. The session Designing Fluid Interfaces proved how thoughtfully Apple employees design and implement interactions with user interfaces. In my opinion, this is how Apple separates itself from competitors—by investing many weeks of work on highly detailed and highly functional interfaces which respond like things we already know from our daily life. Apple is willing to put forth a great deal of effort to build smooth animations, even if it means that implementation is really complex and requires a high amount of performance optimization.
Backwards compatibility. Apple’s update philosophy is a big advantage. Contrary to public belief, they do not force customers to buy new devices with new software releases, but actually bring new life to old devices which may be fast enough for another year of use (e.g. iPhone 5s or iPad Air 1). In fact, Apple continues to support devices that are nearly five years old.
Rinse Mallee, Enterprise Mobile Architect, Amsterdam, The Netherlands
Health records API. Apple’s Health app will now be enriched with a Health Record API for developers. The Health Record API makes it possible to aggregate health records from multiple institutions alongside their patient-generated data, creating a more holistic view of their health. According to Apple, the Health Records API:
- Is built with the industry standard HL7 FHIR
- Creates a seamless connection
- Is encrypted and secure
- Is designed to protect privacy
Siri improvements. Siri shortcuts will make it possible to let any app expose quick actions to Siri. The end user will be able to create their own phrase that automatically triggers Siri to use a specific feature of an app via the Shortcuts app. Siri will also prompt you with a shortcut at a certain time of day based on prior experience, a sort of Siri AI. Shortcuts work across iOS, HomePod and Apple Watch.
Group FaceTime. FaceTime will now support group chats with up to 32 people in a single FaceTime call. How is this possible? Group FaceTime makes use of the screen by automatically enlarging squares when people speak. Even the new message visual features will be used in FaceTime, so you will be able to add a Memoji layer over your face!
ARKit2. The new ARKit2 will be updated with a lot of new features, including:
- A USDZ file format for storing augmented reality in a file, which can be used across web and multiple apps to display realistic AR objects. This feature has high potential for the retail market.
- The new AR multi-viewer mode makes it possible to have multiple viewers (devices) looking at one single augmented world and also acting together in the same augmented world.
- 3D object recognition makes it possible to scan objects and recognize them later.
CoreML. CoreML will also be updated not only with a great way to expand ML models via Create ML, but with a new Natural Language Framework to analyze natural language text and deduce its language-specific metadata.
Josh Mann, Senior Software Engineer, Denver, Colorado, U.S.
Apple’s new Siri updates are about to get a lot smarter in iOS 12 in three big ways: Siri shortcuts, personalized shortcuts, and new workflow capabilities.
Siri shortcuts. The new Siri Shortcuts will automatically use calendar events, location and common tasks and will also suggest shortcuts to automate these for you.
Personalized shortcuts. With personalized shortcuts, you will be able to perform tasks by creating shortcuts using third-party applications.
Workflow capabilities. Finally, using the new workflow capabilities, you will be able to arrange these new shortcuts to simplify tasks to a single voice command. One common use case is my morning commute. I’ll be able to use Siri to check my calendar for upcoming meetings, map my route to work to avoid traffic, start my audiobook for the ride, and order my coffee as I approach the office.
Yishai Roodyn, iOS Engineer, London, U.K.
With WWDC comes the new iOS 12 and I think there are three new features that will have a large impact.
Siri shortcuts. Apps will now be able to donate shortcuts to Siri, and essentially this means that Apple has allowed developers to extend Siri to access their own apps. For instance, a restaurant app will be able to donate a shortcut that orders a takeaway and this will appear as a Siri suggestion on the Spotlight search, lock screen,and Siri watch face. Users will also be able to use donated shortcuts to add custom voice phrases to Siri to activate these shortcuts.
AR USDZ files. The arrival of augmented reality is exciting to both consumers and businesses. Apple worked with several partners to create a new file format and editor. Creative designers will now be able to produce augmented reality-ready designs in a “what you see is what you get” editor. These can be easily added to an app or website, allowing users to see objects in their own surroundings. With this technology, you will be able to check out all angles of a new pair of trainers or place new furniture in the actual room it will be in.
CreateML. Machine learning allows our computers to become even more powerful tools then they already are. Whereas before this was an expensive and difficult task in the remit of academics and domain experts, Apple is now opening this up to all developers. The CreateML tool is an app that takes a dataset and produces an easily useable model. This means that we can add custom machine learning to our apps to help with image recognition and natural language processing.
Jordan Stone, Senior Architect, Denver, Colorado, U.S.
Apple has seemingly doubled down on their commitment to lead the charge in the health care space. With iOS 11.3 brought Apple Health Records, which supported 12 hospital systems and was contained inside of Apple’s private ecosystem. Now boasting over 500 hospital partners and an entire API, Apple Health Records could revolutionize not only how developers use and build upon previously inaccessible kinds of health information, but how users view, interact with and act on their own health data.
Apple Health Records supports the FHIR Argonaut Implementation Guide only, meaning there are some data types in the Argonaut spec that may not make it into iOS 12. Data is made available in JSON format, so for anyone who has worked with Electronic Health Records before, this should feel familiar but be much easier to interact with.
In a lot of ways, Health Records behaves similarly to HealthKit. In typical Apple fashion, Health Records data is stored and encrypted locally on the device. However, unlike HealthKit, Health Records connections are not shared across devices, and there is a longer, 3-step process required to grant access to third party apps. Apple intentionally did this in order to make sure users were fully aware of the kinds of data they were agreeing to share.
We are already working with Apple to better understand this new API and have already begun working with it in the new iOS 12 beta for clients in the health care space. We are excited to combine the power of our Apple alliance, our deep health care experience, and our commitment to user-centered design to bring innovative solutions to our clients and to improve the lives of their customers.
Andreas Tielmann, iOS Engineer, Duesseldorf, Germany
For me, the WWDC 2018 announcements with the most impact are:
ARKit 2. This version greatly improves upon the initial ARKit announcement from last year and adds features to make it more powerful and easier to use. For example, AR experiences can now be persisted and shared between multiple users with different devices.
AR USDZ files. The new Universal Scene Description format "USDZ" ties into these new capabilities and allows our clients to easily embed 3D AR-objects into their mobile apps and websites.
New framework. Another existing announcement was the sneak peek that Apple gave into a framework that will allow us to easily port iOS apps to run on the Mac. However, this framework is not yet available for developers and is currently only used internally at Apple.
