ARKit provides a way to situate stories in the viewer’s real environment using augmented reality (AR). More specifically, it is a suite of developer tools produced by Apple (originally released in 2017) to simplify the creation of AR iOS apps that can be viewed on Apple devices. These apps can currently run on the iPhone and iPad, with some features that require more advanced sensors only available on the latest Apple devices. ARKit includes a software development kit (SDK) that simplifies creating AR apps by providing both templates that include boiler plate code for different kinds of scenes as well as a high-level API that allows programmers to use features such as face tracking without needing to understand the nitty gritty details. ARKit is available for free as part of Apple’s Xcode.
Fundamentally, ARKit offers two key components needed to produce AR apps: motion tracking and scene understanding. Tracking is important because it figures out where the device–such as an iPhone–is in real time in order to display the AR object from the correct perspective. Scene understanding means figuring out the real environment that the AR object with be placed in. This includes detection of surfaces such as floors and tables as well as estimation of lighting direction. Both are key to placing and lighting the AR object in a realistic manner. ARKit is continually being improved by Apple and now includes features such as face tracking, depth information (for devices that include a LiDAR scanner), and collaborative sessions for shared AR experiences.
Storytellers might use ARKit to create 3D objects that viewers can interact with using their iPhones. These AR objects can also be tied to geographic locations, so viewers moving around a physical space could discover them by scanning different spots with their iPhones (like Pokemon Go). Read more about how Quartz is using ARKit to “help people understand objects in the news” such as spacecraft.