In the first Augmented Reality GolfAR blog post (read it here), I discussed why augmented reality is coming into its own. And, highlighted the GolfAR app I was building to learn more about the technology involved.
Learning the latest tools from Apple helped build an experience that is interactive and fun. SwiftUI, ARKit 5, Reality Kit 2, and Reality Composer were all utilized in the build. Continue reading to understand the role each of these technologies plays and the reason you should add them to your tool belt.
SwiftUI is Apple’s UI framework that allows you to build amazing applications across all of the platforms that Apple provides. From tvOS, macOS, iOS, and iPadOS, using SwiftUI’s declarative syntax is not only enjoyable, but super powerful under the covers.
User the MVVM pattern alongside SwiftUI, the interface and code that connects the data to the interface were easy to document and organize. The ability for a developer to implement the
UIViewRepresentable protocol and assign a Coordinator to the class allows extreme flexibility without needing to break away from the SwiftUI approach. Many of the views within GolfAR utilize this pattern to keep a clean and maintainable code base that is decoupled and extensible.
ARKit was initially release in 2017 and hit the market with a splash. Apple gave its developers an amazing API to build immersive, interactive applications that blend the real world with your imagination. Each year we have seen improvement over improvement to ARKit and now, with ARKit 5, it’s easier than ever to get started in the AR space.
ARKit takes advantage of the hardware within your phone to understand your surroundings and properly position and light objects you place in view. ARKit is the backbone to GolfAR’s ability to detect horizontal planes, place objects (ball markers) on those planes, and calculate distance between AR anchors within the world around you. GolfAR utilizes the
ARWorldTrackingConfiguration to detect the scene and analyze how objects should interact. This is done with a few lines of code leaving us more time to focus on the experience that the user interacts with.
RealityKit 2 and Reality Composer
RealityKit hit the scene in 2019 and, along side Reality Composer, offered the easiest ways to build 3D objects and interact with ARKit to seamlessly integrate the virtual objects you create into the real world.
RealityKit 2 brings an essential class to your development tool box which is called ARView. ARView within RealityKit 2 brings integration between ARKit and MetalKit capabilities along with support for many new 3D rendering formats for an easy to approach API that brings the power of animation, gestures, spatial audio, and more to your AR apps.
Reality Composer is a stand alone application that Apple has developed for those that need an easy to use tool to build 3D objects and use those creations within Reality Kit. It’s simple yet powerful and really approachable, even for the non-artist. In GolfAR, all of the ball markers you interact with were designed in Reality Composer.
As you can see, there is a lot of technology packed into such a simple application. Knowledge of Apple’s ecosystem of tools and API’s allow for us to build some amazing things from simple ball marker apps to powerful AR experiences in healthcare, construction, and retail. We are on the forefront of this technology and I expect we will soon be experiencing AR as a normal part of life.
These experiments are important in ensuring that our current and future clients can take advantage of these technologies in their custom applications. Learning more about this ecosystem is allowing Airship to better position ourselves to provide a deeper level of creativity to the problems that our clients bring to us on a daily basis. In newer versions of GolfAR you will see how we track body movement and use Location Anchors to provide even more value to the golfers, and AR enthusiasts out there.