Earlier in September, we saw the roll-out of iOS 11, empowering millions of AR (augmented reality)-enabled devices. With its release, we've thought of a few creative ways Foursquare's location intelligence can provide experiences for you to wow your users. Before diving into a demo, we'll discuss AR and ARKit in some more depth.
“Augmented Reality is the enhanced or resultant image produced by overlaying another image over a real-time view of one's surroundings. “
How does ARKit work?
ARKit uses a technique called Visual Inertial Odometry (VIO), with some 2D plane detection. VIO means that the software tracks your position in space in real-time. This is done via your pose, which is tracked via the camera system and by the accelerometer and gyroscope (CoreMotion). Just like your odometer in your car tracks the distance the car has traveled, the VIO system tracks the distance that your iPhone has traveled in 6D space. 6D means 3D of xyz motion (translation), plus 3D of pitch/yaw/roll (rotation).
One of the core features of ARKit is plane detection. This is needed so you have somewhere on the ground to place your content, otherwise it would look like it's floating horribly in space. This is calculated from the features detected by the optical system.
Now we know more about AR and ARKit, let's dig into a quick demo:
In order to get you started, we've built an ARKit powered app to demonstrate the power of location intelligence. In the example detailed below, you'll be able to move your phone around the world to view locations and their proximity to you using the Foursquare Places API. We'd also encourage you to look at our Pilgrim SDK — where you can understand, communicate and engage with users in real-time.
git clone firstname.lastname@example.org:garethpaul/foursquare-ar-camera-ios.git
Once you have cloned the repo and added your keys you can build out on any iOS device. We recommend using a real-life phone — due to the complexity of testing AR locally.
Before we begin..
Building ARKit + Foursquare
In the example at a high level we perform three main functions.
- Determine location
- Find some places
- Add Places to AR
Step 1 — Determine Location
We utilized Core Location to determine basic location using sensory information. This framework uses all the available onboard hardware, including Wi-Fi, GPS, Bluetooth, magnetometer, barometer, and cellular hardware to gather data.
The LocationManager class conforms to CLLocationManagerDelegate and handles retrieving the location and direction from CoreLocation.
In our example our main ViewController conforms to a SceneLocationViewDelegate. The delegation is a simple and powerful pattern in which our ViewController acts with another object. The delegating object keeps a reference of the other object. The main value of delegation is that it allows us to easily customize AR behaviors of several objects in one central object.
Step 2 — Find Popular Places
Once the view is loaded we utilize Foursquare's Places API to determine locations from the SceneLocation's LocationManager. For simplicity we have added this to the main ViewController, but would recommend create a separate class for a restful services (model and controller).
Step 3 — Add Places to AR
Once we have our response of locations from `getFoursquareLocations` we can add a LocationNode (SCNNode), which is a node for given venues or objects. Additionally we add the locations to a compass, to help provide spacial awareness within an AR environment.
After completing these steps we have a fun way to find a venue and customize experiences based on location and proximity.
We look forward to seeing many more location powered AR apps in the future and are here to help. Given the complexity of all these steps, we'd encourage you to checkout the Github repo for this project. Finally, feel free to reach out to us @FoursquareAPI or via Stackoverflow for questions, tips and tricks!