Augmented Reality iOS Tutorial: Location Based
In this augmented reality tutorial, you’ll learn how to use your iOS users location to create compelling augmented reality experiences. By Jean-Pierre Distler.
Sign up/Sign in
With a free Kodeco account you can download source code, track your progress, bookmark, personalise your learner profile and more!
Create accountAlready a member of Kodeco? Sign in
Contents
Augmented Reality iOS Tutorial: Location Based
30 mins
Warning, Math Inside!
You're still here, so you want to learn more about the math behind HDAugmentedReality
. That’s great! Be warned, however, that it’s a bit more complicated than standard arithmetic. In the following examples, we assume that there are two given points, A and B, that hold the coordinates of a specific point on the earth.
A point’s coordinates consist of two values: longitude and latitude. These are the geographic names for the x- and y-values of a point in the 2D Cartesian system.
- Longitude specifies if a point is east or west of the reference point in Greenwich, England. The value can be from +180° to -180°.
- Latitude specifies if a point is north or south of the equator. The range is from 90° at the north pole to -90° at the south pole.
If you have a look at a standard globe, you’ll see lines of longitude that go from pole to pole – these are also known as meridians. You’ll also see lines of latitude that go around the globe that are also called parallels. You can read in geography books that the distance between two parallels is around 111 km, and the distance between two meridians is also around 111km.
There are 360 meridian lines, one for every degree out of 360 degrees, and 180 lines of parallel. With this in mind, you can calculate the distance between two points on the globe with these formulas:
This gives you the distances for latitude and longitude, which are two sides of a right triangle. Using the Pythagorean theorem, you can now calculate the hypotenuse of the triangle to find the distance between the two points:
That’s quite easy but unfortunately, it's also wrong.
If you look again at your globe, you’ll see that the distance between the parallels is almost equal, but the meridians meet at the poles. So the distance between meridians shrinks when you come closer to the poles, and is zero on the poles. This means the formula above works only for points near the equator. The closer the points are to the poles, the bigger the error becomes.
To calculate the distance more precisely, you can determine the great-circle distance. This is the distance between two points on a sphere and, as we all know, the earth is a sphere. Well OK, it is nearly a sphere, but this method gives you good results. With a known latitude and longitude for two points, you can use the following formula to calculate the great-circle distance.
This formula gives you the distance between two points with an accuracy of around 60 km, which is quite good if you want to know how far Tokyo is from New York. For points closer together, the result will be much better.
Phew - that was hard stuff! The good news is that CLLocation
has a method, distanceFromLocation:
, that does this calculation for you. HDAugmentedReality
also uses this method.
Why HDAugmentedReality
You may be thinking to yourself "Meh, I still don't see why I should use HDAugmentedReality." It's true, grabbing frames and showing them is not that hard and you can read about it on this site. You can calculate the distance between points with a method from CLLocation
without bleeding.
So why did I introduce this library? The problem comes when you need to calculate where to show the overlay for a POI on the screen. Assume you have a POI that is to the north of you and your device is pointing to the northeast. Where should you show the POI – centered or to the left side? At the top or bottom?
It all depends on the current position of the device in the room. If the device is pointing a little towards the ground, you must show the POI nearer to the top. If it’s pointing to the south, you should not show the POI at all. This could quickly get complicated!
And that’s where HDAugmentedReality
is most useful. It grabs all the information needed from the gyroscope and compass and calculates where the device is pointing and its degree of tilt. Using this knowledge, it decides if and where a POI should be displayed on the screen.
Plus, without needing to worry about showing live video and doing complicated and error-prone math, you can concentrate on writing a great app your users will enjoy using.
Start Coding
Now have a quick look at the files inside the HDAugmentedReality\Classes group:
-
ARAnnotation
: This class is used to define an POI. -
ARAnnotationView
: This is used to provide a view for POI. -
ARConfiguration
: This is used to provide some basic configuration and helper methods. -
ARTrackingManager
: This is where the hard work is done. Luckily you don't have to deal with it. -
ARViewController
: This controller does all the visual things for you. It shows a live video and adds markers to the view.
Setting Up the AR View
Open ViewController.swift and add another property below the places
property.
fileprivate var arViewController: ARViewController!
Now find @IBAction func showARController(_ sender: Any)
and add the following to the body of the method:
arViewController = ARViewController()
//1
arViewController.dataSource = self
//2
arViewController.maxVisibleAnnotations = 30
arViewController.headingSmoothingFactor = 0.05
//3
arViewController.setAnnotations(places)
self.present(arViewController, animated: true, completion: nil)
- First the
dataSource
for thearViewController
is set. ThedataSource
provides views for visible POIs - This is some fine tuning for the
arViewController
.maxVisibleAnnotations
defines how many views are visible at the same time. To keep everything smooth you use a value of thirty, but this means also that if you live in an exciting area with lots of POIs around you, that maybe not all will be shown. -
headingSmoothingFactor
is used to move views for the POIs about the screen. A value of 1 means that there is no smoothing and if you turn your iPhone around views may jump from one position to another. Lower values mean that the moving is animated, but then the views may be a bit behind the "moving". You should play a bit with this value to get a good compromise between smooth moving and speed. - This shows the
arViewController
You should have a look into ARViewController.swift for some more properties like maxDistance
which defines a range in meters to show views within. So everything that is behind this value will not be shown.