Augmented reality wayfinding

14 Dec. 20

What is Augmented Reality?

Augmented reality is made up of the word “augment” which means to make something great by adding something to it.

Augmented reality is used to add 2D or 3D objects to the live view using the device’s camera.

Generally, Augmented Reality is used into indoor such as indoor path mapping, interior designing, airport, hospitals path mapping, games.

There are some popular applications which are using AR: LensKart, Pokemo GO, Real Strike and many more.

You can also create many kinds of AR applications using the front or rear camera of the device.

iPhone 6s and up devices with A9 chip and up devices support ARKit functionality. Please note, ARKit is not available to xCode Simulator.

What is ARNode?

AR Node is used to add any objects on live view. It basically works on x-axis, y-axis and z-axis.

We need to set all three axes for any object. Also, we can add multiple child objects for parent objects.

For example, You’re creating a car game then we can add tyres, seats as child of parent node.

For implementing AR into iOS. We need to import the ARKit package into the project.

Step 1:

Initialize pod for project using terminal to run below commands.

pod init
open Podfile

Add below 2 pods into the Podfile.

pod 'GoogleMaps'
pod 'GooglePlaces'

Save Podfile and run this command into terminal

pod install

Open your project .xcworkspace file.

Step 2:

Add below key into Info.Plist file


Step 3:

Add below swift package under Project target -> swift packages section

Step 4:

Create google api key and add it into the project globally.

Step 5:

Set Google api key when app launches. Add below code in appdelegate.swift file.

func application(
    _ application: UIApplication,
    didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?
) -> Bool {
    return true

Step 6:

Add UIView and button to ViewController and assign GMSMapView class to UIView.

Step 7:

Create a MapViewController.swift file and assign it to the viewcontroller in the storyboard. We will setup google map related functionalities into this file.

import GoogleMaps
import GooglePlaces
class MapViewController: UIViewController {
    @IBOutlet weak var mapView: GMSMapView!
    //MARK: Go button action created
    @IBAction func navigateButton(_ sender: Any) {
        let vc = ViewController()
        self.navigationController?.pushViewController(vc, animated: true)
    let locationManager = CLLocationManager()
    var currentLocation: CLLocation?
    var routeCoordinates = [CLLocationCoordinate2D]()
    var resultsViewController: GMSAutocompleteResultsViewController?
    var searchController: UISearchController?
    var resultView: UITextView?
    var selectedPin: GMSMarker?

Step 8:

Setup mapview functionalities when view controller loads.

override func viewDidLoad() {
    // Configuring location manager
    locationManager.delegate = self
    locationManager.desiredAccuracy = kCLLocationAccuracyBest
    // Configure mapView
    mapView.delegate = self
    mapView.isUserInteractionEnabled = true
    // Zoom to user location
    let camera = 0.01, longitude: 0.01, zoom: 15) = camera
    // Hide back button
    self.navigationItem.setHidesBackButton(true, animated: false)
    resultsViewController = GMSAutocompleteResultsViewController()
    resultsViewController?.delegate = self
    searchController = UISearchController(searchResultsController: resultsViewController)
    searchController?.searchResultsUpdater = resultsViewController
    // Put the search bar in the navigation bar.
    navigationItem.titleView = searchController?.searchBar
    // When UISearchController presents the results view, present it in
    // this view controller, not one further up the chain.
    definesPresentationContext = true
    // Prevent the navigation bar from being hidden when searching.
    searchController?.hidesNavigationBarDuringPresentation = false

Step 9:

We will get location coordinates for source and destination. From that value we will draw a path in below method.

//MARK: Draw polyline route from source to destination.
func getPolylineRoute(from source: CLLocationCoordinate2D, to destination: CLLocationCoordinate2D){
    let config = URLSessionConfiguration.default
    let session = URLSession(configuration: config)
    let url = URL(string: "\(source.latitude),\(source.longitude)&destination=\(destination.latitude),\(destination.longitude)&sensor=true&mode=driving&key=\(GOOGLE_APIKEY)")!
    let task = session.dataTask(with: url, completionHandler: {
        (data, response, error) in
        if error != nil {
        else {
            do {
                if let json : [String:Any] = try JSONSerialization.jsonObject(with: data!, options: .allowFragments) as? [String: Any]{
                    guard let routes = json["routes"] as? NSArray else {
                        DispatchQueue.main.async {
                            // self.activityIndicator.stopAnimating()
                    if (routes.count > 0) {
                        let overview_polyline = routes[0] as? NSDictionary
                        let dictPolyline = overview_polyline?["overview_polyline"] as? NSDictionary
                        let points = dictPolyline?.object(forKey: "points") as? String
                        if let legs = overview_polyline!["legs"] as? NSArray {
                            let legsValue = legs[0] as? NSDictionary
                            if let steps = legsValue!["steps"] as? NSArray {
                                for i in 1 ..< steps.count {
                                    let dic = steps[i] as? NSDictionary
                                    if i == 0 {
                                        let startLocation = dic!["start_location"] as? NSDictionary
                                        let location = CLLocationCoordinate2D(latitude: Double((startLocation!["lat"] as? NSNumber)!), longitude: Double((startLocation!["lng"] as? NSNumber)!))
                                    } else if i == steps.count - 1 {
                                        let startLocation = dic!["start_location"] as? NSDictionary
                                        let location = CLLocationCoordinate2D(latitude: Double((startLocation!["lat"] as? NSNumber)!), longitude: Double((startLocation!["lng"] as? NSNumber)!))
                                        let endLocation = dic!["end_location"] as? NSDictionary
                                        let secondLocation = CLLocationCoordinate2D(latitude: Double((endLocation!["lat"] as? NSNumber)!), longitude: Double((endLocation!["lng"] as? NSNumber)!))
                                    } else {
                                        let endLocation = dic!["end_location"] as? NSDictionary
                                        let location = CLLocationCoordinate2D(latitude: Double((endLocation!["lat"] as? NSNumber)!), longitude: Double((endLocation!["lng"] as? NSNumber)!))
                            //Path is available now, Let’s show path on map
                            self.showPath(polyStr: points!)
                            DispatchQueue.main.async {
                                // self.activityIndicator.stopAnimating()
                                let bounds = GMSCoordinateBounds(coordinate: source, coordinate: destination)
                                let update =, with: UIEdgeInsets(top: 170, left: 30, bottom: 30, right: 30))
                        else {
                            DispatchQueue.main.async {
                                // self.activityIndicator.stopAnimating()
            catch {
                print("error in JSONSerialization")
                DispatchQueue.main.async {
                    // self.activityIndicator.stopAnimating()

//MARK: Show path on map function
func showPath(polyStr :String){
    let path = GMSPath(fromEncodedPath: polyStr)
    let polyline = GMSPolyline(path: path)
    polyline.strokeWidth = 3.0
    polyline.strokeColor = = mapView // Your map view

Step 10:

We will check if permission is granted or not. If permission is granted then we will request for location and update location.

//MARK: Configure CLLocationManagerDelegate method(s)
extension MapViewController: CLLocationManagerDelegate {
    func locationManager(_ manager: CLLocationManager, didChangeAuthorization status: CLAuthorizationStatus) {
        if status == .authorizedWhenInUse {
    func locationManager(_ manager: CLLocationManager, didUpdateLocations locations: [CLLocation]) {
        currentLocation = locations[0]
        let marker = GMSMarker()
        marker.position = (currentLocation?.coordinate)!
        marker.icon = UIImage(named: "current") = mapView
        let camera = (currentLocation?.coordinate)!, zoom: 15) = camera
        print("Current location altitude: \(currentLocation?.altitude)")
    func locationManager(_ manager: CLLocationManager, didFailWithError error: Error) {
        print("Error finding location: \(error.localizedDescription)")

Step 11:

ConfigGMSAutocompleteResultsViewControllerDelegate used to show place autocomplete predictions in a table view.

//MARK: ConfigGMSAutocompleteResultsViewControllerDelegate method(s)
extension MapViewController: GMSAutocompleteResultsViewControllerDelegate {

    func resultsController(_ resultsController: GMSAutocompleteResultsViewController,
                           didAutocompleteWith place: GMSPlace) {
        searchController?.isActive = false
        // Do something with the selected place.
        print("Place name: \(")
        print("Place address: \(place.formattedAddress)")
        print("Place attributions: \(place.attributions)")
        if ("Let\'s Nurture"))!{
            DispatchQueue.main.asyncAfter( + 2) {

 func resultsController(_ resultsController: GMSAutocompleteResultsViewController,
                           didFailAutocompleteWithError error: Error){
        // TODO: handle the error.
        print("Error: ", error.localizedDescription)

    // Turn the network activity indicator on and off again.
    func didRequestAutocompletePredictions(_ viewController: GMSAutocompleteViewController) {
        UIApplication.shared.isNetworkActivityIndicatorVisible = true
    func didUpdateAutocompletePredictions(_ viewController: GMSAutocompleteViewController) {
        UIApplication.shared.isNetworkActivityIndicatorVisible = false
    func resultsController(_ resultsController: GMSAutocompleteResultsViewController, didSelect prediction: GMSAutocompletePrediction) -> Bool {
        return true

Step 12:

GMSMapViewDelegate used to place marker on mapview

//MARK: GMSMapViewDelegate delegate method(s)
extension MapViewController: GMSMapViewDelegate {
    func mapView(_ mapView: GMSMapView, markerInfoWindow marker: GMSMarker) -> UIView? {
        return UIView()

Step 13:

Create ViewController.swift file and below code into that file. We are using 2 packages into this file.

1. FocusNode - Focus node is used to add node on live scene
2. SmartHitTest - Used to estimate the position of the anchor, like looking for the best position based on what we know about our detected planes in the scene.

import ARKit
import FocusNode
import SmartHitTest
extension ARSCNView: ARSmartHitTest {}
class ViewController: UIViewController {
    var sceneView = ARSCNView(frame: .zero)
    let focusSquare = FocusSquare()
    var hitPoints = [SCNVector3]() {
        didSet {
            self.pathNode.path = self.hitPoints
    var pathNode = SCNPathNode(path: [])
    override func viewDidLoad() {
        //Add button in navigation bar right side
        let btn = UIBarButtonItem(title: "Add/Clear", style: .plain, target: self, action: #selector(newRouteDraw))
        self.navigationItem.rightBarButtonItem = btn
        self.sceneView.frame = self.view.bounds
        self.sceneView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
        // Set the view's delegate
        self.sceneView.delegate = self
        self.focusSquare.viewDelegate = self.sceneView
        // the next chunk of lines are just things I've added to make the path look nicer
        let pathMat = SCNMaterial()
        self.pathNode.materials = [pathMat]
        self.pathNode.position.y += 0.05
        //        if Int.random(in: 0...1) == 0 {
        //            pathMat.diffuse.contents = UIImage(named: "path_seamless")
        //            self.pathNode.textureRepeats = true
        //        } else {
        pathMat.diffuse.contents = UIImage(named: "path_with_fade")
        //        }
        self.pathNode.width = 0.5
        if let data = loadData(){
            hitPoints = data
        } else {
    override func viewWillAppear(_ animated: Bool) {
        let configuration = ARWorldTrackingConfiguration()
        configuration.planeDetection = [.horizontal, .vertical]
    override func viewWillDisappear(_ animated: Bool) {

Step 14:

We will draw new route and save that into user defaults for future use and will remove points

//MARK: New route draw and save into user defaults.
@objc func newRouteDraw(){
    let defaults: UserDefaults = UserDefaults.standard
    defaults.removeObject(forKey: "kLetsNurtureRoute")

Step 15:

Will fetch existing routes from user defaults

//MARK: get data from user defaults
func loadData() -> [SCNVector3]?{
    let defaults: UserDefaults = UserDefaults.standard
    let data = defaults.object(forKey: "kLetsNurtureRoute") as? Data
    if data != nil {
        do {
            if let userinfo = try? NSKeyedUnarchiver.unarchiveTopLevelObjectWithData(data! as Data) as? [SCNVector3]{
                return userinfo
            else {
                return nil
    return nil

Step 16:

Render scene and update time everytime

//MARK: Update AR view at specific time interval
extension ViewController: ARSCNViewDelegate {
    func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
        DispatchQueue.main.async {

Step 17:

Add tap gesture on camera view and on tap add point to that position.

//MARK: Setup tap gesture on camera view
extension ViewController: UIGestureRecognizerDelegate {
    func setupGestures() {
        let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap(_:)))
        tapGesture.delegate = self
//Handle tap gesture functionality
    @IBAction func handleTap(_ gestureRecognizer: UITapGestureRecognizer) {
        guard gestureRecognizer.state == .ended else {
        if self.focusSquare.state != .initializing {
            let defaults: UserDefaults = UserDefaults.standard
                let data: Data = try NSKeyedArchiver.archivedData(withRootObject: self.hitPoints, requiringSecureCoding: false)
                defaults.set(data, forKey: "kLetsNurtureRoute")

Step 18:

Add/Update node when scene is rendering and set plane anchor according to it.

//MARK: Add/update node at the time of rendering
extension ViewController {
	func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
		if let planeAnchor = anchor as? ARPlaneAnchor, planeAnchor.alignment == .vertical, let geom = ARSCNPlaneGeometry(device: MTLCreateSystemDefaultDevice()!) {
			geom.update(from: planeAnchor.geometry)
			geom.firstMaterial?.colorBufferWriteMask = .alpha
			node.geometry = geom
	func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
		if let planeAnchor = anchor as? ARPlaneAnchor, planeAnchor.alignment == .vertical, let geom = node.geometry as? ARSCNPlaneGeometry {
			geom.update(from: planeAnchor.geometry)



Lets Nurture
Posted by Lets Nurture

Blog A directory of wonderful things

Google in app purchase in android app

Before we start with app purchase, Let’s get a basic idea of what type of digital content we are selling to users. Google’s Billing system gives us mainly two types …

Django vs Laravel vs Node js

Web Frameworks are basically software packages that ease the complexity of web development. They come with many inbuilt features that are common in web development, thus minimizing the development time, …

8 Chatbot Development Frameworks: Building a Better Bot for Your Business

There has been an explosion in the use of chatbots across both business websites and messaging applications, mainly because businesses want to cater to their customers and customers have a …

Personalization of advertising and marketing: Tools at a marketer’s disposal

According to Econsultancy, following the personalization of metasearch engine, 93 percent of organizations experienced an increase in transformation rates. Personalization depicts the system and strategies of conveying the content and …

BigCommerce Vs Shopify – Key Difference

According to google reports, current e-commerce statistics state that 40 percent of worldwide internet users have bought products or goods online via desktop, mobile, tablet or other online devices. This …

Survival of the Fittest is now the Survival of the Agilest

A joint study by Forbes magazine and Scrum Alliance has found that Agile practices are gaining a lot more popularity among SME businesses and across sizes that and include startups, …

Top 3 Fantasy Sports Apps in India

Fantasy Sports Application is nothing but an Android and iOS-based gaming platform. Through this, the fans can make their own teams by choosing the players who they think will perform …

What Makes the ideal NGO App?

When it comes to world issues, charities and non-profit organizations have been at the forefront of organizing and distributing support across the globe for decades. According to nonprofit source, America …

Vitals to Monitor for Your mHealthcare App Development

Due to the advancement of modern technology, patients now have more options than ever, to quickly access inexpensive diagnoses, and in turn, appropriate treatments. One such option, Mobile Health (mHealth) …

2019 Technology Trends to Watch in an Increasingly Mobile Landscape

In 2019, technology users are expected to gravitate further towards portable devices that support mobile interactions. Smartphones, smartwatches, and virtual assistants, for example, are becoming the go-to devices for many. …

TensorFlow: A Framework Companion for Machine Learning

Nowadays, Machine Learning has become a major component in large business operations that are utilizing the technological advancements of artificial intelligence. The term, ‘Machine Learning’ was coined back in 1959 …

IoT Automotive Solutions Disrupting Car Rental Business

The Internet of Things (IoT) has become an increasingly used buzzword across media platforms, as the growing topic of conversation around it has brought forth major implications to all industrial …

Personal E-Bike for Hire: The Next Horizon of the Sharing Economy

When it comes to the modern sharing economy, one of the more recognizable brands in the industry is Uber, an on-demand ride-sharing platform solution that connects those without vehicles to …

Disaster Management – In Depth Analysis & Role of IT technologies

On card with Despicable Disaster! A disaster is any sudden serious calamity or disruption, which may be relatively short or may even last for months. Precisely, a disaster may be …

Advance Technology Features to improve Elderly Care Management

The elderly are the fastest growing age group in developed countries. According to the World Health Organization (WHO), 2 billion people will be over the age of 60 by 2050. …

May brings Chills and Thrills at Lets Nurture- a top IT Offshore Development Center in Ahmedabad

The fifth month of the year, May started with terribly hot and humid climate in Ahmedabad. Despite the extreme temperatures, Let’s Nurture welcomed May as a month full on new …

Life of an Intern- First few days in an IT industry

Woooahh!! was my expression when I was selected at the campus recruitment drive by ‘Lets Nurture’ an offshore software development company in Ahmedabad, India working for clients with global and …


Have an !dea or need help with your current business?

We use cookies to give you tailored experiences on our website.