How to Create a Mobile AR Navigation App
Let’s explore a use case to develop a mobile app with an augmented reality-based navigation feature including process flow, team, tech stack.
Join the DZone community and get the full member experience.Join For Free
Let’s explore a use case to develop a mobile app with an augmented reality-based navigation feature.
The system is going to have:
- A backend database to store the venues’ data.
- An admin panel
- A mobile app
From the end user’s perspective, the app delivers information about bars and restaurants. They whip out their phones, and they learn what’s the scuttlebutt about these places. People get to see and submit ratings and reviews, and they also get access to information about open hours, locations, and directions.
This is obviously a heavily saturated section of the market. How does a product owner cut through the noise and differentiate his app from all the rest? Adding AR functionality would definitely do the job, attracting media attention as the offering becomes the first bars and restaurants app to leverage the power of Augmented reality.
AR in this situation is an add-on feature in the software. A linear workflow for the Augmented reality development process would look like this:
1. Pre-Contract stage
Before anything too big gets started, the product owner and the development team need to iron out the basics. There needs to be a technical strategy, and the product owner’s goals also have to be enumerated.
An NDA should be signed at this stage to protect the IP rights of the product owner. It’s also common for some amount of consultancy to be provided at this stage, especially if the vision for the project isn’t entirely production-ready. At this point the product owner may be asked questions like:
- Will a completely new app be written, or can AR be added to an existing one?
- What devices, especially iOS and Android systems, does the target audience actually use?
- How do you picture the user actually utilizing the AR component?
- How accurately does navigation have to be?
- Can third-party systems, such as Google or Apple Maps, be incorporated?
- Where are you going to get the data about the venues? Will they sign up, will users submit it or will you scrape the web for existing information?
2. Business Analysis / Technical Analysis
Synthesizing all the answers into a vision for the project is critical. The product owner and the project managers will need to impose constraints. This means arriving at conclusions about a budget, any deadlines, and what technologies will be employed to produce the app. A business analyst can be involved.
Technical analysts provide an overview of the available development tools to narrow down the technology stack. For the AR feature development, it means choosing an Augmented Reality software development kit and 3D engine. In our case it could be:
Having an AR feature in the application doesn’t impact the choice of back-end technology because the data is transmitted in JSON format via API. So it may be Node.js, Python, Ruby, or any other technology based on the project requirements.
Making the right choice here is essential because it dictates everything else that follows. In particular, the product owner’s choice is between developing code natively or using a cross-platform architecture. In our case, the optimal solution will be native development with Swift for iOS and Kotlin for Android. Using cross-platform development is possible but could bring more complexity in the system architecture and its support because the AR module must be developed with a native platform anyway to ensure the best app performance.
When developing Augmented reality apps for iOS, ARKit is used in conjunction with SceneKit. For the Android platform, the go-to tools are ARCore and Sceneform. An AR app development team should have the skills needed to process 3-D content, deal with imagine tasks, and work with most major mobile technologies. To assemble a team for this project, professionals would take on the following roles:
- iOS/Android developer for mobile AR
- Back-end developer
- Front-end designer
- UI/UX designer
- 3-D artist
- Project manager
- QA engineer
If the project doesn’t call for 3-D models or the models are likely to not be complex, the 3-D artist’s job may be assumed by the UI/UX designer.
It’s crucial to ensure that a development team is using the latest tools, technologies, and approaches on the project. Using older tools poses a great risk as there’s always the chance that they will become outdated and unsupported in the not-too-distant future. There’s also the risk that outdated software will affect the implementation of new features later down the road. While it may work perfectly now, you must think of the future.
3. Design stage
At this point, a prototype with simplified 3-D models is created. The models don’t have to look amazing at this point – it’s sufficient to operate with simple navigation boxes and spheres. Later they will be replaced with attractive 3D signs and arrows created by the UI/UX designer.
4. Development stage
Not much differentiates this stage from typical mobile app development, but the one major exception is QA. With an outdoor navigation app that uses AR, QA can be challenging. The QA engineer and the developers will need to walk around with several smartphones to judge what works, and they’ll also need a laptop to perform debugging. Logs will have to be dumped into the laptop, and they’ll have to be studied to determine where misfires occurred and why.
AR navigation feature can be either added to an existing application as an add-on module or be laid down in the architecture of an app built from scratch. On average, the development of the application from scratch may take about four to six months, with one month spent on the AR outdoor navigation module.
5. Deployment and support stage
The product owner is now ready to show the world his new AR app! In addition to typical mobile app concerns, such as fixing bugs and listening to customer feedback, he’ll also need to keep it updated. Apple and Google release new versions of their SDKs ARKit and ARCore up to several times a year. New updates to the app should be rolled out with each new release. Ongoing support and maintenance will ensure the app stays up to date.
Tada! That’s it. The product owner has now delivered a new AR-enabled mobile app. Further, we’ll take a deeper dive into the augmented reality development process and talk about its cost.
How to Choose AR Tech Stack
An AR SDK powers the Augmented reality development by providing a set of tools, libraries, relevant documentation, code samples, and guides to developers. The SDK determines the app's spatial awareness and virtual objects rendering, underpinning the features and functionality. So it’s essential to choose the correct platform based on the project requirements.
Besides the above-mentioned ARKit and ARCore, which are free SDKs, there is a number of popular paid SDKs like Vuforia and Wikitude which offer great compatibility with an array of Augmented reality development platforms. Going the cross-platform route also acts as a great razor because hybrid apps (Flutter-based and others) still have technological limitations, including performance issues, when AR feature is added. Vuforia or Wikitude could be a choice for cross-platform AR development. For graphics-heavy AR apps a good choice is Unity 3D with Vuforia, or Unity ARKit Plugin and ARCore. Compatibility with more than twenty platforms makes Unity3D an optimal cross-platform tool, too.
One of the main reasons why it’s important to outline so many technical aspects during the pre-contract phase is because SDK choice can be narrowed down by looking at the project requirements. If you’re with an iOS-only app, you can quickly jump toward ARKit and supporting SDKs. If it’s Android-only, then you just cut the list of SDKs down to ARCore.
Though Google and Apple offer out-of-the-box solutions for Augmented Reality development, sometimes a business owner doesn’t need a complex app but just an MVP to showcase the product’s basic purpose and functionality. In this case, it doesn’t make sense to deal with complex SDKs to get an AR feature into the app, especially if it would become part of a larger ecosystem with a host of related products based on a variety of technologies.
In order to prevent any discrepancies, a custom AR SDK can be developed to simplify the process of integration of the AR app with other products. The custom AR SDK controls the AR features and allows easy integration of an Augmented reality experience into various products. Moreover, it becomes easier to update the AR features across an array of products. Generally, custom AR SDK development might be a choice for business owners who have two and more products to integrate with AR features.
3D Graphics and UI/UX Design for AR Development
Adding life to static objects is one of the killer features of AR apps. The graphic design process entails:
- Figuring out how the information will be presented and estimating the UI/UX design phase
- Designing a wireframe of the app
- Establishing the user’s pathway for exploring the app
- Adding 3-D models and 3-D interface elements
To create this experience, a graphical engine has to seamlessly merge the real-world scene with 3-D content, and that calls for a graphics engine that’s up to the task. On the iOS platform, Apple’s SceneKit is the go-to solution. Google’s SceneForm is the complementary solution for Android apps.
UI/UX designer deviates from the regular 2D screen perception of a user and designs the reality where a user is inside but not from the sidelines. This is where the rubber meets the road as far as the user is concerned. The Augmented reality app interface has to be simple and intuitive, or they’ll give up and move on to their next app in a matter of minutes. Likewise, the 3-D graphics have to look fabulous.
As previously mentioned, Unity 3D is a hugely popular choice when it comes to graphic-intensive Augmented reality apps. Unity Asset Store offers plenty of free and commercially available models, animations, and textures. In case your AR app doesn’t have strict requirements for unique 3D graphics created from scratch, you may use the existing libraries for obtaining UI elements and 3D models.
QA for Augmented Reality Applications
Adding AR and MR components means taking on a relatively time-consuming QA process. Not only does the AR app have to play nicely with all the devices it will be deployed on, but the user experience has to be thoroughly evaluated. In particular, the QA engineer has to determine that people “get” how to use the Augmented reality app. This fits well within a traditional software testing pyramid.
Augmented reality app testing initially goes within the standard scheme; then after the product specifications and usage conditions are set, we create a testing checklist to consider various screen orientations, speed of the Internet connection and its cut-off, low level of memory, and battery consumption.
There are many tricks that the process induces when adding Augmented Reality to mobile apps. For example, what happens when an AR app is used in a moving vehicle, on a train, or on a plane? Standard testing scenarios are applicable to static scenes only. For non-static ones, we need to apply a custom algorithm to compensate for vibration and the mismatch of visual and motion data.
Testing HoloLens applications require a specific approach considering the user’s position both sitting and standing, during movement, in situations of different light conditions, encountering moving objects, and interior changes.
It’s essential to test an AR app on all devices listed in the project documentation as emulators cannot substitute real devices when it comes to finding and eliminating possible issues within real-world physical environments. Murphy’s Law applies. All scenarios that can be imagined must be tested.
Augmented Reality is still a very recent innovative area, where industry standards are yet to be set up. There are not so many software development teams nowadays that have already invested time into research activities and obtained substantial development experience. Thus, partnering only with an experienced remote development team will ensure delivery of AR software products keeping the balance between quality and service costs.
Published at DZone with permission of Andrew Makarov. See the original article here.
Opinions expressed by DZone contributors are their own.