Immersive Multiexperience: A New Generation of Enterprise Application Development Takes Flight
See how a new generation of enterprise application development platforms will draw on the immersive, cinematic experiences made possible by game engines to address a wide range of enterprise use cases.
Join the DZone community and get the full member experience.
Join For FreeAbstract
In this article, see how a new generation of enterprise application development platforms will draw on the immersive, cinematic experiences made possible by game engines to address a wide range of enterprise use cases, including real-time 3D visualization solutions developed from photographs and other 2D data feeds, role-based multiexperience data visualizations for cross-functional collaboration, employee and customer training via digital twinning, and more.
The article will discuss the convergence of market conditions and technological advances that are spurring demand for immersive, cinematic enterprise applications — exploring the potential of AI, AR, and VR-infused multiexperience applications and how game engines may enable some of these use cases, but how they also come with some limitations inherent in their heritage as development tools. Bill will make the case that the right development platform — one that has its roots in game engines but that is purpose-built for enterprise use cases without a dependency on specialized developers — will be able to push enterprise application development into an entirely new age.
Immersive multiexperience is not entertainment; it is content. In the words of Edward Tufte, “The best design gets out of the way between the viewer’s brain and the content.” Good enterprise applications facilitate the user’s ability to understand the context of the situation, interpret the data readily, and become more efficient and effective with his or her decision making and workflow management. Immersive multiexperience for the enterprise is where application development is headed.
Let me give a brief analogy. I have flown airplanes since I was sixteen years old. Flying requires the real-time awareness of one’s location in three dimensions, vertical and horizontal velocity, weather systems and terrain along the flight route, proximate aircraft, fuel status, and engine operating parameters, among other things. Misinterpreting any of this information can have dire consequences.
My first airplane had an instrument panel that looked similar to this:
It required the pilot to monitor eight analog flight and navigation instruments, 10 analog engine and system status instruments, five different radios, and a few dozen other switches. In all, the pilot had to synthesize 29 different, siloed sources of information. Course and radio frequencies had to be looked up on paper charts and entered into the navigation radios by turning knobs.
Much of a pilot’s training focused on the “scan” — how to integrate and interpret all of the data as one flew through the clouds. Few pilots knew exactly where they were all the time, and all pilots were guessing at the locations of hazards like bad weather systems.
Today my instrument panel looks more like this:
Four main flat-screen panels integrate all of the information offered by the 24 analog displays in the old panel, plus 2D and 3D visualizations that provide real-time information about weather, traffic, terrain, flight planning, and navigation. Pilots enter flight planning data via a touch screen, by voice recognition, or by pairing with an iPad. No more paper charts. The pilot can easily plan a route around bad weather on the touch screen simply by dragging the course line away from the radar image of a storm. The panel provides early warnings about potentially deadly problems like impending terrain, traffic, or engine abnormalities. As a result, there is much less guessing about the situation and much less digging through paper charts for information. This greatly enhances the pilot’s ability to effectively manage the risks of a flight.
Interesting, perhaps, but what does this mean for enterprise applications?
Companies capture a lot of disparate information in the course of a typical business day, including ERP data, sales data, asset management data, website and eCommerce data, social media data, and more. Some types of information may not be captured effectively, such as the activity of the field sales team, or the expertise in the heads of experienced employees. And much of the available information still exists in silos and is not accessible by employees or customers in real time or in a coherent format.
Every CEO that I worked with in my former role as a strategy consultant asked for an enterprise version of the new airplane cockpit I described above – an application that would integrate information from multiple sources, put it in context (time, location, user role), and present it in an easily understandable, visual format to support timely decision making. This is what good application design means today. Typically, however, building this type of multi-experience application to support digital transformation is complex and challenging, requiring multiple dev tools, multiple teams, and a lot of coordination.
Highly integrated airplane cockpits were developed over decades with hundreds of millions of dollars of development cost. Today’s multi-experience, modular application development platforms draw on the years of investment in technology including game engines that bring immersive gaming to life. These software development platforms enable collaborative teams to build immersive enterprise applications quickly and cost-effectively. In fact, working prototypes can often be available in a few hours or days, allowing the design to be refined with user input. The development process is design-led, rather than development-led, and the efficiency of the application in providing content to the user stays at the center of the process.
Key elements of an immersive, multi-experience application development platform include several key components:
- A library of modular components that can rapidly be assembled in a multiuser visual editor. These include modules such as indoor and outdoor location services, online and offline mapping, readers for Bluetooth beacons, RFID and barcodes, IoT dashboards, and gesture and voice control, as well as more typical BPM components. Custom components are easily created and added to the library.
- An “experience engine,” built on game engine technology, that enables 3D animations and avatars, and efficient authoring of AR and VR content.
- A multi-user content delivery system, real-time APIs for connectivity to other enterprise systems, authentication and logging of data access, and cloud services orchestration.
The advantage of this type of platform is that it allows users (developers and non-technical colleagues within the enterprise) to rapidly build out software solutions that integrate a wide range of data sources, presenting data in visually compelling ways, and allowing users to interact in the most efficient way to support critical workflows.
Many types of data are more easily understood using 3D visualization, especially location information or IoT data from distributed sensors. Training can be much more effective if it is presented in a VR scenario that the trainee actively participates in. Touchscreens are not great input devices for applications like kiosks in a time of pandemic – providing a gesture or voice control option is important. Features like these are easily incorporated into projects using pre-built modules.
Here are some examples of the type of projects that have already been built using this kind of immersive, multiexperience application development platform:
SmartDiagnostic: Health status management has become a key issue during the COVID-19 crisis and will remain crucial after the threat of the pandemic has passed. The Smart Diagnostic uses an avatar of a “digital doctor” to interact with an individual to ask health history questions, while self-sterilizing sensors measure vital signs. In a hospitality setting such as a cruise line, by using low-cost in-room hardware, this solution provides a global view of health status onboard and flags any potential risks for follow up either directly with a telemedicine service, or with the onboard medical department.
Indoor location services: The solution captures a 3D “fingerprint” of RF signals in a building. Location trackers sample the RF signals around them and an AI algorithm running in the cloud can pinpoint a location in a building with 99% accuracy. No hardware installation is required. The location of critical assets or employees requiring urgent attention or assistance gets mapped on a dashboard that can include a 3D model of the building.
VR Training. Training scenarios for situations ranging from agitated customer interactions to critical plant safety procedures can be rapidly created using the platform’s VR scenario builder. The scenario can include real-time data from enterprise systems displayed in the 3D virtual scenario. Trainers can modify the scenario during the session to create impactful learning moments for participants.
A growing number of users of applications have come to expect immersive experiences, in part because of the rapid growth of gaming. There is an ongoing evolution of enterprise applications from what was essentially paper forms recreated on an electronic screen into applications that use multiple sensors to capture data and 2D or 3D imaging to present it. This is not about entertainment but about efficiency in analyzing and interpreting content and in providing the user with context. Sensors can provide location, capture information using voice, machine vision, or LIDAR scanning. Applications can be made available to employees based on their location or time of day. Many streams can easily be interpreted using AI frameworks that can flag anomalies for users.
As with the modern airplane cockpit, immersive applications provide users with better-contextualized information to facilitate efficient decision-making and manage workflows. The immersive, multiexperience, enterprise application is ready for takeoff. Platforms that support the efficient creation of these immersive applications provide the engines that will make them soar.
Opinions expressed by DZone contributors are their own.
Comments