Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Building a Floating HoloLens Info Screen: Adding C# Behaviors

DZone's Guide to

Building a Floating HoloLens Info Screen: Adding C# Behaviors

Diving into Unity and HoloLens development, let's create a floating, fading info screen with audio cues to help users figure your app out.

· IoT Zone
Free Resource

Discover why Bluetooth mesh is the next evolution of IoT solutions. Download the mesh overview.

In a previous blog post, we created the UI of an info screen. Now we are going to build the dynamics.

  • First, we will make the app recognize a speech command
  • Then we will add the dynamics to make the screen appears and disappears when we want.
  • Then we will make the close button work
  • As a finishing touch, we will add some spatial sound.

Adding a Speech Command: The Newest New Way

For the second or maybe even the third time since I have started using the HoloToolkit, the way speech commands are supposed to work has changed. The keyword manager is now obsolete — you now have to use SpeechInputSource and SpeechInputHandler.

First, we add a Messenger, as already described in this blog post, to the Managers game object. It sits in Assets/HoloToolkitExtensions/Scripts/Messaging.

Then, since we are good scouts that like to keep things organized, create a folder, “Scripts”, under “Assets/App”. In “Scripts”, we add a “Messages” folder, where we create the following highly complicated message class.

public class ShowHelpMessage
{
}


In Scripts, we create the SpeechCommandExectutor, which is simply this:

using HoloToolkitExtensions.Messaging;
using UnityEngine;

public class SpeechCommandExecutor : MonoBehaviour
{
    public void OpenHelpScreen()
    {
        Messenger.Instance.Broadcast(new ShowHelpMessage());
    }
}


Add this SpeechCommandExecutor to the Managers game object. Also, add a SpeechInputSource script from the HoloToolkit, click the tiny plus-button on the right, and add “show help” as the keyword:

imageimage

Also, select a key in “key shortcut”. Although the Unity3D editor supports voice commands, you can now also use code to test the flow. And believe me – your colleagues will thank you for that. Although lots of my colleagues are now quite used to me talking to devices and gesturing in empty air, repeatedly shouting at a computer because it was not possible to determine if there’s a bug in the code or the computer just did not hear you… it's still kind of frowned upon.

Anyway. To connect the SpeechCommandExecutor to the SpeechInputSource, we need a SpeechInputHandler. That is also in the HoloToolkit. So drag it out of there into the Managers objects. Once again you have to click a very tiny plus-button:

image

And then the workflow is a follows:

image

  1. Check the “Is Global Listener” checkbox (that is there because of a pull request by Yours Truly)
  2. Select the plus-button under “Responses”
  3. Select “Show help” from the keyword drop down
  4. Drag the Managers object from the Hierarchy to the box under “Runtime only”
  5. Change “Runtime only” to “Editor and Runtime”
  6. Select “SpeechCommandExecutor” and then “OpenHelpScreen” from the right dropdown.

To test you have done everything ok:

In Assets/App/Scripts, double-click SpeechCommandExecutor.

image

This will open Visual Studio, on the SpeechCommandExecutor. Set a breakpoint on

Messenger.Instance.Broadcast(new ShowHelpMessage());


Hit F5 and return to Unity3D. Click the play button, and press “0”, or shout “Show help” if you think that’s funny (on my machine, speech recognition in the editor does not work on most occasions, thus I am very happy with the keys options).

If you have wired up everything correctly, the breakpoint should be hit. Stop Visual Studio and leave Unity Play Mode again. This part is done.

Making the Screen Follow Your Gaze

Another script from my HoloToolkitExtensions that I already mentioned in some form is MoveByGaze. It looks like this:

using UnityEngine;
using HoloToolkit.Unity.InputModule;
using HoloToolkitExtensions.SpatialMapping;
using HoloToolkitExtensions.Utilities;

namespace HoloToolkitExtensions.Animation
{
    public class MoveByGaze : MonoBehaviour
    {
        public float MaxDistance = 2f;

        public float DistanceTrigger = 0.2f;

        public float Speed = 1.0f;

        private float _startTime;
        private float _delay = 0.5f;

        private bool _isJustEnabled;

        private Vector3 _lastMoveToLocation;

        public BaseRayStabilizer Stabilizer = null;

        public BaseSpatialMappingCollisionDetector CollisonDetector;

        // Use this for initialization
        void Start()
        {
            _startTime = Time.time + _delay;
            _isJustEnabled = true;
            if (CollisonDetector == null)
            {
                CollisonDetector = new DefaultMappingCollisionDetector();
            }
        }

        void OnEnable()
        {
            _isJustEnabled = true;
        }

        // Update is called once per frame
        void Update()
        {
            if ( _isBusy || _startTime > Time.time)
                return;

            var newPos = LookingDirectionHelpers.GetPostionInLookingDirection(2.0f, 
                GazeManager.Instance.Stabilizer);
            if ((newPos - _lastMoveToLocation).magnitude > DistanceTrigger || _isJustEnabled)
            {
                _isJustEnabled = false;
                var maxDelta = CollisonDetector.GetMaxDelta(newPos - transform.position);
                if (maxDelta != Vector3.zero)
                {
                    _isBusy = true;
                    newPos = transform.position + maxDelta;
                    LeanTween.moveLocal(gameObject, transform.position + maxDelta, 
                        2.0f * maxDelta.magnitude / Speed).setEaseInOutSine().setOnComplete(MovingDone);
                    _lastMoveToLocation = newPos;
                }
            }
        }

        private void MovingDone()
        {
            _isBusy = false;
        }

        private bool _isBusy;

    }
}


This is an updated, LeanTween (in stead of iTween) based version of a thing I already described before in this post so I won’t go over it in detail. You will find it in the Animation folder of the HoloToolkitExtensions in the demo project. It uses helper classes BaseSpatialMappingCollisionDetector, DefaultMappingCollisionDetector, and SpatialMappingCollisionDetector, which are also described in the same post – these are in the HoloToolkitExtensions/SpatialMapping folder of the demo project.

The short workflow, for if you don’t want to go back to that article:

  • Add a SpatialMappingCollisionDetector to the Plane in the HelpHolder
  • Add a MoveByGaze to the HelpHolder itself
  • Drag the InputManager on top of the “Stabilizer” field in the MoveByGaze script
  • Drag the Plane on top of the “Collision Detector” field

The result should look like this:

image

I would suggest updating “Speed” to 2.5 because although the screen moves nice and fluid, the default value is a bit slow for my taste. If you press the Play Button in Unity now, you will see the screen already following the gaze cursor if you move around with the mouse or the keyboard.

The only thing is that it is not always aligned with the camera. For that, we have the LookAtCamera script I already wrote about in October in part 3 of the HoloLens airplane tracker app, but I will show it here anyway:

using UnityEngine;

namespace HoloToolkitExtensions.Animation
{
    public class LookatCamera : MonoBehaviour
    {
        public float RotateAngle = 180f;

        void Update()
        {
            gameObject.transform.LookAt(Camera.main.transform);
            gameObject.transform.Rotate(Vector3.up, RotateAngle);
        }
    }
}


The only change between this and the earlier version is that you know can set the rotation angle in the editor. Drag it on top of the HelpHolder, and now the screen will always face the user after moving to a place right in front of it.

Fading In/Out the Help Screen

In the first video, you can see the screen fades nicely in on the voice command and out when it’s clicked. The actual fading is done by no less than three classes, two of whom are inside the HoloToolkitExtensions. First is this simple FadeInOutController, which is actually usable all by itself:

using UnityEngine;

namespace HoloToolkitExtensions.Animation
{
    public class FadeInOutController : MonoBehaviour
    {
        public float FadeTime = 0.5f;

        protected bool IsVisible { get; private set; }

        private bool _isBusy;

        public virtual void Start()
        {
            Fade(false, 0);
        }

        private void Fade(bool fadeIn, float time)
        {
            if (!_isBusy)
            {
                _isBusy = true;
                LeanTween.alpha(gameObject, fadeIn ? 1 : 0, time).setOnComplete(() 
                    => _isBusy = false);
            }
        }

        public virtual void Show()
        {
            IsVisible = true;
            Fade(true, FadeTime);
        }

        public virtual void Hide()
        {
            IsVisible = false;
            Fade(false, FadeTime);
        }
    }
}


So this is a pretty simple behavior that fades the current gameobject in or out, in a configurable timespan, and it makes sure it will not get interrupted while doing the fade. Also – notice it initially fades the gamobject out in zero time, so initially any gameobject with this behavior will be invisible

Next up is BaseTextScreenController, which is a child class of FadeInOutController:

using UnityEngine;
using System.Collections;
using System.Collections.Generic;
using System.Linq;

namespace HoloToolkitExtensions.Animation
{
    public class BaseTextScreenController : FadeInOutController
    {
        private List<MonoBehaviour> _allOtherBehaviours;

        // Use this for initialization
        public override void Start()
        {
            base.Start();
            _allOtherBehaviours = GetAllOtherBehaviours();
            SetComponentStatus(false);
        }

        public override void Show()
        {
            if (IsVisible)
            {
                return;
            }
            SetComponentStatus(true);
            var a = GetComponent<AudioSource>();
            if (a != null)
            {
                a.Play();
            }
            base.Show();
        }

        public override void Hide()
        {
            if (!IsVisible)
            {
                return;
            }
            base.Hide();
            StartCoroutine(WaitAndDeactivate());
        }

        IEnumerator WaitAndDeactivate()
        {
            yield return new WaitForSeconds(0.5f);
            SetComponentStatus(false);
        }
    }
}


So this override, on start, gathers all other behaviors, then de-activates components (this will be explained below). When Show is called, it first activates the components, then tries to play a sound, then calls the base Show to unfade the control. If Hide is called, it first calls the base fade, then after a short wait starts to de-activate all components again.

So what is the deal with this? The other two missing routines are like this:

private List<MonoBehaviour> GetAllOtherBehaviours()
{
    var result = new List<Component>();
    GetComponents(result);
    var behaviors = result.OfType<MonoBehaviour>().Where(p => p != this).ToList();
    GetComponentsInChildren(result);
    behaviors.AddRange(result.OfType<MonoBehaviour>());
    return behaviors;
}

private void SetComponentStatus(bool active)
{
    foreach (var c in _allOtherBehaviours)
    {
        c.enabled = active;
    }
    for (var i = 0; i < transform.childCount; i++)
    {
        transform.GetChild(i).transform.gameObject.SetActive(active);
    }
}


As you can see, the first method simply finds all behaviors in the gameobject – the screen - and its immediate children, except for this behavior. If you supply “false” for “active”, it will first disable all behaviors (except the current one), and then it will set all child gameobjects to inactive. The point of this is that we have a lot of things happening in this screen. It’s following your gaze, checking for collisions, it’s spinning a button, and it’s waiting for clicks – all in vain as the screen is invisible. So this setup makes the whole screen dormant, disables all behaviors except the current one – and also can bring it back ‘to life’ again by supplying ‘true’. The important part is to do the right order (first the behaviors, then the gameobjects). It’s also important to gather the behaviors at the start, because once gameobjects are deactivated, you can’t get to their behaviors anymore.

The final class does nearly nothing – but this is the only app-specific class:

using HoloToolkitExtensions.Animation;
using HoloToolkitExtensions.Messaging;

public class HelpTextController : BaseTextScreenController
{
    public override void Start()
    {
        base.Start();
        Messenger.Instance.AddListener<ShowHelpMessage>(ShowHelp);
    }

    private void ShowHelp(ShowHelpMessage arg1)
    {
        Show();
    }
}


Basically, the only thing this does is make sure the Show method is called when a ShowHelpMessage is received. If you drag this HelpTextController on top of the HelpHolder and press the Unity play button, you see an empty screen in stead of the help screen. But if you press 0 or yell “show help” the screen will pop up.

Closing the Screen by a Button Tap

So now the screen is initially invisible, and it appears on a speech command – now how do we get rid of it again? With this very simple script, the circle is closed:

using HoloToolkit.Unity.InputModule;
using UnityEngine;

namespace HoloToolkitExtensions.Animation
{
    public class CloseButton : MonoBehaviour, IInputClickHandler
    {
        private void Start()
        { }

        void Awake()
        {
            gameObject.SetActive(true);
        }

        public void OnInputClicked(InputClickedEventData eventData)
        {
            var h = gameObject.GetComponentInParent<BaseTextScreenController>();
            if (h != null)
            {
                h.Hide();
                var a = gameObject.GetComponent<AudioSource>();
                if (a != null)
                {
                    a.Play();
                }
            }
        }
    }
}


This a standard HoloToolkit IInputClickHandler – when the user clicks, it tries to find a BaseTextScreenController in the parent and calls the Hide method, effectively fading out the screen. And it tries to play a sound, too.

Some Finishing Audio Touches

Two behaviors – the CloseButton and the BaseTextScreenController – try to play sound when they are activated. As I have stated multiple times before, having immediate audio feedback when a HoloLens image‘understands’ a user initiated action is vital, especially when that action’s execution may take some time. At no point do you want the user to have a ‘huh, it’s not doing anything’ feeling.

In the demo project, I have included two audio files I use quite a lot – “Click” and “Ready”. “Click” should be added to the Sphere in HelpHolder. That is easily done by dragging it onto the Sphere from App/Scripts/Audio onto the Sphere. That will automatically create an AudioSource.

The following settings are important:

  • Check the “Spatialize” checkbox
  • Uncheck the “Play on awake checkbox
  • Move the “Spatial Blend” slider all the way to the right
  • In the 3D sound settings section, set “Volume Roloff” to “Custom Rolloff”

Finally, drag “Ready” on top of the HelpHolder itself, where it will be picked up by the HelpTextController (which is a child class of BaseTextScreenController ) and apply the same settings. Although you might consider not using spatial sound here because it’s not a sound that is particularly attached to a location — it’s a general confirmation sound.

Conclusion

To be honest, a 2d-ish help screen feels a bit like a stopgap. You can also try to have a kind of video of the audio message showing/telling the user about the options that are available. Ultimately, you can think of an intelligent virtual assistant that teaches you the intricacies of an immersive app. With the advent of ‘intelligent’ bots and stuff like LUIS, it might actually become possible to have an app help you through its own functionality by having a simple question-and-answer conversation with it. I had quite an interesting discussion about this subject at Unity Unite Europe . But then again, since Roman times, we have pointed people in right directions or conveyed commercial messages by using traffic signs and street signs – essentially 2D signs in a 3D world as well. Sometimes we used painted murals or even statue-like things. KISS sometimes just works.

The completed demo project can be downloaded here.

Take a deep dive into Bluetooth mesh. Read the tech overview and discover new IoT innovations.

Topics:
iot ,unity ,game development ,hololens ,virtual reality ,tutorial

Published at DZone with permission of Joost van Schaik, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}