Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Livecoding Recap #44: Dipping My Toes in AR.js [Video]

DZone's Guide to

Livecoding Recap #44: Dipping My Toes in AR.js [Video]

Watch a live coding video showing how to get a basic demo of an augmented reality app running on iOS 11 using AR.js and React.

· Mobile Zone
Free Resource

Download this comprehensive Mobile Testing Reference Guide to help prioritize which mobile devices and OSs to test against, brought to you in partnership with Sauce Labs.

Image titleThis is a Livecoding Recap - an almost-weekly post about interesting things discovered while livecoding. Usually shorter than 500 words. Often with pictures. Livecoding happens almost every Sunday at 2pm PDT on multiple channels. You should follow My Youtube channel to catch me live.


This Sunday, we dipped our toes in AR.js. We didn't get very far with anything real, but fun was had and things were learned.

AR.js is an open source library by Jerome Etienne that promises to bring "Efficient Augmented Reality for the Web - 60fps on mobile!" And it did. Right after I updated my phone to a beta version of iOS11.

Why iOS11?

Because augmented reality requires access to your camera. To get access to your camera, AR.js uses WebRTC. iOS doesn't support WebRTC in any browser until iOS11, which is meant to come out in the next 3 months.

You can get the beta version until it's ready. So I did. Now I can AR. AR.js in the browser, ARKit in native. Gonna have to play with that, too.

iOS11 looks great, btw! My iPhone 5SE is a little small for the new fluffy design, but I can appreciate that things are easier to see. Does that mean I'm getting old?

Here's What We Did to Get a Basic Demo Running

We used create-react-app to bootstrap an app. This proved more trouble than it was worth because AR.js doesn't work well with import or require() statements. There's been some work to modularize it, but it hasn't landed yet.

Image title

To get around this, we imported AR.js and aframe as <script> tags in public/index.html. This works well enough, but discards all the optimizations Webpack can do for us.

aframe gives us WebVR support. I think it creates special HTML elements we can use to build VR and AR scenes. Not sure if they count as web components or not.

AR.js gives us... I'm still not sure where aframe ends and AR starts. But all the demo code we used comes from the AR.js project, so I'm sure it's doing a lot ��

aframe-minecraft is a demo of a dancing Minecraft figure that Jerome uses in some of his videos. We aaaalmost got it working.

We render the scene in src/App.js using strange HTML elements I have never seen before.

import { Scene, Entity } from 'aframe-react' ;

class App extends Component {
    render( ) {
        return (
            <div className= "App" >
                <Scene artoolkit= { {sourceType: 'webcam' , trackingMethod: 'best' } } >

                    <a-anchor hit-testing-enabled= "true" >
                        <a-entity minecraft minecraft-head-anim= "yes" minecraft-body-anim= "hiwave" material= 'opacity: 0.5' />
                        <a-box position= '0 0 0.5' material= 'opacity: 0.5;' ></a-box>
                    </a-anchor>
                    <a-camera- static preset= "hiro" />
                </Scene>
            </div>
        ) ;
    }
}

App is a React component that renders stuff. We'll add more functionality next Sunday.

Inside the render function, we use a combination of aframe-react, which is a thing wrapper on Aframe, and custom HTML elements coming from, I guess, AR.js.

<Scene> creates a new WebAR/WebVR scene. a-entity is some sort of AR entity, whatever that means. In this case, a waving Minecraft figure that is supposed to be colorful and fun, but is instead pure black.

Now that I think of it, we're probably missing texture files ��

a-box creates a white semi transparent box. For some reason, this was necessary to make the Minecraft figure visible. I don't know why... maybe something to do with those textures.

a-camera-static renders a full screen webcam view using the "hiro" image as an AR marker.

AR.js is a marker-based augmented reality engine, which means it needs a recognizable image to attach itself to the real world. This means you can't render your stuff on top of any random object the camera sees. You need a specific marker.

Like this →

The experiment code is on GitHub.

Here's What We Learned About AR.js

AR.js is great, but it's early days for WebAR and augmented reality on the web. The experience was hacky and cool.

You need markers, which limits usability. We can potentially improve this with on-the-spot deep learning that turns recognized objects into markers on the fly.

You need a desktop browser, which doesn't need AR because why would it. Are you going to move your laptop around to look at augmented reality? Prob not.

You need either an Android phone or iOS11. In a few months, everybody's phone is going to support AR.js. This is huge.

WebRTC requires https. This makes development annoying because localhost doesn't have https, so you have to deploy on a real server if you want to test on your phone.

5/7 would hack again.

You should follow me on Twitter here.

Analysts agree that a mix of emulators/simulators and real devices are necessary to optimize your mobile app testing - learn more in this white paper, brought to you in partnership with Sauce Labs.

Topics:
mobile ,mobile app development ,ar.js ,ios 11 ,ios ,react ,augmented reality

Published at DZone with permission of Swizec Teller, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}