- Selected by Google as WebGL VR experience at Chrome Experiments.
- Highlighted by Mozilla as one of the first VR featured projects for the web.
Trying to interactively tell a story which contained a small puzzle for the user to solve in order to move forward in the experience. To tell a story, the first thing you need is a hero. Ours had to find and then place a series of mysterious lights, thus illuminating his tiny world.
I N S P I R I T may seem to be a story about light, but it is really more about people. People who enlighten us and complete us; because I like to think that there are people whose presence lights us up inside.
The original idea was for the lights to create a circle along which our hero was moving. Each light would emit color and a different sound when placed on its pulpit.
In the end, I decided to add mountains and elements behind the road that the character is moving on. This helps in virtual reality, where remote objects appear to move less when you change your point of view. Elements stay longer in the frame and you are able to appreciate them much better.
Different ways to experience it
There are four different ways to enjoy it. That is because one of the of the goals of the project is demonstrating to a potential client my ability to tell an interactive story or create a 3D experience on multiple levels, where each potential viewer can view it a number of ways. Perhaps only a few of visitors will use the VR or Cardboard options, but if you are able to offer it, you’re saying a lot about how far you can take a message.
For this reason, you can view it on your computer (locking the mouse cursor when the experience begins), on smartphone with a gyroscope or a WebGL-ready tablet (particularly for the many iPad users), with Cardboard for the half million owners of this headset (at this point it was key for me to get it to work for iPhone users) and you also have the VR option to use an Oculus Rift device.
The main challenge was learning to develop webVR based on webGL, discovering the particular quirks it had when creating a project for Cardboard (it had to be optimized for smartphones) or Oculus. But apart from the opportunity of modeling and animating a character, working with audio3D in the scene or using different particles throughout the experience.
Animated character as a lead actor
Working with Blender as a 3D tool, the first step was to model a simple character (1075 vertices) and to create a skeleton for his movements to be based on user actions.
After that, I created 12 animations in a loop of about 2 to 3 seconds. Most of them were for export to the skeletal animations in THREE.js with the exporter that comes with the library.
[Before exporting you must always remember to remove the Armature modifier]
And finally, when it was time to animate, the BlendCharacter class by Michael Guerrero that appears in the THREE.js examples was very useful once adapted to the project’s needs.
Since I was focused on VR, being able to use 3D sound would enrich the scene as well as make the experience much more immersive. My curiosity was first piqued when reading this post by Felix Turner, and then last year at Jaume Sanchez’s presentation "Everybody dance now with Web Audio API". Attending that presentation, I thought that someday I would implement it in a project.
I used the THREE.Audio and THREE.AudioListener as the base, and then integrated it to cohabitate with the SoundJS and PreloadJS. Thus creating a class ( Audio3DControls.js ) that added the sound buffer loaded with SoundJS to the previously created instance of THREE.Audio. This made it possible to create an Analyser to get frequencyBinCount or to work with the panner, gain or context of these audios.
One of the barriers that you run into using the Web Audio API is the difference from iOS and other browsers, where audio is muted until user interaction. Also if you constantly modify the position of the listener, you can’t hear it well in iOS. So in my case, rather than moving it all the time, it changes if the camera position varies more than 5°.
Another aspect to consider is the position of the listener. I created an invisible object halfway between the camera and the protagonist’s path. That way it was nearest where the sounds were coming from, and was changing position. With a fixed camera setting, if I set the listener to 0,0,0 (still playing with the panner audio) I would not get the desired effect.
Audio interacting with the scene
I had great expectations hopes for this part of the project. I asked Guillermo Laporta, who collaborated with the music, to create a tune divided into 7 parts. It was a base and 6 tracks, one for each of the pulpits to unlock. Each light activated a new sound motif.
In the end I managed work with the mean value of the FFT which we imported from the analyzer for each sound. So depending on the intensity, we modified the parameters of various graphic elements of the scene.
It was also was very effective synchronize playbackRate loop for steps with the speed of the character, which automatically synced with the sound of the character’s footsteps.
Unifying the experience for different devices
Another of the project’s challenges was to get the same navigation to work well on very different devices. You had to be able to use a computer mouse, a mobile phone’s gyroscope, a mobile with Cardboard or Oculus… And it was also important to have a computer as a fallback for quick testing.
For this project I had to modify the DeviceOrientationControls, MouseOrientationControls and VRControls classes so that they sent an alpha angle depending on where the user was looking, in order to allow handling via a circular cursor.
Additionally, for each configuration, the FOV and the height of the camera changed to suit each device’s field of vision settings.
Lessons learned Developing for VR
Designing for VR
One of the main lessons is that there are things that result in attractive views on the computer screen but turn out to be annoying in VR. In the former, you have to avoid making the viewer constantly refocus their attention. Avoid making several elements move in separate directions at once, it tires the eyes out quickly from trying to focus on them.
One of the main articles that deal with this issue is Quick VR Mockups with Illustrator, by Josh Carpenter. His approach to the Human field of vision for VR is very valuable. It is also interesting to read this part of this VR Interface Design Manifesto, where they explain how human visual perception works.
Additionally, this interview with the creator of Temple Run, whos now creating a VR version, gives some tips on what to avoid when designing an experience for VR.
"Be keenly aware of people's sensitivity to motion and Avoid doing things in-game That Might cause people to get motion sick"
"In overall, anytime you cause camera movement that isn’t directly initiated by the player, you are asking for trouble."
Although I used a system where the user chooses the type of experience at first, Boris Smus has created a responsive solution. The WebVR Boilerplate, is a good start for creating different types navigation based on the user’s requirements.
Especially when optimizing for use with Cardboard, I ran into 2 problems to solve:
Managing the device’s Sleep time. This is because when you are viewing it with Cardboard, you do not touch the screen for a long time. In iOS this hack still works, but for Android devices there is no such thing and you have to access Sleep Time settings in the webvr-manager.js of the WebVR Boilerplate there is a method to solve this problem for iOS and Android devices.
Another is how to enter fullscreen and landscape position to watch the experience. In Android, using screen.orientation.lock ('landscape'); combined with document.fullscreen are easy one-click ways to achieve it. But in iOS we can only display a notification when you switch to portrait mode.
Development was greatly accelerated thanks to tools like Grunt used to automate publishing tasks for both desktop and mobile, and Bower, to manage library packages.
We generally used libraries like jQuery, TweenMax, SoundJS or PreloadJS to make programming more comfortable. And we took advantage of the benefits of js-signals for event handling, and of handlebars for templates.
As in art books or Pixar shorts, I created a Color Script with screenshots for different moments of the experience. I like to think that because of the interactivity, every viewer will see a different order of colors.
In the end, the result can be seen in inspirit.unboring.net. I hope you experience and enjoy it.
The world of 3D web and the emergence of VR give us the opportunity to offer new ways of interacting with content. Our goal is to create innovative projects that demonstrate that through the Web you can get amazing results.
For future projects, we want to use textures in our characters, optimize and reuse characters by dynamically customizing their dimensions, and integrate Leap Motion in the experiences.
Would you like to work with us in achieving these goals? If you found this article interesting and would like to pursue a 3D / VR web project, we'd be happy to hear from you. Here you can find different ways to contact or find out more about us.
Arturo Paracuellos (@arturitu) - Direction, Design & Development
Guillermo Laporta (guillermolaporta.com) - Sound Design