Mixed Reality, the future of computer interaction

When a game changing technology comes along, so does a whole new world of opportunity. In the case of Mixed Reality, a whole new range multidimensional applications living in, and reacting, to your real world environment. Previous applications had developed from a keyboard and mouse interface, followed by a touch screen interface. Current applications see an emergence of a very unexplored interface: your personal environment. Physically holding and rotating an object allows for independent analysis. It engages a level of understanding far superior to flat diagrams or simple animations. This is the introduction of digital information via a lens into our personal environment. 

The opportunity to enter the application market has recently seemed somewhat limited. Anything short of a brilliant and original idea has been done in some form or another, although this was not always the case. Before smartphones, web based applications had only a particular reach, use, and capability. The smartphone unlocked gyroscopes, GPS, touch interaction, microphones, cameras, even an on demand flashlight. Each of these opened the door to entirely new possibilities for applications. In the near future, Mixed Reality devices will have a similar effect on possibility.

A lens in front of your eyes is what creates a Mixed Reality experience. Viewed through a phone screen, the program would be an AR (Augmented Reality) experience. Mixed Reality is a term that was championed by Microsoft with the introduction of the HoloLens, a device providing one of the most impressive MR experiences because of its ability to scan your room and maintain stable 3D geometry within your space. 

This mixing of realities will allow your device to see what you see, understand what is being seen, and add digital information to your space. These devices can put a model volcano on a desk, cut it in half and show the inner workings, or change the material of a surface from carpet to what seems like hard wood. They can even create a ball in front of you and let it drop to your floor. As for Ascend Reality, we can visualise an entire apartment complex inside a sales office.

Early applications lived on a computer monitor and, as mentioned previously, were controlled by a keyboard and mouse. As such, the interface developed within those constraints. When applications were designed for the touchscreen, the interfaces changed to leverage new functionality. Our fingers replaced the mouse, and pinch to zoom replaced the scroll wheel. When it comes to Mixed Reality , the interface will change as well. Microsoft has already used the direction of your gaze as your 3D cursor and tracking of your hands to allow you to "click" or open a menu. The Meta2 tracks each of your hands to allow you to grip the sides of a digital object, allowing you to resize and relocate digital content within arm’s reach. The possibilities are immeasurable in this respect as they will develop with the adaptation of the technology. Much like the pinch to zoom function, which seems simply obvious now, it wasn’t discovered the moment the enabling technology was released. MR interaction will develop concurrently, and utilise many inputs from your hands to your eyes to possibly even the voice inside you. See AlterEgo for a mind blowing development.

The obvious power of Mixed Reality is the ability to see a 3D object as if it is really in the room with you. A picture in a book, or on a website, is still a 2D representation and requires a reference to imply scale. But a 3D scale object that is sitting on your coffee table is something you can move around and see from any angle. A running model of an engine that exists in front of you can be stripped down to still demonstrate the functionality of the moving parts. Free from the weight of reality, a user can lift up that engine and see what is going on underneath, spin it around and view a cross section at will. Gone are the days of confusing diagrams and expensive physical models, education will be changed forever.

 

Jesse Dombowsky