Yesterday I met with Abdul, my co-founder at Advimu. He showed me the progress of his current research and development in the area of augmented reality and brain-computer interface.
What he actually worked on is to combine augmented reality technology with computer brain interface technology. The result is that you can remote control things in your environment simply by looking at them or by thinking about them.
One example: You want to turn off the lights in your living room? Simply look at the “turn off” font which is shown in your AR glasses and they will automatically turn off.
What exactly is BCI?
A BCI or brain-computer interface is basically a way to read and measure your brainwaves with a computer. This might allow a connected brain to remote control a robotic arm or several other things. Today this technology is still primarily based on improving or restoring damaged hearing, sight, and movement.
Paraplegics can control and move Smart Objects with BCI
Now you might ask yourself what this might be good for. There are a few cases where this technology might actually be very helpful. The first and most important use case is paraplegia. If you are disabled and you cannot talk, walk, and not even move your fingers you have big problems with daily things.
You cannot simply turn off the lights in your room. If you are lying in the hospital you cannot simply move the bed higher of lower. For nearly all things you need somebody’s help. And even here you have a problem because it might be very hard for you to call for help when you cannot move your fingers nor talk.
You don’t need BCI and AR in the future
Brain-computer interfacing (BCI) might be a very helpful solution for disabled or elderly people who usually rely on the help of others. The fact is that augmented reality glasses are actually not even that important for this use case. They are currently used to stimulate your brain and to detect objects in your environment. The AR part might, therefore, become redundant in the future. With BCI technology you will be able to control your smart home with simply thinking about them. Think about “turning all lights off” and all lights will turn off. In the future, this does not necessarily require AR glasses but it will still probably be a combination of both.
Do we need BCI to control smart objects?
Controlling your smart home or everyday objects does not necessarily require a brain-computer interface. I think that the most promising part of the smart control of connected objects will be based on AR technology solely.
First of all, I don’t think that BCI technology will not meet the approval of the majority of people soon. It is still very scary when you connect your brain to a computer. And let’s be honest: at least for me, it will always be a little bit scary. The good thing: most of us have luckily the ability to speak, look, and move our hands and fingers. These abilities proving very helpful – even more, when we combine it with recent technologies – to control smart objects in an easy and convenient way.
Using our Voice to Control Smart Objects
Voice controls which are combined with smart assistants are already in retail stores for sale. They are called Amazon Echo, Amazon Dot, or Google Home. You can already use these smart assistants to remote control your smart home. You can connect your lights, your heater, air conditioner, washing machine etc. to your smart assistant. The result is that you can say: “Alexa, turn off all lights.” This is not only the easiest way of controlling smart objects but it is the best way of controlling smart objects today.
Voice recognition software has become excellent within the last years. But it does not stay with a simple voice recognition. Microsoft just recently launched a developer API which allows us, humans, to have (more or less) real conversation with computers. This makes it possible to say: “It’s a little bit dark here.” Your smart assistant will understand the real meaning behind your statement and it will respond: “I agree with you, Marius. I am turning a few lights on”.
Voice Control combined with Augmented Reality Glasses
If we look five years into the future it is very likely that AR technology will still work with glasses only. Magic Leap, Microsoft, Epson, and many other companies are working or AR technology based on glasses. How can we control smart objects with AR glasses? There are different ways of how we will be able to do so. The first and the easiest way to control smart objects will be a combination of AR glasses and voice recognition. Let’s say you want to turn off your lights so you will simply say “turn off the lights” or, as we just learned, “it’s pretty dark inside”. The microphone of your AR glasses will understand you and remote control the lights in your room. The task and use case of the AR glasses is to show you additional information about your smart objects or smart home.
Let’s say you enter a lift outside of your home. As soon as you step into the lift, your AR glasses will recognize the lift and they will show you the floor options:
“1st floor: Dr. Miller, 2nd Floor Company XYZ, 3rd-floor Hotel Radisson”. Now you can simply say: “3rd floor” or “Hotel Radisson” and the lift will bring you there.
Eye- and Voice Detection to Control Smart Objects
This is exactly another way of how AR technology might help us remote control smart objects. While your AR glasses show you the different options within a lift, instead of speaking out your option loud you will be able to simply look at it. Within your AR glasses you will see 3 different options and while looking at the 3rd option, your AR glasses will detect it and bring you there. Eye detection technology will recognize exactly where you are looking at and what you intend to do.
Combine it for Greatness
I think the best ways to control smart objects in the future is a combination of different technologies. All these technologies will probably be combined with AR glasses or similar devices. Some may opt for AR glasses with integrated brain-computer-interface others will choose the more convenient and affordable option of AR glasses with voice recognition and eye detection.
If you need to choose between 1-6 simple options it will be faster and simpler to simply look at the right option in your AR headset. When things become more complicated it will be faster and easier to simply speak what you want to do.
It will be easier to get to the third floor with a lift by glancing at “3rd floor” but it might be faster to say: “turn off all lights, start the washing machine, and set the heater to 20°C after I left the home”.
One thing will be sure: The interaction between smart objects will become easier and faster. While most smart home devices still require and app to be controlled, future devices will be controlled with a glance of your eye.