Although Google’s Project Glass has already raised over $75 million in funding, it still has a long ways to go before becoming a household name like Apple’s iPhone or Blackberry’s Playbook. Despite many skeptics believe it to be nothing more than another gimmick, Google’s new eyewear is already garnering positive feedback from a surprising number of consumers. Even after introducing half-baked prototype versions, some of the more futuristic wearable glasses were quickly abandoned like the average consumer-ready model of Google Glass. While big tech companies determine how to bring the head-mounted electronic technology into the mainstream by way of apps and handheld devices, an army of upstart startups are entering the market, pairing touch-sensitive see-through screens with heart-rate and fitness-monitoring applications.
One such startup is MetaSense, which is currently testing an AR technology head-wear solution. It uses high-definition cameras built into ordinary glasses to track wearer movement and deliver detailed information about where the wearer is in real-time. A handful of large gyrostachios and other veggies have already been incorporated into MetaSense’s solution so that it can tell when hands are in motion and which vegetables are on plate. By tracking a person’s motion using the image of his hands, MetaSense can determine where he is in relation to items in his kitchen, determine if he’s moving towards the sink or towards the door, and even determine what items are in the way of his vision.
Another team called Vuzix came up with an interesting idea to integrate Google Glass with their existing mobility solution, namely the VuZix Blade. The two technologies share a lot of similarities, such as being both touch sensitive voice recognition powered, and GPS/RFID enabled. However, they differ greatly in implementation. Because the two technologies work so well together, the developers at Vuzix wanted to incorporate the technology into their product line. So, they developed a hands-free device that acts as an “augmented reality viewer” that combines the power of Google Glass with mobile connectivity. A future model may let consumers control augmented reality scenes through their smartphone.
There’s also Jambool, which wants to take photos with you. Their device called Jambool+ lets you use your phone’s camera to capture your own shots without holding the camera to take photos. With the help of a remote user interface, the captured images can be instantly shared using social networks, emailed to a recipient, or uploaded to a photo service like Flickr.
Smartphone augmented reality companies like Augmented Reality Labs want to be the one to make Google Glass popular. They’ve already been working with the camera company, Samsung, to develop the Meuyi Smartphone, which is equipped with Google Glass functionality. The goal is for Meuyi to become the world’s first wearable, fully-integrated smartphone with Google technology integrated. Jambool and other movers members are looking for ways to take mobile augmented reality to the next level.
Other smartphone manufacturers are working on devices that can be used to not only take photos, but surf the web, reply to emails, track flight speeds, and check the weather. Samsung already has its own vision with the Galaxy Gear, but other companies are following suit. Garmin, for instance, has announced plans to launch a new type of smartphone that combines Google’s Glass technology with data from its own GPS navigation system. The upcoming smartphone from Garmin, called the omniwear GPS, will feature built-in GPS and will allow its users to surf the web, reply to emails, and track flight speeds. The device will also be able to access the internet, control music and video, and update the weather.
In the future, we may see augmented reality start to take shape with more advanced mobile technology. One such device, the Google Phone, already has some form of AR functionality. The phone has a front-facing camera that can digitally enhance images taken by the user. It also includes image stabilization technology, so you won’t have to hold your arm up to take a picture or use your fingers to move the focus knobs on the screen. The phone also uses an AR element called ProjectKnow and allows its users to instantly connect with people around the globe. This makes it a very differentiating feature compared to the other major smart phone brands, like Samsung and LG, who still haven’t embraced the idea fully.
Another upcoming application from the manufacture, calledmoive, takes advantage of the multi-touch gestures of capacables such as the capacitive ones on the iPhone and Android devices to allow you to browse through an app in a “flow” like manner. You might swipe left to right to go to the next section of the app, or you could tap the centre of the screen to jump straight to the menu. You can also tap the sides of the screen to jump back out of an app, just like how you might get to the home screen on an iPhone or tap the menu option on Android handsets. The trick is to not swipe too fast, to avoid jarring your finger against the screen. These types of multi-touch gestures are only available in the fully functioning Google Glass, so it’s possible that we won’t see these features offered on regular glasses anytime soon.