Virtual & Augmented Reality News - Page 127
Advancements in 3D are made every day. No longer do users have to suffer through those red and cyan glasses to watch 3D movies. Some implementations nowadays, like the Nintendo 3DS, don't require any fancy headgear at all. It seems as though the future of 3D isn't that far off and the Star Trek holodeck technology may no longer remain a fantasy.
Sharp Labs Europe are leading the development of a new communication technology that will help to make 3D images indistinguishable from reality by using holographic technology. Some brush off 3D as just a gimmick, but others see it as just the first step. "The ultimate goal is to make a holographic display and what I mean by that is a display that shows images that are indistinguishable from reality," Mather says.
The first step, however, is to make it as commonplace as driving. "I think a sensible target is for 3D displays to become a natural part of modern life," Mather explains to Humans Invent. "Home cinema systems showing 3D movies, computer games played in an immersive environment and holiday photos presented with depth."
Just how far off from this goal are we? And what is the goal? Mather explains: "The ultimate 3D display is a holographic display. Many people don't realise but there is one thing missing from today's stereoscopic 3D displays."
Computex 2012 - I'm sure you're aware of our relationship with NVIDIA, and yesterday we walked through their stand in Nangang at Computex. I noticed that my badge was getting looked at by virtually everyone, so I don't know if the staff were told to look out for certain media, but it felt like it. I thought I'd ask for a t-shirt, lanyard, etc - as I'm still a fan of the company, but all I got was a lanyard. We decided to visit again today, and got the same looks, but had more time there as it wasn't so wall-to-wall crazy busy. First up we have the GEFORCE GTX 680 and some awards it won.
Next up we hit the ASUS ROG PCs, where there was a StarCraft II session happening with some pro-gamers, check out the shots below.
A nice panoramic so you can get a feel of the scale of the event itself.
That small little project being worked on by Google is gathering more publicity. In a recent interview with California Lieutenant Governor Gavin Newsom, Sergey allowed Newsom to put the glasses on to see a picture that Sergey had taken with the glasses. When asked, Sergey wouldn't say how the picture was taken, but we do now know that there is a touchpad on the side behind the display.
In the picture above, you can see Mr. Brin operating the touchpad with his finger while looking for the picture he had taken previously. After being found, Newsom got to wear the glasses and commented on them: "You can easily forget you have them on, and sense the capacity of use in the future."
Newsom expressed that he was impressed with the quality of the image taken, especially since the stage isn't ideal for demoing a display. He even commented that the "image was remarkably clear." Brin also let everyone know that those are a rough prototype: "I have some hopes to maybe get it out sometime next year, but that's still a little bit of a hope."
It's likely that the final product will be a fair bit different than the current version, but with the prototype being so well received, it speaks volumes for just how great this product will be. Newsom expressed that the glasses are "a heck of a lot further along than people have imagined." Brin explained Google's view on the glasses: "The idea is that you want to be free to experience the world without futzing with a phone."
Google is sure throwing quite a bit of money and support at the Project Glass augmented reality glasses. They are trying to cram so much technology into a tiny, and hopefully fashionable, package that they should become every geek's dream gadget. We still don't know much about the project other than it's being publicly tested by Google executives.
We've seen some pictures released that have been taken using the glasses and, to be honest, they aren't that great. Most modern smartphones could easily outpace the resolution and quality, but the glasses do have one advantage: Point-of-view. Pictures can be taken hands-free and are from the perspective of the wearer.
Google has now released a 15-second video that was taken while the user was jumping on a trampoline. This really is where these glasses start to shine. It would have been near impossible to take a video like this without these glasses. The quality of the video isn't stunning by any means, but for users who video blog their lives, these glasses are an invaluable piece of technology.
Intel building Skynet, are launching research into technology that mimics the human brain and "learns" about its user
Reuters is reporting that Intel are launching research in Israel into technology that will mimic the human brain, with devices that will hopefully "learn" about their user. Intel's Chief Technology Officer, Justin Rattner told reporters in Tel Aviv:
Machine learning is such a huge opportunity. Despite their name, smartphones are rather dumb devices. My smartphone doesn't know anything more about me than when I got it. All of these devices will come to know us as individuals, will very much tailor themselves to us.
Intel Collaborative Research Institute for Computational Intelligence, as well as specialists from the Technion in Haifa and the Hebrew University in Jerusalem will carry out the research, and is aimed at enabling new applications, as well as small, wearable computers that can enhance, or help with daily life. One example cited is if you were to leave your keys in the house, the wearable system would remember where you left them, and learn this. By the second week it will remind the user to pick up the keys before you leave the house.
News on Google's augmented reality Glasses hasn't been strong since they were teased a few weeks ago, but CEO Larry Page was spotted rocking the Glasses in London. The pictures you see below are thanks to a Google employee, where they posted them on Google+.
The employee wrote alongside his pictures "My life is now complete - met Larry Page today! Thank you for visiting EMEA". Google's Glasses project is quite big, as its the company's first jump into cutting-edge hardware.
On top of this, thanks to the acquisition of Motorola yesterday, we should see Google's hardware division strengthened considerably. The best picture is the one above, where I think Page is laughing at the Facebook share prices tanking, as a layer in front of his eyes thanks to the Glasses, but that's just my guess.
Where do I begin to even categorize this? Within 'Augmented Reality', I guess? Well, Microsoft have begun testing their home automation software, dubbed HomeOS, over the past few months. HomeOS can view quite a lot of gadgets, more than you probably think.
HomeOS sees smartphones, printers and air conditioners as network peripherals, all controlled by a dedicated gateway computer. HomeOS even sports some apps, which perform functions such as energy monitoring, remote surveillance and face-recognition. This is all thanks to Microsoft testing the suite out in 12 homes over the past few months.
This list will only grow, and the apps are made available through a portal called "HomeStore". These apps will surely turn into something magical over the years if HomeOS takes off.
We've talked about this a few times, but the latest news to float onto the surface that is the sea of the Internet is that Valve were hiring hardware engineers, that myself, and other tech sites presumed was for their unannounced Steam Box home console. I was wrong. It seems as though Valve are hiring for something quite different: wearable computing.
Google are doing it, Apple will wait a few years and do it and call it revolutionary, but it seems Valve are also getting into the mix. Games Industry reports from a recent blog post by Valve developer Michael Abrash where he revealed the fact that Valve are hiring for wearable computing. The project is inspired by Neal Stephenson's novel Snow Crash, where Abrash has taken it upon himself to try and shrink computers down to the point where you can have one on you at all times.
The on-going multiple wars that the United States are knee-deep in probably won't stop anytime soon, and will most likely only get worse from here on out if everyone we're being fed on the news is right: Iran, North Korea and those pesky "terrorists" that the U.S. government fund left, right and centre. But, the latest step that the United State Department of Defense is working on is something very interesting indeed.
The U.S. DoD have signed a contract with Innovega, a Washington-based firm for development and testing of its dual-focus contact lenses. The technology would make it possible to project a HUD (heads-up display) onto the center of each lens, while keeping it in-focus regardless where the wearer is looking.
This would really just give the soldier a game-like HUD, where he could view his
health stats, armor, and XP - on a serious note, the soldier could see details like notes from superiors, real-time maps, satellite view, and more. We're already seeing the consumer level of this technology from Google in the form of their Project Glass.
Interesting to see where the future of technology is, we're really looking at combining a cyber lifestyle, with our body and person. It simultaneously scares me, and makes me want it oh so much.
An enterprising gamer has done what gamers do best: thought outside the box. What he's done is grabbed a bunch of components, built a virtual reality, motion-controlled, Skyrim experience. The equipment used was: Skyrim, a Sony HMZ-T1, a Kinect, a TrackIR 5, a TrackClip Pro, Shoot [software], and FAAST 0.9 [also software]. The results? This:
The artist goes by the name of 'Awesome Man', and from the video's comments we have:
I've setup the Sony HMZ-T1 head mounted display to use Stereoscopic 3D as well as attached the TrackClip Pro on it for head tracking. I had to place the TrackIR 5 on a wire hanging from the ceiling as it needed to be around head level to track my head movements properly.
The Kinect was setup on the PC using PrimeSense's OpenNI drivers. I used FAAST 0.9 with a custom script to map certain gestures with the keyboard, such as walking on the spot to move in the game, leaning left, leaning right, jumping and moving my right arm forward to use the sword.