Technolife-HoloLens-side

In several months, Microsoft will unveil its most ambitious undertaking in years, a head-mounted holographic computer called Project HoloLens. But at this point, even most people at Microsoft have never heard of it.  Codenamed “HoloLens” is still a prototype being developed

First Impressions:

At first it looks like slate gray headset moves through a series of scenarios, from collaborating with coworkers on a conference call to soaring. It’s bigger and more substantial than Google Glass, but far less boxy than the Oculus Rift.

prototype is amazing. It amplifies the special powers that Kinect introduced, using a small fraction of the energy. The depth camera has a field of vision that spans 120 by 120 degrees—far more than the original Kinect—so it can sense what your hands are doing even when they are nearly outstretched. Sensors flood the device with terabytes of data every second, all managed with an onboard CPU, GPU and first-of-its-kind HPU (holographic processing unit).

the computer doesn’t grow hot on your head, because the warm air is vented out through the sides. On the right side, buttons allow you to adjust the volume and to control the contrast of the hologram.

Tricking Your Brain Does the Job

Technolife

Project HoloLens’ key achievement—realistic holograms—works by tricking your brain into seeing light as matter.

“Ultimately, you know, you perceive the world because of light,”

“If I could magically turn the debugger on, we’d see photons bouncing throughout this world. Eventually they hit the back of your eyes, and through that, you reason about what the world is. You essentially hallucinate the world, or you see what your mind wants you to see.”

To create Project HoloLens’ images, light particles bounce around millions of times in the so-called light engine of the device. Then the photons enter the goggles’ two lenses, where they ricochet between layers of blue, green and red glass before they reach the back of your eye. “When you get the light to be at the exact angle,” Kipman tells me, “that’s where all the magic comes in.”

Trip To Mars

technolife

Microsoft has teamed up with NASA to let scientists explore what Curiosity sees on Mars. Instead of panoramic imagery on a computer screen, Microsoft’s demo lit up a room and turned it into Mars.I walked around the rocky terrain, bumped into the Curiosity rover, and generally just checked out a planet I will never visit in my lifetime. It’s a totally new perspective that felt like I was immersed in touring Mars, but not necessarily there. The field of view felt a little too limited to truly immerse myself and trick my brain into thinking I was really on another planet, but what impressed me most is what Microsoft has built into this experience.

Another scenario lands me on a virtual Mars-scape. Kipman developed it in close collaboration with NASA rocket scientist Jeff Norris, who spent much of the first half of 2014 flying back and forth between Seattle and his Southern California home to help develop the scenario. With a quick upward gesture, I toggle from computer screens that monitor the Curiosity rover’s progress across the planet’s surface to the virtual experience of being on the planet. The ground is a parched, dusty sandstone, and so realistic that as I take a step, my legs begin to quiver. They don’t trust what my eyes are showing them. Behind me, the rover towers seven feet tall, its metal arm reaching out from its body like a tentacle. The sun shines brightly over the rover, creating short black shadows on the ground beneath its legs.

After exploring Mars, I don’t want to remove the headset, which has provided a glimpse of a combination of computing tools that make the unimaginable feel real. NASA felt the same way. Norris will roll out Project HoloLens this summer so that agency scientists can use it to collaborate on a mission.

Holo Studio

Microsoft’s next demo didn’t have us using the HoloLens prototypes directly. Instead, i watched as “Nick”.manipulate objects in digital space so he could build a Koala bear or a pickup truck. It was actually quite impressive, as cameras filmed him and screens showed both Alex and the virtual objects he was manipulating in the same space in real time.

hololens.technolife

The idea was to convince us that HoloLens would unleash a wave of creators who would be able to dream up 3D objects with little to no training. It’s much easier to understand what a thing is in your living room than it is in AutoCad.

But sitting there after our whirlwind of actually experiencing HoloLens, my mind was elsewhere. For example, there are only a few ways to interact with this system so far:

  • Glance: you point your head at something.

  • AirTap: you make a “Number 1” sign with your hand, then move your finger down like you’re depressing a lever.

  • Voice: you can issue commands, usually to switch what “tool” you’re using.

  • Mouse: So actually the neatest thing is that objects you use to interact with computers can be used to interact with holograms.

That seems like enough, but it’s not nearly enough. It’s wildly impressive that these objects really do feel like they’re out there in your living room, but it’s equally depressing to know that you can’t treat them like real objects.

Minecraft IRL

Technolife-Minecraft

Defiantly it is one the best Demo . There a small box , heavy block which is hang around your neck which has all the computer power.It all comes in with lens , tiny projectors, speakers and motion sensors.

A “screen in your field of view” is the right way to think about HoloLens, too. It’s immersive, but not nearly as immersive as proper virtual reality is. You still see the real world in between the virtual objects; you can see where the magic holograph world ends and your peripheral vision begins.

You definitely have a big stupid grin on your face even though the contraption that’s strapped to it is pressing your eyeglasses into the bridge of your nose.

Then it’s demo time. You can’t touch anything, but you can look and point a little circle at objects on it by moving your head around. You learn how a “glance” is just you looking at things and pointing your reticle at them, and an “AirTap” is the equivalent of clicking your mouse. The demo involves digging Minecraft holes and blowing up Minecraft zombies with Minecraft TNT. It’s basically incredible to see these digital things in real space.

You blow up a hole in the table and then you look through it to more digital objects on the floor. You blow up a hole in the wall and tiny bats fly out and you see that behind your very normal wall is a virtual hellscape of lava and rock. You peer into the hole, around the corner, and see that dark realm extend far into space.

 

Advertisements