Even if you doubt this, just look at some of the particulars. Magic Leap is a Hollywood Florida start-up. Rony Abovitz, the eccentric President, CEO & Founder of Magic Leap was a co-founder of surgical robotics firm Mako Surgical, which was sold for $1.65 billion in December of 2013. Now he’s hit the ground running and apparently has tech that’s turning heads and garnishing a lot of capital. Magic Leap received $50 million in a 1st round of investing in Feb. What’s left a lot of folks gaping is the $542 million they got in a 2nd round in October – with a large part of that coming from Goggle (corporate, not the investment arms), plus Qualcomm, Legendary Entertainment and some other VC firms.
So now they have nearly $600 million to play with – they are hiring like mad. But what are they doing? From their press release:
With our founding principles our team dug deep into the physics of the visual world, and dug deep into the physics and processes of our visual and sensory perception. We created something new. We call it a Dynamic Digitized Lightfield Signal™ (you can call it a Digital Lightfield™). It is biomimetic, meaning it respects how we function naturally as humans (we are humans after all, not machines).
In time, we began adding a number of other technologies to our Digital Lightfield: hardware, software, sensors, core processors, and a few things that just need to remain a mystery. The result of this combination enabled our technology to deliver experiences that are so unique, so unexpected, so never-been-seen-before, they can only be described as magical.
We are building a world-class team of experience developers, and are reaching out to application wizards, game developers, story-tellers, musicians, and artists who are motivated by just wanting to make cool stuff.
So what does this mean? According to some accounts of folks who’ve seen the technology and if you glean info from their patents, it’s apparently a very realistic projection system that either projects onto something over the eyes or actually into the eyes. As creepy as that sounds, it’s called Virtual Retinal Display and I remember work being done at the Human Interface Technology Lab at the University of Washington in the 90’s. Couple that with some sort of head mounted display (not only to mount the projectors, but a front facing RGBZ camera, speakers, headphones, location/orientation tracker) and you’ve got a pretty awesome VR *or* AR system. It’s possible to calculate where (direction AND distance) the user is looking and tailor the display to be focused correctly – this from the reference to a lightfield signal (see Lytro). Throw in some input gloves (ideally haptic) and you’ve a pretty awesome platform. The awesome part if that it crosses the line between very practical (think Google Glasses on steroids – a HUD that’s able to interact with the real-world scene that you eyes are currently viewing) and very entertainment oriented – a totally immersive VR experience that has a Game/VR HUD that encompasses your total field of view or can project high-def movies into you eyes.
I find the most fascinating thing being the involvement/investment by Google corporate. Given the bad press Glass has garnered in the last year, the lack of mention of Glass at Google IO this year, one starts wondering. The folks at Google are smart, perhaps it’s time to spin off the tech. Florida is pretty far from Silicon Valley.