Yes, we can render any object in real time, and yes, we can track our heads and hands and bodies. Half a decade later, with multiple trillion-dollar companies working hard on augmented reality spectacles, we're ready to breach the next barrier. I emerged from that demo convinced that I'd seen the next great leap forward in virtual reality, which I summarized in two words: "Tracking happens." Although not yet quite perfect, the device proved that a PC had more than enough horsepower to enable sourceless, self-contained tracking. This VR system had a pair outward-facing cameras that digested a continuous stream of video data, using it to track the position and orientation of my head as I moved through a virtual world – and through the demo suite. That's why I found myself in Intel's private demo suite at the 2017 Consumer Electronics Show in Las Vegas, wearing what was effectively a PC strapped to my forehead. Fortunately, by the mid 2010s, Moore's Law gave us computers a thousand times faster – more than enough horsepower to track a body, with plenty left over to run a decent VR simulation. While that sounds straightforward, computers in the 1990s were about a hundred times too slow to take on that task. (For that reason, installing VR tracking systems in a building with a lot of metal components – such as a convention center held up by steel beams – was always a nightmare.)Īn obvious solution for tracking was to point a camera at a person, then use computer vision techniques to calculate the orientation and position of the various body parts. Tracking the body thirty years ago required expensive and fiddly sensors moving within a magnetic field. It deeply involves the body – head and hand tracking are table stakes for any VR system. Yet it took virtual reality twenty years to catch up to the quantum leap in real-time 3D, because virtual reality is more than just drawing pretty pictures at thirty frames a second. Reflecting on SGI's unexpected collapse, one of my colleagues – who'd seen the future coming – delivered a quick eulogy: "Rendering happens," he said, "get used to it." No-one needed a half-million dollar Silicon Graphics workstation for virtual reality anymore – a body blow from which the firm never recovered. Along with Rendermorphics RealityLab (which hundreds of millions use today under its other name: Direct3D) it transformed the entire landscape of real-time 3D graphics. Someone even quipped that a Canon researcher must have accidentally remained logged in over the weekend, so someone could send off an obviously prank post.īut Renderware was real. When the first announcement of Canon's Renderware made it onto the usegroup sci.virtual-worlds it was greeted with disbelief and disdain. Then two British software engineers – one at Canon Research, another at startup RenderMorphics, both raised on the famed BBC Micro and its outré graphics capabilities – created tight, highly performant libraries to do real-time 3D rendering in software.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |