May 11, 2012 3:44 PM, By Cynthia Wisehart
Technology gives students a more real experience.
“One of the hardest things for architecture students to learn is how their designs will feel in three-dimensional space, and how it will feel to move through them. It’s something they learn when the buildings they design are constructed,” he says a little ironically, fully aware that he’s talking about experience gained over years. So how to speed that experience? The virtual reality lab does not actually simulate walking within a building—students are immersed in their drawings and models. Paint and finishes are simulated; the student can put on the Sensics headset and walk through the expansive courtyard as if they were walking through the 3D models of their buildings. Other students can watch the walkthrough on a monitor, while the student in the headset can change among multiple models preloaded into the virtual reality system computer.
The installation is relatively simple—a series of cameras mounted on an overhead truss, wired via Cat-5 to the systems’ computer. The cameras read the headset sensors wirelessly, determining where the viewer is in the virtual space, reporting to the computer, which then plays the corresponding view back to the viewer’s headset (via a beltpack)—also wirelessly. The virtual reality computer lives on the LAN, so drawings and models can be readily uploaded. The headset and the belt pack are variations on the Sensics system elements. The tree-like configuration of the LEDs that sticks up from the headset is an adaptation Anderson picked up from Mark Bolas, the director of the Mixed Reality lab at the University of Southern California. And the beltpack receivers seem to have come from a consumer electronics store. It’s pretty clear that for this application tinkering the wearable interfaces is irresistible and ongoing.
The Virtual Reality Design Lab at Rapson Hall was seeded eight years ago with an alumni gift from Ted and Linda Johnson to foster a cooperative effort between architecture and computer science, where Anderson’s counterpart is Victoria Interrante. Subsequent funding came from various sources including NSF grants, and it was envisioned that the lab would support other disciplines within the College of Design as well as architecture. So Anderson points out that landscape architects could walk through their designs, graphic artists could see their billboards, and the fashion department has already experimented with virtual fashion shows in which virtual models walk by your headset as if they were walking down a runway. Well, not the models, but the clothes themselves, animated as if they were on a model.
These applications have yet to be integrated into the daily life of the lab, but from the looks of things, that life is just getting started now that it is finished and open. Beyond the college of design, the lab could contribute to other applications such as testing and modeling health care procedures such as how long it takes a nurse to move among workstations. The lab will also contribute to the body of work on human perception and virtual reality. There is still much to be learned, Anderson says, such as why humans perceive virtual distances as shorter than real ones. Who knows, Anderson or one of his students may be a future presenter at VR 2015 or beyond.
The EXODesk by the Canadian company EXOPC (in partnership with ViewSonic) debuted as prototype at CES this year and created a minor sensation. Some called it the poor man’s Microsoft Surface; it’s smaller at 32in. to 40in., and at a projected $1,000 to $1,500, it’s a fraction of the cost.
The ViewSonic EXODesk, as it is now known, is an HD LCD touchscreen display that rests flat on the surface of your desk. It hooks up to a Windows or Mac computer and can be used to interact with what’s on the computer monitor. You can launch websites or use a virtual Microsoft Word keyboard. But it can also be used as a giant standalone tablet with a limited number of Surface-like touchscreen apps.
The EXODesk’s first deployment will be in a Panamanian elementary school physics classroom, as shown in this image. The pilot classroom will feature 20 touchscreen desks, a larger EXOdesk for the teacher, and a larger-still interactive multi-touch "blackboard." The screens are all connected via Wi-Fi to ease collaboration and help the teacher keep track of what kids are doing with the desks. In this deployment, the desk contains an Intel i5 processor and runs a version of Windows 7 and a custom HTML5 interface with the curriculum. Books, notebooks, writing utensils, and "ink" will be stored within the desks memory and accessible via the cloud.
Acceptable Use Policy blog comments powered by Disqus