. 24/7 Space News .
UCSD Computer Scientists Develop Video Application For 3D Environments

The RealityFlythrough software stitches together images and live video feeds to simulate the 3D environment.
  • Neil McCurdy discusses potential commercial applications for his new software, including virtual tourism.
  • 'Tele-Reality in the Wild' is a research video by Neil McCurdy about an earlier version of the technology.
  • San Diego CA (SPX) Jun 09, 2005
    Computer scientists at the University of California, San Diego have taken the wraps off a new technique for mixing images and video feeds from mobile cameras in the field to provide remote viewers with a virtual window into a physical environment.

    Dubbed 'RealityFlythrough,' the application constructs a 3D virtual environment dynamically out of the live video streams.

    "Instead of watching all the feeds simultaneously on a bank of monitors, the viewer can navigate an integrated, interactive environment as if it were a video game," said UCSD computer science and engineering professor Bill Griswold, who is working on the project with Ph.D. candidate Neil McCurdy.

    "RealityFlythrough creates the illusion of complete live camera coverage in a physical space. It's a new form of situational awareness, and we designed a system that can work in unforgiving environments with intermittent network connectivity."

    The researchers at UCSD's Jacobs School of Engineering have already begun testing the software for homeland security and emergency response, but they say that the technology has other potential consumer uses as well.

    "With virtual tourism, for instance, you could walk down the streets of Bangkok to see what it will be like before getting there," said McCurdy.

    "Another really cool application is pre-drive driving instructions. Imagine going to your favorite mapping website, where currently you get a set of instructions to turn left here or right there, and instead, you can 'fly' through the drive before doing it."

    On June 6 at MobiSys 2005 in Seattle, McCurdy presented a joint paper* with Griswold about RealityFlythrough and a "systems architecture for ubiquitous video."

    The third international conference on mobile systems, applications and services brings together academic and industry researchers in the area of mobile and wireless systems.

    Griswold and McCurdy are testing their new system as part of the WIISARD (Wireless Internet Information System for Medical Response in Disasters) project, which is funded by NIH's National Library of Medicine.

    During a May 12 disaster drill organized by San Diego's Metropolitan Medical Strike Team, the researchers shadowed a hazmat team responding to a simulated terrorist attack. They wore cameras mounted on their hardhats, tilt sensors with magnetic compasses, and global positioning (GPS) devices.

    Walking through the simulated disaster scene at the city's Cruise Ship Terminal, McCurdy and Griswold captured continuous video to be fed over an ad hoc wireless network to a makeshift command post nearby.

    The RealityFlythrough software automatically stitches the feeds together, by integrating the visual data with the camera's location and direction it is pointing.

    "Our system works in ubiquitous and dynamic environments, and the cameras themselves are moving and shifting," said McCurdy, who expects to finish his Ph.D. in 2006.

    "RealityFlythrough situates still photographs or live video in a three-dimensional environment, making the transition between two cameras while projecting the images onto the screen. We're cheating and flattening space into two dimensions and then re-projecting the images in 3D space."

    The UCSD researchers say the biggest research challenge was to overcome the limitation of incomplete coverage of live video streams.

    "Every square meter of a space cannot be viewed from every angle with a live video stream at any given moment," said Griswold, an academic participant in the California Institute for Telecommunications and Information Technology (Calit2).

    "We had to find a way to fill in the empty space that would give the user a sense of how the video streams relate to one another spatially."

    Their solution: RealityFlythrough fills in the gaps in coverage with the most recent still images captured during camera pans.

    The software then blends the imagery with smooth transitions that simulate the sensation of a human performing a walking camera pan - even when one of the images is a still frame. If older images are not desirable (e.g. in some security applications), the fill-in images can be omitted, or shown in sepia, or include an icon displaying how old the photo is.

    The fundamental research finding to date, according to McCurdy, is that some of the processing can be offloaded to the human.

    "We take advantage of a principle called closure, which allows our brains to make sense of incomplete information. The visual cortex does this all the time when it corrects for blind spots in our vision, for example," explained the graduate student.

    "RealityFlythrough supplies as much information as possible to the human operator, and the operator can easily fill in the blanks."

    Human input is especially important indoors, where GPS cannot provide adequate location information. McCurdy carried a 'dead reckoning' device on his back during the May 12 disaster drill. The device uses gyros and other components to track body movement directions and footsteps from the moment the user enters an indoor area.

    Since dead-reckoning systems lose accuracy over time, the researchers implemented a system that allows the camera operators to periodically correct their locations. "We created a Wizard-of-Oz approach to correcting inadequate location information," explained McCurdy.

    "Since we're combining this self-reporting technology with GPS or dead reckoning, it only has to be done occasionally. From all the footage we got from the May 12 drill, I only had to put in four corrections, and that was sufficient to give us pretty good accuracy indoors."

    McCurdy will work on refining the system for his dissertation. And if consumers start to show interest in RealityFlythrough, he holds open the possibility of starting up a company to commercialize the technology -- but only after finishing his Ph.D. in 2006.

    Related Links
    Jacobs School
    SpaceDaily
    Search SpaceDaily
    Subscribe To SpaceDaily Express

    CyberWalk - Unconstrained Walking In Virtual Worlds
    Tuebingen, Germany (SPX) May 2, 2005
    The development of a walking platform which will allow unconstrained movement in virtual worlds is the goal of the CyberWalk project, initiated by scientists of the Max Planck Institute for Biological Cybernetics in Tuebingen, Germany, together with their colleagues from the Technical University Munich, the Swiss Federal Institute of Technology, Zurich, Switzerland and the University Roma, Italy.



    Thanks for being here;
    We need your help. The SpaceDaily news network continues to grow but revenues have never been harder to maintain.

    With the rise of Ad Blockers, and Facebook - our traditional revenue sources via quality network advertising continues to decline. And unlike so many other news sites, we don't have a paywall - with those annoying usernames and passwords.

    Our news coverage takes time and effort to publish 365 days a year.

    If you find our news sites informative and useful then please consider becoming a regular supporter or for now make a one off contribution.
    SpaceDaily Contributor
    $5 Billed Once


    credit card or paypal
    SpaceDaily Monthly Supporter
    $5 Billed Monthly


    paypal only














    The content herein, unless otherwise known to be public domain, are Copyright 1995-2016 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. Privacy Statement All images and articles appearing on Space Media Network have been edited or digitally altered in some way. Any requests to remove copyright material will be acted upon in a timely and appropriate manner. Any attempt to extort money from Space Media Network will be ignored and reported to Australian Law Enforcement Agencies as a potential case of financial fraud involving the use of a telephonic carriage device or postal service.