. 24/7 Space News .
Build Your Own Borg: Sort of

NASA is developing the Wearable Augmented Reality Prototype (Warp), a personal communication device. The voice- activated wearable computer allows easy, real-time access to voice communication, pictures, video, people and technical reports. "It wasn't so much the electronics but the packaging that ended up being the big unknown..." - JPL engineer Ann Devereaux Image Credit: JPL.
by Astrobiology Magazine
Moffett Field (SPX) Dec 02, 2004 Most observers of the Mars missions think the rovers are driven, like a car, but in fact they are commanded.

While using a joystick or wheel to drive a rover might at first seem appealing, the 20 minute delay in transmission of each signal from Earth to Mars would make a true drive more like a very slow crawl. To counteract this delay, more autonomy has been built into loading an entire day's worth of driving in a single set of commands sequences.

The core of this capability is sophisticated hazard avoidance and remote decisionmaking. While avoiding bad spots ranks highest in a decision tree, the opportunity for a rover to direct itself to interesting places becomes important, particularly for longer drives. A team of scientists has set out to combine human mobility with some of the latest off-the-shelf hardware to study what a remote geologist might do on another planet. The team calls their system, the "Cyborg Astrobiologist". The half-machine/half-human seeks out and prioritizes changes in its survey.

Their recent accounts in the field are abridged as a case study of what cybernetics might deliver. Patrick McGuire is the lead author describing a mission that included robotics experts, geologists, and a wearable computer equipped with image analysis software as its pointing compass. What follows is an excerpt of his longer account of field experiences so far with the "Cyborg" project.

We have developed and field-tested a "Cyborg Astrobiologist" system that now can:

  • Use human mobility to maneuver to and within a geological site and to follow suggestions from the computer as to how to approach a geological outcrop;
  • Use a portable robotic camera system to obtain a mosaic of color images;
  • Use a 'wearable' computer to search in real-time for the most uncommon regions of these mosaic images;
  • Use the robotic camera system to re-point at several of the most uncommon areas of the mosaic images, in order to obtain much more detailed information about these 'interesting' uncommon areas;
  • Use human intelligence to choose between the wearable computer's different options for interesting areas in the panorama for closer approach; and
  • Repeat the process as often as desired, sometimes retracing a step of geological approach.

The half-human/half-machine 'Cyborg' approach uses human locomotion and human-geologist intuition/intelligence for taking the computer vision-algorithms to the field for teaching and testing, using a wearable computer. This is advantageous because we can therefore concentrate on developing the 'scientific' aspects for autonomous discovery of features in computer imagery, as opposed to the more 'engineering' aspects of using computer vision to guide the locomotion of a robot through treacherous terrain.

This means the development of the scientific vision system for the robot is effectively decoupled from the development of the locomotion system for the robot.

The non-human hardware of the Cyborg Astrobiologist system consists of:

  • a 667 MHz wearable computer (from ViA Computer Systems in Minnesota) with a 'power-saving' Transmeta 'Crusoe' CPU and 112 MB of physical memory,
  • an SV-6 Head Mounted Display (from Tekgear in Virginia, via the Spanish supplier Decom in Valencia) with native pixel dimensions of 640 by 480 that works well in bright sunlight,
  • a SONY 'Handycam' color video camera (model DCR-TRV620E-PAL),
  • a thumb-operated USB finger trackball from 3G Green Green Globe, resupplied by ViA Computer Systems and by Decom,
  • a small keyboard attached to the human's arm,
  • a tripod for the camera, and
  • a Pan-Tilt Unit (model PTU-46-70W) from Directed Perception in California with a bag of associated power and signal converters.

The programming for this Cyborg Astrobiologist/Geologist project was initiated with the SONY Handycam in April 2002. The wearable computer arrived in June 2003, and the head mounted display arrived in November 2003.

We now have a reliably functioning human and hardware and software Cyborg Geologist system, which is partly robotic with its Pan Tilt camera mount. This robotic extension allows the camera to be pointed repeatedly, precisely and automatically in different directions.

Based upon the performance of the Cyborg Astrobiologist system during the 1st mission to Rivas in March 2004 on the outcropping cliffs near Rivas Vaciamadrid, we have decided that the system was paying too much attention to the shadows made by the 3D structure of the cliffs.

We hope to improve the Cyborg Astrobiologist system in the next months in order to detect and to pay less attention to shadows. We also hope to upgrade our system to include: image-segmentation based upon micro-texture; and adaptive methods for summing the uncommon maps in order to compute the interest map.

Based upon the significantly-improved performance of the Cyborg Astrobiologist system during the 2nd mission to Rivas in June 2004, we conclude that the system now is debugged sufficiently so as to be able to produce studies of the utility of particular computer vision algorithms for geological deployment in the field.

We have outlined some possibilities for improvement of the system based upon the second field trip, particularly in the improvement in the systems-level algorithms needed in order to more intelligently drive the approach of the Cyborg or robotic system towards a complex geological outcrop.

These possible systems-level improvements include: a better interest-map algorithm, with adaptation and more layers; hardware and software for intelligent use of the camera's zoom lens; a memory of the image segmentation performed at greater distance or lower magnification of the zoom lens; and highlevel image-interpretation capabilities.

Now that we have demonstrated that this software and hardware in the Cyborg Astrobiologist system can function for developing and testing computer-vision algorithms for robotic exploration of a geological site, we have some decisions to make as to future directions for this project, options for these future directions include:

Performing further offline analysis and algorithm-development for the imagery obtained at Rivas Vaciamadrid: several of the parameters of the algorithms need testing for their optimality, and further enhancements of the algorithms could be made.

Optimizing the image-processing and robotic-control code for the current Cyborg Astrobiologist system for speed and memory utilization. Further testing of the existing Cyborg geological exploration system at other geological sites with different types of imagery.

Speeding up the algorithm development by changing the project from being partly a hardware project with cameras and pan-tilt units and fieldwork to being entirely a software project without robotically-obtained image mosaics and without robotic interest-map pointing; with such a change in focus, our algorithms could be significantly enhanced by studying many more types of imagery: for example, from human geologist field studies on the Earth, from robotic geologist field studies on Mars, and from orbiter or flyby studies of our solar system's moons.

What the Mars MER team has achieved is truly amazing:

  • Firstly, the rovers can move to points 50-150 meters away in one sol with autonomous obstacle avoidance enabled for the uncertain or dangerous parts of the journey.
  • Secondly, prior to a given sol, based upon information received after the previous sol, the MER team has the remarkable capabilities to develop a command sequence of tens or hundreds of robotic commands for the entire sol.
  • As of July 4, 2004, this was taking 4-5 hours per sol for the mission team to complete, rather than the 17 hours per sol that it took at the beginning of the MER missions.

Such capabilities for semi-autonomous teleoperated robotic 'movement and discovery' are a significant leap beyond the capabilities of the previousMars lander missions of Viking I and II and of Pathfinder and Sojourner.

Nonetheless, we would like to build upon this great success of the MER rovers by developing enhancing technology that could be deployed in future robotic and/or human exploration missions to the Moon, Mars, and Europa.

One future mission deserves special discussion for the technology developments: the Mars Science Laboratory, planned for launch in 2009.

Related Links
Mars Rovers at JPL
Mars Rovers at Cornell
SpaceDaily
Search SpaceDaily
Subscribe To SpaceDaily Express

Argo Robotic Instrument Network Now Covers Most Of The Globe
San Diego CA (SPX) Dec 02, 2004
Scientists have crossed an important threshold in an international effort to deploy a global network of robotic instruments to monitor and investigate important changes in the world's oceans.



Thanks for being here;
We need your help. The SpaceDaily news network continues to grow but revenues have never been harder to maintain.

With the rise of Ad Blockers, and Facebook - our traditional revenue sources via quality network advertising continues to decline. And unlike so many other news sites, we don't have a paywall - with those annoying usernames and passwords.

Our news coverage takes time and effort to publish 365 days a year.

If you find our news sites informative and useful then please consider becoming a regular supporter or for now make a one off contribution.
SpaceDaily Contributor
$5 Billed Once


credit card or paypal
SpaceDaily Monthly Supporter
$5 Billed Monthly


paypal only














The content herein, unless otherwise known to be public domain, are Copyright 1995-2016 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. Privacy Statement All images and articles appearing on Space Media Network have been edited or digitally altered in some way. Any requests to remove copyright material will be acted upon in a timely and appropriate manner. Any attempt to extort money from Space Media Network will be ignored and reported to Australian Law Enforcement Agencies as a potential case of financial fraud involving the use of a telephonic carriage device or postal service.