Science is the poetry of Nature.







Contributing Authors
Researchers’ new system could study other worlds with a cell phone camera.

(ISNS) — An international team of researchers has developed a simple way to make a future planetary rover behave more like a human geologist, using just a cell phone camera and laptop. Dubbed “the cyborg astrobiologist,” the system is designed to pick out novel features in rocky landscapes to speed up exploration and identification of alien terrain.

The science missions of current rovers, like Curiosity on Mars, are slowed in part by their reliance on human operators, whose instructions take 14 minutes to reach the rover from Earth. Despite Curiosity’s high-tech cameras, a human pair of eyes is still required to evaluate any images of Martian rocks, and even the rover’s navigation is mostly under external control. The goal of the cyborg astrobiologist is to automate the geological analysis portion of the decision-making for future rovers, said the project’s lead author, planetary scientist Patrick McGuire of Freie Universität in Berlin.

McGuire and his colleagues report on the first field test of their computer vision system in an upcoming issue of the International Journal of Astrobiology. A former coal mine in West Virginia served as the study’s Mars-like backdrop. The scientists used a conventional cell phone to take pictures of rock outcroppings, lichens, shale, and sandstone. The pictures were then sent via Bluetooth to a nearby laptop that analyzed the images. Essentially, said McGuire, the cyborg astrobiologist “compares color and textures in images and looks for redundant color patches or repeating pixels.” Much like a human geologist, it’s looking for novelty, or attractive regions for further exploration, and similarity, to categorize and place images with already identified rock features.

Of the 55 images taken in just an hour, the cyborg software correctly classified 91 percent of images that geologists considered similar, and for novel images, the software’s verdict matched the geologists’ findings 64 percent of the time. After initial geological detection, more sophisticated sensors could be trained on novel areas to look for biochemistry or organics, said McGuire.

The system tended to have difficulty with images that contained similar colors but completely different textures, like lichens and sulfur-streaked coalbeds that were both yellow.

"Lighting and scale are perennial challenges," David Thompson, a computer vision expert at NASA’s Jet Propulsion Laboratory, told Inside Science via email. He has been working on similar questions in image analysis. "The human eye, backed with its billion-neuron computer, is adept at distinguishing important attributes from incidental ones like lighting or surface coatings. Teaching a silicon computer to make the ‘right’ distinctions is a challenge."

When he started this computer vision project 11 years ago, McGuire elected not to use robots to test the algorithms – they are “too complex, and break down. A human replaces a lot of robotic capabilities” at the software development and testing stage and acts as a control for judging its output, hence the astrobiology software is a “cyborg.” A cumbersome wearable computer system with a video camera gave way to a simpler phone camera for testing the skills of the software. A laptop and cell phone obviously won’t be in the arsenal of Curiosity’s successor; rather, the software would be integrated into the robot, whose cameras would also have more sophisticated imaging capabilities.

McGuire acknowledges that further field testing on Earth, plus improvements to the software’s speed, are necessary before the cyborg astrobiologist could be deployed to Mars. But the ability for robots to perform even simple geology analyses autonomously could make missions more efficient, and computer vision has now advanced to a stage where this is possible, said Thompson.

One advantage that the cyborg astrobiologist software has – it is unsupervised, meaning it does not have to learn image characteristics from prior datasets to work well – is also a limitation. As the researchers write in their paper, “the algorithm cannot really identify lichens or coal as being lichens or coal.”

A human must evaluate the software’s output, and for the time being at least, will also have a keener eye for discontinuities or small details in rock formations that could prove interesting.

"Robots are ultimately just tools" said Thompson, "and the real intelligence – for the long-foreseeable future – lies with investigators on Earth."

But until people are sent to other planets to have a look for themselves, a semi-independent system like the cyborg astrobiologist could prove very valuable for mapping planetary surfaces, and in the search for extraterrestrial life.

  1. dj-garv reblogged this from scinerds
  2. tratnayake reblogged this from scinerds
  3. dcy3 reblogged this from scinerds
  4. alisohani reblogged this from emergentfutures
  5. whatsurface34 reblogged this from emergentfutures
  6. triboonetwork reblogged this from emergentfutures
  7. emergentfutures reblogged this from scinerds
  8. viirulentscience reblogged this from scinerds
  9. imahflowah reblogged this from pabstbluerippem
  10. felinedatabase reblogged this from scinerds
  11. pabstbluerippem reblogged this from scinerds
  12. emptyknight reblogged this from scinerds
  13. becomingyourfears reblogged this from scinerds
  14. bonbonusagi reblogged this from scinerds
  15. eerieearthling reblogged this from scinerds
  16. ofcellandclocwork reblogged this from scinerds
  17. higgsupernova reblogged this from scinerds
  18. somethinglessstupid reblogged this from scinerds
  19. holy-shit-8 reblogged this from scinerds