Free Newsletters - Space News - Defense Alert - Environment Report - Energy Monitor
by Staff Writers
Salt Lake City UT (SPX) Jan 21, 2014
If we've learned anything from post-apocalyptic movies it's that computers eventually become self-aware and try to eliminate humans. BYU engineer Dah-Jye Lee isn't interested in that development, but he has managed to eliminate the need for humans in the field of object recognition. Lee has created an algorithm that can accurately identify objects in images or video sequences without human calibration.
"In most cases, people are in charge of deciding what features to focus on and they then write the algorithm based off that," said Lee, a professor of electrical and computer engineering. "With our algorithm, we give it a set of images and let the computer decide which features are important."
Not only is Lee's genetic algorithm able to set its own parameters, but it also doesn't need to be reset each time a new object is to be recognized-it learns them on its own.
Lee likens the idea to teaching a child the difference between dogs and cats. Instead of trying to explain the difference, we show children images of the animals and they learn on their own to distinguish the two. Lee's object recognition does the same thing: Instead of telling the computer what to look at to distinguish between two objects, they simply feed it a set of images and it learns on its own.
In a study published in the December issue of academic journal Pattern Recognition, Lee and his students demonstrate both the independent ability and accuracy of their "ECO features" genetic algorithm.
The BYU algorithm tested as well or better than other top object recognition algorithms to be published, including those developed by NYU's Rob Fergus and Thomas Serre of Brown University.
Lee and his students fed their object recognition program four image datasets from CalTech (motorbikes, faces, airplanes and cars) and found 100 percent accurate recognition on every dataset. The other published well-performing object recognition systems scored in the 95-98% range.
The team has also tested their algorithm on a dataset of fish images from BYU's biology department that included photos of four species: Yellowstone cutthroat, cottid, speckled dace and whitefish. The algorithm was able to distinguish between the species with 99.4% accuracy.
Lee said the results show the algorithm could be used for a number of applications, from detecting invasive fish species (think of the carp in Utah Lake) to identifying flaws in produce such as apples on a production line.
"It's very comparable to other object recognition algorithms for accuracy, but, we don't need humans to be involved," Lee said. "You don't have to reinvent the wheel each time. You just run it."
Fellow BYU electrical and computer engineering professor James Archibald as well as graduate students Kirt Lillywhite and Beau Tippetts were coauthors on the research.
Brigham Young University
All about the robots on Earth and beyond!
|The content herein, unless otherwise known to be public domain, are Copyright 1995-2014 - Space Media Network. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA Portal Reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. Advertising does not imply endorsement,agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. Privacy Statement|