by Staff Writers
Singapore (SPX) Feb 17, 2017
Scientists from Nanyang Technological University, Singapore (NTU Singapore) have developed an ultrafast high-contrast camera that could help self-driving cars and drones see better in extreme road conditions and in bad weather.
Unlike typical optical cameras, which can be blinded by bright light and unable to make out details in the dark, NTU's new smart camera can record the slightest movements and objects in real time.
The new camera records the changes in light intensity between scenes at nanosecond intervals, much faster than conventional video, and it stores the images in a data format that is many times smaller as well.
With a unique in-built circuit, the camera can do an instant analysis of the captured scenes, highlighting important objects and details.
Developed by Assistant Professor Chen Shoushun from NTU's School of Electrical and Electronic Engineering, the new camera named Celex is now in its final prototype phase.
"Our new camera can be a great safety tool for autonomous vehicles, since it can see very far ahead like optical cameras but without the time lag needed to analyse and process the video feed," explained Asst Prof Chen.
"With its continuous tracking feature and instant analysis of a scene, it complements existing optical and laser cameras and can help self-driving vehicles and drones avoid unexpected collisions that usually happens within seconds."
Asst Prof Chen unveiled the prototype of Celex last month at the 2017 IS and T International Symposium on Electronic Imaging (EI 2017) held in the United States.
It received positive feedback from the conference attendees, many of whom are academia and top industry players.
How it works
High-speed video cameras that record up to 120 frames or photos per second generate gigabytes of video data, which are then processed by a computer in order for self-driving vehicles to "see" and analyse their environment.
The more complex the environment, the slower the processing of the video data, leading to lag times between "seeing" the environment and the corresponding actions that the self-driving vehicle has to take.
To enable an instant processing of visual data, NTU's patent-pending camera records the changes between light intensity of individual pixels at its sensor, which reduces the data output. This avoids the needs to capture the whole scene like a photograph, thus increasing the camera's processing speed.
The camera sensor also has a built-in processor that can analyse the flow of data instantly to differentiate between the foreground objects and the background, also known as optical flow computation. This innovation allows self-driving vehicles more time to react to any oncoming vehicles or obstacles.
The research into the sensor technology started in 2009 and it has received $500,000 in funding from the Ministry of Education Tier 1 research grant and the Singapore-MIT Alliance for Research and Technology (SMART) Proof-of-Concept grant.
The technology was also published in two academic journals published by the Institute of Electrical and Electronics Engineers (IEEE), the world's largest technical professional organisation for the advancement of technology.
Asst Prof Chen expects that the new camera will be commercially ready by the end of this year, as they are already in talks with global electronic manufacturers.
Nanyang Technological University
Car Technology at SpaceMart.com
|The content herein, unless otherwise known to be public domain, are Copyright 1995-2017 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. Privacy Statement|