. 24/7 Space News .
CAR TECH
AI-powered autonomous driving vehicle
by Staff Writers
Toyohashi, Japan (SPX) Sep 13, 2022

The AI model architecture is composed of the perception module (blue) and the controller module (green). The perception module is responsible for perceiving the environment based on the observation data provided by an RGBD camera. Meanwhile, the controller module is responsible for decoding the extracted information to estimate the degree of steering, throttle, and braking.

A research team consisting of Oskar Natan, a Ph.D. student, and his supervisor, Professor Jun Miura, who are affiliated with the Active Intelligent System Laboratory (AISL), Department of Computer Science Engineering, Toyohashi University of Technology, has developed an AI model that can handle perception and control simultaneously for an autonomous driving vehicle. The AI model perceives the environment by completing several vision tasks while driving the vehicle following a sequence of route points.

Moreover, the AI model can drive the vehicle safely in diverse environmental conditions under various scenarios. Evaluated under point-to-point navigation tasks, the AI model achieves the best drivability of certain recent models in a standard simulation environment.

Autonomous driving is a complex system consisting of several subsystems that handle multiple perception and control tasks. However, deploying multiple task-specific modules is costly and inefficient, as numerous configurations are still needed to form an integrated modular system. Furthermore, the integration process can lead to information loss as many parameters are adjusted manually.

With rapid deep learning research, this issue can be tackled by training a single AI model with end-to-end and multi-task manners. Thus, the model can provide navigational controls solely based on the observations provided by a set of sensors. As manual configuration is no longer needed, the model can manage the information all by itself.

The challenge that remains for an end-to-end model is how to extract useful information so that the controller can estimate the navigational controls properly. This can be solved by providing a lot of data to the perception module to better perceive the surrounding environment. In addition, a sensor fusion technique can be used to enhance performance as it fuses different sensors to capture various data aspects.

However, a huge computation load is inevitable as a bigger model is needed to process more data. Moreover, a data preprocessing technique is necessary as varying sensors often come with different data modalities. Furthermore, imbalance learning during the training process could be another issue since the model performs both perception and control tasks simultaneously.

In order to answer those challenges, the team propose an AI model trained with end-to-end and multi-task manners. The model is made of two main modules, namely perception and controller modules. The perception phase begins by processing RGB images and depth maps provided by a single RGBD camera. Then, the information extracted from the perception module along with vehicle speed measurement and route point coordinates are decoded by the controller module to estimate the navigational controls.

So as to ensure that all tasks can be performed equally, the team employs an algorithm called modified gradient normalization (MGN) to balance the learning signal during the training process. The team considers imitation learning as it allows the model to learn from a large-scale dataset to match a near-human standard. Furthermore, the team designed the model to use a smaller number of parameters than others to reduce the computational load and accelerate the inference on a device with limited resources.

Based on the experimental result in a standard autonomous driving simulator, CARLA, it is revealed that fusing RGB images and depth maps to form a birds-eye-view (BEV) semantic map can boost the overall performance. As the perception module has better overall understanding of the scene, the controller module can leverage useful information to estimate the navigational controls properly. Furthermore, the team states that the proposed model is preferable for deployment as it achieves better drivability with fewer parameters than other models.

Future outlook
The team is currently working on modifications and improvements to the model so as to tackle several issues when driving in poor illumination conditions, such as at night, in heavy rain, etc. As a hypothesis, the team believes that adding a sensor that is unaffected by changes in brightness or illumination, such as LiDAR, will improve the model's scene understanding capabilities and result in better drivability. Another future task is to apply the proposed model to autonomous driving in the real world.

Research Report:End-to-end Autonomous Driving with Semantic Depth Cloud Mapping and Multi-agent


Related Links
Toyohashi University of Technology
Car Technology at SpaceMart.com


Thanks for being there;
We need your help. The SpaceDaily news network continues to grow but revenues have never been harder to maintain.

With the rise of Ad Blockers, and Facebook - our traditional revenue sources via quality network advertising continues to decline. And unlike so many other news sites, we don't have a paywall - with those annoying usernames and passwords.

Our news coverage takes time and effort to publish 365 days a year.

If you find our news sites informative and useful then please consider becoming a regular supporter or for now make a one off contribution.
SpaceDaily Monthly Supporter
$5+ Billed Monthly


paypal only
SpaceDaily Contributor
$5 Billed Once


credit card or paypal


CAR TECH
After pandemic hiatus, Detroit Auto Show reboots itself
New York (AFP) Sept 11, 2022
Less glitz, better weather. The Detroit Auto Show, long a dead of winter mainstay that drew car industry brass and international media to America's "Motor City" ahead of a big public expo, will convene next week for the first time since the Covid-19 pandemic. The event, reconceptualized as a partly outdoor gathering, will spotlight the growing class of electric vehicles (EV) that are beginning to hit showrooms, in what is still the early days of a lengthy transition. With no Detroit show si ... read more

Comment using your Disqus, Facebook, Google or Twitter login.



Share this article via these popular social media networks
del.icio.usdel.icio.us DiggDigg RedditReddit GoogleGoogle

CAR TECH
Redwire and Sodern team up to market the Exquisite-Class Eagle Eye Star Tracker

US should end ISS collaboration with Russia

Harris talks with space station astronauts, introduces new initiatives

NASA-funded technology helps relieve symptoms of menopause

CAR TECH
Rocket Lab Prepares to Launch Synspective Satellite on 30th Electron Launch

SpaceX launches 34 more Starlink satellites, AST SpaceMobile satellite

Uncrewed Blue Origin rocket crashes in setback for space tourism

Blue Origin rocket suffers booster failure, prompting emergency abort system

CAR TECH
Searching for Frost at Jezero Crater

Sols 3592-3593: Onwards

Glaciers flowed on ancient Mars, but slowly

Martian rock-metal composite shows potential of 3D printing on Mars

CAR TECH
Rocket to carry Mengtian space lab module arrives at launch site

Duo undertake 7-hour spacewalk

Chinese scientist advocates int'l cooperation in space science

China's Shenzhou-14 astronauts carry out spacewalk

CAR TECH
Spaceflight signs with NewSpace India to launch Astrocast IoT satellite into orbit

OneWeb and HD Hyundai Avikus to advance marine technology by unlocking the potential of LEO connectivity

MDA Selected by Airbus OneWeb Satellites for US Government Program

PCX Aerosystems Announces Acquisition of NuSpace

CAR TECH
Crisis-hit German toilet paper maker turns to coffee grounds

How the tide turned on data centres in Europe

Vestigo Aerospace raises $375K in seed funding to spur deorbit systems

New ice-shedding coating is 100x stronger than others

CAR TECH
Twisted magnetic fields can reveal how protobinary systems, Tatooine planets form

A thousand days of CHEOPS

Surprise finding suggests 'water worlds' are more common than we thought

Two new rocky worlds around an ultra-cool star

CAR TECH
NASA's Juno Mission Reveals Jupiter's Complex Colors

The PI's Perspective: Extending Exploration and Making Distant Discoveries

Uranus to begin reversing path across the night sky on Wednesday

Underwater snow gives clues about Europa's icy shell









The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us.