Latest News, Special Features

Robot road-trip

Researchers at Queensland’s University of Technology are taking artificial intelligence technology on the road to prepare Australia for an autonomous future.

Earlier this year, Tesla CEO Elon Musk announced the launch of a new type of microchip, the FSD, which stands for full self-driving. At an event celebrating the launch he said it was “objectively the best chip in the world…by a huge margin”.

While the claim has been largely contested and seems more tied to theatrics than reality, the truth is, regardless of the objective significance of Musk’s microchip, driverless car technology is developing at a rapid pace.

The robot or artificial intelligence (AI) system that powers a driverless car is usually trained through virtual simulation and deep learning algorithms.

Simulations teach the AI how to navigate itself using real-world road examples and subsequent intelligence aggregation. The technology is admittedly impressive, however leaves room for blind spots potentially unnoticed by human developers. When confronted with a rural road that lacks lane markings for example, human-beings would use their common sense and stick to the left – a robot needs to be told.

While driving, automated vehicles use camera vision systems to detect, read and interpret roadside traffic signs and line markings in order to follow road rules and systems. However, an Austroads report on Automated Vehicles from 2018 shows traffic sign recognition systems have difficulties reading Australian signage.

According to the report, a key issue facing the technology is the installation and maintenance of signage. Traffic sign recognition systems performed well when dealing with standard speed signs in testing, but the vision systems could not handle significant variations to a core standard at the current stage of development.

Queensland University of Technology (QUT) Robotics Professor and Chief Investigator at the Australian Center for Robotic Vision (ACRV), Michael Milford, has observed similar challenges.

“Autonomous car technology is developing incredibly quickly, however Australian roads are unique, and we need to conduct more research into how this new technology will be able to position itself within our infrastructure,” Prof. Milford says.

As its name suggests, ACRV is driven by a desire to develop technology that enables robots to see, and like humans, understand their environment using this sense of vision.

Mr Milford’s specific field of research models the neural mechanisms in the brain that underlie tasks like navigation and perception. He uses this to develop new ideas and systems for challenging applications such as all-weather, anytime positioning for autonomous vehicles.

Learning and recognition is one of the fundamental processes performed by humans and according to Prof. Milford, should similarly be a fundamental process for robots and AI. The field of robotics centres around how machines interact with the physical world. The field of computer vision is focused on analysis and understanding the world through images. ACRV, in an attempt to solve real-world challenges, takes a multidisciplinary approach and combines the two through the application of computer vision to robotics.

Prof. Milford’s latest project involves a driver taking an electric car, which a QUT engineering team has fitted out with high-tech sensors and an onboard computer, on a 1200 kilometre road trip. Over the three-month road trip the research team will assess the cars AI data each day and analyse how the car responses to certain obstacles and a wide range of road and weather conditions.

“It’s all about investigating how the artificial intelligence and sensors you would find on an autonomous vehicle will react and cope to the specific infrastructure on Australian roads,” says Prof. Milford.

The project is part of the Queensland Department of Transport and Main Roads Cooperative and Highly Automated Driving Pilot, which is also supported by the iMOVE Cooperative Research Centre. Mr. Milford says early testing revealed how a paint spill on asphalt could confuse a self-driving AI system into identifying it as a lane marking.

“When faced with everyday regional road conditions, current autonomous car systems get confused and refuse to work in autonomous mode,” says Prof. Milford.

“Our new research project will look at how a car’s AI system copes with lane markings, traffic lights and street signs and how to determine a vehicle’s exact position despite errors that occur with GPS systems in highly built-up urban areas, or poor reception areas like urban canyons.”

The test car has been fitted with multiple sensors and cameras common for a driverless car, rather than having the car drive itself however, Prof. Milford and his team will use these sensors to collect data and observe how the system reacts to certain conditions.

“We will then process the data using powerful computing resources to develop high tech algorithms about what works well and what doesn’t in terms of how the car interacts with the various levels of Australian infrastructure.”

The car has been fitted with three different types of sensor, the first being a GPS system that can tell you where the car is positioned within a fewcentimetres of precision. Cameras designed to replicate what humans do with their eyes are the second type of sensor and LiDAR, which stands for light detection and ranging, is the third. LiDAR uses light in the form of a pulsed laser to measure variable distances. LiDAR sensors are positioned on the car to allow 360-degree viewing and help the car understand different distances between objects.

“The primary goal of our research is to determine how current advances in robotic vision and machine learning enable our research car platform to see and make sense of everyday road signage and markings that we, as humans, take for granted,” Prof. Milford says. “Safety is an off-shoot, but that’s not the focus of this study. What’s important is understanding how AI performs and potential improvements to both the technology and physical infrastructure as the autonomous car revolution unfolds.”

On the same day he launched his new microchip, Musk said Tesla would have a million driverless cars on the road by the end of 2020. He added that the cars would by Level 5 without a geofence, or in laymen’s terms they would be able to travel without a human behind the wheel anywhere and under any conditions. While Prof. Milford and his team might be more realistic about the pace of development than Musk, they’re still setting their sights high.

“Robotics and AI are ultimately about enhancing human life in some way – we’ll be on the roads day and night and in all weather conditions to be sure AI is put to the ‘real world’ test.”


Related stories:

Send this to a friend