Driverless tech accelerated by robo racing cars
Johannes Betz is not your typical racing car driver.
For a start, he doesn't get in the vehicle - it's driverless. As a post-doctoral researcher, he is in charge of the Technical University of Munich's entry in the Roborace motorsport competition, now in its first competitive season.
All these cars are electric and self-driving. "We started in early 2017, when my professor saw this in a newspaper," he says.
"Each month, we have to develop our software a little further, and then go to an event - yeah, like Formula 1," he laughs.
Each team - the University of Pisa and electric van start-up Arrival also compete - writes software for an identical racing car, currently the DevBot 2.0, which is capable of speeds over 200mph (322km/h).
It is guided by six cameras, two radars, 18 ultrasound sensors, and five lidar [light detection and ranging] sensors. The onboard computer processor is capable of 24 trillion operations a second.
It was the first racing car to establish a fully autonomous official record at the Goodwood Hill Climb, an important motorsport event in England's West Sussex, on 13 July.
This narrow, winding 1.86 km (1.16-mile) track climbs 150m (492ft) round slippery bends, hay bales, and flint walls.
And DevBot navigated it in 66.96 seconds - eight seconds faster than an unofficial attempt last year.
Now "there's around twelve seconds left for the AI (artificial intelligence) to find" before it can match the best human drivers, says Bryn Balcombe, Roborace's chief strategy officer.
Of that, "six seconds we think is easily gained, and then you're starting to get into the unknowns."
But what is the point of racing driverless cars?
It's an important way to assess the quality of the sensors and cameras which autonomous vehicles (AVs) will rely on, explains Mr Balcombe.
And "testing performance limits on real roads is not something, as a member of society, I'm 100% comfortable with," he says.
He plans to introduce obstacles for the DevBots to navigate, such as slower-moving lorries and tractors. Overtaking is the hardest race course task to automate, says Dr Betz.
The ultimate aim is to find out whether driverless cars can eventually "perform at a level so you can't detect it's an AI," Mr Balcombe says.
Your ability to be autonomous "is based on your ability to see the world around you", explains Glen de Vos, chief technology officer at Aptiv, a car electronics company headquartered in Dublin.
So sensors are getting most of the attention these days.
Tesla's Elon Musk insists cameras can do the job alone. Mobileye, owned by the giant chipmaker Intel, is making a camera-only AV, too.
But most automotive experts think you need lots of other kit, too - lidar, radar, ultrasound, inter-car wireless communication and so on - in case cameras alone aren't up to the job.
Lidar is "really good at short distance, but gets really interfered with by weather", says Wael Elrifai, vice president of Hitachi Vantara, a subsidiary of the Japanese electronics multinational.
Cameras are "really good at managing shapes and colours to identify objects, and can even work at long distance", but rain and fog frustrate them, too.
And radar is "low resolution but works at long distances in bad weather, and is also good at determining the relative speed of something coming towards you", he says.
To be completely reliable, cameras would have to "match the ability of the human eye", argues Charles Boulanger, chief executive of Quebec lidar company LeddarTech. And cars would need "human brain" processing power to analyse all the images in real time.
But six lidars could add €15,000-20,000 (£13,700-18,300) to the price of a car today, says Mr De Vos, so it's more likely this tech specification level will be reserved for driverless ride sharing, shuttle buses, automated deliveries and robo-taxis.
Once you remove the cost of the driver and can use the vehicle 24-hours-a-day, the sensors start looking affordable. And once commercial AVs become more common - within the next five years some technologists predict - sensor manufacturing costs will fall.
"Optimising design, materials and manufacturing, this hasn't really begun in earnest," says Mr De Vos.
A big move forward, in cost and performance, will come from lidars on a chip, or solid-state lidars, says Raffi Mardirosian, vice president of San Francisco-based lidar-maker Ouster. When technology systems can be built on a single silicon chip, it opens the door to making them much more cheaply.
You might have seen bulky lidars on top of prototype driverless cars, spinning round scanning the environment. Solid-state lidars won't have moving parts, so one solution is flashing an entire area with laser light, says Mr Boulanger.
Other sensor technologies besides lidars are also emerging.
Like far infrared heat sensing, useful for spotting pedestrians, says Kirsty Lloyd-Jukes, chief executive of Latent Logic, an Oxford autonomous systems spin-off.
But there is a long way to go to make all these sensors reliable and cheap enough to please regulators and manufacturers, particularly when a simple sticker or bit of spray paint can fool AI into confusing a stop sign for a speed limit sign.
There is also the issue of how to maintain and repair all this complex sensor technology.
"We're used to yearly MOTs," says Ms Lloyd-Jukes, "but the existing ecosystem isn't going to be enough."
Insurers, repairers, and regulators will need to adapt to this new autonomous world.
At the moment, on average humans drive for at least eight million hours before misidentifying something that leads to an accident. Currently, AVs can only manage 10,000-30,000 hours.
"When you have a number like that, it's sending big flashing signals there's a lot of work to be done," warns Hitachi Vantara's Wael Elrifai.
But robo racing cars are at least accelerating development of this autonomous future.
Comments
Post a Comment