Autonomous driving - are we there yet? (Part 3)
In the third part of this user opinion series, written by Jack Creasey, he'll have a look at what is missing from the current development mules.
It’s clear that huge advances in the available compute capability and the use of AI to solve recognition needs in the driving task are rapidly advancing the development of driverless vehicles; however, these driving tasks are not the only requirements for a complete and safe driverless solution.
Here are some additional requirements that I think should be considered in any design for broadly available driverless vehicles providing TAAS/MAAS solutions:
- Sensor distances and FOV are restricted and need to be extended. For example, there are few Lidars able to see more than 300 ft in degraded conditions. Cameras are subject to glare (both Sun and headlight) and are degraded in rain/snow/dust. Since traffic conditions may include stop and go with restricted FOV through to freeway speeds up to 70 mph (and even higher in Europe) the perception software will be challenging. No-one has yet demonstrated development vehicles with this type of broad capability.
- Vehicles are designed with extensive ADAS capability (ESC, ABS etc) and it has yet to be demonstrated that the driving task controllers under development integrate well with these features. We already have clear indications that this integration is problematic if not implemented, one needs to look no further than Ubers accident that caused a pedestrian fatality where basic vehicle safety measures were turned off in order to test the automation controller. The goal here must be an intimate integration of the base vehicle capability, with the automation controller.
- Vehicles used as driverless platforms will need high automation everywhere. For example, all windows, doors and trunks need to be automated otherwise simply leaving a door ajar disables the vehicle. Automating the doors carries its own requirements such as how far they open (to avoid hitting another vehicle or pedestrian) or opening on the non-pavement side where other moving traffic may exist. For example, Waymo in their First Responder training advised that a first responder should leave a door open to prevent their vehicle from potentially attempting to drive away! This does not auger well for a vehicle in a crash scenario and highlights the lack of automation controller perception.
- Seatbelts, child restraints and seats are still required, but there is the possibility of a passenger undoing their seatbelt or opening a window inappropriately. What software will the developers use to address passenger created problems?
- Passenger monitoring and interaction must be supported in depth. In addition to monitoring the passengers in the vehicle, it becomes necessary to provide perception (yet more AI?) to understand their actions and interactions with the vehicle. In addition, you may need to detect and deal with a passenger in need of medical attention. Providing a touch screen in a driverless vehicle may not be enough to handle all situations.
- Interaction with pedestrians is an essential element. The need for external displays is required for not only pedestrian interaction, but to pick up passengers (matching the passenger to the vehicle). How this will work for disadvantaged consumers such as the blind is not yet clear, and for driverless vehicles to be universal this communications issue be resolved.
- Cities in which driverless vehicles are deployed as large fleets need a unified city-wide navigation system. Management of navigation cannot be left to each individual service provider; large fleets of vehicles from various providers will need some form of global control. Cities allowing deployment of driverless vehicles may have to step up and fill this need using V2X or similar technologies. To allow the current developers to pursue a go-it-alone methodology simply postpones this inevitable requirement.
- Some form of an off-vehicle-backup direction is essential to cover all operational situations. The current press around human remote backup drivers seem misplaced to me, and unlikely to provide a reasonable solution for large fleets. For example: Waymo’s goal is to provide a Uber like TAAS service but support a fallback capability over a wireless channel to a human backup driver (monitoring multiple vehicles). Firstly, if Waymo intends to have a backup driver in a Call Center, isn’t this effectively an SAE Level 3 solution and not SAE Level 4? What ratio of Call Center backup drivers to vehicles on the road will be required to scale? If one considers a 10:1 ratio that means potentially 8000 backup drivers for their 80000-vehicle fleet. At 100:1 it will require 800 backup drivers. This seems an unlikely way to scale successfully.
So, are we there yet?
In my opinion, we are not there yet. The driverless vehicle industry must address all the requirements that make driverless vehicles safe.
Much more development is needed to produce a viable end-product that will be safe under all operational scenarios. In my opinion, road accident fatalities caused by the early release of automated vehicles with inadequate driving capabilities is unacceptable. My hope is that within the next 2-3 years, the industry will step up and deliver well-integrated and managed driverless solutions that are truly consumer-safe.