Ford going further with autonomous data sharing
Technology and Business

Alice Salter
24-08-2020
In this article, Alice will answer the questions:
In the world of driverless cars, the one thing we hear about time and time again is safety. And that comes down to testing. Even though we may take it for granted once we’ve got our license, driving is an incredibly complex skill. For vehicles to manage the many processes involved in driving autonomously they need to be exposed to all imaginable scenarios and be programmed to deal with the outcomes. That’s precisely why some autonomous test models drive as many as 20 million miles a day in simulated environments.
On the open road, however, progress is a little slower. AV testing requires safety drivers, is limited in terms of distance per day and there is no guarantee that any obstacles, which can provide useful data, will be met along the way. So far, it’s been up to individual manufacturers to take on the burden of building up data for their own use, but now Ford has shared a new option.
After running a fleet of 2014 Ford Fusion hybrids outfitted with LIDAR, cameras and sensors around the Metro Detroit area through every season, Ford has gathered a highly detailed data set. And they are sharing it. Their data answers a lot of questions on the performance, handling, decision-making and capabilities of its fleet of driverless cars – we explored the results.
You want to know more about how this data could improve autonomous cars and their ability to do a better job at driving in the darkness than humans? Read our article "Prince of Darkness: Autonomous vehicles at night".
Why is Ford's self-driving data unusual?
In the past, driverless innovators like Waymo and Lyft have shared their data with the world, but Ford’s goes to new depths that haven’t really been explored before. It documents various traffic patterns. It covers complicated freeways, built-up urban areas and even airport drop-offs. And it includes driving in all sorts of weather – something we know can be problematic for AVs.
Compared to existing performance data sets from other driverless technologies, which are quite narrow in scope (using closed roads, set-piece situations and academic vs. commercial data), Ford’s offering shows how driverless cars react to ‘real world’ situations. We can now explore what happens when the sun interferes with the car’s camera or LIDAR systems, or when snow covers road markings.
Most importantly, Ford used multiple autonomous vehicle platforms to collect this data simultaneously. That means data was collected about each car’s performance from the outside as cars passed on the road, as well as internally. This is particularly exciting as it could open up new research avenues in the area of collaborative autonomous driving.

What can we learn from Ford's data?
Ford chose to release their collected data in order to “further spur innovation in this exciting field,” and the information they’ve shared contains much which will prove useful to engineers and researchers creating the software which dictates how AVs analyse their surroundings. It teaches us a few lessons too, the most useful being:
How weather conditions affect sensors
One of the biggest jumps in terms of new raw data is the performance of Ford’s driverless fleet in favourable and adverse weather conditions. Thanks to the length of the project, Ford were able to use Detroit’s cold winter, wet spring and autumn and warm summer to collect a huge amount of information across more weather types than we’ve seen before.
The potential of collaborative driving
Driverless cars have limits to their ‘vision’ as sensors can’t always penetrate every area on every route. By running tests simultaneously, Ford’s data reveals how cars could work collaboratively to expand their own ‘vision’ with information shared between vehicles. This could open up new routes for multi-vehicle communication, localisation, perception and path planning.
Car handling in urban and commercial environments
Simple road layouts are understandably easier for AVs to navigate, but when elements like freeways, tunnels, urban neighbourhoods and airports are introduced, things quickly become more complex. Ford’s data covers each of these scenarios and includes information on construction zones and pedestrian activity too. Data on how cars performed in those situations will help researchers design algorithms robust enough to cope with dynamic environments in the future.
How 3D mapping can advance testing
This data set can now be mapped in 3D for anyone using it. Creating a virtual environment with ‘real world’ factors like cloud coverage, road reflectivity, brightness and other driving conditions that challenge humans will help developers create and complete increasingly accurate virtual testing before it begins on the road.
Can we expect more manufacturers to choose open source?
Though we have seen some autonomous innovators release data before, shared data on this scale from a major car manufacturer is quite revolutionary and could mark the start of a new, more collaborative, future for the industry.
By making their data freely available to all researchers, Ford have positioned themselves as thought leaders in the field. This move could be seen as a pivot towards working more like tech manufacturers – the open source operating system Linux springs to mind – rather than traditionally secretive car OEMs. If researchers using their data share their findings in this way too, information will become more free flowing through the entire industry. This has many implications.
Should the pursuit of automated transport be a group endeavour where data is freely shared to spur on innovation, rather than one where any individual company builds up a technological advantage? It could certainly be a way to ensure faster development for the tech. But whether more developers will follow Ford’s lead remains to be seen.
Do you think more manufacturers will follow Ford and share their data? Will this speed up innovation in driverless tech? We’d love to hear your thoughts – share them with us in the comments section below.
Comments