Infographic: cars in 2025

Infographics

Gareth Watson

18-11-2015

       

The car of the future will be able to drive itself. To do that, it must be able to sense, memorize and communicate. In short: the car will become like a human. But that is not enough. It will need super-human qualities.

 

Picture yourself in a car on a highway in 2025. It is driving, you’re the passenger. You’re surfing the web on a tablet, not paying attention to the road. You’ve entrusted the car with your life.

 

But on what basis? You’d only let go of the steering wheel if the car can do everything you can do and more. Therefore, the car of 2025 will be equipped with cutting-edge technology that not only emulates human capabilities but transcends them. 

 

Cars of 2025

 

SEEING

Various cameras will be located around the car. There is, of course, the stereo camera at the front. It replicates binocular vision via two images being merged, allowing it to perceive depth and see 3D shapes of obstacles – like pedestrians, crossing the road. It can then inform the intelligent actuators in the system (brakes and steering) to avoid hitting them.

 

Rear-view and wing mirrors will be a thing of the past. Cameras will replace them and will provide a 360° surround view. Other forward facing mono cameras will even be able to read street signs.

 

Not only will there be eyes outside but inside too. An infrared camera system in the cockpit will be able to track head movements to know where the driver’s attention is focused. LED strips will direct it back to the road if necessary.

 

 

HEARING

Like with sight, if the car of the future is to hear, one would hope it sports more enhanced auditory capabilities than a human. The increased range of a dog perhaps. Or better still – the echolocation qualities of a bat – another active sense. Indeed, ultrasonic sensors are used to detect stationary objects – even in today’s park distance control features.   

 

 

 

A SENSE OF MOVEMENT

It is often forgotten that our senses go beyond the five primary senses of seeing, hearing, touching, tasting and smelling. Besides detecting things like pain, vibration and temperature, we have a range of senses related to our movement: balance and kinesthesia (the sense of movement and position).

 

Needless to say, the car of 2025 will also possess these senses. Our sense of balance is akin to the Electronic Stability Control (ESC) system. Although not new in itself, this system which detects and corrects for loss of traction, is fundamental to many advanced driver assistance systems. The car’s movement sense or kinesthetic sense is provided by the motion sensors that let it know its relative position – is it going uphill, downhill etc.?

 

 

THINKING AND DECISION-MAKING

 

As with human senses, there are shortcomings. Like our eyes, cameras may be obscured by dirt, or bright sunlight. Like our touch, active sensors can tell us that something is there but not what it is. Like our hearing, ultrasonic senses can only function to certain distances.

 

Fusing all of the data together paints a complete picture of the car’s environment and is known as sensor fusion. But where are all these data processed? Our brain processes our sensory data so surely the car will need a brain too – and a pretty powerful one?

 

The brain of the system will be the data processing unit – the number cruncher! The system will work on the redundancy principle: different signals are compared and only when data is consistent, the car will act upon it. For example, a front facing light based sensor (LiDAR) combined with a camera could tell the vehicle not only that there is something in front of it but that it is a pedestrian and the emergency braking should be actuated immediately.

 

 

MEMORIZING

 

“Ah yes, I’m definitely going the right way – I remember this from the last time.” These are words we have all muttered when driving. As humans, we move through our world and commit information to memory. Then if required, we can use those memories to validate what we see. In the future, cars will do the same.

 

However, your car will not only access its own memory. Crowd sourced data from various vehicles will create a collective memory from which to draw information. The result: the whole world becomes familiar territory for the vehicle - thanks to cloud computing technology.

 

Digital maps are already being developed to log all relevant “road furniture” and a secure backend connection will ensure that all data is always up to date. Dynamic mapping via a cloud connection will provide real time updates of relevant road events for accurate route planning and navigation.

 

 

COMMUNICATING

 

Sensors and cameras enable the car to recognize visible obstacles. But what if the danger lurks around the bend or beyond a hill? Sure, vehicles can cope with this situation by reacting, but knowing about it further in advance will make the whole process smoother. This is where communication comes in. Once cars are able to speak with each other, a car further up the road can warn cars behind of dangers they cannot yet see and your car can react quite comfortably without you even spilling a drop of your coffee!

 

Cars in 2025 will be equipped to communicate with each other (V2V), with infrastructure (V2I) and with the driver. Multiple antennae will be integrated into a single unit and connected to the vehicle’s system architecture. This will provide a high speed connection to the backend, cloud and other cars.

 

When combined with motion information from sensors, it will be able to send highly accurate data to other vehicles and infrastructure. Thus, vehicles can warn each other of any road events.

 

Let’s not forget that the car of 2025 will still have a human in the driver’s seat! It is imperative that the vehicle communicates clearly and effectively with the driver, too. A simplified human-machine interface will ensure that the driver is privy to all relevant information – at the right time, in the right place. Augmented Reality windshield displays (also known as Head up Displays, HUD) will superimpose digital information onto what the driver sees in the real world.

 

This will ensure that, no matter how smart the car will get, the human driver will always be in the know – and therefore ultimately remain in control.

 

 

What do you think? Would you trust “machine sense” over your own? 

Engage

<div id="hs_cos_wrapper_Engage_" class="hs_cos_wrapper hs_cos_wrapper_widget hs_cos_wrapper_type_inline_text" style="" data-hs-cos-general-type="widget" data-hs-cos-type="inline_text" data-hs-cos-field="submit_your_story.icon_text">Letter</div>
Submit your story

Become part of our autonomous revolution and submit your stories, images and videos

Submit your story
Megaphone
Stay informed

Stay up to speed with our weekly briefing. Enjoy autonomous driving content direct to your inbox

Join our weekly briefing
Connect
Connect with us

Follow us on our social networks for up to date information and thoughts on automated driving