2025AD Homepage

SXSW: Another year into the intelligent future

SXSW: Becoming more than just a music festival as driverless cars take to the stage. (Photo: SXSW)

Article Publication Meta Data

Stephan Giesler
Stephan Giesler

Article Interactions

Rate this article on a scale of 1 to 5.
5 1 votes

Hello, automated driving community! Intelligent mobility a hit at SXSW, seeing around corners and driverless school runs: we bring you this week’s key stories from the world of automated driving!

I often find myself tipping my cap to the creativity of humankind when it comes to overcoming many of the challenges facing automated driving. The creative spirit pays no attention to discipline boundaries: it is as much alive in hard engineering problems as it is in music and arts. So it’s no surprise that future mobility fits right in at the SXSW (South by Southwest) Conference.

Back in 2016, Doug Newcomb, automotive tech journalist and founder of C3 wrote a piece for 2025AD.com about how connected mobility was motoring to prominence at the festival that started out as a means to showcase unsigned bands! In a Forbes article just one year later, the very same man explained why SXSW had in fact become a “can’t-miss mobility event”. That brings us to the present day and the 2018 edition – which kicked off last Friday in Austin, Texas.

Looking at the schedule for the ‘Intelligent Future’ track, I would say the selection of sessions and speakers has grown again, with a certain master marketeer kicking things off in typical fashion. That’s right, Mr Elon Musk apparently dropped in unexpectedly (really?) only to be interviewed in front of a 2750-strong crowd on the opening weekend. After spending some time on another planet (Mars, to be exact), Musk came back to earth and again updated his own autonomous timeline, stating that “by the end of next year, self-driving will encompass all modes of driving and be at least 100 to 200 percent safer than a person. We're talking 18 months from now.”

Whatever you think of the Tesla boss, he has continued to pique the public’s interest in automated driving. Statements like this could be seen as ambitious but, on closer inspection, may leave some wiggle room in terms of delivering on them. At the end of the day, there is a big difference between testing or even prototype vehicle status and serial production or mass market penetration status. Also, I’m all ears as to how to quantify being “100 to 200 percent safer than a person” – maybe “average number of accident-free miles”?

One thing’s for sure: as the autonomous car gets closer to becoming part of the entertainment and cultural network, its prominence at SXSW is only set to increase. It says a lot about the direction mobility is headed when it is a bigger deal at tech shows (CES) and media shows (SXSW) than the good old automotive shows.  

Is imaging technology turning a corner?

“Expect the unexpected” is a fervently taught driving mantra to help us avoid collision situations. It’s much easier to follow when you can see the unexpected! New work into non-line of sight imaging at Stanford might just make this possible someday.

A team of researchers at the Ivy League university have published work in the much-revered Nature journal describing how they were able to build 3-D models of objects that were hidden from the direct line of sight, i.e. see around a corner. Building on Lidar technology, the team used a similar method of sending short laser pulses towards a surface and gathered information from how the photons reflect back. But they weren’t interested in the light that reflects back directly but rather the light that is scattered by the hidden object. “We are looking for the second, and third and fourth bounces – they encode the objects that are hidden,” said Dr Matthew O’Toole, a coauthor of the research.

A recent comment in response to our tech talk between Ralph Lauxmann and Ibro Muharemovic sprung to mind when I read this. There are two schools of thought as to the information sources for autonomous vehicles. One relies on full connectivity: where cars will get a data feed from external objects to complement their own sensor data. On the other hand, this commenter – and many others – believe the cars must be fully self-reliant on their own off-line sensors. This research adds weight to the latter.

Mum, who’s taking us to school today?

Sedric is.

Initially unveiled at the Geneva Motor Show this time last year, the autonomous concept car from VW returned this year – in school bus yellow – to reveal its new proposed use-case: transporting kids to school without a bus driver.

But hold on. Just imagine typical school bus routes. They are either urban, with all the complexity that brings, or they are rural, with the windy roads and lacking infrastructure. So, will Sedric be doing the school run tomorrow? No. And anyway, it takes a whole new level of trust to send your beloved kids packing to a driverless vehicle every morning. But that’s for another day.

What VW are doing here, quite cleverly if you ask me, are pushing the boundaries; making us think. They know how difficult this use case would be when it comes to acceptance, but hey, things have to be in discourse before they are put into practice. Consider the seed sewn.

So long, drive safely (until cars are driverless), 

Stephan Giesler

Editor-in-Chief, 2025AD

Article Interactions

5 1 votes
Rate this article on a scale of 1 to 5.

Quick Newsletter Registration!

Subscribe to our monthly newsletter to receive a roundup of news, blogs and more from 2025AD directly to your mailbox!