Last month’s CES featured countless autonomous pods from OEMs, suppliers and consumer electronics companies, all designed with great expectations for how passengers eventually will spend their time in them.
Judging by these concepts and initial research, we’ll spend that daily hour-long commute shopping, sleeping, working and watching movies in autonomous vehicles. This wide range of activities demands an ability to flexibly adapt to multiple uses.
For over a century, automakers have built, sold and delivered vehicles with a given purpose and character. While a Ford Mustang has one purpose, a Ram truck’s is wholly different. A German luxury sedan has one drive-style, while a Ferrari has another.
Aside from modifications (think: “Fast ’N Loud”), these vehicles were defined on the desks of strategists, designers and engineers and then purchased to fulfill one overall need or desire. To give them personality, some consumers give their cars a name; my friend called his Toyota “Pedro.”
Besides pods, voice control also had a big presence at CES. We saw several examples of (more or less natural) dialogue between riders and their cars. Very soon, you might say, “Hey Pedro, find a parking spot at the movie theater” while en route. Over time, we expect Pedro, Alexa, Siri and others to proactively learn our needs and our preferences – it will automatically book and pay for a parking spot when you reserve your movie ticket.
The vehicle itself may even adapt continually to suit the day’s events. Need a fast ride to the airport? All distractions will be muted, handling characteristics will be set to “firm” and the navigation will find the quickest route. Have some time before needing to pick up friends? Pedro will select a leisurely route, put on mood lighting and let you make calls during the journey.
When these connected vehicles become shared and fully autonomous, expect this variability to vastly increase, along with utilization rates. Rather than calling upon your personal car, you might have a fleet of vehicles available to you, regularly updating themselves to optimize for efficiency.
Once the fleet AI has learned that every Thursday at 11 a.m., autonomous pod 638 (perhaps still called Pedro) is requested by a group of individuals that require mobility assistance, it will proactively move the seats to accommodate six wheelchairs. When it picks up schoolchildren at 3 p.m., it will make 12 seats available.
Not just the handling, but the entire interior concept will be designed for automated variability in a shared model – no human required.
These fleets of autonomous vehicles – perhaps managed decentrally by the blockchain – will continually scan the internal and external environment for threats and opportunities and adjust their profile accordingly. Has a blackout disabled the wireless charging networks in a part of the city? Vehicles with low charge will avoid the area, while others will flock to serve riders unable to take the subway. A concert has just ended, and 3,000 visitors need rides to the main train station? Where one vehicle hits upon its barriers of space and variability, other vehicles may be called in from the wider mobility network.
The network can incorporate data from any source. Well before we leave the house, our smartwatches may have aligned our individual biorhythm with our calendars and vehicles available in the mobility network. When you are feeling anxious and your calendar knows you have an important meeting coming up, perhaps a tranquil, single-seat pod will do. In turn, when you’re going out with friends, the social pod with drink service turns up at your door. Over time, both the vehicle and the network adjust for the optimal configuration to bring users to their destinations.
Too futuristic for 2019? Perhaps, but current product planning extends past 2030. Given the pace of change, OEMs and tech companies must organize data sets today to enable flexible integration into tomorrow’s use cases. Vehicles built today can and should be incomplete at deployment. They will self-learn new roles based on additional data and applications.
Even beyond self-learning networks is self-healing software, a process by which the vehicle proactively scans itself and its environment for up-to-date information; where it detects a fault, the software autonomously implements a solution – again, without human input. And where there is an unprogrammed opportunity – a flashmob at the coffee shop, perhaps – the network could create an algorithm to address the opportunity.
Initially, self-learning and self-healing software may serve to provide a safety net for OEMs in the context of shorter development and testing cycles. Ultimately, however, it will future-proof vehicles. Because we can’t know today what Pedro will do tomorrow.