CES used to be about electronics and gadgets, separate from even computers. Over the past 15 years or so it has absorbed other consumer technology, so much so that the Consumer Electronics Association has changed its name officially to theConsumer Technology Association – but the show is still CES. In that transition, automotive technology has also become a key part of the show. That shouldn’t be surprising, as cars pack more computing power and electronics than any other consumer product, and are the most complex of all consumer products to design and manufacture.
While cars are certainly not just smartphones on wheels – they’re much more – the smartphone experience is fundamentally changing the expectations consumers have about how they interact with their cars and their connection to the world around them. Auto manufacturers and suppliers know that, and they still want control over the driving experience, be it normal, semi autonomous, or fully autonomous.
Human Machine Interface (HMI)
HMI, as designers and others call it, was one of the dominant parts of the auto displays on the show. Every manufacturer at the show showed current infotainment and control systems plus futuristic ones. On the infotainment side, the recently launched Android Auto and Apple CarPlay were present, but automakers want to offer a more customized experience.
Ford showed the previously announced Sync 3, but part of its importance is the new AppLink platform. In conjunction with partners QNX (underlying OS for Sync 3), infotainment software developer UIEvolution, and Toyota, the new AppLink platform is now open source (called Smart Device Link, or SDL) and aims to be the third platform – with Apple CarPlay and Android Auto – for tying smartphone capabilities and applications into the car environment. With Toyota and Ford behind it, it brings the high volume of two full-line automakers to entice developers to adapt their apps to the platform. Ford also showed a couple of examples to tie the car into home automation applications, one with voice control of Wink applications from Sync3 and another with an interface to Amazon’s Alexa to control lights and garage doors in the home.
One pervasive theme in HMI was the trend to deliver increasingly personalized experiences. While cars have long offered levels of personalization for seat and steering, radio presets, mirrors and other aspects depending on the model, and usually tied to a key fob and sometimes Bluetooth phone connections, the trend is to tie this to the smartphone and deliver much more. Ford’s in-house Livio software unit is working on ways to determine, via sensors inside the car and users smartphones, where people are sitting in the car, offering them personalization and control over HVAC and infotainment systems from only in their zones – rear, front, etc.
Kia’s i-Cockpit concept featured keying off of a smartwatch to adjust a variety of aspects of environment and infotainment features to the user’s preferences. Naturally, besides the vehicle features the user’s favorite in vehicle apps would automatically load into the system and be ready to go. Another feature was a tie-in to enable fuel (or electric charging payment) by scanning a fingerprint sensor in the car.
BMW showed off other aspects of personalization and its potential practicality. BMW Connected and the Open Mobility Cloud ties into the user’s calendar and to do lists, the car could make some decisions for the driver. In a future electric vehicle, the car’s knowledge of destinations based on the calendar could ensure that the car gets enough charge to accomplish the day’s activities while in the garage. In addition, it could optimize a route based on, for example, knowing that the driver has to go to work, pick up a child at school, and shop for groceries. Another customization touch lets the driver select a custom gesture for a function. In BMWs demo with an i3 electric car, the driver gets out of the car and with a selectable gesture the car parked itself. It wasn’t clear to me if any specific gestures might be prohibited.
Not to be outdone, VW’s Herbert Deiss introduced the company’s Budd-e electric concept van, the modern successor to the iconic VW bus. If features futuristic multi-screen displays controlled by voice, gesture, and swipes on steering wheel controls and screens. Mercedes-Benz’s booth, aside from featuring excellent espresso and fancy Voss water, was themed “it’s all about me,” furthering the personalized experience trend. Thenext-generation E-Class will feature a 1920-by-720 pixel all digital instrument panel and swipe controls on the steering wheel, bringing high resolution visualizations and animations to the in-car experience.
To bring your work with you in the car, Microsoft and Harman demonstrated integration of Office 365 and Cortana productivity features in a custom Rinspeed concept car with multiple high resolution dash displays. Cortana voice features would enable a driver or passenger to take their work anywhere by scheduling meetings, attending video conference calls, and accessing and responding to email while on the road. Just don’t try to redo your Powerpoint dec while driving, unless Cortana is doing it for you.
Car-to-Car and Car-to-Infrastructure Communication
Another recurring theme at CES with connected cars is communication between cars and communication between cars and intelligent road signs or sensors. At Ford’s SDL announcement, Amazon’s AWS talked about using the power of the cloud to take vehicle information and use it intelligently to avoid potential problems or dangerous situations. In one scenario, a variety of car data including vehicle speed location, and braking is being continuously analyzed in the cloud. In a scenario where there is limited visibility and perhaps limited traction on a road, as in snow, ice, and fog, being able to feed back to cars what lies ahead could tell car systems to react to things their drivers or sensors can’t see yet. For example, if cars ahead are braking for an accident or obstacle ahead on an icy foggy road, the driver – or in an autonomous or semi autonomous mode, the car – can be alerted to the problem ahead. Then the car can begin to react, such as brake, turn, or pull over to an empty spot on the roadside, before it’s too late.
A somewhat different example of car-to-infrastructure connection, and automakers branching out to enhance the mobility experience beyond the vehicle, is BMW’s combination LED street light and electric vehicle charging station. Planned to be marketed to cities and due to launch next year, it’s smart and connected, of course. WLAN connectivity, sensors that can detect vacant parking spaces, the ability to control the street light by time of day or remote control, and connections to payments apps on mobile phones for electric charging are part of the feature list.
Panasonic and the city of Denver announced a pilot project for smart LED street lamps in the Denver International Airport area. The lights will employ HD cameras that detect pedestrian traffic to dim and brighten the lights, as well as feed traffic and parking space data to cloud back end systems, which will provide analytics and data to enhance vehicle and mobile applications in the future.
Autonomous Driving
This is the area that generates the most attention, but in reality is still years off, because it’s complex from both a technical and policy perspective. Automakers and suppliers were all eager to tout their leadership in this area nonetheless. Ford talked about having 30 autonomous cars on the road testing. Kia laid out its roadmap goals for semi autonomous driving by 2020 and fully autonomous driving by 2030. Toyota announced that it is funding the Toyota Research Institute to the tune of $1 billion, with labs near MIT in Boston and Stanford in Palo Alto. Part of the mandate of the new labs will be to solve the hard problems in machine learning and artificial intelligence, as Toyota claims only the easy parts of the autonomous driving puzzle are solved.
Nvidia announced the Drive PX2, what they called the first supercomputer for cars. It features 12 CPU cores, and has the power of 150 MacBook Pros. With all the sensor and environmental information that needs to be processed in real time, the computing power needed for autonomous driving will be orders of magnitude higher than even the considerable amount in today’s most advanced models. The first automotive partner to use it will be Volvo. Nvidia DIGITS is a development solution to enable the neural network-based machine learning that will be required for safe autonomous driving.
The driving experience is fundamentally changing. As computers take over more driving functions, drivers will need to make less decisions. Car cameras, sensors, and computing power can see more and react faster than drivers – it might make us feel inferior and like we’re losing control, but it’s happening. Perhaps we just need to embrace the new paradigm. Electric starters, power steering and brakes, automatic transmissions, and many other technological advances made the driving experience much more comfortable, safer and easier in the past century of motoring. Connected car technology, semi-autonomous driving aids, and ultimately fully autonomous driving will once again redefine the transportation experience. It’s just evolving into more of an information centric experience, beyond the physical and sensory experience of controlling the machine.